Oct  2 06:45:40 np0005465988 kernel: Linux version 5.14.0-620.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025
Oct  2 06:45:40 np0005465988 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct  2 06:45:40 np0005465988 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  2 06:45:40 np0005465988 kernel: BIOS-provided physical RAM map:
Oct  2 06:45:40 np0005465988 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct  2 06:45:40 np0005465988 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct  2 06:45:40 np0005465988 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct  2 06:45:40 np0005465988 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct  2 06:45:40 np0005465988 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct  2 06:45:40 np0005465988 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct  2 06:45:40 np0005465988 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct  2 06:45:40 np0005465988 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct  2 06:45:40 np0005465988 kernel: NX (Execute Disable) protection: active
Oct  2 06:45:40 np0005465988 kernel: APIC: Static calls initialized
Oct  2 06:45:40 np0005465988 kernel: SMBIOS 2.8 present.
Oct  2 06:45:40 np0005465988 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct  2 06:45:40 np0005465988 kernel: Hypervisor detected: KVM
Oct  2 06:45:40 np0005465988 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct  2 06:45:40 np0005465988 kernel: kvm-clock: using sched offset of 9248419340 cycles
Oct  2 06:45:40 np0005465988 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct  2 06:45:40 np0005465988 kernel: tsc: Detected 2800.000 MHz processor
Oct  2 06:45:40 np0005465988 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct  2 06:45:40 np0005465988 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct  2 06:45:40 np0005465988 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct  2 06:45:40 np0005465988 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct  2 06:45:40 np0005465988 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct  2 06:45:40 np0005465988 kernel: Using GB pages for direct mapping
Oct  2 06:45:40 np0005465988 kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct  2 06:45:40 np0005465988 kernel: ACPI: Early table checksum verification disabled
Oct  2 06:45:40 np0005465988 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct  2 06:45:40 np0005465988 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:45:40 np0005465988 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:45:40 np0005465988 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:45:40 np0005465988 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct  2 06:45:40 np0005465988 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:45:40 np0005465988 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:45:40 np0005465988 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct  2 06:45:40 np0005465988 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct  2 06:45:40 np0005465988 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct  2 06:45:40 np0005465988 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct  2 06:45:40 np0005465988 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct  2 06:45:40 np0005465988 kernel: No NUMA configuration found
Oct  2 06:45:40 np0005465988 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct  2 06:45:40 np0005465988 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Oct  2 06:45:40 np0005465988 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct  2 06:45:40 np0005465988 kernel: Zone ranges:
Oct  2 06:45:40 np0005465988 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct  2 06:45:40 np0005465988 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct  2 06:45:40 np0005465988 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct  2 06:45:40 np0005465988 kernel:  Device   empty
Oct  2 06:45:40 np0005465988 kernel: Movable zone start for each node
Oct  2 06:45:40 np0005465988 kernel: Early memory node ranges
Oct  2 06:45:40 np0005465988 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct  2 06:45:40 np0005465988 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct  2 06:45:40 np0005465988 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct  2 06:45:40 np0005465988 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct  2 06:45:40 np0005465988 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct  2 06:45:40 np0005465988 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct  2 06:45:40 np0005465988 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct  2 06:45:40 np0005465988 kernel: ACPI: PM-Timer IO Port: 0x608
Oct  2 06:45:40 np0005465988 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct  2 06:45:40 np0005465988 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct  2 06:45:40 np0005465988 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct  2 06:45:40 np0005465988 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct  2 06:45:40 np0005465988 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct  2 06:45:40 np0005465988 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct  2 06:45:40 np0005465988 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct  2 06:45:40 np0005465988 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct  2 06:45:40 np0005465988 kernel: TSC deadline timer available
Oct  2 06:45:40 np0005465988 kernel: CPU topo: Max. logical packages:   8
Oct  2 06:45:40 np0005465988 kernel: CPU topo: Max. logical dies:       8
Oct  2 06:45:40 np0005465988 kernel: CPU topo: Max. dies per package:   1
Oct  2 06:45:40 np0005465988 kernel: CPU topo: Max. threads per core:   1
Oct  2 06:45:40 np0005465988 kernel: CPU topo: Num. cores per package:     1
Oct  2 06:45:40 np0005465988 kernel: CPU topo: Num. threads per package:   1
Oct  2 06:45:40 np0005465988 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct  2 06:45:40 np0005465988 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct  2 06:45:40 np0005465988 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct  2 06:45:40 np0005465988 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct  2 06:45:40 np0005465988 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct  2 06:45:40 np0005465988 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct  2 06:45:40 np0005465988 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct  2 06:45:40 np0005465988 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct  2 06:45:40 np0005465988 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct  2 06:45:40 np0005465988 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct  2 06:45:40 np0005465988 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct  2 06:45:40 np0005465988 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct  2 06:45:40 np0005465988 kernel: Booting paravirtualized kernel on KVM
Oct  2 06:45:40 np0005465988 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct  2 06:45:40 np0005465988 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct  2 06:45:40 np0005465988 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct  2 06:45:40 np0005465988 kernel: kvm-guest: PV spinlocks disabled, no host support
Oct  2 06:45:40 np0005465988 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  2 06:45:40 np0005465988 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct  2 06:45:40 np0005465988 kernel: random: crng init done
Oct  2 06:45:40 np0005465988 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: Fallback order for Node 0: 0 
Oct  2 06:45:40 np0005465988 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct  2 06:45:40 np0005465988 kernel: Policy zone: Normal
Oct  2 06:45:40 np0005465988 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct  2 06:45:40 np0005465988 kernel: software IO TLB: area num 8.
Oct  2 06:45:40 np0005465988 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct  2 06:45:40 np0005465988 kernel: ftrace: allocating 49370 entries in 193 pages
Oct  2 06:45:40 np0005465988 kernel: ftrace: allocated 193 pages with 3 groups
Oct  2 06:45:40 np0005465988 kernel: Dynamic Preempt: voluntary
Oct  2 06:45:40 np0005465988 kernel: rcu: Preemptible hierarchical RCU implementation.
Oct  2 06:45:40 np0005465988 kernel: rcu: #011RCU event tracing is enabled.
Oct  2 06:45:40 np0005465988 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct  2 06:45:40 np0005465988 kernel: #011Trampoline variant of Tasks RCU enabled.
Oct  2 06:45:40 np0005465988 kernel: #011Rude variant of Tasks RCU enabled.
Oct  2 06:45:40 np0005465988 kernel: #011Tracing variant of Tasks RCU enabled.
Oct  2 06:45:40 np0005465988 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct  2 06:45:40 np0005465988 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct  2 06:45:40 np0005465988 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  2 06:45:40 np0005465988 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  2 06:45:40 np0005465988 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  2 06:45:40 np0005465988 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct  2 06:45:40 np0005465988 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct  2 06:45:40 np0005465988 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct  2 06:45:40 np0005465988 kernel: Console: colour VGA+ 80x25
Oct  2 06:45:40 np0005465988 kernel: printk: console [ttyS0] enabled
Oct  2 06:45:40 np0005465988 kernel: ACPI: Core revision 20230331
Oct  2 06:45:40 np0005465988 kernel: APIC: Switch to symmetric I/O mode setup
Oct  2 06:45:40 np0005465988 kernel: x2apic enabled
Oct  2 06:45:40 np0005465988 kernel: APIC: Switched APIC routing to: physical x2apic
Oct  2 06:45:40 np0005465988 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct  2 06:45:40 np0005465988 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct  2 06:45:40 np0005465988 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct  2 06:45:40 np0005465988 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct  2 06:45:40 np0005465988 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct  2 06:45:40 np0005465988 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct  2 06:45:40 np0005465988 kernel: Spectre V2 : Mitigation: Retpolines
Oct  2 06:45:40 np0005465988 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct  2 06:45:40 np0005465988 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct  2 06:45:40 np0005465988 kernel: RETBleed: Mitigation: untrained return thunk
Oct  2 06:45:40 np0005465988 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct  2 06:45:40 np0005465988 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct  2 06:45:40 np0005465988 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct  2 06:45:40 np0005465988 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct  2 06:45:40 np0005465988 kernel: x86/bugs: return thunk changed
Oct  2 06:45:40 np0005465988 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct  2 06:45:40 np0005465988 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct  2 06:45:40 np0005465988 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct  2 06:45:40 np0005465988 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct  2 06:45:40 np0005465988 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct  2 06:45:40 np0005465988 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct  2 06:45:40 np0005465988 kernel: Freeing SMP alternatives memory: 40K
Oct  2 06:45:40 np0005465988 kernel: pid_max: default: 32768 minimum: 301
Oct  2 06:45:40 np0005465988 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct  2 06:45:40 np0005465988 kernel: landlock: Up and running.
Oct  2 06:45:40 np0005465988 kernel: Yama: becoming mindful.
Oct  2 06:45:40 np0005465988 kernel: SELinux:  Initializing.
Oct  2 06:45:40 np0005465988 kernel: LSM support for eBPF active
Oct  2 06:45:40 np0005465988 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct  2 06:45:40 np0005465988 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct  2 06:45:40 np0005465988 kernel: ... version:                0
Oct  2 06:45:40 np0005465988 kernel: ... bit width:              48
Oct  2 06:45:40 np0005465988 kernel: ... generic registers:      6
Oct  2 06:45:40 np0005465988 kernel: ... value mask:             0000ffffffffffff
Oct  2 06:45:40 np0005465988 kernel: ... max period:             00007fffffffffff
Oct  2 06:45:40 np0005465988 kernel: ... fixed-purpose events:   0
Oct  2 06:45:40 np0005465988 kernel: ... event mask:             000000000000003f
Oct  2 06:45:40 np0005465988 kernel: signal: max sigframe size: 1776
Oct  2 06:45:40 np0005465988 kernel: rcu: Hierarchical SRCU implementation.
Oct  2 06:45:40 np0005465988 kernel: rcu: #011Max phase no-delay instances is 400.
Oct  2 06:45:40 np0005465988 kernel: smp: Bringing up secondary CPUs ...
Oct  2 06:45:40 np0005465988 kernel: smpboot: x86: Booting SMP configuration:
Oct  2 06:45:40 np0005465988 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct  2 06:45:40 np0005465988 kernel: smp: Brought up 1 node, 8 CPUs
Oct  2 06:45:40 np0005465988 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct  2 06:45:40 np0005465988 kernel: node 0 deferred pages initialised in 26ms
Oct  2 06:45:40 np0005465988 kernel: Memory: 7765388K/8388068K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 616504K reserved, 0K cma-reserved)
Oct  2 06:45:40 np0005465988 kernel: devtmpfs: initialized
Oct  2 06:45:40 np0005465988 kernel: x86/mm: Memory block size: 128MB
Oct  2 06:45:40 np0005465988 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct  2 06:45:40 np0005465988 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: pinctrl core: initialized pinctrl subsystem
Oct  2 06:45:40 np0005465988 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct  2 06:45:40 np0005465988 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct  2 06:45:40 np0005465988 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct  2 06:45:40 np0005465988 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct  2 06:45:40 np0005465988 kernel: audit: initializing netlink subsys (disabled)
Oct  2 06:45:40 np0005465988 kernel: audit: type=2000 audit(1759401939.439:1): state=initialized audit_enabled=0 res=1
Oct  2 06:45:40 np0005465988 kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct  2 06:45:40 np0005465988 kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct  2 06:45:40 np0005465988 kernel: thermal_sys: Registered thermal governor 'user_space'
Oct  2 06:45:40 np0005465988 kernel: cpuidle: using governor menu
Oct  2 06:45:40 np0005465988 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct  2 06:45:40 np0005465988 kernel: PCI: Using configuration type 1 for base access
Oct  2 06:45:40 np0005465988 kernel: PCI: Using configuration type 1 for extended access
Oct  2 06:45:40 np0005465988 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct  2 06:45:40 np0005465988 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct  2 06:45:40 np0005465988 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct  2 06:45:40 np0005465988 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct  2 06:45:40 np0005465988 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct  2 06:45:40 np0005465988 kernel: Demotion targets for Node 0: null
Oct  2 06:45:40 np0005465988 kernel: cryptd: max_cpu_qlen set to 1000
Oct  2 06:45:40 np0005465988 kernel: ACPI: Added _OSI(Module Device)
Oct  2 06:45:40 np0005465988 kernel: ACPI: Added _OSI(Processor Device)
Oct  2 06:45:40 np0005465988 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct  2 06:45:40 np0005465988 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct  2 06:45:40 np0005465988 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct  2 06:45:40 np0005465988 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct  2 06:45:40 np0005465988 kernel: ACPI: Interpreter enabled
Oct  2 06:45:40 np0005465988 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct  2 06:45:40 np0005465988 kernel: ACPI: Using IOAPIC for interrupt routing
Oct  2 06:45:40 np0005465988 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct  2 06:45:40 np0005465988 kernel: PCI: Using E820 reservations for host bridge windows
Oct  2 06:45:40 np0005465988 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct  2 06:45:40 np0005465988 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct  2 06:45:40 np0005465988 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [3] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [4] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [5] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [6] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [7] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [8] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [9] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [10] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [11] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [12] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [13] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [14] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [15] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [16] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [17] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [18] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [19] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [20] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [21] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [22] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [23] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [24] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [25] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [26] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [27] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [28] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [29] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [30] registered
Oct  2 06:45:40 np0005465988 kernel: acpiphp: Slot [31] registered
Oct  2 06:45:40 np0005465988 kernel: PCI host bridge to bus 0000:00
Oct  2 06:45:40 np0005465988 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct  2 06:45:40 np0005465988 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct  2 06:45:40 np0005465988 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct  2 06:45:40 np0005465988 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct  2 06:45:40 np0005465988 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct  2 06:45:40 np0005465988 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct  2 06:45:40 np0005465988 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct  2 06:45:40 np0005465988 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct  2 06:45:40 np0005465988 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct  2 06:45:40 np0005465988 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct  2 06:45:40 np0005465988 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct  2 06:45:40 np0005465988 kernel: iommu: Default domain type: Translated
Oct  2 06:45:40 np0005465988 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct  2 06:45:40 np0005465988 kernel: SCSI subsystem initialized
Oct  2 06:45:40 np0005465988 kernel: ACPI: bus type USB registered
Oct  2 06:45:40 np0005465988 kernel: usbcore: registered new interface driver usbfs
Oct  2 06:45:40 np0005465988 kernel: usbcore: registered new interface driver hub
Oct  2 06:45:40 np0005465988 kernel: usbcore: registered new device driver usb
Oct  2 06:45:40 np0005465988 kernel: pps_core: LinuxPPS API ver. 1 registered
Oct  2 06:45:40 np0005465988 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct  2 06:45:40 np0005465988 kernel: PTP clock support registered
Oct  2 06:45:40 np0005465988 kernel: EDAC MC: Ver: 3.0.0
Oct  2 06:45:40 np0005465988 kernel: NetLabel: Initializing
Oct  2 06:45:40 np0005465988 kernel: NetLabel:  domain hash size = 128
Oct  2 06:45:40 np0005465988 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct  2 06:45:40 np0005465988 kernel: NetLabel:  unlabeled traffic allowed by default
Oct  2 06:45:40 np0005465988 kernel: PCI: Using ACPI for IRQ routing
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct  2 06:45:40 np0005465988 kernel: vgaarb: loaded
Oct  2 06:45:40 np0005465988 kernel: clocksource: Switched to clocksource kvm-clock
Oct  2 06:45:40 np0005465988 kernel: VFS: Disk quotas dquot_6.6.0
Oct  2 06:45:40 np0005465988 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct  2 06:45:40 np0005465988 kernel: pnp: PnP ACPI init
Oct  2 06:45:40 np0005465988 kernel: pnp: PnP ACPI: found 5 devices
Oct  2 06:45:40 np0005465988 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct  2 06:45:40 np0005465988 kernel: NET: Registered PF_INET protocol family
Oct  2 06:45:40 np0005465988 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct  2 06:45:40 np0005465988 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  2 06:45:40 np0005465988 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct  2 06:45:40 np0005465988 kernel: NET: Registered PF_XDP protocol family
Oct  2 06:45:40 np0005465988 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct  2 06:45:40 np0005465988 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct  2 06:45:40 np0005465988 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct  2 06:45:40 np0005465988 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct  2 06:45:40 np0005465988 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct  2 06:45:40 np0005465988 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct  2 06:45:40 np0005465988 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 80772 usecs
Oct  2 06:45:40 np0005465988 kernel: PCI: CLS 0 bytes, default 64
Oct  2 06:45:40 np0005465988 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct  2 06:45:40 np0005465988 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct  2 06:45:40 np0005465988 kernel: ACPI: bus type thunderbolt registered
Oct  2 06:45:40 np0005465988 kernel: Trying to unpack rootfs image as initramfs...
Oct  2 06:45:40 np0005465988 kernel: Initialise system trusted keyrings
Oct  2 06:45:40 np0005465988 kernel: Key type blacklist registered
Oct  2 06:45:40 np0005465988 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct  2 06:45:40 np0005465988 kernel: zbud: loaded
Oct  2 06:45:40 np0005465988 kernel: integrity: Platform Keyring initialized
Oct  2 06:45:40 np0005465988 kernel: integrity: Machine keyring initialized
Oct  2 06:45:40 np0005465988 kernel: Freeing initrd memory: 86104K
Oct  2 06:45:40 np0005465988 kernel: NET: Registered PF_ALG protocol family
Oct  2 06:45:40 np0005465988 kernel: xor: automatically using best checksumming function   avx       
Oct  2 06:45:40 np0005465988 kernel: Key type asymmetric registered
Oct  2 06:45:40 np0005465988 kernel: Asymmetric key parser 'x509' registered
Oct  2 06:45:40 np0005465988 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct  2 06:45:40 np0005465988 kernel: io scheduler mq-deadline registered
Oct  2 06:45:40 np0005465988 kernel: io scheduler kyber registered
Oct  2 06:45:40 np0005465988 kernel: io scheduler bfq registered
Oct  2 06:45:40 np0005465988 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct  2 06:45:40 np0005465988 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct  2 06:45:40 np0005465988 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct  2 06:45:40 np0005465988 kernel: ACPI: button: Power Button [PWRF]
Oct  2 06:45:40 np0005465988 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct  2 06:45:40 np0005465988 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct  2 06:45:40 np0005465988 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct  2 06:45:40 np0005465988 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct  2 06:45:40 np0005465988 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct  2 06:45:40 np0005465988 kernel: Non-volatile memory driver v1.3
Oct  2 06:45:40 np0005465988 kernel: rdac: device handler registered
Oct  2 06:45:40 np0005465988 kernel: hp_sw: device handler registered
Oct  2 06:45:40 np0005465988 kernel: emc: device handler registered
Oct  2 06:45:40 np0005465988 kernel: alua: device handler registered
Oct  2 06:45:40 np0005465988 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct  2 06:45:40 np0005465988 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct  2 06:45:40 np0005465988 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct  2 06:45:40 np0005465988 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct  2 06:45:40 np0005465988 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct  2 06:45:40 np0005465988 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct  2 06:45:40 np0005465988 kernel: usb usb1: Product: UHCI Host Controller
Oct  2 06:45:40 np0005465988 kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct  2 06:45:40 np0005465988 kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct  2 06:45:40 np0005465988 kernel: hub 1-0:1.0: USB hub found
Oct  2 06:45:40 np0005465988 kernel: hub 1-0:1.0: 2 ports detected
Oct  2 06:45:40 np0005465988 kernel: usbcore: registered new interface driver usbserial_generic
Oct  2 06:45:40 np0005465988 kernel: usbserial: USB Serial support registered for generic
Oct  2 06:45:40 np0005465988 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct  2 06:45:40 np0005465988 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct  2 06:45:40 np0005465988 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct  2 06:45:40 np0005465988 kernel: mousedev: PS/2 mouse device common for all mice
Oct  2 06:45:40 np0005465988 kernel: rtc_cmos 00:04: RTC can wake from S4
Oct  2 06:45:40 np0005465988 kernel: rtc_cmos 00:04: registered as rtc0
Oct  2 06:45:40 np0005465988 kernel: rtc_cmos 00:04: setting system clock to 2025-10-02T10:45:39 UTC (1759401939)
Oct  2 06:45:40 np0005465988 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct  2 06:45:40 np0005465988 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct  2 06:45:40 np0005465988 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct  2 06:45:40 np0005465988 kernel: hid: raw HID events driver (C) Jiri Kosina
Oct  2 06:45:40 np0005465988 kernel: usbcore: registered new interface driver usbhid
Oct  2 06:45:40 np0005465988 kernel: usbhid: USB HID core driver
Oct  2 06:45:40 np0005465988 kernel: drop_monitor: Initializing network drop monitor service
Oct  2 06:45:40 np0005465988 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct  2 06:45:40 np0005465988 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct  2 06:45:40 np0005465988 kernel: Initializing XFRM netlink socket
Oct  2 06:45:40 np0005465988 kernel: NET: Registered PF_INET6 protocol family
Oct  2 06:45:40 np0005465988 kernel: Segment Routing with IPv6
Oct  2 06:45:40 np0005465988 kernel: NET: Registered PF_PACKET protocol family
Oct  2 06:45:40 np0005465988 kernel: mpls_gso: MPLS GSO support
Oct  2 06:45:40 np0005465988 kernel: IPI shorthand broadcast: enabled
Oct  2 06:45:40 np0005465988 kernel: AVX2 version of gcm_enc/dec engaged.
Oct  2 06:45:40 np0005465988 kernel: AES CTR mode by8 optimization enabled
Oct  2 06:45:40 np0005465988 kernel: sched_clock: Marking stable (1224007370, 138045710)->(1484063600, -122010520)
Oct  2 06:45:40 np0005465988 kernel: registered taskstats version 1
Oct  2 06:45:40 np0005465988 kernel: Loading compiled-in X.509 certificates
Oct  2 06:45:40 np0005465988 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  2 06:45:40 np0005465988 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct  2 06:45:40 np0005465988 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct  2 06:45:40 np0005465988 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct  2 06:45:40 np0005465988 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct  2 06:45:40 np0005465988 kernel: Demotion targets for Node 0: null
Oct  2 06:45:40 np0005465988 kernel: page_owner is disabled
Oct  2 06:45:40 np0005465988 kernel: Key type .fscrypt registered
Oct  2 06:45:40 np0005465988 kernel: Key type fscrypt-provisioning registered
Oct  2 06:45:40 np0005465988 kernel: Key type big_key registered
Oct  2 06:45:40 np0005465988 kernel: Key type encrypted registered
Oct  2 06:45:40 np0005465988 kernel: ima: No TPM chip found, activating TPM-bypass!
Oct  2 06:45:40 np0005465988 kernel: Loading compiled-in module X.509 certificates
Oct  2 06:45:40 np0005465988 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  2 06:45:40 np0005465988 kernel: ima: Allocated hash algorithm: sha256
Oct  2 06:45:40 np0005465988 kernel: ima: No architecture policies found
Oct  2 06:45:40 np0005465988 kernel: evm: Initialising EVM extended attributes:
Oct  2 06:45:40 np0005465988 kernel: evm: security.selinux
Oct  2 06:45:40 np0005465988 kernel: evm: security.SMACK64 (disabled)
Oct  2 06:45:40 np0005465988 kernel: evm: security.SMACK64EXEC (disabled)
Oct  2 06:45:40 np0005465988 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct  2 06:45:40 np0005465988 kernel: evm: security.SMACK64MMAP (disabled)
Oct  2 06:45:40 np0005465988 kernel: evm: security.apparmor (disabled)
Oct  2 06:45:40 np0005465988 kernel: evm: security.ima
Oct  2 06:45:40 np0005465988 kernel: evm: security.capability
Oct  2 06:45:40 np0005465988 kernel: evm: HMAC attrs: 0x1
Oct  2 06:45:40 np0005465988 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct  2 06:45:40 np0005465988 kernel: Running certificate verification RSA selftest
Oct  2 06:45:40 np0005465988 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct  2 06:45:40 np0005465988 kernel: Running certificate verification ECDSA selftest
Oct  2 06:45:40 np0005465988 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct  2 06:45:40 np0005465988 kernel: clk: Disabling unused clocks
Oct  2 06:45:40 np0005465988 kernel: Freeing unused decrypted memory: 2028K
Oct  2 06:45:40 np0005465988 kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct  2 06:45:40 np0005465988 kernel: Write protecting the kernel read-only data: 30720k
Oct  2 06:45:40 np0005465988 kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct  2 06:45:40 np0005465988 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct  2 06:45:40 np0005465988 kernel: Run /init as init process
Oct  2 06:45:40 np0005465988 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  2 06:45:40 np0005465988 systemd: Detected virtualization kvm.
Oct  2 06:45:40 np0005465988 systemd: Detected architecture x86-64.
Oct  2 06:45:40 np0005465988 systemd: Running in initrd.
Oct  2 06:45:40 np0005465988 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct  2 06:45:40 np0005465988 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct  2 06:45:40 np0005465988 kernel: usb 1-1: Product: QEMU USB Tablet
Oct  2 06:45:40 np0005465988 kernel: usb 1-1: Manufacturer: QEMU
Oct  2 06:45:40 np0005465988 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct  2 06:45:40 np0005465988 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct  2 06:45:40 np0005465988 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct  2 06:45:40 np0005465988 systemd: No hostname configured, using default hostname.
Oct  2 06:45:40 np0005465988 systemd: Hostname set to <localhost>.
Oct  2 06:45:40 np0005465988 systemd: Initializing machine ID from VM UUID.
Oct  2 06:45:40 np0005465988 systemd: Queued start job for default target Initrd Default Target.
Oct  2 06:45:40 np0005465988 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  2 06:45:40 np0005465988 systemd: Reached target Local Encrypted Volumes.
Oct  2 06:45:40 np0005465988 systemd: Reached target Initrd /usr File System.
Oct  2 06:45:40 np0005465988 systemd: Reached target Local File Systems.
Oct  2 06:45:40 np0005465988 systemd: Reached target Path Units.
Oct  2 06:45:40 np0005465988 systemd: Reached target Slice Units.
Oct  2 06:45:40 np0005465988 systemd: Reached target Swaps.
Oct  2 06:45:40 np0005465988 systemd: Reached target Timer Units.
Oct  2 06:45:40 np0005465988 systemd: Listening on D-Bus System Message Bus Socket.
Oct  2 06:45:40 np0005465988 systemd: Listening on Journal Socket (/dev/log).
Oct  2 06:45:40 np0005465988 systemd: Listening on Journal Socket.
Oct  2 06:45:40 np0005465988 systemd: Listening on udev Control Socket.
Oct  2 06:45:40 np0005465988 systemd: Listening on udev Kernel Socket.
Oct  2 06:45:40 np0005465988 systemd: Reached target Socket Units.
Oct  2 06:45:40 np0005465988 systemd: Starting Create List of Static Device Nodes...
Oct  2 06:45:40 np0005465988 systemd: Starting Journal Service...
Oct  2 06:45:40 np0005465988 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  2 06:45:40 np0005465988 systemd: Starting Apply Kernel Variables...
Oct  2 06:45:40 np0005465988 systemd: Starting Create System Users...
Oct  2 06:45:40 np0005465988 systemd: Starting Setup Virtual Console...
Oct  2 06:45:40 np0005465988 systemd: Finished Create List of Static Device Nodes.
Oct  2 06:45:40 np0005465988 systemd: Finished Apply Kernel Variables.
Oct  2 06:45:40 np0005465988 systemd: Finished Create System Users.
Oct  2 06:45:40 np0005465988 systemd-journald[309]: Journal started
Oct  2 06:45:40 np0005465988 systemd-journald[309]: Runtime Journal (/run/log/journal/932782131c3c4fb49fd1d481e0b53ce1) is 8.0M, max 153.5M, 145.5M free.
Oct  2 06:45:40 np0005465988 systemd-sysusers[313]: Creating group 'users' with GID 100.
Oct  2 06:45:40 np0005465988 systemd-sysusers[313]: Creating group 'dbus' with GID 81.
Oct  2 06:45:40 np0005465988 systemd-sysusers[313]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct  2 06:45:41 np0005465988 systemd: Started Journal Service.
Oct  2 06:45:41 np0005465988 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  2 06:45:41 np0005465988 systemd[1]: Starting Create Volatile Files and Directories...
Oct  2 06:45:41 np0005465988 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  2 06:45:41 np0005465988 systemd[1]: Finished Setup Virtual Console.
Oct  2 06:45:41 np0005465988 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct  2 06:45:41 np0005465988 systemd[1]: Starting dracut cmdline hook...
Oct  2 06:45:41 np0005465988 systemd[1]: Finished Create Volatile Files and Directories.
Oct  2 06:45:41 np0005465988 dracut-cmdline[330]: dracut-9 dracut-057-102.git20250818.el9
Oct  2 06:45:41 np0005465988 dracut-cmdline[330]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  2 06:45:41 np0005465988 systemd[1]: Finished dracut cmdline hook.
Oct  2 06:45:41 np0005465988 systemd[1]: Starting dracut pre-udev hook...
Oct  2 06:45:41 np0005465988 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct  2 06:45:41 np0005465988 kernel: device-mapper: uevent: version 1.0.3
Oct  2 06:45:41 np0005465988 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct  2 06:45:41 np0005465988 kernel: RPC: Registered named UNIX socket transport module.
Oct  2 06:45:41 np0005465988 kernel: RPC: Registered udp transport module.
Oct  2 06:45:41 np0005465988 kernel: RPC: Registered tcp transport module.
Oct  2 06:45:41 np0005465988 kernel: RPC: Registered tcp-with-tls transport module.
Oct  2 06:45:41 np0005465988 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct  2 06:45:41 np0005465988 rpc.statd[447]: Version 2.5.4 starting
Oct  2 06:45:41 np0005465988 rpc.statd[447]: Initializing NSM state
Oct  2 06:45:41 np0005465988 rpc.idmapd[452]: Setting log level to 0
Oct  2 06:45:41 np0005465988 systemd[1]: Finished dracut pre-udev hook.
Oct  2 06:45:41 np0005465988 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  2 06:45:41 np0005465988 systemd-udevd[465]: Using default interface naming scheme 'rhel-9.0'.
Oct  2 06:45:41 np0005465988 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  2 06:45:41 np0005465988 systemd[1]: Starting dracut pre-trigger hook...
Oct  2 06:45:41 np0005465988 systemd[1]: Finished dracut pre-trigger hook.
Oct  2 06:45:41 np0005465988 systemd[1]: Starting Coldplug All udev Devices...
Oct  2 06:45:41 np0005465988 systemd[1]: Created slice Slice /system/modprobe.
Oct  2 06:45:41 np0005465988 systemd[1]: Starting Load Kernel Module configfs...
Oct  2 06:45:41 np0005465988 systemd[1]: Finished Coldplug All udev Devices.
Oct  2 06:45:41 np0005465988 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  2 06:45:41 np0005465988 systemd[1]: Finished Load Kernel Module configfs.
Oct  2 06:45:41 np0005465988 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  2 06:45:41 np0005465988 systemd[1]: Reached target Network.
Oct  2 06:45:41 np0005465988 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  2 06:45:41 np0005465988 systemd[1]: Starting dracut initqueue hook...
Oct  2 06:45:41 np0005465988 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct  2 06:45:41 np0005465988 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct  2 06:45:41 np0005465988 kernel: vda: vda1
Oct  2 06:45:41 np0005465988 systemd-udevd[480]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 06:45:41 np0005465988 kernel: scsi host0: ata_piix
Oct  2 06:45:41 np0005465988 kernel: scsi host1: ata_piix
Oct  2 06:45:41 np0005465988 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct  2 06:45:41 np0005465988 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct  2 06:45:41 np0005465988 systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  2 06:45:41 np0005465988 systemd[1]: Reached target Initrd Root Device.
Oct  2 06:45:41 np0005465988 systemd[1]: Mounting Kernel Configuration File System...
Oct  2 06:45:41 np0005465988 systemd[1]: Mounted Kernel Configuration File System.
Oct  2 06:45:41 np0005465988 systemd[1]: Reached target System Initialization.
Oct  2 06:45:41 np0005465988 systemd[1]: Reached target Basic System.
Oct  2 06:45:42 np0005465988 kernel: ata1: found unknown device (class 0)
Oct  2 06:45:42 np0005465988 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct  2 06:45:42 np0005465988 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct  2 06:45:42 np0005465988 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct  2 06:45:42 np0005465988 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct  2 06:45:42 np0005465988 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct  2 06:45:42 np0005465988 systemd[1]: Finished dracut initqueue hook.
Oct  2 06:45:42 np0005465988 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  2 06:45:42 np0005465988 systemd[1]: Reached target Remote Encrypted Volumes.
Oct  2 06:45:42 np0005465988 systemd[1]: Reached target Remote File Systems.
Oct  2 06:45:42 np0005465988 systemd[1]: Starting dracut pre-mount hook...
Oct  2 06:45:42 np0005465988 systemd[1]: Finished dracut pre-mount hook.
Oct  2 06:45:42 np0005465988 systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct  2 06:45:42 np0005465988 systemd-fsck[559]: /usr/sbin/fsck.xfs: XFS file system.
Oct  2 06:45:42 np0005465988 systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  2 06:45:42 np0005465988 systemd[1]: Mounting /sysroot...
Oct  2 06:45:42 np0005465988 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct  2 06:45:42 np0005465988 kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct  2 06:45:43 np0005465988 kernel: XFS (vda1): Ending clean mount
Oct  2 06:45:43 np0005465988 systemd[1]: Mounted /sysroot.
Oct  2 06:45:43 np0005465988 systemd[1]: Reached target Initrd Root File System.
Oct  2 06:45:43 np0005465988 systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct  2 06:45:43 np0005465988 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct  2 06:45:43 np0005465988 systemd[1]: Reached target Initrd File Systems.
Oct  2 06:45:43 np0005465988 systemd[1]: Reached target Initrd Default Target.
Oct  2 06:45:43 np0005465988 systemd[1]: Starting dracut mount hook...
Oct  2 06:45:43 np0005465988 systemd[1]: Finished dracut mount hook.
Oct  2 06:45:43 np0005465988 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct  2 06:45:43 np0005465988 rpc.idmapd[452]: exiting on signal 15
Oct  2 06:45:43 np0005465988 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct  2 06:45:43 np0005465988 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Network.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Remote Encrypted Volumes.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Timer Units.
Oct  2 06:45:43 np0005465988 systemd[1]: dbus.socket: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Closed D-Bus System Message Bus Socket.
Oct  2 06:45:43 np0005465988 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Initrd Default Target.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Basic System.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Initrd Root Device.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Initrd /usr File System.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Path Units.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Remote File Systems.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Preparation for Remote File Systems.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Slice Units.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Socket Units.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target System Initialization.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Local File Systems.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Swaps.
Oct  2 06:45:43 np0005465988 systemd[1]: dracut-mount.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped dracut mount hook.
Oct  2 06:45:43 np0005465988 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped dracut pre-mount hook.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped target Local Encrypted Volumes.
Oct  2 06:45:43 np0005465988 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct  2 06:45:43 np0005465988 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped dracut initqueue hook.
Oct  2 06:45:43 np0005465988 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped Apply Kernel Variables.
Oct  2 06:45:43 np0005465988 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped Create Volatile Files and Directories.
Oct  2 06:45:43 np0005465988 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped Coldplug All udev Devices.
Oct  2 06:45:43 np0005465988 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped dracut pre-trigger hook.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct  2 06:45:43 np0005465988 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped Setup Virtual Console.
Oct  2 06:45:43 np0005465988 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct  2 06:45:43 np0005465988 systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct  2 06:45:43 np0005465988 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Closed udev Control Socket.
Oct  2 06:45:43 np0005465988 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Closed udev Kernel Socket.
Oct  2 06:45:43 np0005465988 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped dracut pre-udev hook.
Oct  2 06:45:43 np0005465988 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped dracut cmdline hook.
Oct  2 06:45:43 np0005465988 systemd[1]: Starting Cleanup udev Database...
Oct  2 06:45:43 np0005465988 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct  2 06:45:43 np0005465988 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped Create List of Static Device Nodes.
Oct  2 06:45:43 np0005465988 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Stopped Create System Users.
Oct  2 06:45:43 np0005465988 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct  2 06:45:43 np0005465988 systemd[1]: Finished Cleanup udev Database.
Oct  2 06:45:43 np0005465988 systemd[1]: Reached target Switch Root.
Oct  2 06:45:43 np0005465988 systemd[1]: Starting Switch Root...
Oct  2 06:45:43 np0005465988 systemd[1]: Switching root.
Oct  2 06:45:43 np0005465988 systemd-journald[309]: Journal stopped
Oct  2 06:45:45 np0005465988 systemd-journald: Received SIGTERM from PID 1 (systemd).
Oct  2 06:45:45 np0005465988 kernel: audit: type=1404 audit(1759401944.170:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct  2 06:45:45 np0005465988 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 06:45:45 np0005465988 kernel: SELinux:  policy capability open_perms=1
Oct  2 06:45:45 np0005465988 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 06:45:45 np0005465988 kernel: SELinux:  policy capability always_check_network=0
Oct  2 06:45:45 np0005465988 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 06:45:45 np0005465988 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 06:45:45 np0005465988 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 06:45:45 np0005465988 kernel: audit: type=1403 audit(1759401944.434:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct  2 06:45:45 np0005465988 systemd: Successfully loaded SELinux policy in 276.967ms.
Oct  2 06:45:45 np0005465988 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 97.873ms.
Oct  2 06:45:45 np0005465988 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  2 06:45:45 np0005465988 systemd: Detected virtualization kvm.
Oct  2 06:45:45 np0005465988 systemd: Detected architecture x86-64.
Oct  2 06:45:45 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 06:45:45 np0005465988 systemd: initrd-switch-root.service: Deactivated successfully.
Oct  2 06:45:45 np0005465988 systemd: Stopped Switch Root.
Oct  2 06:45:45 np0005465988 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct  2 06:45:45 np0005465988 systemd: Created slice Slice /system/getty.
Oct  2 06:45:45 np0005465988 systemd: Created slice Slice /system/serial-getty.
Oct  2 06:45:45 np0005465988 systemd: Created slice Slice /system/sshd-keygen.
Oct  2 06:45:45 np0005465988 systemd: Created slice User and Session Slice.
Oct  2 06:45:45 np0005465988 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  2 06:45:45 np0005465988 systemd: Started Forward Password Requests to Wall Directory Watch.
Oct  2 06:45:45 np0005465988 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct  2 06:45:45 np0005465988 systemd: Reached target Local Encrypted Volumes.
Oct  2 06:45:45 np0005465988 systemd: Stopped target Switch Root.
Oct  2 06:45:45 np0005465988 systemd: Stopped target Initrd File Systems.
Oct  2 06:45:45 np0005465988 systemd: Stopped target Initrd Root File System.
Oct  2 06:45:45 np0005465988 systemd: Reached target Local Integrity Protected Volumes.
Oct  2 06:45:45 np0005465988 systemd: Reached target Path Units.
Oct  2 06:45:45 np0005465988 systemd: Reached target rpc_pipefs.target.
Oct  2 06:45:45 np0005465988 systemd: Reached target Slice Units.
Oct  2 06:45:45 np0005465988 systemd: Reached target Swaps.
Oct  2 06:45:45 np0005465988 systemd: Reached target Local Verity Protected Volumes.
Oct  2 06:45:45 np0005465988 systemd: Listening on RPCbind Server Activation Socket.
Oct  2 06:45:45 np0005465988 systemd: Reached target RPC Port Mapper.
Oct  2 06:45:45 np0005465988 systemd: Listening on Process Core Dump Socket.
Oct  2 06:45:45 np0005465988 systemd: Listening on initctl Compatibility Named Pipe.
Oct  2 06:45:45 np0005465988 systemd: Listening on udev Control Socket.
Oct  2 06:45:45 np0005465988 systemd: Listening on udev Kernel Socket.
Oct  2 06:45:45 np0005465988 systemd: Mounting Huge Pages File System...
Oct  2 06:45:45 np0005465988 systemd: Mounting POSIX Message Queue File System...
Oct  2 06:45:45 np0005465988 systemd: Mounting Kernel Debug File System...
Oct  2 06:45:45 np0005465988 systemd: Mounting Kernel Trace File System...
Oct  2 06:45:45 np0005465988 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  2 06:45:45 np0005465988 systemd: Starting Create List of Static Device Nodes...
Oct  2 06:45:45 np0005465988 systemd: Starting Load Kernel Module configfs...
Oct  2 06:45:45 np0005465988 systemd: Starting Load Kernel Module drm...
Oct  2 06:45:45 np0005465988 systemd: Starting Load Kernel Module efi_pstore...
Oct  2 06:45:45 np0005465988 systemd: Starting Load Kernel Module fuse...
Oct  2 06:45:45 np0005465988 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct  2 06:45:45 np0005465988 systemd: systemd-fsck-root.service: Deactivated successfully.
Oct  2 06:45:45 np0005465988 systemd: Stopped File System Check on Root Device.
Oct  2 06:45:45 np0005465988 systemd: Stopped Journal Service.
Oct  2 06:45:45 np0005465988 systemd: Starting Journal Service...
Oct  2 06:45:45 np0005465988 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  2 06:45:45 np0005465988 systemd: Starting Generate network units from Kernel command line...
Oct  2 06:45:45 np0005465988 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  2 06:45:45 np0005465988 systemd: Starting Remount Root and Kernel File Systems...
Oct  2 06:45:45 np0005465988 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct  2 06:45:45 np0005465988 systemd: Starting Apply Kernel Variables...
Oct  2 06:45:45 np0005465988 kernel: fuse: init (API version 7.37)
Oct  2 06:45:45 np0005465988 systemd: Starting Coldplug All udev Devices...
Oct  2 06:45:45 np0005465988 systemd: Mounted Huge Pages File System.
Oct  2 06:45:45 np0005465988 systemd: Mounted POSIX Message Queue File System.
Oct  2 06:45:45 np0005465988 systemd: Mounted Kernel Debug File System.
Oct  2 06:45:45 np0005465988 systemd: Mounted Kernel Trace File System.
Oct  2 06:45:45 np0005465988 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct  2 06:45:45 np0005465988 systemd: Finished Create List of Static Device Nodes.
Oct  2 06:45:45 np0005465988 systemd: modprobe@configfs.service: Deactivated successfully.
Oct  2 06:45:45 np0005465988 systemd: Finished Load Kernel Module configfs.
Oct  2 06:45:45 np0005465988 systemd: modprobe@efi_pstore.service: Deactivated successfully.
Oct  2 06:45:45 np0005465988 systemd: Finished Load Kernel Module efi_pstore.
Oct  2 06:45:45 np0005465988 systemd-journald[684]: Journal started
Oct  2 06:45:45 np0005465988 systemd-journald[684]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  2 06:45:45 np0005465988 systemd[1]: Queued start job for default target Multi-User System.
Oct  2 06:45:45 np0005465988 systemd[1]: systemd-journald.service: Deactivated successfully.
Oct  2 06:45:45 np0005465988 systemd: Started Journal Service.
Oct  2 06:45:45 np0005465988 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Load Kernel Module fuse.
Oct  2 06:45:45 np0005465988 kernel: ACPI: bus type drm_connector registered
Oct  2 06:45:45 np0005465988 systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Load Kernel Module drm.
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Generate network units from Kernel command line.
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Remount Root and Kernel File Systems.
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Apply Kernel Variables.
Oct  2 06:45:45 np0005465988 systemd[1]: Mounting FUSE Control File System...
Oct  2 06:45:45 np0005465988 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  2 06:45:45 np0005465988 systemd[1]: Starting Rebuild Hardware Database...
Oct  2 06:45:45 np0005465988 systemd[1]: Starting Flush Journal to Persistent Storage...
Oct  2 06:45:45 np0005465988 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct  2 06:45:45 np0005465988 systemd[1]: Starting Load/Save OS Random Seed...
Oct  2 06:45:45 np0005465988 systemd[1]: Starting Create System Users...
Oct  2 06:45:45 np0005465988 systemd[1]: Mounted FUSE Control File System.
Oct  2 06:45:45 np0005465988 systemd-journald[684]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  2 06:45:45 np0005465988 systemd-journald[684]: Received client request to flush runtime journal.
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Flush Journal to Persistent Storage.
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Coldplug All udev Devices.
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Load/Save OS Random Seed.
Oct  2 06:45:45 np0005465988 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Create System Users.
Oct  2 06:45:45 np0005465988 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  2 06:45:45 np0005465988 systemd[1]: Reached target Preparation for Local File Systems.
Oct  2 06:45:45 np0005465988 systemd[1]: Reached target Local File Systems.
Oct  2 06:45:45 np0005465988 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct  2 06:45:45 np0005465988 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct  2 06:45:45 np0005465988 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct  2 06:45:45 np0005465988 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct  2 06:45:45 np0005465988 systemd[1]: Starting Automatic Boot Loader Update...
Oct  2 06:45:45 np0005465988 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct  2 06:45:45 np0005465988 systemd[1]: Starting Create Volatile Files and Directories...
Oct  2 06:45:45 np0005465988 bootctl[702]: Couldn't find EFI system partition, skipping.
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Automatic Boot Loader Update.
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Create Volatile Files and Directories.
Oct  2 06:45:45 np0005465988 systemd[1]: Starting Security Auditing Service...
Oct  2 06:45:45 np0005465988 systemd[1]: Starting RPC Bind...
Oct  2 06:45:45 np0005465988 systemd[1]: Starting Rebuild Journal Catalog...
Oct  2 06:45:45 np0005465988 auditd[708]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct  2 06:45:45 np0005465988 auditd[708]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct  2 06:45:45 np0005465988 systemd[1]: Finished Rebuild Journal Catalog.
Oct  2 06:45:45 np0005465988 systemd[1]: Started RPC Bind.
Oct  2 06:45:45 np0005465988 augenrules[713]: /sbin/augenrules: No change
Oct  2 06:45:45 np0005465988 augenrules[728]: No rules
Oct  2 06:45:45 np0005465988 augenrules[728]: enabled 1
Oct  2 06:45:45 np0005465988 augenrules[728]: failure 1
Oct  2 06:45:45 np0005465988 augenrules[728]: pid 708
Oct  2 06:45:45 np0005465988 augenrules[728]: rate_limit 0
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog_limit 8192
Oct  2 06:45:45 np0005465988 augenrules[728]: lost 0
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog 3
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog_wait_time 60000
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog_wait_time_actual 0
Oct  2 06:45:45 np0005465988 augenrules[728]: enabled 1
Oct  2 06:45:45 np0005465988 augenrules[728]: failure 1
Oct  2 06:45:45 np0005465988 augenrules[728]: pid 708
Oct  2 06:45:45 np0005465988 augenrules[728]: rate_limit 0
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog_limit 8192
Oct  2 06:45:45 np0005465988 augenrules[728]: lost 0
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog 3
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog_wait_time 60000
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog_wait_time_actual 0
Oct  2 06:45:45 np0005465988 augenrules[728]: enabled 1
Oct  2 06:45:45 np0005465988 augenrules[728]: failure 1
Oct  2 06:45:45 np0005465988 augenrules[728]: pid 708
Oct  2 06:45:45 np0005465988 augenrules[728]: rate_limit 0
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog_limit 8192
Oct  2 06:45:45 np0005465988 augenrules[728]: lost 0
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog 4
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog_wait_time 60000
Oct  2 06:45:45 np0005465988 augenrules[728]: backlog_wait_time_actual 0
Oct  2 06:45:45 np0005465988 systemd[1]: Started Security Auditing Service.
Oct  2 06:45:45 np0005465988 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct  2 06:45:46 np0005465988 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct  2 06:45:46 np0005465988 systemd[1]: Finished Rebuild Hardware Database.
Oct  2 06:45:46 np0005465988 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  2 06:45:46 np0005465988 systemd-udevd[736]: Using default interface naming scheme 'rhel-9.0'.
Oct  2 06:45:46 np0005465988 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  2 06:45:46 np0005465988 systemd[1]: Starting Load Kernel Module configfs...
Oct  2 06:45:46 np0005465988 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  2 06:45:46 np0005465988 systemd[1]: Finished Load Kernel Module configfs.
Oct  2 06:45:46 np0005465988 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct  2 06:45:46 np0005465988 systemd-udevd[754]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 06:45:46 np0005465988 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct  2 06:45:46 np0005465988 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct  2 06:45:46 np0005465988 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct  2 06:45:46 np0005465988 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct  2 06:45:46 np0005465988 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct  2 06:45:46 np0005465988 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct  2 06:45:46 np0005465988 kernel: Console: switching to colour dummy device 80x25
Oct  2 06:45:46 np0005465988 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct  2 06:45:46 np0005465988 kernel: [drm] features: -context_init
Oct  2 06:45:46 np0005465988 kernel: [drm] number of scanouts: 1
Oct  2 06:45:46 np0005465988 kernel: [drm] number of cap sets: 0
Oct  2 06:45:46 np0005465988 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct  2 06:45:46 np0005465988 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct  2 06:45:46 np0005465988 kernel: Console: switching to colour frame buffer device 128x48
Oct  2 06:45:46 np0005465988 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct  2 06:45:47 np0005465988 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct  2 06:45:47 np0005465988 kernel: kvm_amd: TSC scaling supported
Oct  2 06:45:47 np0005465988 kernel: kvm_amd: Nested Virtualization enabled
Oct  2 06:45:47 np0005465988 kernel: kvm_amd: Nested Paging enabled
Oct  2 06:45:47 np0005465988 kernel: kvm_amd: LBR virtualization supported
Oct  2 06:45:47 np0005465988 systemd[1]: Starting Update is Completed...
Oct  2 06:45:47 np0005465988 systemd[1]: Finished Update is Completed.
Oct  2 06:45:47 np0005465988 systemd[1]: Reached target System Initialization.
Oct  2 06:45:47 np0005465988 systemd[1]: Started dnf makecache --timer.
Oct  2 06:45:47 np0005465988 systemd[1]: Started Daily rotation of log files.
Oct  2 06:45:47 np0005465988 systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct  2 06:45:47 np0005465988 systemd[1]: Reached target Timer Units.
Oct  2 06:45:47 np0005465988 systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct  2 06:45:47 np0005465988 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct  2 06:45:47 np0005465988 systemd[1]: Reached target Socket Units.
Oct  2 06:45:47 np0005465988 systemd[1]: Starting D-Bus System Message Bus...
Oct  2 06:45:47 np0005465988 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  2 06:45:47 np0005465988 systemd[1]: Started D-Bus System Message Bus.
Oct  2 06:45:47 np0005465988 systemd[1]: Reached target Basic System.
Oct  2 06:45:47 np0005465988 dbus-broker-lau[814]: Ready
Oct  2 06:45:47 np0005465988 systemd[1]: Starting NTP client/server...
Oct  2 06:45:47 np0005465988 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct  2 06:45:47 np0005465988 systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct  2 06:45:47 np0005465988 systemd[1]: Starting IPv4 firewall with iptables...
Oct  2 06:45:47 np0005465988 systemd[1]: Started irqbalance daemon.
Oct  2 06:45:47 np0005465988 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct  2 06:45:47 np0005465988 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 06:45:47 np0005465988 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 06:45:47 np0005465988 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 06:45:47 np0005465988 systemd[1]: Reached target sshd-keygen.target.
Oct  2 06:45:47 np0005465988 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct  2 06:45:47 np0005465988 systemd[1]: Reached target User and Group Name Lookups.
Oct  2 06:45:47 np0005465988 systemd[1]: Starting User Login Management...
Oct  2 06:45:47 np0005465988 systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct  2 06:45:47 np0005465988 chronyd[834]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  2 06:45:47 np0005465988 chronyd[834]: Loaded 0 symmetric keys
Oct  2 06:45:47 np0005465988 chronyd[834]: Using right/UTC timezone to obtain leap second data
Oct  2 06:45:47 np0005465988 chronyd[834]: Loaded seccomp filter (level 2)
Oct  2 06:45:47 np0005465988 systemd[1]: Started NTP client/server.
Oct  2 06:45:47 np0005465988 systemd-logind[827]: New seat seat0.
Oct  2 06:45:47 np0005465988 systemd-logind[827]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  2 06:45:47 np0005465988 systemd-logind[827]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  2 06:45:47 np0005465988 systemd[1]: Started User Login Management.
Oct  2 06:45:47 np0005465988 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct  2 06:45:47 np0005465988 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct  2 06:45:47 np0005465988 iptables.init[821]: iptables: Applying firewall rules: [  OK  ]
Oct  2 06:45:47 np0005465988 systemd[1]: Finished IPv4 firewall with iptables.
Oct  2 06:45:48 np0005465988 cloud-init[844]: Cloud-init v. 24.4-7.el9 running 'init-local' at Thu, 02 Oct 2025 10:45:48 +0000. Up 10.35 seconds.
Oct  2 06:45:49 np0005465988 systemd[1]: run-cloud\x2dinit-tmp-tmp1utprm0_.mount: Deactivated successfully.
Oct  2 06:45:49 np0005465988 systemd[1]: Starting Hostname Service...
Oct  2 06:45:49 np0005465988 systemd[1]: Started Hostname Service.
Oct  2 06:45:49 np0005465988 systemd-hostnamed[858]: Hostname set to <np0005465988.novalocal> (static)
Oct  2 06:45:49 np0005465988 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct  2 06:45:49 np0005465988 systemd[1]: Reached target Preparation for Network.
Oct  2 06:45:49 np0005465988 systemd[1]: Starting Network Manager...
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.4817] NetworkManager (version 1.54.1-1.el9) is starting... (boot:56c3487b-f235-4e9d-84a7-a894185724de)
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.4823] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5153] manager[0x56545bd94080]: monitoring kernel firmware directory '/lib/firmware'.
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5226] hostname: hostname: using hostnamed
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5227] hostname: static hostname changed from (none) to "np0005465988.novalocal"
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5235] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5437] manager[0x56545bd94080]: rfkill: Wi-Fi hardware radio set enabled
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5439] manager[0x56545bd94080]: rfkill: WWAN hardware radio set enabled
Oct  2 06:45:49 np0005465988 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5664] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5665] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5665] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5665] manager: Networking is enabled by state file
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5667] settings: Loaded settings plugin: keyfile (internal)
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5729] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5758] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5833] dhcp: init: Using DHCP client 'internal'
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5836] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  2 06:45:49 np0005465988 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5848] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5868] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5877] device (lo): Activation: starting connection 'lo' (c72ddb6c-2533-4269-990c-c2ee08946507)
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5886] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5889] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 06:45:49 np0005465988 systemd[1]: Started Network Manager.
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5909] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5914] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5917] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5919] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5921] device (eth0): carrier: link connected
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5923] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 06:45:49 np0005465988 systemd[1]: Reached target Network.
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5930] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5941] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5946] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5947] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5948] manager: NetworkManager state is now CONNECTING
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5950] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5956] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 06:45:49 np0005465988 systemd[1]: Starting Network Manager Wait Online...
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.5959] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 06:45:49 np0005465988 systemd[1]: Starting GSSAPI Proxy Daemon...
Oct  2 06:45:49 np0005465988 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.6106] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.6109] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 06:45:49 np0005465988 NetworkManager[862]: <info>  [1759401949.6119] device (lo): Activation: successful, device activated.
Oct  2 06:45:49 np0005465988 systemd[1]: Started GSSAPI Proxy Daemon.
Oct  2 06:45:49 np0005465988 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  2 06:45:49 np0005465988 systemd[1]: Reached target NFS client services.
Oct  2 06:45:49 np0005465988 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  2 06:45:49 np0005465988 systemd[1]: Reached target Remote File Systems.
Oct  2 06:45:49 np0005465988 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  2 06:45:52 np0005465988 NetworkManager[862]: <info>  [1759401952.5727] dhcp4 (eth0): state changed new lease, address=38.129.56.216
Oct  2 06:45:52 np0005465988 NetworkManager[862]: <info>  [1759401952.5741] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  2 06:45:52 np0005465988 NetworkManager[862]: <info>  [1759401952.5769] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 06:45:52 np0005465988 NetworkManager[862]: <info>  [1759401952.5807] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 06:45:52 np0005465988 NetworkManager[862]: <info>  [1759401952.5808] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 06:45:52 np0005465988 NetworkManager[862]: <info>  [1759401952.5812] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 06:45:52 np0005465988 NetworkManager[862]: <info>  [1759401952.5815] device (eth0): Activation: successful, device activated.
Oct  2 06:45:52 np0005465988 NetworkManager[862]: <info>  [1759401952.5819] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  2 06:45:52 np0005465988 NetworkManager[862]: <info>  [1759401952.5822] manager: startup complete
Oct  2 06:45:52 np0005465988 systemd[1]: Finished Network Manager Wait Online.
Oct  2 06:45:52 np0005465988 systemd[1]: Starting Cloud-init: Network Stage...
Oct  2 06:45:52 np0005465988 cloud-init[925]: Cloud-init v. 24.4-7.el9 running 'init' at Thu, 02 Oct 2025 10:45:52 +0000. Up 14.58 seconds.
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: |  eth0  | True |        38.129.56.216         | 255.255.255.0 | global | fa:16:3e:4d:29:f8 |
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: |  eth0  | True | fe80::f816:3eff:fe4d:29f8/64 |       .       |  link  | fa:16:3e:4d:29:f8 |
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: | Route |   Destination   |   Gateway   |     Genmask     | Interface | Flags |
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: |   0   |     0.0.0.0     | 38.129.56.1 |     0.0.0.0     |    eth0   |   UG  |
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: |   1   |   38.129.56.0   |   0.0.0.0   |  255.255.255.0  |    eth0   |   U   |
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: |   2   | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 |    eth0   |  UGH  |
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct  2 06:45:52 np0005465988 cloud-init[925]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Oct  2 06:45:53 np0005465988 cloud-init[925]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Oct  2 06:45:53 np0005465988 cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  2 06:45:56 np0005465988 cloud-init[925]: Generating public/private rsa key pair.
Oct  2 06:45:56 np0005465988 cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct  2 06:45:56 np0005465988 cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct  2 06:45:56 np0005465988 cloud-init[925]: The key fingerprint is:
Oct  2 06:45:56 np0005465988 cloud-init[925]: SHA256:Jbf+FD5KP8+91WD14MwI72D71CINa0Z9Yf4pOpslJkQ root@np0005465988.novalocal
Oct  2 06:45:56 np0005465988 cloud-init[925]: The key's randomart image is:
Oct  2 06:45:56 np0005465988 cloud-init[925]: +---[RSA 3072]----+
Oct  2 06:45:56 np0005465988 cloud-init[925]: |                 |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |                 |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |        .E+   + .|
Oct  2 06:45:56 np0005465988 cloud-init[925]: |        .+ = B +.|
Oct  2 06:45:56 np0005465988 cloud-init[925]: |        S.* = O .|
Oct  2 06:45:56 np0005465988 cloud-init[925]: |        .+ O = oo|
Oct  2 06:45:56 np0005465988 cloud-init[925]: |         .OoOo..+|
Oct  2 06:45:56 np0005465988 cloud-init[925]: |         +oX*+...|
Oct  2 06:45:56 np0005465988 cloud-init[925]: |          .+*oo.o|
Oct  2 06:45:56 np0005465988 cloud-init[925]: +----[SHA256]-----+
Oct  2 06:45:56 np0005465988 cloud-init[925]: Generating public/private ecdsa key pair.
Oct  2 06:45:56 np0005465988 cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct  2 06:45:56 np0005465988 cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct  2 06:45:56 np0005465988 cloud-init[925]: The key fingerprint is:
Oct  2 06:45:56 np0005465988 cloud-init[925]: SHA256:dJYwCaurNyr1iohkpHzrCd1kMm5CreYj6q4Lm/hyYfs root@np0005465988.novalocal
Oct  2 06:45:56 np0005465988 cloud-init[925]: The key's randomart image is:
Oct  2 06:45:56 np0005465988 cloud-init[925]: +---[ECDSA 256]---+
Oct  2 06:45:56 np0005465988 cloud-init[925]: |      ..o.       |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |       ..o .     |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |      . . +      |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |  .  . . o       |
Oct  2 06:45:56 np0005465988 cloud-init[925]: | o +.o  S        |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |= B *.           |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |o@ O..           |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |&=B.*            |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |^XBOE.           |
Oct  2 06:45:56 np0005465988 cloud-init[925]: +----[SHA256]-----+
Oct  2 06:45:56 np0005465988 cloud-init[925]: Generating public/private ed25519 key pair.
Oct  2 06:45:56 np0005465988 cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct  2 06:45:56 np0005465988 cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct  2 06:45:56 np0005465988 cloud-init[925]: The key fingerprint is:
Oct  2 06:45:56 np0005465988 cloud-init[925]: SHA256:RBjbFBfSI+rfrSraoIP0xWo1C69LYQoIZIR/DZaWv5M root@np0005465988.novalocal
Oct  2 06:45:56 np0005465988 cloud-init[925]: The key's randomart image is:
Oct  2 06:45:56 np0005465988 cloud-init[925]: +--[ED25519 256]--+
Oct  2 06:45:56 np0005465988 cloud-init[925]: |o+   o.o=oo.     |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |+   * .=.oo      |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |.. o +..o. .     |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |o . . +.         |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |o  +.. oS        |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |..o..=E          |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |.o..B oo . .     |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |. o= =. . . .    |
Oct  2 06:45:56 np0005465988 cloud-init[925]: |  o++.......     |
Oct  2 06:45:56 np0005465988 cloud-init[925]: +----[SHA256]-----+
Oct  2 06:45:56 np0005465988 sm-notify[1007]: Version 2.5.4 starting
Oct  2 06:45:56 np0005465988 systemd[1]: Finished Cloud-init: Network Stage.
Oct  2 06:45:56 np0005465988 systemd[1]: Reached target Cloud-config availability.
Oct  2 06:45:56 np0005465988 systemd[1]: Reached target Network is Online.
Oct  2 06:45:56 np0005465988 systemd[1]: Starting Cloud-init: Config Stage...
Oct  2 06:45:56 np0005465988 systemd[1]: Starting Notify NFS peers of a restart...
Oct  2 06:45:56 np0005465988 systemd[1]: Starting System Logging Service...
Oct  2 06:45:56 np0005465988 systemd[1]: Starting OpenSSH server daemon...
Oct  2 06:45:56 np0005465988 systemd[1]: Starting Permit User Sessions...
Oct  2 06:45:56 np0005465988 systemd[1]: Started Notify NFS peers of a restart.
Oct  2 06:45:56 np0005465988 systemd[1]: Finished Permit User Sessions.
Oct  2 06:45:56 np0005465988 systemd[1]: Started Command Scheduler.
Oct  2 06:45:56 np0005465988 systemd[1]: Started Getty on tty1.
Oct  2 06:45:56 np0005465988 systemd[1]: Started Serial Getty on ttyS0.
Oct  2 06:45:56 np0005465988 systemd[1]: Reached target Login Prompts.
Oct  2 06:45:56 np0005465988 systemd[1]: Started OpenSSH server daemon.
Oct  2 06:45:56 np0005465988 rsyslogd[1008]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1008" x-info="https://www.rsyslog.com"] start
Oct  2 06:45:56 np0005465988 systemd[1]: Started System Logging Service.
Oct  2 06:45:56 np0005465988 rsyslogd[1008]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct  2 06:45:56 np0005465988 systemd[1]: Reached target Multi-User System.
Oct  2 06:45:56 np0005465988 systemd[1]: Starting Record Runlevel Change in UTMP...
Oct  2 06:45:56 np0005465988 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct  2 06:45:56 np0005465988 systemd[1]: Finished Record Runlevel Change in UTMP.
Oct  2 06:45:56 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 06:45:56 np0005465988 cloud-init[1020]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Thu, 02 Oct 2025 10:45:56 +0000. Up 18.56 seconds.
Oct  2 06:45:57 np0005465988 systemd[1]: Finished Cloud-init: Config Stage.
Oct  2 06:45:57 np0005465988 systemd[1]: Starting Cloud-init: Final Stage...
Oct  2 06:45:57 np0005465988 chronyd[834]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Oct  2 06:45:58 np0005465988 chronyd[834]: System clock wrong by 1.293041 seconds
Oct  2 06:45:58 np0005465988 chronyd[834]: System clock was stepped by 1.293041 seconds
Oct  2 06:45:58 np0005465988 chronyd[834]: System clock TAI offset set to 37 seconds
Oct  2 06:45:58 np0005465988 cloud-init[1024]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Thu, 02 Oct 2025 10:45:58 +0000. Up 18.96 seconds.
Oct  2 06:45:58 np0005465988 irqbalance[825]: Cannot change IRQ 25 affinity: Operation not permitted
Oct  2 06:45:58 np0005465988 irqbalance[825]: IRQ 25 affinity is now unmanaged
Oct  2 06:45:58 np0005465988 irqbalance[825]: Cannot change IRQ 31 affinity: Operation not permitted
Oct  2 06:45:58 np0005465988 irqbalance[825]: IRQ 31 affinity is now unmanaged
Oct  2 06:45:58 np0005465988 irqbalance[825]: Cannot change IRQ 28 affinity: Operation not permitted
Oct  2 06:45:58 np0005465988 irqbalance[825]: IRQ 28 affinity is now unmanaged
Oct  2 06:45:58 np0005465988 irqbalance[825]: Cannot change IRQ 32 affinity: Operation not permitted
Oct  2 06:45:58 np0005465988 irqbalance[825]: IRQ 32 affinity is now unmanaged
Oct  2 06:45:58 np0005465988 irqbalance[825]: Cannot change IRQ 30 affinity: Operation not permitted
Oct  2 06:45:58 np0005465988 irqbalance[825]: IRQ 30 affinity is now unmanaged
Oct  2 06:45:58 np0005465988 irqbalance[825]: Cannot change IRQ 29 affinity: Operation not permitted
Oct  2 06:45:58 np0005465988 irqbalance[825]: IRQ 29 affinity is now unmanaged
Oct  2 06:45:58 np0005465988 cloud-init[1027]: #############################################################
Oct  2 06:45:58 np0005465988 cloud-init[1028]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct  2 06:45:58 np0005465988 cloud-init[1030]: 256 SHA256:dJYwCaurNyr1iohkpHzrCd1kMm5CreYj6q4Lm/hyYfs root@np0005465988.novalocal (ECDSA)
Oct  2 06:45:58 np0005465988 cloud-init[1032]: 256 SHA256:RBjbFBfSI+rfrSraoIP0xWo1C69LYQoIZIR/DZaWv5M root@np0005465988.novalocal (ED25519)
Oct  2 06:45:58 np0005465988 cloud-init[1034]: 3072 SHA256:Jbf+FD5KP8+91WD14MwI72D71CINa0Z9Yf4pOpslJkQ root@np0005465988.novalocal (RSA)
Oct  2 06:45:58 np0005465988 cloud-init[1035]: -----END SSH HOST KEY FINGERPRINTS-----
Oct  2 06:45:58 np0005465988 cloud-init[1036]: #############################################################
Oct  2 06:45:58 np0005465988 cloud-init[1024]: Cloud-init v. 24.4-7.el9 finished at Thu, 02 Oct 2025 10:45:58 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 19.18 seconds
Oct  2 06:45:58 np0005465988 systemd[1]: Finished Cloud-init: Final Stage.
Oct  2 06:45:58 np0005465988 systemd[1]: Reached target Cloud-init target.
Oct  2 06:45:58 np0005465988 systemd[1]: Startup finished in 1.670s (kernel) + 4.179s (initrd) + 13.447s (userspace) = 19.297s.
Oct  2 06:46:03 np0005465988 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 06:46:20 np0005465988 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 06:46:26 np0005465988 systemd[1]: Created slice User Slice of UID 1000.
Oct  2 06:46:26 np0005465988 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct  2 06:46:26 np0005465988 systemd-logind[827]: New session 1 of user zuul.
Oct  2 06:46:26 np0005465988 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct  2 06:46:26 np0005465988 systemd[1]: Starting User Manager for UID 1000...
Oct  2 06:46:26 np0005465988 systemd[1063]: Queued start job for default target Main User Target.
Oct  2 06:46:26 np0005465988 systemd[1063]: Created slice User Application Slice.
Oct  2 06:46:26 np0005465988 systemd[1063]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 06:46:26 np0005465988 systemd[1063]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 06:46:26 np0005465988 systemd[1063]: Reached target Paths.
Oct  2 06:46:26 np0005465988 systemd[1063]: Reached target Timers.
Oct  2 06:46:26 np0005465988 systemd[1063]: Starting D-Bus User Message Bus Socket...
Oct  2 06:46:26 np0005465988 systemd[1063]: Starting Create User's Volatile Files and Directories...
Oct  2 06:46:26 np0005465988 systemd[1063]: Finished Create User's Volatile Files and Directories.
Oct  2 06:46:26 np0005465988 systemd[1063]: Listening on D-Bus User Message Bus Socket.
Oct  2 06:46:26 np0005465988 systemd[1063]: Reached target Sockets.
Oct  2 06:46:26 np0005465988 systemd[1063]: Reached target Basic System.
Oct  2 06:46:26 np0005465988 systemd[1063]: Reached target Main User Target.
Oct  2 06:46:26 np0005465988 systemd[1063]: Startup finished in 118ms.
Oct  2 06:46:26 np0005465988 systemd[1]: Started User Manager for UID 1000.
Oct  2 06:46:26 np0005465988 systemd[1]: Started Session 1 of User zuul.
Oct  2 06:46:27 np0005465988 python3[1146]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 06:46:35 np0005465988 python3[1174]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 06:46:49 np0005465988 python3[1232]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 06:46:51 np0005465988 python3[1272]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct  2 06:46:54 np0005465988 python3[1298]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4HLvw7SnHU3ZU9fE0jIv/0FBRDzTnHfUtei1LSOQahXMfp0JTrMJ0Rj7BbYXxImr2WcDV3bv5FU3LkNsWWkvKZ+/YTg55vh88jhcTTSwOPfyQ0NCsgJ787HDXojmkTKqvsS4ZyAP6VcPlCZbUWNnTSSbJUSyaHZMV5ihm0q6iSgctKks2z5A9UayATNjnXUmG/mYZF8TjRztR4mgHBNFbBBfNYFztb1B2fe+vxBnNa4ls2O1rLzC/5crDuKj3ook8+1X4UTHys4s5ONjn9jxIkB3P5jnGl8ibSdVQRN46RVP9p93WUUJmVLRZdoaq0MrLEwnG1poWzqFMv/cX91UaFWkVQBmiX85KvlSQJqVymdz6LOcPNL+U30yMk55hrLdmeOMl1B9DW4VKD/rr+vtK+HpbaJYYHv9A+woTHFawd+Lkd7oFFavN5+ce0qiO8pg3vZYBLUXgFNDIcMMuu3Th/9xHdxwKfNaQJ9ESqYe+DAE5UBQd/CAmWzb1hdJnFX8= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:46:55 np0005465988 python3[1322]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:46:56 np0005465988 python3[1421]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 06:46:56 np0005465988 python3[1492]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759402015.9076605-254-57742130126639/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=ce6939fdbe5b42ab81e6b7883c8c12f2_id_rsa follow=False checksum=205f22b2d368cfc213016f6ec99460e215b81c8a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:46:57 np0005465988 python3[1615]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 06:46:57 np0005465988 python3[1686]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759402017.0680578-309-24112174075904/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=ce6939fdbe5b42ab81e6b7883c8c12f2_id_rsa.pub follow=False checksum=e1273b820e94a75c0885ab32202b04dec1652e4b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:46:58 np0005465988 irqbalance[825]: Cannot change IRQ 27 affinity: Operation not permitted
Oct  2 06:46:58 np0005465988 irqbalance[825]: IRQ 27 affinity is now unmanaged
Oct  2 06:46:59 np0005465988 python3[1734]: ansible-ping Invoked with data=pong
Oct  2 06:47:01 np0005465988 python3[1758]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 06:47:03 np0005465988 python3[1816]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct  2 06:47:05 np0005465988 python3[1848]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:05 np0005465988 python3[1872]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:05 np0005465988 python3[1896]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:06 np0005465988 python3[1920]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:06 np0005465988 python3[1944]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:06 np0005465988 python3[1968]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:08 np0005465988 python3[1994]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:09 np0005465988 python3[2072]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 06:47:10 np0005465988 python3[2145]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759402029.1740036-33-213968070595594/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:11 np0005465988 python3[2193]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:11 np0005465988 python3[2217]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:11 np0005465988 python3[2241]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:12 np0005465988 python3[2265]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:12 np0005465988 python3[2289]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:12 np0005465988 python3[2313]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:13 np0005465988 python3[2337]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:13 np0005465988 python3[2361]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:13 np0005465988 python3[2385]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:14 np0005465988 python3[2409]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:14 np0005465988 python3[2433]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:14 np0005465988 python3[2457]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:14 np0005465988 python3[2481]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:15 np0005465988 python3[2505]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:15 np0005465988 python3[2529]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:15 np0005465988 python3[2553]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:16 np0005465988 python3[2577]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:16 np0005465988 python3[2601]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:16 np0005465988 python3[2625]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:17 np0005465988 python3[2649]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:17 np0005465988 python3[2673]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:17 np0005465988 python3[2697]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:18 np0005465988 python3[2721]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:18 np0005465988 python3[2745]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:18 np0005465988 python3[2769]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:19 np0005465988 python3[2793]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:47:21 np0005465988 python3[2819]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  2 06:47:21 np0005465988 systemd[1]: Starting Time & Date Service...
Oct  2 06:47:21 np0005465988 systemd[1]: Started Time & Date Service.
Oct  2 06:47:21 np0005465988 systemd-timedated[2821]: Changed time zone to 'UTC' (UTC).
Oct  2 06:47:22 np0005465988 python3[2850]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:22 np0005465988 python3[2926]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 06:47:23 np0005465988 python3[2997]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759402042.5739489-256-272679656848947/source _original_basename=tmp5hczfr78 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:23 np0005465988 python3[3097]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 06:47:24 np0005465988 python3[3168]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759402043.5952349-304-132968008690295/source _original_basename=tmpxji7oa9w follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:25 np0005465988 python3[3270]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 06:47:26 np0005465988 python3[3343]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759402045.4843142-384-163498672753787/source _original_basename=tmpqdnmvmxh follow=False checksum=b49866d430a7b489292e1326bd69be908cd569bf backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:27 np0005465988 python3[3391]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:47:27 np0005465988 python3[3417]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:47:28 np0005465988 python3[3497]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 06:47:28 np0005465988 python3[3570]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759402047.65666-454-104681347024826/source _original_basename=tmpov9y3gfq follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:29 np0005465988 python3[3621]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-2436-8b23-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:47:30 np0005465988 python3[3649]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-2436-8b23-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct  2 06:47:32 np0005465988 python3[3677]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:47:51 np0005465988 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  2 06:47:54 np0005465988 python3[3705]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:48:54 np0005465988 systemd-logind[827]: Session 1 logged out. Waiting for processes to exit.
Oct  2 06:49:18 np0005465988 systemd[1063]: Starting Mark boot as successful...
Oct  2 06:49:18 np0005465988 systemd[1063]: Finished Mark boot as successful.
Oct  2 06:49:43 np0005465988 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  2 06:49:43 np0005465988 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct  2 06:49:43 np0005465988 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct  2 06:49:43 np0005465988 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct  2 06:49:43 np0005465988 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct  2 06:49:43 np0005465988 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct  2 06:49:43 np0005465988 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct  2 06:49:43 np0005465988 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct  2 06:49:43 np0005465988 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct  2 06:49:43 np0005465988 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct  2 06:49:43 np0005465988 NetworkManager[862]: <info>  [1759402183.5869] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  2 06:49:43 np0005465988 systemd-udevd[3707]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 06:49:43 np0005465988 NetworkManager[862]: <info>  [1759402183.6051] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 06:49:43 np0005465988 NetworkManager[862]: <info>  [1759402183.6077] settings: (eth1): created default wired connection 'Wired connection 1'
Oct  2 06:49:43 np0005465988 NetworkManager[862]: <info>  [1759402183.6080] device (eth1): carrier: link connected
Oct  2 06:49:43 np0005465988 NetworkManager[862]: <info>  [1759402183.6082] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  2 06:49:43 np0005465988 NetworkManager[862]: <info>  [1759402183.6087] policy: auto-activating connection 'Wired connection 1' (7ba3a9ff-c329-39b0-9fbb-821e12c36ef0)
Oct  2 06:49:43 np0005465988 NetworkManager[862]: <info>  [1759402183.6090] device (eth1): Activation: starting connection 'Wired connection 1' (7ba3a9ff-c329-39b0-9fbb-821e12c36ef0)
Oct  2 06:49:43 np0005465988 NetworkManager[862]: <info>  [1759402183.6091] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 06:49:43 np0005465988 NetworkManager[862]: <info>  [1759402183.6093] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 06:49:43 np0005465988 NetworkManager[862]: <info>  [1759402183.6097] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 06:49:43 np0005465988 NetworkManager[862]: <info>  [1759402183.6100] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  2 06:49:44 np0005465988 systemd-logind[827]: New session 3 of user zuul.
Oct  2 06:49:44 np0005465988 systemd[1]: Started Session 3 of User zuul.
Oct  2 06:49:44 np0005465988 python3[3738]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-3fa1-89d7-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:49:54 np0005465988 python3[3818]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 06:49:55 np0005465988 python3[3891]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759402194.5106916-206-79379610941740/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=92a35283006dda49a92cab35792372469851aa6a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:49:55 np0005465988 python3[3941]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 06:49:55 np0005465988 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  2 06:49:55 np0005465988 systemd[1]: Stopped Network Manager Wait Online.
Oct  2 06:49:55 np0005465988 systemd[1]: Stopping Network Manager Wait Online...
Oct  2 06:49:55 np0005465988 systemd[1]: Stopping Network Manager...
Oct  2 06:49:55 np0005465988 NetworkManager[862]: <info>  [1759402195.8361] caught SIGTERM, shutting down normally.
Oct  2 06:49:55 np0005465988 NetworkManager[862]: <info>  [1759402195.8372] dhcp4 (eth0): canceled DHCP transaction
Oct  2 06:49:55 np0005465988 NetworkManager[862]: <info>  [1759402195.8372] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 06:49:55 np0005465988 NetworkManager[862]: <info>  [1759402195.8372] dhcp4 (eth0): state changed no lease
Oct  2 06:49:55 np0005465988 NetworkManager[862]: <info>  [1759402195.8374] manager: NetworkManager state is now CONNECTING
Oct  2 06:49:55 np0005465988 NetworkManager[862]: <info>  [1759402195.8524] dhcp4 (eth1): canceled DHCP transaction
Oct  2 06:49:55 np0005465988 NetworkManager[862]: <info>  [1759402195.8525] dhcp4 (eth1): state changed no lease
Oct  2 06:49:55 np0005465988 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 06:49:55 np0005465988 NetworkManager[862]: <info>  [1759402195.8580] exiting (success)
Oct  2 06:49:55 np0005465988 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 06:49:55 np0005465988 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  2 06:49:55 np0005465988 systemd[1]: Stopped Network Manager.
Oct  2 06:49:55 np0005465988 systemd[1]: NetworkManager.service: Consumed 1.345s CPU time, 10.0M memory peak.
Oct  2 06:49:55 np0005465988 systemd[1]: Starting Network Manager...
Oct  2 06:49:55 np0005465988 NetworkManager[3957]: <info>  [1759402195.9234] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:56c3487b-f235-4e9d-84a7-a894185724de)
Oct  2 06:49:55 np0005465988 NetworkManager[3957]: <info>  [1759402195.9236] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  2 06:49:55 np0005465988 NetworkManager[3957]: <info>  [1759402195.9283] manager[0x56397b077070]: monitoring kernel firmware directory '/lib/firmware'.
Oct  2 06:49:55 np0005465988 systemd[1]: Starting Hostname Service...
Oct  2 06:49:56 np0005465988 systemd[1]: Started Hostname Service.
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0030] hostname: hostname: using hostnamed
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0030] hostname: static hostname changed from (none) to "np0005465988.novalocal"
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0035] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0041] manager[0x56397b077070]: rfkill: Wi-Fi hardware radio set enabled
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0041] manager[0x56397b077070]: rfkill: WWAN hardware radio set enabled
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0065] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0066] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0066] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0067] manager: Networking is enabled by state file
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0068] settings: Loaded settings plugin: keyfile (internal)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0072] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0097] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0106] dhcp: init: Using DHCP client 'internal'
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0108] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0112] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0118] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0124] device (lo): Activation: starting connection 'lo' (c72ddb6c-2533-4269-990c-c2ee08946507)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0129] device (eth0): carrier: link connected
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0132] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0136] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0137] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0142] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0147] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0152] device (eth1): carrier: link connected
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0155] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0160] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (7ba3a9ff-c329-39b0-9fbb-821e12c36ef0) (indicated)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0160] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0165] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0170] device (eth1): Activation: starting connection 'Wired connection 1' (7ba3a9ff-c329-39b0-9fbb-821e12c36ef0)
Oct  2 06:49:56 np0005465988 systemd[1]: Started Network Manager.
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0175] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0178] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0180] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0183] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0185] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0187] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0189] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0191] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0192] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0198] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0200] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0213] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0218] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0237] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0243] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0251] device (lo): Activation: successful, device activated.
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0260] dhcp4 (eth0): state changed new lease, address=38.129.56.216
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0269] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0321] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0348] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0350] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0352] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0354] device (eth0): Activation: successful, device activated.
Oct  2 06:49:56 np0005465988 NetworkManager[3957]: <info>  [1759402196.0357] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  2 06:49:56 np0005465988 systemd[1]: Starting Network Manager Wait Online...
Oct  2 06:49:56 np0005465988 python3[4025]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-3fa1-89d7-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:50:06 np0005465988 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 06:50:26 np0005465988 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6029] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 06:50:41 np0005465988 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 06:50:41 np0005465988 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6323] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6330] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6351] device (eth1): Activation: successful, device activated.
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6363] manager: startup complete
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6366] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <warn>  [1759402241.6384] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6402] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct  2 06:50:41 np0005465988 systemd[1]: Finished Network Manager Wait Online.
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6484] dhcp4 (eth1): canceled DHCP transaction
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6486] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6486] dhcp4 (eth1): state changed no lease
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6506] policy: auto-activating connection 'ci-private-network' (5250b2c0-1115-5385-8a00-befdeda027ca)
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6512] device (eth1): Activation: starting connection 'ci-private-network' (5250b2c0-1115-5385-8a00-befdeda027ca)
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6513] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6518] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6527] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6540] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6606] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6610] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 06:50:41 np0005465988 NetworkManager[3957]: <info>  [1759402241.6619] device (eth1): Activation: successful, device activated.
Oct  2 06:50:51 np0005465988 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 06:50:56 np0005465988 systemd[1]: session-3.scope: Deactivated successfully.
Oct  2 06:50:56 np0005465988 systemd[1]: session-3.scope: Consumed 1.648s CPU time.
Oct  2 06:50:56 np0005465988 systemd-logind[827]: Session 3 logged out. Waiting for processes to exit.
Oct  2 06:50:56 np0005465988 systemd-logind[827]: Removed session 3.
Oct  2 06:51:13 np0005465988 systemd-logind[827]: New session 4 of user zuul.
Oct  2 06:51:13 np0005465988 systemd[1]: Started Session 4 of User zuul.
Oct  2 06:51:13 np0005465988 python3[4134]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 06:51:14 np0005465988 python3[4207]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759402273.5195613-373-66008759830325/source _original_basename=tmpxwhx6dqk follow=False checksum=21cbaa2be6010ff877e8cd89353655f79ca4b790 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:51:16 np0005465988 systemd[1]: session-4.scope: Deactivated successfully.
Oct  2 06:51:16 np0005465988 systemd-logind[827]: Session 4 logged out. Waiting for processes to exit.
Oct  2 06:51:16 np0005465988 systemd-logind[827]: Removed session 4.
Oct  2 06:52:18 np0005465988 systemd[1063]: Created slice User Background Tasks Slice.
Oct  2 06:52:18 np0005465988 systemd[1063]: Starting Cleanup of User's Temporary Files and Directories...
Oct  2 06:52:18 np0005465988 systemd[1063]: Finished Cleanup of User's Temporary Files and Directories.
Oct  2 06:56:45 np0005465988 systemd-logind[827]: New session 5 of user zuul.
Oct  2 06:56:45 np0005465988 systemd[1]: Started Session 5 of User zuul.
Oct  2 06:56:46 np0005465988 python3[4269]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-bd1e-1dbd-000000000ca2-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:56:46 np0005465988 python3[4297]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:56:46 np0005465988 python3[4324]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:56:47 np0005465988 python3[4350]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:56:47 np0005465988 python3[4376]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:56:48 np0005465988 python3[4402]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:56:48 np0005465988 python3[4402]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct  2 06:56:49 np0005465988 python3[4428]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 06:56:49 np0005465988 systemd[1]: Reloading.
Oct  2 06:56:49 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 06:56:50 np0005465988 python3[4484]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct  2 06:56:51 np0005465988 python3[4511]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:56:51 np0005465988 python3[4540]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:56:52 np0005465988 python3[4568]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:56:52 np0005465988 python3[4596]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:56:52 np0005465988 python3[4623]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-bd1e-1dbd-000000000ca8-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:56:53 np0005465988 python3[4653]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 06:56:56 np0005465988 systemd[1]: session-5.scope: Deactivated successfully.
Oct  2 06:56:56 np0005465988 systemd[1]: session-5.scope: Consumed 3.569s CPU time.
Oct  2 06:56:56 np0005465988 systemd-logind[827]: Session 5 logged out. Waiting for processes to exit.
Oct  2 06:56:56 np0005465988 systemd-logind[827]: Removed session 5.
Oct  2 06:56:58 np0005465988 systemd-logind[827]: New session 6 of user zuul.
Oct  2 06:56:58 np0005465988 systemd[1]: Started Session 6 of User zuul.
Oct  2 06:56:58 np0005465988 python3[4688]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct  2 06:57:15 np0005465988 kernel: SELinux:  Converting 363 SID table entries...
Oct  2 06:57:15 np0005465988 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 06:57:15 np0005465988 kernel: SELinux:  policy capability open_perms=1
Oct  2 06:57:15 np0005465988 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 06:57:15 np0005465988 kernel: SELinux:  policy capability always_check_network=0
Oct  2 06:57:15 np0005465988 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 06:57:15 np0005465988 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 06:57:15 np0005465988 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 06:57:26 np0005465988 kernel: SELinux:  Converting 363 SID table entries...
Oct  2 06:57:26 np0005465988 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 06:57:26 np0005465988 kernel: SELinux:  policy capability open_perms=1
Oct  2 06:57:26 np0005465988 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 06:57:26 np0005465988 kernel: SELinux:  policy capability always_check_network=0
Oct  2 06:57:26 np0005465988 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 06:57:26 np0005465988 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 06:57:26 np0005465988 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 06:57:35 np0005465988 kernel: SELinux:  Converting 363 SID table entries...
Oct  2 06:57:35 np0005465988 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 06:57:35 np0005465988 kernel: SELinux:  policy capability open_perms=1
Oct  2 06:57:35 np0005465988 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 06:57:35 np0005465988 kernel: SELinux:  policy capability always_check_network=0
Oct  2 06:57:35 np0005465988 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 06:57:35 np0005465988 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 06:57:35 np0005465988 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 06:57:36 np0005465988 setsebool[4756]: The virt_use_nfs policy boolean was changed to 1 by root
Oct  2 06:57:36 np0005465988 setsebool[4756]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct  2 06:57:48 np0005465988 kernel: SELinux:  Converting 366 SID table entries...
Oct  2 06:57:48 np0005465988 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 06:57:48 np0005465988 kernel: SELinux:  policy capability open_perms=1
Oct  2 06:57:48 np0005465988 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 06:57:48 np0005465988 kernel: SELinux:  policy capability always_check_network=0
Oct  2 06:57:48 np0005465988 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 06:57:48 np0005465988 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 06:57:48 np0005465988 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 06:58:10 np0005465988 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  2 06:58:10 np0005465988 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 06:58:10 np0005465988 systemd[1]: Starting man-db-cache-update.service...
Oct  2 06:58:10 np0005465988 systemd[1]: Reloading.
Oct  2 06:58:10 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 06:58:10 np0005465988 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 06:58:11 np0005465988 systemd[1]: Starting PackageKit Daemon...
Oct  2 06:58:11 np0005465988 systemd[1]: Starting Authorization Manager...
Oct  2 06:58:11 np0005465988 polkitd[6311]: Started polkitd version 0.117
Oct  2 06:58:11 np0005465988 systemd[1]: Started Authorization Manager.
Oct  2 06:58:11 np0005465988 systemd[1]: Started PackageKit Daemon.
Oct  2 06:58:14 np0005465988 python3[9146]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-e981-cfdf-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 06:58:15 np0005465988 kernel: evm: overlay not supported
Oct  2 06:58:15 np0005465988 systemd[1063]: Starting D-Bus User Message Bus...
Oct  2 06:58:15 np0005465988 dbus-broker-launch[9857]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct  2 06:58:15 np0005465988 dbus-broker-launch[9857]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct  2 06:58:15 np0005465988 systemd[1063]: Started D-Bus User Message Bus.
Oct  2 06:58:15 np0005465988 dbus-broker-lau[9857]: Ready
Oct  2 06:58:15 np0005465988 systemd[1063]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  2 06:58:15 np0005465988 systemd[1063]: Created slice Slice /user.
Oct  2 06:58:15 np0005465988 systemd[1063]: podman-9745.scope: unit configures an IP firewall, but not running as root.
Oct  2 06:58:15 np0005465988 systemd[1063]: (This warning is only shown for the first unit using IP firewalling.)
Oct  2 06:58:15 np0005465988 systemd[1063]: Started podman-9745.scope.
Oct  2 06:58:16 np0005465988 systemd[1063]: Started podman-pause-926350d6.scope.
Oct  2 06:58:16 np0005465988 python3[10446]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.59:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.59:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:58:17 np0005465988 systemd[1]: session-6.scope: Deactivated successfully.
Oct  2 06:58:17 np0005465988 systemd[1]: session-6.scope: Consumed 1min 3.404s CPU time.
Oct  2 06:58:17 np0005465988 systemd-logind[827]: Session 6 logged out. Waiting for processes to exit.
Oct  2 06:58:17 np0005465988 systemd-logind[827]: Removed session 6.
Oct  2 06:58:41 np0005465988 systemd-logind[827]: New session 7 of user zuul.
Oct  2 06:58:41 np0005465988 systemd[1]: Started Session 7 of User zuul.
Oct  2 06:58:42 np0005465988 python3[18574]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNoWb+X4zjyiL1C8i00X7uqnMpK2nqXgv8anwAVqGS5xltCc1+WIIAtSsS512sgaVFfbCTDrCO98+/EA3bNR7Uo= zuul@np0005465985.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:58:42 np0005465988 python3[18774]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNoWb+X4zjyiL1C8i00X7uqnMpK2nqXgv8anwAVqGS5xltCc1+WIIAtSsS512sgaVFfbCTDrCO98+/EA3bNR7Uo= zuul@np0005465985.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:58:44 np0005465988 python3[19206]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005465988.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct  2 06:58:44 np0005465988 python3[19519]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNoWb+X4zjyiL1C8i00X7uqnMpK2nqXgv8anwAVqGS5xltCc1+WIIAtSsS512sgaVFfbCTDrCO98+/EA3bNR7Uo= zuul@np0005465985.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 06:58:45 np0005465988 python3[19729]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 06:58:45 np0005465988 python3[19915]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759402724.872754-170-8567510711917/source _original_basename=tmpxiapsjxu follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 06:58:46 np0005465988 python3[20226]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Oct  2 06:58:46 np0005465988 systemd[1]: Starting Hostname Service...
Oct  2 06:58:46 np0005465988 systemd[1]: Started Hostname Service.
Oct  2 06:58:46 np0005465988 systemd-hostnamed[20327]: Changed pretty hostname to 'compute-2'
Oct  2 06:58:46 np0005465988 systemd-hostnamed[20327]: Hostname set to <compute-2> (static)
Oct  2 06:58:46 np0005465988 NetworkManager[3957]: <info>  [1759402726.7159] hostname: static hostname changed from "np0005465988.novalocal" to "compute-2"
Oct  2 06:58:46 np0005465988 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 06:58:46 np0005465988 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 06:58:47 np0005465988 systemd[1]: session-7.scope: Deactivated successfully.
Oct  2 06:58:47 np0005465988 systemd[1]: session-7.scope: Consumed 2.306s CPU time.
Oct  2 06:58:47 np0005465988 systemd-logind[827]: Session 7 logged out. Waiting for processes to exit.
Oct  2 06:58:47 np0005465988 systemd-logind[827]: Removed session 7.
Oct  2 06:58:56 np0005465988 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 06:59:12 np0005465988 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 06:59:12 np0005465988 systemd[1]: Finished man-db-cache-update.service.
Oct  2 06:59:12 np0005465988 systemd[1]: man-db-cache-update.service: Consumed 1min 6.201s CPU time.
Oct  2 06:59:12 np0005465988 systemd[1]: run-re1e52232e0d14b95bf0ff1c789176c73.service: Deactivated successfully.
Oct  2 06:59:16 np0005465988 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 07:01:01 np0005465988 systemd[1]: Starting Cleanup of Temporary Directories...
Oct  2 07:01:02 np0005465988 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct  2 07:01:02 np0005465988 systemd[1]: Finished Cleanup of Temporary Directories.
Oct  2 07:01:02 np0005465988 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct  2 07:03:16 np0005465988 systemd[1]: packagekit.service: Deactivated successfully.
Oct  2 07:03:50 np0005465988 systemd-logind[827]: New session 8 of user zuul.
Oct  2 07:03:50 np0005465988 systemd[1]: Started Session 8 of User zuul.
Oct  2 07:03:51 np0005465988 python3[26670]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:03:54 np0005465988 python3[26786]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:03:54 np0005465988 python3[26859]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403033.8182964-30615-212795378560985/source mode=0755 _original_basename=delorean.repo follow=False checksum=bb4c2ff9dad546f135d54d9729ea11b84117755d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:03:55 np0005465988 python3[26885]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:03:55 np0005465988 python3[26958]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403033.8182964-30615-212795378560985/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:03:55 np0005465988 python3[26984]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:03:56 np0005465988 python3[27057]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403033.8182964-30615-212795378560985/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:03:56 np0005465988 python3[27083]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:03:56 np0005465988 python3[27156]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403033.8182964-30615-212795378560985/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:03:57 np0005465988 python3[27182]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:03:57 np0005465988 python3[27255]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403033.8182964-30615-212795378560985/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:03:58 np0005465988 python3[27281]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:03:58 np0005465988 python3[27354]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403033.8182964-30615-212795378560985/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:03:58 np0005465988 python3[27380]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:03:59 np0005465988 python3[27453]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403033.8182964-30615-212795378560985/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=d911291791b114a72daf18f370e91cb1ae300933 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:04:11 np0005465988 python3[27501]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:09:10 np0005465988 systemd-logind[827]: Session 8 logged out. Waiting for processes to exit.
Oct  2 07:09:10 np0005465988 systemd[1]: session-8.scope: Deactivated successfully.
Oct  2 07:09:10 np0005465988 systemd[1]: session-8.scope: Consumed 6.234s CPU time.
Oct  2 07:09:10 np0005465988 systemd-logind[827]: Removed session 8.
Oct  2 07:18:18 np0005465988 systemd[1]: Starting dnf makecache...
Oct  2 07:18:18 np0005465988 dnf[27509]: Failed determining last makecache time.
Oct  2 07:18:18 np0005465988 dnf[27509]: delorean-openstack-barbican-42b4c41831408a8e323 341 kB/s |  13 kB     00:00
Oct  2 07:18:18 np0005465988 dnf[27509]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.7 MB/s |  65 kB     00:00
Oct  2 07:18:18 np0005465988 dnf[27509]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.3 MB/s |  32 kB     00:00
Oct  2 07:18:18 np0005465988 dnf[27509]: delorean-python-stevedore-c4acc5639fd2329372142 5.3 MB/s | 131 kB     00:00
Oct  2 07:18:18 np0005465988 dnf[27509]: delorean-python-cloudkitty-tests-tempest-3961dc 973 kB/s |  25 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-os-net-config-28598c2978b9e2207dd19fc4  12 MB/s | 356 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 1.8 MB/s |  42 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-python-designate-tests-tempest-347fdbc 605 kB/s |  18 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-openstack-glance-1fd12c29b339f30fe823e 781 kB/s |  18 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.3 MB/s |  29 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-openstack-manila-3c01b7181572c95dac462 1.2 MB/s |  25 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-python-whitebox-neutron-tests-tempest- 5.3 MB/s | 154 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-openstack-octavia-ba397f07a7331190208c 1.2 MB/s |  26 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-openstack-watcher-c014f81a8647287f6dcc 809 kB/s |  16 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-edpm-image-builder-55ba53cf215b14ed95b 381 kB/s | 7.4 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 4.7 MB/s | 144 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-openstack-swift-dc98a8463506ac520c469a 664 kB/s |  14 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-python-tempestconf-8515371b7cceebd4282 2.4 MB/s |  53 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: delorean-openstack-heat-ui-013accbfd179753bc3f0 3.8 MB/s |  96 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: CentOS Stream 9 - BaseOS                         55 kB/s | 6.7 kB     00:00
Oct  2 07:18:19 np0005465988 dnf[27509]: CentOS Stream 9 - AppStream                      73 kB/s | 6.8 kB     00:00
Oct  2 07:18:20 np0005465988 dnf[27509]: CentOS Stream 9 - CRB                            65 kB/s | 6.6 kB     00:00
Oct  2 07:18:20 np0005465988 dnf[27509]: CentOS Stream 9 - Extras packages                66 kB/s | 8.0 kB     00:00
Oct  2 07:18:20 np0005465988 dnf[27509]: dlrn-antelope-testing                            17 MB/s | 1.1 MB     00:00
Oct  2 07:18:20 np0005465988 dnf[27509]: dlrn-antelope-build-deps                         15 MB/s | 461 kB     00:00
Oct  2 07:18:20 np0005465988 dnf[27509]: centos9-rabbitmq                                7.1 MB/s | 123 kB     00:00
Oct  2 07:18:21 np0005465988 dnf[27509]: centos9-storage                                  19 MB/s | 415 kB     00:00
Oct  2 07:18:21 np0005465988 dnf[27509]: centos9-opstools                                1.2 MB/s |  51 kB     00:00
Oct  2 07:18:21 np0005465988 dnf[27509]: NFV SIG OpenvSwitch                             5.2 MB/s | 447 kB     00:00
Oct  2 07:18:22 np0005465988 dnf[27509]: repo-setup-centos-appstream                      43 MB/s |  25 MB     00:00
Oct  2 07:18:29 np0005465988 dnf[27509]: repo-setup-centos-baseos                        5.7 MB/s | 8.8 MB     00:01
Oct  2 07:18:30 np0005465988 dnf[27509]: repo-setup-centos-highavailability              1.3 MB/s | 744 kB     00:00
Oct  2 07:18:32 np0005465988 dnf[27509]: repo-setup-centos-powertools                    6.5 MB/s | 7.1 MB     00:01
Oct  2 07:18:38 np0005465988 dnf[27509]: Extra Packages for Enterprise Linux 9 - x86_64  4.7 MB/s |  20 MB     00:04
Oct  2 07:18:51 np0005465988 dnf[27509]: Metadata cache created.
Oct  2 07:18:51 np0005465988 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  2 07:18:51 np0005465988 systemd[1]: Finished dnf makecache.
Oct  2 07:18:51 np0005465988 systemd[1]: dnf-makecache.service: Consumed 23.730s CPU time.
Oct  2 07:19:32 np0005465988 systemd-logind[827]: New session 9 of user zuul.
Oct  2 07:19:32 np0005465988 systemd[1]: Started Session 9 of User zuul.
Oct  2 07:19:33 np0005465988 python3.9[27769]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:19:35 np0005465988 python3.9[27950]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:19:48 np0005465988 systemd[1]: session-9.scope: Deactivated successfully.
Oct  2 07:19:48 np0005465988 systemd[1]: session-9.scope: Consumed 8.853s CPU time.
Oct  2 07:19:48 np0005465988 systemd-logind[827]: Session 9 logged out. Waiting for processes to exit.
Oct  2 07:19:48 np0005465988 systemd-logind[827]: Removed session 9.
Oct  2 07:20:04 np0005465988 systemd-logind[827]: New session 10 of user zuul.
Oct  2 07:20:04 np0005465988 systemd[1]: Started Session 10 of User zuul.
Oct  2 07:20:04 np0005465988 python3.9[28163]: ansible-ansible.legacy.ping Invoked with data=pong
Oct  2 07:20:06 np0005465988 python3.9[28337]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:20:07 np0005465988 python3.9[28489]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:20:08 np0005465988 python3.9[28642]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:20:09 np0005465988 python3.9[28794]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:20:09 np0005465988 python3.9[28946]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:20:10 np0005465988 python3.9[29069]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404009.3494365-184-265161242115703/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:20:11 np0005465988 python3.9[29221]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:20:12 np0005465988 python3.9[29377]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:20:13 np0005465988 python3.9[29527]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:20:17 np0005465988 python3.9[29782]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:20:18 np0005465988 python3.9[29932]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:20:19 np0005465988 python3.9[30086]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:20:21 np0005465988 python3.9[30244]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:20:21 np0005465988 python3.9[30328]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:21:10 np0005465988 systemd[1]: Reloading.
Oct  2 07:21:11 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:21:11 np0005465988 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct  2 07:21:12 np0005465988 systemd[1]: Reloading.
Oct  2 07:21:12 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:21:12 np0005465988 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct  2 07:21:12 np0005465988 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct  2 07:21:12 np0005465988 systemd[1]: Reloading.
Oct  2 07:21:12 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:21:12 np0005465988 systemd[1]: Listening on LVM2 poll daemon socket.
Oct  2 07:21:13 np0005465988 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Oct  2 07:21:13 np0005465988 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Oct  2 07:21:13 np0005465988 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Oct  2 07:22:44 np0005465988 kernel: SELinux:  Converting 2716 SID table entries...
Oct  2 07:22:44 np0005465988 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:22:44 np0005465988 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:22:44 np0005465988 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:22:44 np0005465988 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:22:44 np0005465988 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:22:44 np0005465988 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:22:44 np0005465988 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:22:45 np0005465988 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct  2 07:22:45 np0005465988 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:22:45 np0005465988 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:22:45 np0005465988 systemd[1]: Reloading.
Oct  2 07:22:45 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:22:45 np0005465988 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:22:46 np0005465988 systemd[1]: Starting PackageKit Daemon...
Oct  2 07:22:46 np0005465988 systemd[1]: Started PackageKit Daemon.
Oct  2 07:22:48 np0005465988 python3.9[31847]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:22:52 np0005465988 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:22:52 np0005465988 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:22:52 np0005465988 systemd[1]: man-db-cache-update.service: Consumed 1.498s CPU time.
Oct  2 07:22:52 np0005465988 systemd[1]: run-rf1ffb0b7294341128997750bd2806b2f.service: Deactivated successfully.
Oct  2 07:22:52 np0005465988 python3.9[32128]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct  2 07:22:53 np0005465988 python3.9[32282]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct  2 07:22:59 np0005465988 python3.9[32436]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:23:07 np0005465988 python3.9[32588]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct  2 07:23:08 np0005465988 python3.9[32740]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:23:09 np0005465988 python3.9[32892]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:23:10 np0005465988 python3.9[33015]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404188.854187-646-131718454821036/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=131465f77d8d4faa1442e1beada7324e1814ff9f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:23:11 np0005465988 python3.9[33167]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct  2 07:23:12 np0005465988 python3.9[33320]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:23:12 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:23:13 np0005465988 python3.9[33479]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:23:14 np0005465988 python3.9[33639]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct  2 07:23:15 np0005465988 python3.9[33792]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:23:16 np0005465988 python3.9[33950]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct  2 07:23:17 np0005465988 python3.9[34102]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:23:18 np0005465988 irqbalance[825]: Cannot change IRQ 26 affinity: Operation not permitted
Oct  2 07:23:18 np0005465988 irqbalance[825]: IRQ 26 affinity is now unmanaged
Oct  2 07:23:20 np0005465988 python3.9[34255]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:23:21 np0005465988 python3.9[34407]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:23:21 np0005465988 python3.9[34530]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759404200.7811162-932-279563564205712/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:23:23 np0005465988 python3.9[34682]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:23:23 np0005465988 systemd[1]: Starting Load Kernel Modules...
Oct  2 07:23:23 np0005465988 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct  2 07:23:23 np0005465988 kernel: Bridge firewalling registered
Oct  2 07:23:23 np0005465988 systemd-modules-load[34686]: Inserted module 'br_netfilter'
Oct  2 07:23:23 np0005465988 systemd[1]: Finished Load Kernel Modules.
Oct  2 07:23:24 np0005465988 python3.9[34843]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:23:24 np0005465988 python3.9[34966]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759404203.6100175-1001-100190640371680/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:23:25 np0005465988 python3.9[35118]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:23:31 np0005465988 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Oct  2 07:23:31 np0005465988 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Oct  2 07:23:32 np0005465988 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:23:32 np0005465988 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:23:32 np0005465988 systemd[1]: Reloading.
Oct  2 07:23:32 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:23:32 np0005465988 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:23:35 np0005465988 python3.9[37218]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:23:36 np0005465988 python3.9[38220]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct  2 07:23:37 np0005465988 python3.9[38869]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:23:38 np0005465988 python3.9[39314]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:23:38 np0005465988 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  2 07:23:39 np0005465988 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  2 07:23:40 np0005465988 python3.9[39687]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:23:40 np0005465988 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct  2 07:23:40 np0005465988 systemd[1]: tuned.service: Deactivated successfully.
Oct  2 07:23:40 np0005465988 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct  2 07:23:40 np0005465988 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  2 07:23:40 np0005465988 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  2 07:23:41 np0005465988 python3.9[39848]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct  2 07:23:42 np0005465988 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:23:42 np0005465988 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:23:42 np0005465988 systemd[1]: man-db-cache-update.service: Consumed 5.831s CPU time.
Oct  2 07:23:42 np0005465988 systemd[1]: run-rf29cf46d30674cb19dac9e013a1651e7.service: Deactivated successfully.
Oct  2 07:23:45 np0005465988 python3.9[40001]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:23:45 np0005465988 systemd[1]: Reloading.
Oct  2 07:23:45 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:23:46 np0005465988 python3.9[40191]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:23:46 np0005465988 systemd[1]: Reloading.
Oct  2 07:23:46 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:23:47 np0005465988 python3.9[40380]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:23:48 np0005465988 python3.9[40533]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:23:48 np0005465988 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct  2 07:23:49 np0005465988 python3.9[40686]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:23:52 np0005465988 python3.9[40848]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:23:53 np0005465988 python3.9[41001]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:23:53 np0005465988 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  2 07:23:53 np0005465988 systemd[1]: Stopped Apply Kernel Variables.
Oct  2 07:23:53 np0005465988 systemd[1]: Stopping Apply Kernel Variables...
Oct  2 07:23:53 np0005465988 systemd[1]: Starting Apply Kernel Variables...
Oct  2 07:23:53 np0005465988 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  2 07:23:53 np0005465988 systemd[1]: Finished Apply Kernel Variables.
Oct  2 07:23:53 np0005465988 systemd[1]: session-10.scope: Deactivated successfully.
Oct  2 07:23:53 np0005465988 systemd[1]: session-10.scope: Consumed 2min 17.272s CPU time.
Oct  2 07:23:53 np0005465988 systemd-logind[827]: Session 10 logged out. Waiting for processes to exit.
Oct  2 07:23:53 np0005465988 systemd-logind[827]: Removed session 10.
Oct  2 07:23:58 np0005465988 systemd-logind[827]: New session 11 of user zuul.
Oct  2 07:23:58 np0005465988 systemd[1]: Started Session 11 of User zuul.
Oct  2 07:24:00 np0005465988 python3.9[41184]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:24:01 np0005465988 python3.9[41340]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct  2 07:24:02 np0005465988 python3.9[41493]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:24:03 np0005465988 python3.9[41651]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:24:05 np0005465988 python3.9[41811]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:24:06 np0005465988 python3.9[41895]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:24:11 np0005465988 python3.9[42058]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:24:23 np0005465988 kernel: SELinux:  Converting 2726 SID table entries...
Oct  2 07:24:23 np0005465988 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:24:23 np0005465988 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:24:23 np0005465988 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:24:23 np0005465988 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:24:23 np0005465988 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:24:23 np0005465988 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:24:23 np0005465988 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:24:23 np0005465988 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct  2 07:24:23 np0005465988 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct  2 07:24:24 np0005465988 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:24:24 np0005465988 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:24:24 np0005465988 systemd[1]: Reloading.
Oct  2 07:24:24 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:24:24 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:24:25 np0005465988 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:24:26 np0005465988 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:24:26 np0005465988 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:24:26 np0005465988 systemd[1]: run-r8550b50ae3c44cb2aadd875d7e298223.service: Deactivated successfully.
Oct  2 07:24:28 np0005465988 python3.9[43161]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:24:28 np0005465988 systemd[1]: Reloading.
Oct  2 07:24:28 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:24:28 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:24:28 np0005465988 systemd[1]: Starting Open vSwitch Database Unit...
Oct  2 07:24:28 np0005465988 chown[43203]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct  2 07:24:28 np0005465988 ovs-ctl[43208]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct  2 07:24:28 np0005465988 ovs-ctl[43208]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct  2 07:24:28 np0005465988 ovs-ctl[43208]: Starting ovsdb-server [  OK  ]
Oct  2 07:24:28 np0005465988 ovs-vsctl[43257]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct  2 07:24:29 np0005465988 ovs-vsctl[43277]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"90028908-5ebc-4bb4-8a1f-92ec79bb27aa\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct  2 07:24:29 np0005465988 ovs-ctl[43208]: Configuring Open vSwitch system IDs [  OK  ]
Oct  2 07:24:29 np0005465988 ovs-ctl[43208]: Enabling remote OVSDB managers [  OK  ]
Oct  2 07:24:29 np0005465988 systemd[1]: Started Open vSwitch Database Unit.
Oct  2 07:24:29 np0005465988 ovs-vsctl[43283]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Oct  2 07:24:29 np0005465988 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct  2 07:24:29 np0005465988 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct  2 07:24:29 np0005465988 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct  2 07:24:29 np0005465988 kernel: openvswitch: Open vSwitch switching datapath
Oct  2 07:24:29 np0005465988 ovs-ctl[43327]: Inserting openvswitch module [  OK  ]
Oct  2 07:24:29 np0005465988 ovs-ctl[43296]: Starting ovs-vswitchd [  OK  ]
Oct  2 07:24:29 np0005465988 ovs-vsctl[43345]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Oct  2 07:24:29 np0005465988 ovs-ctl[43296]: Enabling remote OVSDB managers [  OK  ]
Oct  2 07:24:29 np0005465988 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct  2 07:24:29 np0005465988 systemd[1]: Starting Open vSwitch...
Oct  2 07:24:29 np0005465988 systemd[1]: Finished Open vSwitch.
Oct  2 07:24:30 np0005465988 python3.9[43496]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:24:31 np0005465988 python3.9[43648]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct  2 07:24:32 np0005465988 kernel: SELinux:  Converting 2740 SID table entries...
Oct  2 07:24:32 np0005465988 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:24:32 np0005465988 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:24:32 np0005465988 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:24:32 np0005465988 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:24:32 np0005465988 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:24:32 np0005465988 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:24:32 np0005465988 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:24:35 np0005465988 python3.9[43803]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:24:36 np0005465988 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct  2 07:24:36 np0005465988 python3.9[43961]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:24:38 np0005465988 python3.9[44114]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:24:40 np0005465988 python3.9[44401]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:24:41 np0005465988 python3.9[44551]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:24:41 np0005465988 python3.9[44705]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:24:44 np0005465988 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:24:44 np0005465988 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:24:44 np0005465988 systemd[1]: Reloading.
Oct  2 07:24:44 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:24:44 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:24:44 np0005465988 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:24:45 np0005465988 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:24:45 np0005465988 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:24:45 np0005465988 systemd[1]: run-r9c26d779792c4bae80412b7806a4b067.service: Deactivated successfully.
Oct  2 07:24:46 np0005465988 python3.9[45023]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:24:46 np0005465988 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  2 07:24:46 np0005465988 systemd[1]: Stopped Network Manager Wait Online.
Oct  2 07:24:46 np0005465988 systemd[1]: Stopping Network Manager Wait Online...
Oct  2 07:24:46 np0005465988 systemd[1]: Stopping Network Manager...
Oct  2 07:24:46 np0005465988 NetworkManager[3957]: <info>  [1759404286.2192] caught SIGTERM, shutting down normally.
Oct  2 07:24:46 np0005465988 NetworkManager[3957]: <info>  [1759404286.2210] dhcp4 (eth0): canceled DHCP transaction
Oct  2 07:24:46 np0005465988 NetworkManager[3957]: <info>  [1759404286.2210] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:24:46 np0005465988 NetworkManager[3957]: <info>  [1759404286.2210] dhcp4 (eth0): state changed no lease
Oct  2 07:24:46 np0005465988 NetworkManager[3957]: <info>  [1759404286.2214] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 07:24:46 np0005465988 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:24:46 np0005465988 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:24:46 np0005465988 NetworkManager[3957]: <info>  [1759404286.3662] exiting (success)
Oct  2 07:24:46 np0005465988 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  2 07:24:46 np0005465988 systemd[1]: Stopped Network Manager.
Oct  2 07:24:46 np0005465988 systemd[1]: NetworkManager.service: Consumed 13.066s CPU time, 4.1M memory peak, read 0B from disk, written 20.0K to disk.
Oct  2 07:24:46 np0005465988 systemd[1]: Starting Network Manager...
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.4210] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:56c3487b-f235-4e9d-84a7-a894185724de)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.4212] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.4267] manager[0x5622396c1090]: monitoring kernel firmware directory '/lib/firmware'.
Oct  2 07:24:46 np0005465988 systemd[1]: Starting Hostname Service...
Oct  2 07:24:46 np0005465988 systemd[1]: Started Hostname Service.
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5496] hostname: hostname: using hostnamed
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5496] hostname: static hostname changed from (none) to "compute-2"
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5501] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5505] manager[0x5622396c1090]: rfkill: Wi-Fi hardware radio set enabled
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5505] manager[0x5622396c1090]: rfkill: WWAN hardware radio set enabled
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5526] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5534] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5535] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5535] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5536] manager: Networking is enabled by state file
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5538] settings: Loaded settings plugin: keyfile (internal)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5541] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5567] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5578] dhcp: init: Using DHCP client 'internal'
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5581] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5586] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5591] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5599] device (lo): Activation: starting connection 'lo' (c72ddb6c-2533-4269-990c-c2ee08946507)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5607] device (eth0): carrier: link connected
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5611] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5616] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5616] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5623] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5629] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5636] device (eth1): carrier: link connected
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5639] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5643] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (5250b2c0-1115-5385-8a00-befdeda027ca) (indicated)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5643] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5648] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5655] device (eth1): Activation: starting connection 'ci-private-network' (5250b2c0-1115-5385-8a00-befdeda027ca)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5662] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  2 07:24:46 np0005465988 systemd[1]: Started Network Manager.
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5669] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5672] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5674] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5676] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5679] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5681] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5684] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5689] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5707] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5709] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5717] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5729] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5737] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5739] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5745] device (lo): Activation: successful, device activated.
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5752] dhcp4 (eth0): state changed new lease, address=38.129.56.216
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.5760] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  2 07:24:46 np0005465988 systemd[1]: Starting Network Manager Wait Online...
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.6576] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.6583] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.6585] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.6587] manager: NetworkManager state is now CONNECTED_LOCAL
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.6590] device (eth1): Activation: successful, device activated.
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.6597] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.6598] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.6601] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.6604] device (eth0): Activation: successful, device activated.
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.6608] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  2 07:24:46 np0005465988 NetworkManager[45041]: <info>  [1759404286.6610] manager: startup complete
Oct  2 07:24:46 np0005465988 systemd[1]: Finished Network Manager Wait Online.
Oct  2 07:24:47 np0005465988 python3.9[45249]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:24:52 np0005465988 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:24:52 np0005465988 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:24:52 np0005465988 systemd[1]: Reloading.
Oct  2 07:24:52 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:24:52 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:24:52 np0005465988 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:24:53 np0005465988 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:24:53 np0005465988 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:24:53 np0005465988 systemd[1]: run-r5022e31ab31044e290953a600069b154.service: Deactivated successfully.
Oct  2 07:24:55 np0005465988 python3.9[45711]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:24:55 np0005465988 python3.9[45863]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:24:56 np0005465988 python3.9[46017]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:24:56 np0005465988 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:24:57 np0005465988 python3.9[46169]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:24:58 np0005465988 python3.9[46321]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:24:58 np0005465988 python3.9[46473]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:24:59 np0005465988 python3.9[46625]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:25:00 np0005465988 python3.9[46748]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404299.0838084-654-190116668369184/.source _original_basename=.69xje4dt follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:25:01 np0005465988 python3.9[46900]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:25:02 np0005465988 python3.9[47052]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct  2 07:25:03 np0005465988 python3.9[47204]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:25:05 np0005465988 python3.9[47631]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct  2 07:25:06 np0005465988 ansible-async_wrapper.py[47806]: Invoked with j702790357617 300 /home/zuul/.ansible/tmp/ansible-tmp-1759404305.7705326-852-73096452997763/AnsiballZ_edpm_os_net_config.py _
Oct  2 07:25:06 np0005465988 ansible-async_wrapper.py[47809]: Starting module and watcher
Oct  2 07:25:06 np0005465988 ansible-async_wrapper.py[47809]: Start watching 47810 (300)
Oct  2 07:25:06 np0005465988 ansible-async_wrapper.py[47810]: Start module (47810)
Oct  2 07:25:06 np0005465988 ansible-async_wrapper.py[47806]: Return async_wrapper task started.
Oct  2 07:25:06 np0005465988 python3.9[47811]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct  2 07:25:07 np0005465988 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct  2 07:25:07 np0005465988 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct  2 07:25:07 np0005465988 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct  2 07:25:07 np0005465988 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct  2 07:25:07 np0005465988 kernel: cfg80211: failed to load regulatory.db
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.7679] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.7697] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8339] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8340] audit: op="connection-add" uuid="bc59c6da-a657-4046-9310-b6068b4e9e52" name="br-ex-br" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8358] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8360] audit: op="connection-add" uuid="59ce69d2-9692-4c08-afd5-1c5dc5744330" name="br-ex-port" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8375] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8376] audit: op="connection-add" uuid="c4a5e099-4e8c-44ee-9ccb-7834c48427b2" name="eth1-port" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8395] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8396] audit: op="connection-add" uuid="1c75ca9e-b45c-4e36-a95d-3ccc4862bb05" name="vlan20-port" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8417] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8419] audit: op="connection-add" uuid="2056253a-61f4-4159-8350-57a2e9ae2607" name="vlan21-port" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8437] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8440] audit: op="connection-add" uuid="7be05f0a-83f6-4e6a-925d-6dbeacf5d756" name="vlan22-port" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8460] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8465] audit: op="connection-add" uuid="3c5067bf-1637-4346-8722-66be3ec1c81c" name="vlan23-port" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8487] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8504] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8505] audit: op="connection-add" uuid="d98d298d-f8b2-4262-84ad-6660786776fa" name="br-ex-if" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8895] audit: op="connection-update" uuid="5250b2c0-1115-5385-8a00-befdeda027ca" name="ci-private-network" args="ovs-external-ids.data,connection.master,connection.timestamp,connection.port-type,connection.controller,connection.slave-type,ipv4.routes,ipv4.dns,ipv4.never-default,ipv4.routing-rules,ipv4.addresses,ipv4.method,ovs-interface.type,ipv6.routes,ipv6.dns,ipv6.addr-gen-mode,ipv6.routing-rules,ipv6.addresses,ipv6.method" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8931] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8936] audit: op="connection-add" uuid="6d848826-eb31-46af-974d-280e815df650" name="vlan20-if" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8962] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8966] audit: op="connection-add" uuid="f6d301ad-9e4e-4abf-a2f9-a55d1ff6c41f" name="vlan21-if" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8996] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.8999] audit: op="connection-add" uuid="4effa8f7-87f7-4d13-afe9-8965f6e9d4f9" name="vlan22-if" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9028] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9032] audit: op="connection-add" uuid="b308ecc9-9d67-49e4-8ae9-2b032a51e445" name="vlan23-if" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9050] audit: op="connection-delete" uuid="7ba3a9ff-c329-39b0-9fbb-821e12c36ef0" name="Wired connection 1" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9069] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9083] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9089] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (bc59c6da-a657-4046-9310-b6068b4e9e52)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9091] audit: op="connection-activate" uuid="bc59c6da-a657-4046-9310-b6068b4e9e52" name="br-ex-br" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9094] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9106] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9113] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (59ce69d2-9692-4c08-afd5-1c5dc5744330)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9116] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9126] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9132] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (c4a5e099-4e8c-44ee-9ccb-7834c48427b2)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9134] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9143] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9150] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (1c75ca9e-b45c-4e36-a95d-3ccc4862bb05)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9152] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9162] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9168] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (2056253a-61f4-4159-8350-57a2e9ae2607)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9171] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9179] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9185] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (7be05f0a-83f6-4e6a-925d-6dbeacf5d756)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9187] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9196] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9202] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (3c5067bf-1637-4346-8722-66be3ec1c81c)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9203] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9207] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9210] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9220] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9229] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9237] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (d98d298d-f8b2-4262-84ad-6660786776fa)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9238] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9243] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9246] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9248] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9250] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9265] device (eth1): disconnecting for new activation request.
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9266] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9270] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9273] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9275] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9279] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9286] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9293] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (6d848826-eb31-46af-974d-280e815df650)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9295] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9299] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9302] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9304] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9308] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9315] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9322] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (f6d301ad-9e4e-4abf-a2f9-a55d1ff6c41f)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9323] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9328] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9332] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9334] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9339] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9346] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9353] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (4effa8f7-87f7-4d13-afe9-8965f6e9d4f9)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9355] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9359] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9362] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9365] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9370] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9377] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9384] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (b308ecc9-9d67-49e4-8ae9-2b032a51e445)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9385] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9389] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9392] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9394] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9397] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9414] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method" pid=47812 uid=0 result="success"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9417] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9423] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9426] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9435] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9440] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9445] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9451] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 kernel: ovs-system: entered promiscuous mode
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9455] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9462] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9468] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9472] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9474] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9481] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 kernel: Timeout policy base is empty
Oct  2 07:25:08 np0005465988 systemd-udevd[47817]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9495] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9499] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9500] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9508] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9514] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9518] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9521] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9525] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9531] dhcp4 (eth0): canceled DHCP transaction
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9531] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9531] dhcp4 (eth0): state changed no lease
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9533] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct  2 07:25:08 np0005465988 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9545] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9549] audit: op="device-reapply" interface="eth1" ifindex=3 pid=47812 uid=0 result="fail" reason="Device is not activated"
Oct  2 07:25:08 np0005465988 NetworkManager[45041]: <info>  [1759404308.9555] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct  2 07:25:08 np0005465988 kernel: br-ex: entered promiscuous mode
Oct  2 07:25:08 np0005465988 kernel: vlan20: entered promiscuous mode
Oct  2 07:25:08 np0005465988 systemd-udevd[47816]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:25:09 np0005465988 kernel: vlan21: entered promiscuous mode
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0097] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0105] dhcp4 (eth0): state changed new lease, address=38.129.56.216
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0117] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0125] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0136] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0142] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0148] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:25:09 np0005465988 kernel: vlan22: entered promiscuous mode
Oct  2 07:25:09 np0005465988 kernel: vlan23: entered promiscuous mode
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0871] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0887] device (eth1): Activation: starting connection 'ci-private-network' (5250b2c0-1115-5385-8a00-befdeda027ca)
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0891] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0893] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0894] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0896] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0897] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0898] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0900] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0902] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0907] device (eth1): disconnecting for new activation request.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0908] audit: op="connection-activate" uuid="5250b2c0-1115-5385-8a00-befdeda027ca" name="ci-private-network" pid=47812 uid=0 result="success"
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0923] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0930] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0933] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0938] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0942] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0947] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0950] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0956] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0959] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0964] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0967] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0972] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0976] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0980] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0985] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.0989] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1009] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1016] device (eth1): Activation: starting connection 'ci-private-network' (5250b2c0-1115-5385-8a00-befdeda027ca)
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1018] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47812 uid=0 result="success"
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1021] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1039] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1043] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1050] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1056] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1071] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1077] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1080] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1086] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1095] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1101] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1110] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1113] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1117] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1119] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1125] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1130] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1136] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1142] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1144] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1147] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1152] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1157] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1163] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1169] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1175] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1177] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:25:09 np0005465988 NetworkManager[45041]: <info>  [1759404309.1184] device (eth1): Activation: successful, device activated.
Oct  2 07:25:10 np0005465988 NetworkManager[45041]: <info>  [1759404310.3196] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47812 uid=0 result="success"
Oct  2 07:25:10 np0005465988 NetworkManager[45041]: <info>  [1759404310.4936] checkpoint[0x562239697950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct  2 07:25:10 np0005465988 NetworkManager[45041]: <info>  [1759404310.4939] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47812 uid=0 result="success"
Oct  2 07:25:10 np0005465988 python3.9[48174]: ansible-ansible.legacy.async_status Invoked with jid=j702790357617.47806 mode=status _async_dir=/root/.ansible_async
Oct  2 07:25:10 np0005465988 NetworkManager[45041]: <info>  [1759404310.7567] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47812 uid=0 result="success"
Oct  2 07:25:10 np0005465988 NetworkManager[45041]: <info>  [1759404310.7578] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47812 uid=0 result="success"
Oct  2 07:25:11 np0005465988 NetworkManager[45041]: <info>  [1759404311.0414] audit: op="networking-control" arg="global-dns-configuration" pid=47812 uid=0 result="success"
Oct  2 07:25:11 np0005465988 NetworkManager[45041]: <info>  [1759404311.1558] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct  2 07:25:11 np0005465988 NetworkManager[45041]: <info>  [1759404311.1967] audit: op="networking-control" arg="global-dns-configuration" pid=47812 uid=0 result="success"
Oct  2 07:25:11 np0005465988 NetworkManager[45041]: <info>  [1759404311.2005] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47812 uid=0 result="success"
Oct  2 07:25:11 np0005465988 NetworkManager[45041]: <info>  [1759404311.3971] checkpoint[0x562239697a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct  2 07:25:11 np0005465988 NetworkManager[45041]: <info>  [1759404311.3981] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47812 uid=0 result="success"
Oct  2 07:25:11 np0005465988 ansible-async_wrapper.py[47810]: Module complete (47810)
Oct  2 07:25:11 np0005465988 ansible-async_wrapper.py[47809]: Done in kid B.
Oct  2 07:25:14 np0005465988 python3.9[48280]: ansible-ansible.legacy.async_status Invoked with jid=j702790357617.47806 mode=status _async_dir=/root/.ansible_async
Oct  2 07:25:14 np0005465988 python3.9[48380]: ansible-ansible.legacy.async_status Invoked with jid=j702790357617.47806 mode=cleanup _async_dir=/root/.ansible_async
Oct  2 07:25:15 np0005465988 python3.9[48532]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:25:16 np0005465988 python3.9[48655]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404314.9762588-933-211101413085536/.source.returncode _original_basename=.6bq7b7ft follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:25:16 np0005465988 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 07:25:17 np0005465988 python3.9[48810]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:25:17 np0005465988 python3.9[48934]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404316.531058-981-194304316229739/.source.cfg _original_basename=.g83lqtcd follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:25:18 np0005465988 python3.9[49086]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:25:18 np0005465988 systemd[1]: Reloading Network Manager...
Oct  2 07:25:18 np0005465988 NetworkManager[45041]: <info>  [1759404318.9900] audit: op="reload" arg="0" pid=49090 uid=0 result="success"
Oct  2 07:25:18 np0005465988 NetworkManager[45041]: <info>  [1759404318.9914] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct  2 07:25:19 np0005465988 systemd[1]: Reloaded Network Manager.
Oct  2 07:25:19 np0005465988 systemd[1]: session-11.scope: Deactivated successfully.
Oct  2 07:25:19 np0005465988 systemd[1]: session-11.scope: Consumed 52.653s CPU time.
Oct  2 07:25:19 np0005465988 systemd-logind[827]: Session 11 logged out. Waiting for processes to exit.
Oct  2 07:25:19 np0005465988 systemd-logind[827]: Removed session 11.
Oct  2 07:25:24 np0005465988 systemd-logind[827]: New session 12 of user zuul.
Oct  2 07:25:24 np0005465988 systemd[1]: Started Session 12 of User zuul.
Oct  2 07:25:26 np0005465988 python3.9[49274]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:25:27 np0005465988 python3.9[49428]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:25:28 np0005465988 python3.9[49622]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:25:28 np0005465988 systemd-logind[827]: Session 12 logged out. Waiting for processes to exit.
Oct  2 07:25:28 np0005465988 systemd[1]: session-12.scope: Deactivated successfully.
Oct  2 07:25:28 np0005465988 systemd[1]: session-12.scope: Consumed 2.759s CPU time.
Oct  2 07:25:28 np0005465988 systemd-logind[827]: Removed session 12.
Oct  2 07:25:29 np0005465988 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:25:35 np0005465988 systemd-logind[827]: New session 13 of user zuul.
Oct  2 07:25:35 np0005465988 systemd[1]: Started Session 13 of User zuul.
Oct  2 07:25:36 np0005465988 python3.9[49804]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:25:37 np0005465988 python3.9[49959]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:25:38 np0005465988 python3.9[50115]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:25:39 np0005465988 python3.9[50199]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:25:41 np0005465988 python3.9[50353]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:25:42 np0005465988 python3.9[50548]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:25:43 np0005465988 python3.9[50700]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:25:43 np0005465988 systemd[1]: var-lib-containers-storage-overlay-compat2958890847-merged.mount: Deactivated successfully.
Oct  2 07:25:43 np0005465988 podman[50701]: 2025-10-02 11:25:43.863961685 +0000 UTC m=+0.078984653 system refresh
Oct  2 07:25:44 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:25:45 np0005465988 python3.9[50863]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:25:46 np0005465988 python3.9[50986]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404344.592006-204-144797055200890/.source.json follow=False _original_basename=podman_network_config.j2 checksum=b8e9b1ed4f0544f43c742c6085b28002b7c2e6ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:25:46 np0005465988 python3.9[51138]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:25:47 np0005465988 python3.9[51261]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759404346.2339067-249-45087625496587/.source.conf follow=False _original_basename=registries.conf.j2 checksum=b0997da0dac7c72916bfa4feb1650346bde4dfbe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:25:48 np0005465988 python3.9[51413]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:25:49 np0005465988 python3.9[51565]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:25:49 np0005465988 python3.9[51717]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:25:50 np0005465988 python3.9[51869]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:25:51 np0005465988 python3.9[52021]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:25:53 np0005465988 python3.9[52174]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:25:54 np0005465988 python3.9[52328]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:25:55 np0005465988 python3.9[52480]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:25:56 np0005465988 python3.9[52632]: ansible-service_facts Invoked
Oct  2 07:25:56 np0005465988 network[52649]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:25:56 np0005465988 network[52650]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:25:56 np0005465988 network[52651]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:26:01 np0005465988 python3.9[53105]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:26:04 np0005465988 python3.9[53258]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct  2 07:26:05 np0005465988 python3.9[53410]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:26:06 np0005465988 python3.9[53535]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404365.4558072-646-202209506076441/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:07 np0005465988 python3.9[53689]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:26:07 np0005465988 python3.9[53814]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404366.8757956-691-48769842709941/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:09 np0005465988 python3.9[53968]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:11 np0005465988 python3.9[54122]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:26:12 np0005465988 python3.9[54206]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:26:13 np0005465988 python3.9[54360]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:26:14 np0005465988 python3.9[54444]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:26:14 np0005465988 chronyd[834]: chronyd exiting
Oct  2 07:26:14 np0005465988 systemd[1]: Stopping NTP client/server...
Oct  2 07:26:14 np0005465988 systemd[1]: chronyd.service: Deactivated successfully.
Oct  2 07:26:14 np0005465988 systemd[1]: Stopped NTP client/server.
Oct  2 07:26:14 np0005465988 systemd[1]: Starting NTP client/server...
Oct  2 07:26:14 np0005465988 chronyd[54453]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  2 07:26:14 np0005465988 chronyd[54453]: Frequency -28.764 +/- 0.126 ppm read from /var/lib/chrony/drift
Oct  2 07:26:14 np0005465988 chronyd[54453]: Loaded seccomp filter (level 2)
Oct  2 07:26:14 np0005465988 systemd[1]: Started NTP client/server.
Oct  2 07:26:15 np0005465988 systemd[1]: session-13.scope: Deactivated successfully.
Oct  2 07:26:15 np0005465988 systemd[1]: session-13.scope: Consumed 27.018s CPU time.
Oct  2 07:26:15 np0005465988 systemd-logind[827]: Session 13 logged out. Waiting for processes to exit.
Oct  2 07:26:15 np0005465988 systemd-logind[827]: Removed session 13.
Oct  2 07:26:21 np0005465988 systemd-logind[827]: New session 14 of user zuul.
Oct  2 07:26:21 np0005465988 systemd[1]: Started Session 14 of User zuul.
Oct  2 07:26:22 np0005465988 python3.9[54634]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:23 np0005465988 python3.9[54786]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:26:23 np0005465988 python3.9[54909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404382.3577082-69-81867724389637/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:24 np0005465988 systemd[1]: session-14.scope: Deactivated successfully.
Oct  2 07:26:24 np0005465988 systemd[1]: session-14.scope: Consumed 1.935s CPU time.
Oct  2 07:26:24 np0005465988 systemd-logind[827]: Session 14 logged out. Waiting for processes to exit.
Oct  2 07:26:24 np0005465988 systemd-logind[827]: Removed session 14.
Oct  2 07:26:30 np0005465988 systemd-logind[827]: New session 15 of user zuul.
Oct  2 07:26:30 np0005465988 systemd[1]: Started Session 15 of User zuul.
Oct  2 07:26:31 np0005465988 python3.9[55087]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:26:32 np0005465988 python3.9[55243]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:33 np0005465988 python3.9[55418]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:26:34 np0005465988 python3.9[55541]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759404392.6679337-90-102432922459163/.source.json _original_basename=.p6lqwgt0 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:35 np0005465988 python3.9[55693]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:26:35 np0005465988 python3.9[55816]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404394.6593146-159-31790965546396/.source _original_basename=.rkjfrhjj follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:36 np0005465988 python3.9[55968]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:26:37 np0005465988 python3.9[56120]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:26:37 np0005465988 python3.9[56243]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759404396.6834383-231-268263912992730/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:26:38 np0005465988 python3.9[56395]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:26:39 np0005465988 python3.9[56518]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759404398.023798-231-250078556788384/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:26:39 np0005465988 python3.9[56670]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:40 np0005465988 python3.9[56822]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:26:41 np0005465988 python3.9[56945]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404400.142137-343-263159011505405/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:42 np0005465988 python3.9[57097]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:26:42 np0005465988 python3.9[57220]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404401.5585177-388-25258742728057/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:44 np0005465988 python3.9[57372]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:26:44 np0005465988 systemd[1]: Reloading.
Oct  2 07:26:44 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:26:44 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:26:44 np0005465988 systemd[1]: Reloading.
Oct  2 07:26:44 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:26:44 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:26:44 np0005465988 systemd[1]: Starting EDPM Container Shutdown...
Oct  2 07:26:44 np0005465988 systemd[1]: Finished EDPM Container Shutdown.
Oct  2 07:26:45 np0005465988 python3.9[57599]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:26:46 np0005465988 python3.9[57722]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404404.8884318-457-208918379812228/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:46 np0005465988 python3.9[57874]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:26:47 np0005465988 python3.9[57997]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404406.2898672-501-204531085366371/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:26:48 np0005465988 python3.9[58149]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:26:48 np0005465988 systemd[1]: Reloading.
Oct  2 07:26:48 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:26:48 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:26:48 np0005465988 systemd[1]: Reloading.
Oct  2 07:26:48 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:26:48 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:26:49 np0005465988 systemd[1]: Starting Create netns directory...
Oct  2 07:26:49 np0005465988 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:26:49 np0005465988 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:26:49 np0005465988 systemd[1]: Finished Create netns directory.
Oct  2 07:26:49 np0005465988 python3.9[58377]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:26:50 np0005465988 network[58394]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:26:50 np0005465988 network[58395]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:26:50 np0005465988 network[58396]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:26:54 np0005465988 python3.9[58660]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:26:54 np0005465988 systemd[1]: Reloading.
Oct  2 07:26:54 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:26:54 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:26:54 np0005465988 systemd[1]: Stopping IPv4 firewall with iptables...
Oct  2 07:26:54 np0005465988 iptables.init[58701]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct  2 07:26:54 np0005465988 iptables.init[58701]: iptables: Flushing firewall rules: [  OK  ]
Oct  2 07:26:54 np0005465988 systemd[1]: iptables.service: Deactivated successfully.
Oct  2 07:26:54 np0005465988 systemd[1]: Stopped IPv4 firewall with iptables.
Oct  2 07:26:55 np0005465988 python3.9[58897]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:26:56 np0005465988 python3.9[59051]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:26:56 np0005465988 systemd[1]: Reloading.
Oct  2 07:26:56 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:26:56 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:26:57 np0005465988 systemd[1]: Starting Netfilter Tables...
Oct  2 07:26:57 np0005465988 systemd[1]: Finished Netfilter Tables.
Oct  2 07:26:58 np0005465988 python3.9[59243]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:26:59 np0005465988 python3.9[59396]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:27:00 np0005465988 python3.9[59521]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404418.7139866-709-160973337311953/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:27:01 np0005465988 python3.9[59672]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:27:26 np0005465988 systemd[1]: session-15.scope: Deactivated successfully.
Oct  2 07:27:26 np0005465988 systemd[1]: session-15.scope: Consumed 22.320s CPU time.
Oct  2 07:27:26 np0005465988 systemd-logind[827]: Session 15 logged out. Waiting for processes to exit.
Oct  2 07:27:26 np0005465988 systemd-logind[827]: Removed session 15.
Oct  2 07:27:39 np0005465988 systemd-logind[827]: New session 16 of user zuul.
Oct  2 07:27:39 np0005465988 systemd[1]: Started Session 16 of User zuul.
Oct  2 07:27:41 np0005465988 python3.9[59865]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:27:42 np0005465988 python3.9[60021]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:27:43 np0005465988 python3.9[60196]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:27:43 np0005465988 python3.9[60274]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.kbd27dmr recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:27:44 np0005465988 python3.9[60426]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:27:45 np0005465988 python3.9[60504]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.p4hkolbs recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:27:45 np0005465988 python3.9[60656]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:27:46 np0005465988 python3.9[60808]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:27:47 np0005465988 python3.9[60886]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:27:47 np0005465988 python3.9[61038]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:27:48 np0005465988 python3.9[61116]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:27:48 np0005465988 python3.9[61268]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:27:49 np0005465988 python3.9[61420]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:27:50 np0005465988 python3.9[61498]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:27:50 np0005465988 python3.9[61650]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:27:51 np0005465988 python3.9[61728]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:27:52 np0005465988 python3.9[61880]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:27:52 np0005465988 systemd[1]: Reloading.
Oct  2 07:27:52 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:27:52 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:27:53 np0005465988 python3.9[62070]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:27:54 np0005465988 python3.9[62148]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:27:54 np0005465988 python3.9[62300]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:27:55 np0005465988 python3.9[62378]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:27:56 np0005465988 python3.9[62530]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:27:56 np0005465988 systemd[1]: Reloading.
Oct  2 07:27:56 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:27:56 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:27:56 np0005465988 systemd[1]: Starting Create netns directory...
Oct  2 07:27:56 np0005465988 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:27:56 np0005465988 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:27:56 np0005465988 systemd[1]: Finished Create netns directory.
Oct  2 07:27:57 np0005465988 python3.9[62722]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:27:57 np0005465988 network[62739]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:27:57 np0005465988 network[62740]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:27:57 np0005465988 network[62741]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:28:02 np0005465988 python3.9[63004]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:28:03 np0005465988 python3.9[63082]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:04 np0005465988 python3.9[63234]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:05 np0005465988 python3.9[63386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:28:05 np0005465988 python3.9[63509]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404484.467262-616-183874564495548/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:06 np0005465988 python3.9[63661]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  2 07:28:06 np0005465988 systemd[1]: Starting Time & Date Service...
Oct  2 07:28:06 np0005465988 systemd[1]: Started Time & Date Service.
Oct  2 07:28:07 np0005465988 python3.9[63817]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:08 np0005465988 python3.9[63969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:28:09 np0005465988 python3.9[64092]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404488.0881376-721-238135860540441/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:10 np0005465988 python3.9[64244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:28:10 np0005465988 python3.9[64367]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404489.6584601-766-35826793619632/.source.yaml _original_basename=.z_6tl7m5 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:11 np0005465988 python3.9[64519]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:28:12 np0005465988 python3.9[64642]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404491.147938-811-42443314511471/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:13 np0005465988 python3.9[64794]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:28:14 np0005465988 python3.9[64947]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:28:15 np0005465988 python3[65100]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:28:15 np0005465988 python3.9[65252]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:28:16 np0005465988 python3.9[65375]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404495.3988886-927-25308062420389/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:17 np0005465988 python3.9[65527]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:28:18 np0005465988 python3.9[65650]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404496.8785913-973-38341688124986/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:19 np0005465988 python3.9[65802]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:28:19 np0005465988 python3.9[65925]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404498.4779196-1017-115925923963942/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:20 np0005465988 python3.9[66077]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:28:21 np0005465988 python3.9[66200]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404499.9496527-1063-161574137543761/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:22 np0005465988 python3.9[66352]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:28:23 np0005465988 python3.9[66475]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404501.805385-1108-12448531250131/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:24 np0005465988 python3.9[66627]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:24 np0005465988 python3.9[66779]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:28:25 np0005465988 chronyd[54453]: Selected source 167.160.187.12 (pool.ntp.org)
Oct  2 07:28:25 np0005465988 python3.9[66938]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:26 np0005465988 python3.9[67091]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:27 np0005465988 python3.9[67243]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:28 np0005465988 python3.9[67395]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  2 07:28:29 np0005465988 python3.9[67548]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  2 07:28:29 np0005465988 systemd[1]: session-16.scope: Deactivated successfully.
Oct  2 07:28:29 np0005465988 systemd[1]: session-16.scope: Consumed 34.803s CPU time.
Oct  2 07:28:29 np0005465988 systemd-logind[827]: Session 16 logged out. Waiting for processes to exit.
Oct  2 07:28:29 np0005465988 systemd-logind[827]: Removed session 16.
Oct  2 07:28:35 np0005465988 systemd-logind[827]: New session 17 of user zuul.
Oct  2 07:28:35 np0005465988 systemd[1]: Started Session 17 of User zuul.
Oct  2 07:28:35 np0005465988 python3.9[67729]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct  2 07:28:36 np0005465988 python3.9[67881]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:28:36 np0005465988 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  2 07:28:37 np0005465988 python3.9[68035]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:28:38 np0005465988 python3.9[68187]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpTq9G8ymc65djWd0YMUA1/KMKQbBxw7LoyOCyAnPotUx6UyYfBtjYX5I4TzqzEugao1w+4AHDZ5XKSwr8sv9kaSGm0ERmNxz22+5cmKwWxcvUfNGQQXbk6gk6z5p0qpH/Ue9e19xDUC+RDUMGcwrysoGQ05aVcGDaEmNUxvYjj0UUfs45KX/pHPk5xQ4c0WjiL0BfzPJmphY2PAj6O9b4iFA3HjIJgvQ3+i3jEOkvA1FsXm5s7O1/wEjqwsdfKPlX0LUuCqXyxI4uhWY16Ofi89lEtsdQRwFyoZcDMJUDHMH8oJSopUNwwMEe7UBD1MHJSIzrd6NUGnvRjhqH6dE/IoT2X3f4JN/Six+J9ayDqiIkd1QNsJzPBr6G2Lj/dQbUusb3nXhPk5TXKMOXm5i+J940nYQv8/Y9rf2H1qltGaDEOS95ktKpcL6EVplOsQand/Qmb/ShKbiAo2dr3YC3v/FFE2AAj+0Dnh4xob14bhivkYHDhIF0zyzcVGhHZXc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICufzWCrq7lQCIqxq8UNP+WfGRQD+uOEPLr+ZneqofrM#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHP494uEOdMq07v1W25s7bKFki4bQuHkde7xWzYJuUT44SD4tSCrPbQiOkLCqtg9H5yxKL0Ovnl22PYLf1HMKAs=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2ry6wCZyHJmZsI4Z83U5DYzCaQhL5JmDbykEZokepEcnLFJt20bbnTU0eQzXJylkCgp7rhmpZo7V7qVNnZUMI3aLUHK30Yr5jzQVofHBRg6ZnAIq1MAwqwGH1s6vfNo+//zth4OMHvolMSEO6zSmOWeAsuHM2DTEJ6IdRasKfhOCc3oI/Tcf5vOUyVGg/BH+fFOHKzPiyJNXozsvw2u4ppfdkMJvVC9w2oTNHMIGcDxSsx0zD2bLdYe5l23tFIOaBM149ktg5KPPsKYyQFymOi5qJHHnf9027MqZ4N7Z9SYuQrqt2nY4C/XmaVFOmUIFNNMZ5qMWDsc38V9cHCgurSaMsQ4em1srXr9nzADLh9bw4WksIRfrtt3twMp7FG9fMsw8rdmFt0+4/IdHr/3wCmHeF07qp10kJPXa5z9dApoIKiQlbIl+UCzlaN5tHD6vb4q0MyhqAtU4mSA1zGz67c2lLSGbF4FTgU9yza15FZjHzQ0ArNu/1KIheA8nrpkE=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHwAEaDivXDvJkCgJw1MVhYQArg6qfdDb4SKBZRPdoOc#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNMghqQyWdigdn5yyuBSIQ3tHLq/tZwQO222aoRtckuDI9Ml6snE/xKJ7YWmTvRTsqj2tqCqXIllFFfreYY7Apw=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDFg5rufAFy1itLjBBGlAJUDsQsaZUavZeI3stNJBLolkBBMB4sBpwAvQFbu2iUhtVavUC7q9xD2LsX0DVBu9DCaQn6tETqUUvMQqzvmaXd34gwo5fH6vo+bjqVdZEih0pIVI1O2OfOUvnv2MFLdKx8MWLQd54beGjWQsC3xCnYVuh0W/aAQtRC2EA77nBo+r40u5V3HXOhdmUbFNvL0r6I8FwP4IvbKC5jkBTtqIzewh+/cyJrURCh0aCpeUjBqNqw3ADhtuR2h5n3ioq+IwPXbhHViJUWQyJ5XKmlSzupEEYA+RV8i1Y3eHJK2RuYlCXkpRP3MEsyBxmISTPhVdQwfxClvyi/mTQkl6k5XFGyZher7KbE6lx4qzp8iCOyOWkw32N3tG0AlnOtPI5HJw8uKbwWl2Apb7RncDQ5fpNOKNFcB1sg61g2Vvew7xJs62OxhkOTiSkEEUYoFfXAqNLiH8gC0+Go12qYleZKbfzL00BDT2boQ2UxYn2rWK7YifU=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIB29M+5Yr1BRNmm2RoLe921umFtraZRFTbdptrBdgsAV#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv7vUjfSNyE5eIqsBh1jfLF/N1YKOXT7KtCRIxAQ1i9+ljB9j4j/dQgL6TGk3m+hQRPyAVxTDwUpeBxHWIpFjU=#012 create=True mode=0644 path=/tmp/ansible.7uvc8ri3 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:39 np0005465988 python3.9[68339]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.7uvc8ri3' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:28:41 np0005465988 python3.9[68493]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.7uvc8ri3 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:28:41 np0005465988 systemd[1]: session-17.scope: Deactivated successfully.
Oct  2 07:28:41 np0005465988 systemd[1]: session-17.scope: Consumed 3.897s CPU time.
Oct  2 07:28:41 np0005465988 systemd-logind[827]: Session 17 logged out. Waiting for processes to exit.
Oct  2 07:28:41 np0005465988 systemd-logind[827]: Removed session 17.
Oct  2 07:28:57 np0005465988 systemd-logind[827]: New session 18 of user zuul.
Oct  2 07:28:57 np0005465988 systemd[1]: Started Session 18 of User zuul.
Oct  2 07:28:59 np0005465988 python3.9[68671]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:29:00 np0005465988 python3.9[68827]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  2 07:29:01 np0005465988 python3.9[68981]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:29:02 np0005465988 python3.9[69134]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:29:03 np0005465988 python3.9[69287]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:29:04 np0005465988 python3.9[69441]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:29:05 np0005465988 python3.9[69596]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:29:05 np0005465988 systemd[1]: session-18.scope: Deactivated successfully.
Oct  2 07:29:05 np0005465988 systemd[1]: session-18.scope: Consumed 5.247s CPU time.
Oct  2 07:29:05 np0005465988 systemd-logind[827]: Session 18 logged out. Waiting for processes to exit.
Oct  2 07:29:05 np0005465988 systemd-logind[827]: Removed session 18.
Oct  2 07:29:13 np0005465988 systemd-logind[827]: New session 19 of user zuul.
Oct  2 07:29:13 np0005465988 systemd[1]: Started Session 19 of User zuul.
Oct  2 07:29:15 np0005465988 python3.9[69774]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:29:16 np0005465988 python3.9[69930]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:29:17 np0005465988 python3.9[70014]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:29:19 np0005465988 python3.9[70165]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:29:21 np0005465988 python3.9[70316]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:29:22 np0005465988 python3.9[70466]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:29:22 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:29:22 np0005465988 python3.9[70617]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:29:23 np0005465988 systemd[1]: session-19.scope: Deactivated successfully.
Oct  2 07:29:23 np0005465988 systemd[1]: session-19.scope: Consumed 6.478s CPU time.
Oct  2 07:29:23 np0005465988 systemd-logind[827]: Session 19 logged out. Waiting for processes to exit.
Oct  2 07:29:23 np0005465988 systemd-logind[827]: Removed session 19.
Oct  2 07:29:32 np0005465988 systemd-logind[827]: New session 20 of user zuul.
Oct  2 07:29:32 np0005465988 systemd[1]: Started Session 20 of User zuul.
Oct  2 07:29:39 np0005465988 python3[71383]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:29:41 np0005465988 python3[71478]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct  2 07:29:43 np0005465988 python3[71505]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:29:43 np0005465988 python3[71531]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:29:43 np0005465988 kernel: loop: module loaded
Oct  2 07:29:43 np0005465988 kernel: loop3: detected capacity change from 0 to 14680064
Oct  2 07:29:43 np0005465988 python3[71566]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:29:43 np0005465988 lvm[71569]: PV /dev/loop3 not used.
Oct  2 07:29:44 np0005465988 lvm[71571]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct  2 07:29:44 np0005465988 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Oct  2 07:29:44 np0005465988 lvm[71581]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct  2 07:29:44 np0005465988 lvm[71581]: VG ceph_vg0 finished
Oct  2 07:29:44 np0005465988 lvm[71579]:  1 logical volume(s) in volume group "ceph_vg0" now active
Oct  2 07:29:44 np0005465988 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Oct  2 07:29:45 np0005465988 python3[71659]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:29:45 np0005465988 python3[71732]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759404584.8504577-33532-162076686242347/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:29:46 np0005465988 python3[71782]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:29:46 np0005465988 systemd[1]: Reloading.
Oct  2 07:29:46 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:29:46 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:29:46 np0005465988 systemd[1]: Starting Ceph OSD losetup...
Oct  2 07:29:46 np0005465988 bash[71822]: /dev/loop3: [64513]:4349018 (/var/lib/ceph-osd-0.img)
Oct  2 07:29:46 np0005465988 systemd[1]: Finished Ceph OSD losetup.
Oct  2 07:29:46 np0005465988 lvm[71824]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct  2 07:29:46 np0005465988 lvm[71824]: VG ceph_vg0 finished
Oct  2 07:29:48 np0005465988 python3[71848]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:29:56 np0005465988 systemd[1]: packagekit.service: Deactivated successfully.
Oct  2 07:32:16 np0005465988 systemd-logind[827]: New session 21 of user ceph-admin.
Oct  2 07:32:16 np0005465988 systemd[1]: Created slice User Slice of UID 42477.
Oct  2 07:32:16 np0005465988 systemd[1]: Starting User Runtime Directory /run/user/42477...
Oct  2 07:32:16 np0005465988 systemd[1]: Finished User Runtime Directory /run/user/42477.
Oct  2 07:32:16 np0005465988 systemd[1]: Starting User Manager for UID 42477...
Oct  2 07:32:16 np0005465988 systemd-logind[827]: New session 23 of user ceph-admin.
Oct  2 07:32:16 np0005465988 systemd[71899]: Queued start job for default target Main User Target.
Oct  2 07:32:16 np0005465988 systemd[71899]: Created slice User Application Slice.
Oct  2 07:32:16 np0005465988 systemd[71899]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 07:32:16 np0005465988 systemd[71899]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 07:32:16 np0005465988 systemd[71899]: Reached target Paths.
Oct  2 07:32:16 np0005465988 systemd[71899]: Reached target Timers.
Oct  2 07:32:16 np0005465988 systemd[71899]: Starting D-Bus User Message Bus Socket...
Oct  2 07:32:16 np0005465988 systemd[71899]: Starting Create User's Volatile Files and Directories...
Oct  2 07:32:17 np0005465988 systemd[71899]: Listening on D-Bus User Message Bus Socket.
Oct  2 07:32:17 np0005465988 systemd[71899]: Reached target Sockets.
Oct  2 07:32:17 np0005465988 systemd[71899]: Finished Create User's Volatile Files and Directories.
Oct  2 07:32:17 np0005465988 systemd[71899]: Reached target Basic System.
Oct  2 07:32:17 np0005465988 systemd[71899]: Reached target Main User Target.
Oct  2 07:32:17 np0005465988 systemd[71899]: Startup finished in 134ms.
Oct  2 07:32:17 np0005465988 systemd[1]: Started User Manager for UID 42477.
Oct  2 07:32:17 np0005465988 systemd[1]: Started Session 21 of User ceph-admin.
Oct  2 07:32:17 np0005465988 systemd[1]: Started Session 23 of User ceph-admin.
Oct  2 07:32:17 np0005465988 systemd-logind[827]: New session 24 of user ceph-admin.
Oct  2 07:32:17 np0005465988 systemd[1]: Started Session 24 of User ceph-admin.
Oct  2 07:32:17 np0005465988 systemd-logind[827]: New session 25 of user ceph-admin.
Oct  2 07:32:17 np0005465988 systemd[1]: Started Session 25 of User ceph-admin.
Oct  2 07:32:18 np0005465988 systemd-logind[827]: New session 26 of user ceph-admin.
Oct  2 07:32:18 np0005465988 systemd[1]: Started Session 26 of User ceph-admin.
Oct  2 07:32:18 np0005465988 systemd-logind[827]: New session 27 of user ceph-admin.
Oct  2 07:32:18 np0005465988 systemd[1]: Started Session 27 of User ceph-admin.
Oct  2 07:32:19 np0005465988 systemd-logind[827]: New session 28 of user ceph-admin.
Oct  2 07:32:19 np0005465988 systemd[1]: Started Session 28 of User ceph-admin.
Oct  2 07:32:19 np0005465988 systemd-logind[827]: New session 29 of user ceph-admin.
Oct  2 07:32:19 np0005465988 systemd[1]: Started Session 29 of User ceph-admin.
Oct  2 07:32:20 np0005465988 systemd-logind[827]: New session 30 of user ceph-admin.
Oct  2 07:32:20 np0005465988 systemd[1]: Started Session 30 of User ceph-admin.
Oct  2 07:32:20 np0005465988 systemd-logind[827]: New session 31 of user ceph-admin.
Oct  2 07:32:20 np0005465988 systemd[1]: Started Session 31 of User ceph-admin.
Oct  2 07:32:21 np0005465988 systemd-logind[827]: New session 32 of user ceph-admin.
Oct  2 07:32:21 np0005465988 systemd[1]: Started Session 32 of User ceph-admin.
Oct  2 07:32:21 np0005465988 systemd-logind[827]: New session 33 of user ceph-admin.
Oct  2 07:32:21 np0005465988 systemd[1]: Started Session 33 of User ceph-admin.
Oct  2 07:32:21 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:32:59 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:00 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:00 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:01 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:01 np0005465988 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 72922 (sysctl)
Oct  2 07:33:01 np0005465988 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct  2 07:33:01 np0005465988 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct  2 07:33:01 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:02 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:02 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:04 np0005465988 systemd[1]: var-lib-containers-storage-overlay-compat286450351-lower\x2dmapped.mount: Deactivated successfully.
Oct  2 07:33:19 np0005465988 podman[73197]: 2025-10-02 11:33:19.597314998 +0000 UTC m=+17.247916269 container create 926a20dee5cc42383398409164bb325132dbc9e6cfd71950c6464be4a9b1b282 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct  2 07:33:19 np0005465988 podman[73197]: 2025-10-02 11:33:19.582608603 +0000 UTC m=+17.233209884 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:19 np0005465988 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct  2 07:33:19 np0005465988 systemd[1]: Started libpod-conmon-926a20dee5cc42383398409164bb325132dbc9e6cfd71950c6464be4a9b1b282.scope.
Oct  2 07:33:19 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:33:19 np0005465988 podman[73197]: 2025-10-02 11:33:19.691796418 +0000 UTC m=+17.342397709 container init 926a20dee5cc42383398409164bb325132dbc9e6cfd71950c6464be4a9b1b282 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 07:33:19 np0005465988 podman[73197]: 2025-10-02 11:33:19.698233544 +0000 UTC m=+17.348834815 container start 926a20dee5cc42383398409164bb325132dbc9e6cfd71950c6464be4a9b1b282 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Oct  2 07:33:19 np0005465988 podman[73197]: 2025-10-02 11:33:19.7019096 +0000 UTC m=+17.352510891 container attach 926a20dee5cc42383398409164bb325132dbc9e6cfd71950c6464be4a9b1b282 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 07:33:19 np0005465988 systemd[1]: libpod-926a20dee5cc42383398409164bb325132dbc9e6cfd71950c6464be4a9b1b282.scope: Deactivated successfully.
Oct  2 07:33:19 np0005465988 gifted_gauss[73258]: 167 167
Oct  2 07:33:19 np0005465988 conmon[73258]: conmon 926a20dee5cc42383398 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-926a20dee5cc42383398409164bb325132dbc9e6cfd71950c6464be4a9b1b282.scope/container/memory.events
Oct  2 07:33:19 np0005465988 podman[73197]: 2025-10-02 11:33:19.704051272 +0000 UTC m=+17.354652543 container died 926a20dee5cc42383398409164bb325132dbc9e6cfd71950c6464be4a9b1b282 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 07:33:19 np0005465988 systemd[1]: var-lib-containers-storage-overlay-f1e2110182788b50be2d2a7cafb06e6437d937fef3df9251ea148369c937934f-merged.mount: Deactivated successfully.
Oct  2 07:33:19 np0005465988 podman[73197]: 2025-10-02 11:33:19.736277413 +0000 UTC m=+17.386878684 container remove 926a20dee5cc42383398409164bb325132dbc9e6cfd71950c6464be4a9b1b282 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Oct  2 07:33:19 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:19 np0005465988 systemd[1]: libpod-conmon-926a20dee5cc42383398409164bb325132dbc9e6cfd71950c6464be4a9b1b282.scope: Deactivated successfully.
Oct  2 07:33:19 np0005465988 podman[73282]: 2025-10-02 11:33:19.902909307 +0000 UTC m=+0.056232235 container create 1ee10274bea023e1d6b7b8076737ffa2661d27352f3a9e140ec2143f937ce6a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_gates, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Oct  2 07:33:19 np0005465988 systemd[1]: Started libpod-conmon-1ee10274bea023e1d6b7b8076737ffa2661d27352f3a9e140ec2143f937ce6a9.scope.
Oct  2 07:33:19 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:33:19 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c00ca2c171536afa6b753b6d7bfd6554b1f5de912682de2aede95eabb5717a2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:19 np0005465988 podman[73282]: 2025-10-02 11:33:19.885449593 +0000 UTC m=+0.038772541 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:19 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c00ca2c171536afa6b753b6d7bfd6554b1f5de912682de2aede95eabb5717a2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:19 np0005465988 podman[73282]: 2025-10-02 11:33:19.989050256 +0000 UTC m=+0.142373284 container init 1ee10274bea023e1d6b7b8076737ffa2661d27352f3a9e140ec2143f937ce6a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_gates, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct  2 07:33:19 np0005465988 podman[73282]: 2025-10-02 11:33:19.995578145 +0000 UTC m=+0.148901113 container start 1ee10274bea023e1d6b7b8076737ffa2661d27352f3a9e140ec2143f937ce6a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_gates, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Oct  2 07:33:20 np0005465988 podman[73282]: 2025-10-02 11:33:20.000013713 +0000 UTC m=+0.153336701 container attach 1ee10274bea023e1d6b7b8076737ffa2661d27352f3a9e140ec2143f937ce6a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_gates, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Oct  2 07:33:21 np0005465988 goofy_gates[73297]: [
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:    {
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:        "available": false,
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:        "ceph_device": false,
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:        "device_id": "QEMU_DVD-ROM_QM00001",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:        "lsm_data": {},
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:        "lvs": [],
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:        "path": "/dev/sr0",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:        "rejected_reasons": [
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "Has a FileSystem",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "Insufficient space (<5GB)"
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:        ],
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:        "sys_api": {
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "actuators": null,
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "device_nodes": "sr0",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "devname": "sr0",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "human_readable_size": "482.00 KB",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "id_bus": "ata",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "model": "QEMU DVD-ROM",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "nr_requests": "2",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "parent": "/dev/sr0",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "partitions": {},
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "path": "/dev/sr0",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "removable": "1",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "rev": "2.5+",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "ro": "0",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "rotational": "0",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "sas_address": "",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "sas_device_handle": "",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "scheduler_mode": "mq-deadline",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "sectors": 0,
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "sectorsize": "2048",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "size": 493568.0,
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "support_discard": "2048",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "type": "disk",
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:            "vendor": "QEMU"
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:        }
Oct  2 07:33:21 np0005465988 goofy_gates[73297]:    }
Oct  2 07:33:21 np0005465988 goofy_gates[73297]: ]
Oct  2 07:33:21 np0005465988 systemd[1]: libpod-1ee10274bea023e1d6b7b8076737ffa2661d27352f3a9e140ec2143f937ce6a9.scope: Deactivated successfully.
Oct  2 07:33:21 np0005465988 systemd[1]: libpod-1ee10274bea023e1d6b7b8076737ffa2661d27352f3a9e140ec2143f937ce6a9.scope: Consumed 1.118s CPU time.
Oct  2 07:33:21 np0005465988 podman[73282]: 2025-10-02 11:33:21.113714908 +0000 UTC m=+1.267037876 container died 1ee10274bea023e1d6b7b8076737ffa2661d27352f3a9e140ec2143f937ce6a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_gates, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:33:21 np0005465988 systemd[1]: var-lib-containers-storage-overlay-6c00ca2c171536afa6b753b6d7bfd6554b1f5de912682de2aede95eabb5717a2-merged.mount: Deactivated successfully.
Oct  2 07:33:21 np0005465988 podman[73282]: 2025-10-02 11:33:21.182623859 +0000 UTC m=+1.335946797 container remove 1ee10274bea023e1d6b7b8076737ffa2661d27352f3a9e140ec2143f937ce6a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_gates, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True)
Oct  2 07:33:21 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:21 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:21 np0005465988 systemd[1]: libpod-conmon-1ee10274bea023e1d6b7b8076737ffa2661d27352f3a9e140ec2143f937ce6a9.scope: Deactivated successfully.
Oct  2 07:33:26 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:26 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:26 np0005465988 podman[76008]: 2025-10-02 11:33:26.917426584 +0000 UTC m=+0.040203513 container create 91c25dd725bfb21664246cf9c31d3c831e170e4295c482b57ed7ae1002fd1291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_einstein, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef)
Oct  2 07:33:26 np0005465988 systemd[1]: Started libpod-conmon-91c25dd725bfb21664246cf9c31d3c831e170e4295c482b57ed7ae1002fd1291.scope.
Oct  2 07:33:26 np0005465988 podman[76008]: 2025-10-02 11:33:26.901669859 +0000 UTC m=+0.024446788 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:27 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:33:27 np0005465988 podman[76008]: 2025-10-02 11:33:27.02839781 +0000 UTC m=+0.151174809 container init 91c25dd725bfb21664246cf9c31d3c831e170e4295c482b57ed7ae1002fd1291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_einstein, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Oct  2 07:33:27 np0005465988 podman[76008]: 2025-10-02 11:33:27.037861523 +0000 UTC m=+0.160638472 container start 91c25dd725bfb21664246cf9c31d3c831e170e4295c482b57ed7ae1002fd1291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_einstein, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 07:33:27 np0005465988 podman[76008]: 2025-10-02 11:33:27.043706022 +0000 UTC m=+0.166482971 container attach 91c25dd725bfb21664246cf9c31d3c831e170e4295c482b57ed7ae1002fd1291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_einstein, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 07:33:27 np0005465988 sleepy_einstein[76024]: 167 167
Oct  2 07:33:27 np0005465988 systemd[1]: libpod-91c25dd725bfb21664246cf9c31d3c831e170e4295c482b57ed7ae1002fd1291.scope: Deactivated successfully.
Oct  2 07:33:27 np0005465988 podman[76008]: 2025-10-02 11:33:27.046176124 +0000 UTC m=+0.168953073 container died 91c25dd725bfb21664246cf9c31d3c831e170e4295c482b57ed7ae1002fd1291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_einstein, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Oct  2 07:33:27 np0005465988 podman[76008]: 2025-10-02 11:33:27.113751776 +0000 UTC m=+0.236528725 container remove 91c25dd725bfb21664246cf9c31d3c831e170e4295c482b57ed7ae1002fd1291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_einstein, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:33:27 np0005465988 systemd[1]: libpod-conmon-91c25dd725bfb21664246cf9c31d3c831e170e4295c482b57ed7ae1002fd1291.scope: Deactivated successfully.
Oct  2 07:33:27 np0005465988 podman[76042]: 2025-10-02 11:33:27.202397837 +0000 UTC m=+0.045567778 container create 8f8ce58add0c7f3b7319f08768db2915975b7f1ef0ddf6ed39fea7a89f98f321 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_engelbart, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 07:33:27 np0005465988 systemd[1]: Started libpod-conmon-8f8ce58add0c7f3b7319f08768db2915975b7f1ef0ddf6ed39fea7a89f98f321.scope.
Oct  2 07:33:27 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:33:27 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ee3c622614c47e20b6aacc83675fe88db8b505b2a2e28744ca86c323953bf56/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:27 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ee3c622614c47e20b6aacc83675fe88db8b505b2a2e28744ca86c323953bf56/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:27 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ee3c622614c47e20b6aacc83675fe88db8b505b2a2e28744ca86c323953bf56/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:27 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ee3c622614c47e20b6aacc83675fe88db8b505b2a2e28744ca86c323953bf56/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:27 np0005465988 podman[76042]: 2025-10-02 11:33:27.274121649 +0000 UTC m=+0.117291680 container init 8f8ce58add0c7f3b7319f08768db2915975b7f1ef0ddf6ed39fea7a89f98f321 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_engelbart, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0)
Oct  2 07:33:27 np0005465988 podman[76042]: 2025-10-02 11:33:27.181294087 +0000 UTC m=+0.024464078 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:27 np0005465988 podman[76042]: 2025-10-02 11:33:27.284699545 +0000 UTC m=+0.127869526 container start 8f8ce58add0c7f3b7319f08768db2915975b7f1ef0ddf6ed39fea7a89f98f321 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_engelbart, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Oct  2 07:33:27 np0005465988 podman[76042]: 2025-10-02 11:33:27.290683278 +0000 UTC m=+0.133853259 container attach 8f8ce58add0c7f3b7319f08768db2915975b7f1ef0ddf6ed39fea7a89f98f321 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_engelbart, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Oct  2 07:33:27 np0005465988 systemd[1]: libpod-8f8ce58add0c7f3b7319f08768db2915975b7f1ef0ddf6ed39fea7a89f98f321.scope: Deactivated successfully.
Oct  2 07:33:27 np0005465988 conmon[76058]: conmon 8f8ce58add0c7f3b7319 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8f8ce58add0c7f3b7319f08768db2915975b7f1ef0ddf6ed39fea7a89f98f321.scope/container/memory.events
Oct  2 07:33:27 np0005465988 podman[76042]: 2025-10-02 11:33:27.390769899 +0000 UTC m=+0.233939880 container died 8f8ce58add0c7f3b7319f08768db2915975b7f1ef0ddf6ed39fea7a89f98f321 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_engelbart, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Oct  2 07:33:27 np0005465988 podman[76042]: 2025-10-02 11:33:27.45794039 +0000 UTC m=+0.301110371 container remove 8f8ce58add0c7f3b7319f08768db2915975b7f1ef0ddf6ed39fea7a89f98f321 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_engelbart, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 07:33:27 np0005465988 systemd[1]: libpod-conmon-8f8ce58add0c7f3b7319f08768db2915975b7f1ef0ddf6ed39fea7a89f98f321.scope: Deactivated successfully.
Oct  2 07:33:27 np0005465988 systemd[1]: Reloading.
Oct  2 07:33:27 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:33:27 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:33:27 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:27 np0005465988 systemd[1]: Reloading.
Oct  2 07:33:27 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:33:27 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:33:27 np0005465988 systemd[1]: Reached target All Ceph clusters and services.
Oct  2 07:33:27 np0005465988 systemd[1]: Reloading.
Oct  2 07:33:28 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:33:28 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:33:28 np0005465988 systemd[1]: Reached target Ceph cluster fd4c5763-22d1-50ea-ad0b-96a3dc3040b2.
Oct  2 07:33:28 np0005465988 systemd[1]: Reloading.
Oct  2 07:33:28 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:33:28 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:33:28 np0005465988 systemd[1]: Reloading.
Oct  2 07:33:28 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:33:28 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:33:28 np0005465988 systemd[1]: Created slice Slice /system/ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2.
Oct  2 07:33:28 np0005465988 systemd[1]: Reached target System Time Set.
Oct  2 07:33:28 np0005465988 systemd[1]: Reached target System Time Synchronized.
Oct  2 07:33:28 np0005465988 systemd[1]: Starting Ceph mon.compute-2 for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2...
Oct  2 07:33:28 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:28 np0005465988 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:33:29 np0005465988 podman[76335]: 2025-10-02 11:33:29.077824899 +0000 UTC m=+0.043785146 container create 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Oct  2 07:33:29 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86a89f1867fef0a3fa258d7f8892744bb8ab5c59db97cdcb8eee7bbba361ebc1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:29 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86a89f1867fef0a3fa258d7f8892744bb8ab5c59db97cdcb8eee7bbba361ebc1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:29 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86a89f1867fef0a3fa258d7f8892744bb8ab5c59db97cdcb8eee7bbba361ebc1/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:29 np0005465988 podman[76335]: 2025-10-02 11:33:29.145504185 +0000 UTC m=+0.111464422 container init 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Oct  2 07:33:29 np0005465988 podman[76335]: 2025-10-02 11:33:29.150512519 +0000 UTC m=+0.116472726 container start 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 07:33:29 np0005465988 podman[76335]: 2025-10-02 11:33:29.057341678 +0000 UTC m=+0.023301925 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:29 np0005465988 bash[76335]: 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57
Oct  2 07:33:29 np0005465988 systemd[1]: Started Ceph mon.compute-2 for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2.
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: set uid:gid to 167:167 (ceph:ceph)
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: pidfile_write: ignore empty --pid-file
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: load: jerasure load: lrc 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: RocksDB version: 7.9.2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Git sha 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Compile date 2025-05-06 23:30:25
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: DB SUMMARY
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: DB Session ID:  MN3GOSCTDLK2IE3HZQ1Z
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: CURRENT file:  CURRENT
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: IDENTITY file:  IDENTITY
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                         Options.error_if_exists: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                       Options.create_if_missing: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                         Options.paranoid_checks: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.flush_verify_memtable_count: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                                     Options.env: 0x55b3e8b6ac40
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                                      Options.fs: PosixFileSystem
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                                Options.info_log: 0x55b3e97b4fc0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                Options.max_file_opening_threads: 16
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                              Options.statistics: (nil)
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                               Options.use_fsync: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                       Options.max_log_file_size: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                   Options.log_file_time_to_roll: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                       Options.keep_log_file_num: 1000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                    Options.recycle_log_file_num: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                         Options.allow_fallocate: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                        Options.allow_mmap_reads: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                       Options.allow_mmap_writes: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                        Options.use_direct_reads: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:          Options.create_missing_column_families: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                              Options.db_log_dir: 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                                 Options.wal_dir: 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                Options.table_cache_numshardbits: 6
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                         Options.WAL_ttl_seconds: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                       Options.WAL_size_limit_MB: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.manifest_preallocation_size: 4194304
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                     Options.is_fd_close_on_exec: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                   Options.advise_random_on_open: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                    Options.db_write_buffer_size: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                    Options.write_buffer_manager: 0x55b3e97c4b40
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.access_hint_on_compaction_start: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                      Options.use_adaptive_mutex: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                            Options.rate_limiter: (nil)
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                       Options.wal_recovery_mode: 2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                  Options.enable_thread_tracking: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                  Options.enable_pipelined_write: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                  Options.unordered_write: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.write_thread_max_yield_usec: 100
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                               Options.row_cache: None
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                              Options.wal_filter: None
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.avoid_flush_during_recovery: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.allow_ingest_behind: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.two_write_queues: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.manual_wal_flush: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.wal_compression: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.atomic_flush: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                 Options.persist_stats_to_disk: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                 Options.write_dbid_to_manifest: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                 Options.log_readahead_size: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                 Options.best_efforts_recovery: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.allow_data_in_errors: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.db_host_id: __hostname__
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.enforce_single_del_contracts: true
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.max_background_jobs: 2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.max_background_compactions: -1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.max_subcompactions: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.delayed_write_rate : 16777216
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.max_total_wal_size: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                   Options.stats_dump_period_sec: 600
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                 Options.stats_persist_period_sec: 600
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                          Options.max_open_files: -1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                          Options.bytes_per_sync: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                      Options.wal_bytes_per_sync: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                   Options.strict_bytes_per_sync: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:       Options.compaction_readahead_size: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                  Options.max_background_flushes: -1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Compression algorithms supported:
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: #011kZSTD supported: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: #011kXpressCompression supported: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: #011kBZip2Compression supported: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: #011kLZ4Compression supported: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: #011kZlibCompression supported: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: #011kLZ4HCCompression supported: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: #011kSnappyCompression supported: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Fast CRC32 supported: Supported on x86
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: DMutex implementation: pthread_mutex_t
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:           Options.merge_operator: 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:        Options.compaction_filter: None
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b3e97b4c00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55b3e97ad1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:        Options.write_buffer_size: 33554432
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:  Options.max_write_buffer_number: 2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:          Options.compression: NoCompression
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.num_levels: 7
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0f53a282-e834-4e21-912d-f4016b84b664
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759404809190127, "job": 1, "event": "recovery_started", "wal_files": [4]}
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759404809191834, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759404809191922, "job": 1, "event": "recovery_finished"}
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55b3e97d6e00
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: DB pointer 0x55b3e9860000
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 512.00 MB usage: 0.86 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.64 KB,0.00012219%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid fd4c5763-22d1-50ea-ad0b-96a3dc3040b2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(???) e0 preinit fsid fd4c5763-22d1-50ea-ad0b-96a3dc3040b2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).mds e1 new map
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e14 crush map has features 3314933000852226048, adjusting msgr requires
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e14 crush map has features 288514051259236352, adjusting msgr requires
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e14 crush map has features 288514051259236352, adjusting msgr requires
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).osd e14 crush map has features 288514051259236352, adjusting msgr requires
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Added host compute-2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Saving service mon spec with placement compute-0;compute-1;compute-2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Saving service mgr spec with placement compute-0;compute-1;compute-2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Marking host: compute-0 for OSDSpec preview refresh.
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Marking host: compute-1 for OSDSpec preview refresh.
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Saving service osd.default_drive_group spec with placement compute-0;compute-1;compute-2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Updating compute-1:/etc/ceph/ceph.conf
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Updating compute-1:/var/lib/ceph/fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/config/ceph.conf
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Updating compute-1:/var/lib/ceph/fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/config/ceph.client.admin.keyring
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Failed to apply mon spec MONSpec.from_json(yaml.safe_load('''service_type: mon#012service_name: mon#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012''')): Cannot place <MONSpec for service_name=mon> on compute-2: Unknown hosts
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Failed to apply mgr spec ServiceSpec.from_json(yaml.safe_load('''service_type: mgr#012service_name: mgr#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012''')): Cannot place <ServiceSpec for service_name=mgr> on compute-2: Unknown hosts
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Deploying daemon crash.compute-1 on compute-1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Health check failed: Failed to apply 2 service(s): mon,mgr (CEPHADM_APPLY_SPEC_FAIL)
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.101:0/1556391589' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "a3ccd8b9-4533-4913-ba5b-13fae1978607"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.101:0/1556391589' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "a3ccd8b9-4533-4913-ba5b-13fae1978607"}]': finished
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/3545942114' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "ca035294-3f2d-465d-b3e6-43971a2c0201"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/3545942114' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "ca035294-3f2d-465d-b3e6-43971a2c0201"}]': finished
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Deploying daemon osd.0 on compute-1
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Deploying daemon osd.1 on compute-0
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='osd.0 [v2:192.168.122.101:6800/721300389,v1:192.168.122.101:6801/721300389]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='osd.0 [v2:192.168.122.101:6800/721300389,v1:192.168.122.101:6801/721300389]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='osd.0 [v2:192.168.122.101:6800/721300389,v1:192.168.122.101:6801/721300389]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0068, "args": ["host=compute-1", "root=default"]}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='osd.0 [v2:192.168.122.101:6800/721300389,v1:192.168.122.101:6801/721300389]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0068, "args": ["host=compute-1", "root=default"]}]': finished
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='osd.1 [v2:192.168.122.100:6802/1702934741,v1:192.168.122.100:6803/1702934741]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='osd.1 [v2:192.168.122.100:6802/1702934741,v1:192.168.122.100:6803/1702934741]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='osd.1 [v2:192.168.122.100:6802/1702934741,v1:192.168.122.100:6803/1702934741]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0068, "args": ["host=compute-0", "root=default"]}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='osd.1 [v2:192.168.122.100:6802/1702934741,v1:192.168.122.100:6803/1702934741]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0068, "args": ["host=compute-0", "root=default"]}]': finished
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: OSD bench result of 4356.019216 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: osd.0 [v2:192.168.122.101:6800/721300389,v1:192.168.122.101:6801/721300389] boot
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Adjusting osd_memory_target on compute-1 to  5247M
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: OSD bench result of 10020.515737 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Adjusting osd_memory_target on compute-0 to 127.8M
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Unable to set osd_memory_target on compute-0 to 134062899: error parsing value: Value '134062899' is below minimum 939524096
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: osd.1 [v2:192.168.122.100:6802/1702934741,v1:192.168.122.100:6803/1702934741] boot
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Updating compute-2:/etc/ceph/ceph.conf
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Updating compute-2:/var/lib/ceph/fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/config/ceph.conf
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Updating compute-2:/var/lib/ceph/fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/config/ceph.client.admin.keyring
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Deploying daemon mon.compute-2 on compute-2
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: Cluster is now healthy
Oct  2 07:33:29 np0005465988 ceph-mon[76355]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Oct  2 07:33:31 np0005465988 ceph-mon[76355]: mon.compute-2@-1(probing) e1  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct  2 07:33:31 np0005465988 ceph-mon[76355]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Oct  2 07:33:31 np0005465988 ceph-mon[76355]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Oct  2 07:33:31 np0005465988 ceph-mon[76355]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Oct  2 07:33:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 07:33:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct  2 07:33:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2025-10-02T11:33:27.327644Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025,kernel_version=5.14.0-620.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864104,os=Linux}
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e15 e15: 2 total, 2 up, 2 in
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: Deploying daemon mon.compute-1 on compute-1
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/4219378585' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: mon.compute-0 calling monitor election
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: mon.compute-2 calling monitor election
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Oct  2 07:33:34 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 07:33:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct  2 07:33:35 np0005465988 ceph-mon[76355]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Oct  2 07:33:35 np0005465988 ceph-mon[76355]: paxos.1).electionLogic(10) init, last seen epoch 10
Oct  2 07:33:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 07:33:36 np0005465988 podman[76533]: 2025-10-02 11:33:36.194990852 +0000 UTC m=+0.078468849 container create aeeca1e045ab67d45d25c75d779f14be64e76a6b7a0aae3ccb3e953038863c6f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_murdock, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Oct  2 07:33:36 np0005465988 podman[76533]: 2025-10-02 11:33:36.140507177 +0000 UTC m=+0.023985204 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:36 np0005465988 systemd[1]: Started libpod-conmon-aeeca1e045ab67d45d25c75d779f14be64e76a6b7a0aae3ccb3e953038863c6f.scope.
Oct  2 07:33:36 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:33:36 np0005465988 podman[76533]: 2025-10-02 11:33:36.337349104 +0000 UTC m=+0.220827121 container init aeeca1e045ab67d45d25c75d779f14be64e76a6b7a0aae3ccb3e953038863c6f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_murdock, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Oct  2 07:33:36 np0005465988 podman[76533]: 2025-10-02 11:33:36.346031865 +0000 UTC m=+0.229509862 container start aeeca1e045ab67d45d25c75d779f14be64e76a6b7a0aae3ccb3e953038863c6f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_murdock, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Oct  2 07:33:36 np0005465988 busy_murdock[76550]: 167 167
Oct  2 07:33:36 np0005465988 systemd[1]: libpod-aeeca1e045ab67d45d25c75d779f14be64e76a6b7a0aae3ccb3e953038863c6f.scope: Deactivated successfully.
Oct  2 07:33:36 np0005465988 podman[76533]: 2025-10-02 11:33:36.411249989 +0000 UTC m=+0.294728016 container attach aeeca1e045ab67d45d25c75d779f14be64e76a6b7a0aae3ccb3e953038863c6f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_murdock, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef)
Oct  2 07:33:36 np0005465988 podman[76533]: 2025-10-02 11:33:36.413140523 +0000 UTC m=+0.296618520 container died aeeca1e045ab67d45d25c75d779f14be64e76a6b7a0aae3ccb3e953038863c6f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_murdock, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Oct  2 07:33:36 np0005465988 systemd[1]: var-lib-containers-storage-overlay-de1383cf6784b1c10764ee0e035773a44ff94b84eb0b7be8ce776e1f679b14c0-merged.mount: Deactivated successfully.
Oct  2 07:33:36 np0005465988 podman[76533]: 2025-10-02 11:33:36.733022995 +0000 UTC m=+0.616500992 container remove aeeca1e045ab67d45d25c75d779f14be64e76a6b7a0aae3ccb3e953038863c6f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_murdock, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Oct  2 07:33:36 np0005465988 systemd[1]: libpod-conmon-aeeca1e045ab67d45d25c75d779f14be64e76a6b7a0aae3ccb3e953038863c6f.scope: Deactivated successfully.
Oct  2 07:33:36 np0005465988 systemd[1]: Reloading.
Oct  2 07:33:37 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:33:37 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:33:37 np0005465988 systemd[1]: Reloading.
Oct  2 07:33:37 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:33:37 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:33:37 np0005465988 systemd[1]: Starting Ceph mgr.compute-2.rbjjpf for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2...
Oct  2 07:33:37 np0005465988 podman[76694]: 2025-10-02 11:33:37.738505135 +0000 UTC m=+0.107765445 container create 1848f12633a19fb48341ed14442652ef9ec60e7295cfc00a3840846a7d69ce5c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Oct  2 07:33:37 np0005465988 podman[76694]: 2025-10-02 11:33:37.654747585 +0000 UTC m=+0.024007885 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:37 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1782af3e8d960072299e22a7f125dc2e1a3e46d91436de3cb6ad34238f64f1d6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:37 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1782af3e8d960072299e22a7f125dc2e1a3e46d91436de3cb6ad34238f64f1d6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:37 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1782af3e8d960072299e22a7f125dc2e1a3e46d91436de3cb6ad34238f64f1d6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:37 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1782af3e8d960072299e22a7f125dc2e1a3e46d91436de3cb6ad34238f64f1d6/merged/var/lib/ceph/mgr/ceph-compute-2.rbjjpf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:38 np0005465988 podman[76694]: 2025-10-02 11:33:38.003657225 +0000 UTC m=+0.372917605 container init 1848f12633a19fb48341ed14442652ef9ec60e7295cfc00a3840846a7d69ce5c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef)
Oct  2 07:33:38 np0005465988 podman[76694]: 2025-10-02 11:33:38.009266887 +0000 UTC m=+0.378527207 container start 1848f12633a19fb48341ed14442652ef9ec60e7295cfc00a3840846a7d69ce5c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct  2 07:33:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct  2 07:33:38 np0005465988 bash[76694]: 1848f12633a19fb48341ed14442652ef9ec60e7295cfc00a3840846a7d69ce5c
Oct  2 07:33:38 np0005465988 systemd[1]: Started Ceph mgr.compute-2.rbjjpf for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2.
Oct  2 07:33:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct  2 07:33:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct  2 07:33:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct  2 07:33:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 07:33:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 07:33:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Oct  2 07:33:41 np0005465988 ceph-mgr[76715]: set uid:gid to 167:167 (ceph:ceph)
Oct  2 07:33:41 np0005465988 ceph-mgr[76715]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Oct  2 07:33:41 np0005465988 ceph-mgr[76715]: pidfile_write: ignore empty --pid-file
Oct  2 07:33:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Oct  2 07:33:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e16 e16: 2 total, 2 up, 2 in
Oct  2 07:33:41 np0005465988 ceph-mon[76355]: mon.compute-0 calling monitor election
Oct  2 07:33:41 np0005465988 ceph-mon[76355]: mon.compute-2 calling monitor election
Oct  2 07:33:41 np0005465988 ceph-mon[76355]: mon.compute-1 calling monitor election
Oct  2 07:33:41 np0005465988 ceph-mon[76355]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Oct  2 07:33:41 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 07:33:41 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:41 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'alerts'
Oct  2 07:33:41 np0005465988 ceph-mgr[76715]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct  2 07:33:41 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'balancer'
Oct  2 07:33:41 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:41.455+0000 7f1a5cfaa140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct  2 07:33:41 np0005465988 ceph-mgr[76715]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct  2 07:33:41 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'cephadm'
Oct  2 07:33:41 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:41.739+0000 7f1a5cfaa140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct  2 07:33:42 np0005465988 ceph-mon[76355]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct  2 07:33:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.ypnrbl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Oct  2 07:33:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.ypnrbl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Oct  2 07:33:42 np0005465988 ceph-mon[76355]: Deploying daemon mgr.compute-1.ypnrbl on compute-1
Oct  2 07:33:42 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/1347565839' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct  2 07:33:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e17 e17: 2 total, 2 up, 2 in
Oct  2 07:33:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e18 e18: 2 total, 2 up, 2 in
Oct  2 07:33:43 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/1347565839' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct  2 07:33:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Oct  2 07:33:43 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'crash'
Oct  2 07:33:43 np0005465988 ceph-mgr[76715]: mgr[py] Module crash has missing NOTIFY_TYPES member
Oct  2 07:33:43 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'dashboard'
Oct  2 07:33:43 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:43.993+0000 7f1a5cfaa140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Oct  2 07:33:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e18 _set_new_cache_sizes cache_size:1019942543 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:33:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e19 e19: 2 total, 2 up, 2 in
Oct  2 07:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:44 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/2188088816' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct  2 07:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Oct  2 07:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Oct  2 07:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Oct  2 07:33:45 np0005465988 podman[76892]: 2025-10-02 11:33:45.152268055 +0000 UTC m=+0.055425792 container create d739d3aa4cf97feede403794716d4a1095c11964c38c672d7fe7ad7abd514248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_lalande, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 07:33:45 np0005465988 systemd[1]: Started libpod-conmon-d739d3aa4cf97feede403794716d4a1095c11964c38c672d7fe7ad7abd514248.scope.
Oct  2 07:33:45 np0005465988 podman[76892]: 2025-10-02 11:33:45.125039279 +0000 UTC m=+0.028197086 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:45 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:33:45 np0005465988 podman[76892]: 2025-10-02 11:33:45.261945424 +0000 UTC m=+0.165103241 container init d739d3aa4cf97feede403794716d4a1095c11964c38c672d7fe7ad7abd514248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_lalande, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 07:33:45 np0005465988 podman[76892]: 2025-10-02 11:33:45.276435553 +0000 UTC m=+0.179593320 container start d739d3aa4cf97feede403794716d4a1095c11964c38c672d7fe7ad7abd514248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_lalande, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Oct  2 07:33:45 np0005465988 podman[76892]: 2025-10-02 11:33:45.281611882 +0000 UTC m=+0.184769659 container attach d739d3aa4cf97feede403794716d4a1095c11964c38c672d7fe7ad7abd514248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_lalande, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Oct  2 07:33:45 np0005465988 bold_lalande[76908]: 167 167
Oct  2 07:33:45 np0005465988 systemd[1]: libpod-d739d3aa4cf97feede403794716d4a1095c11964c38c672d7fe7ad7abd514248.scope: Deactivated successfully.
Oct  2 07:33:45 np0005465988 podman[76892]: 2025-10-02 11:33:45.285804913 +0000 UTC m=+0.188962680 container died d739d3aa4cf97feede403794716d4a1095c11964c38c672d7fe7ad7abd514248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_lalande, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 07:33:45 np0005465988 systemd[1]: var-lib-containers-storage-overlay-fa477187647d2524db65db91167a9454a3143a659727fcb7bca4cbd148d1a019-merged.mount: Deactivated successfully.
Oct  2 07:33:45 np0005465988 podman[76892]: 2025-10-02 11:33:45.333991665 +0000 UTC m=+0.237149422 container remove d739d3aa4cf97feede403794716d4a1095c11964c38c672d7fe7ad7abd514248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_lalande, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Oct  2 07:33:45 np0005465988 systemd[1]: libpod-conmon-d739d3aa4cf97feede403794716d4a1095c11964c38c672d7fe7ad7abd514248.scope: Deactivated successfully.
Oct  2 07:33:45 np0005465988 systemd[1]: Reloading.
Oct  2 07:33:45 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'devicehealth'
Oct  2 07:33:45 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:33:45 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:33:45 np0005465988 systemd[1]: Reloading.
Oct  2 07:33:45 np0005465988 ceph-mgr[76715]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct  2 07:33:45 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'diskprediction_local'
Oct  2 07:33:45 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:45.713+0000 7f1a5cfaa140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct  2 07:33:45 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:33:45 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:33:45 np0005465988 systemd[1]: Starting Ceph crash.compute-2 for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2...
Oct  2 07:33:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Oct  2 07:33:46 np0005465988 ceph-mon[76355]: Deploying daemon crash.compute-2 on compute-2
Oct  2 07:33:46 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/2188088816' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct  2 07:33:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Oct  2 07:33:46 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Oct  2 07:33:46 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Oct  2 07:33:46 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]:  from numpy import show_config as show_numpy_config
Oct  2 07:33:46 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:46.291+0000 7f1a5cfaa140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct  2 07:33:46 np0005465988 ceph-mgr[76715]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct  2 07:33:46 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'influx'
Oct  2 07:33:46 np0005465988 podman[77054]: 2025-10-02 11:33:46.228340183 +0000 UTC m=+0.022265874 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:46 np0005465988 podman[77054]: 2025-10-02 11:33:46.530581785 +0000 UTC m=+0.324507496 container create 1fd1f805ef646b78255ffe5f6fa72a3e53c36381297f637a51b97c534b7ae6d2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True)
Oct  2 07:33:46 np0005465988 ceph-mgr[76715]: mgr[py] Module influx has missing NOTIFY_TYPES member
Oct  2 07:33:46 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:46.549+0000 7f1a5cfaa140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Oct  2 07:33:46 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'insights'
Oct  2 07:33:46 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'iostat'
Oct  2 07:33:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e20 e20: 2 total, 2 up, 2 in
Oct  2 07:33:47 np0005465988 ceph-mgr[76715]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct  2 07:33:47 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'k8sevents'
Oct  2 07:33:47 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:47.053+0000 7f1a5cfaa140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct  2 07:33:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct  2 07:33:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct  2 07:33:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:47 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/4165571221' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct  2 07:33:48 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/338c2dfc73037c13bbc4687fb90d940eec91778dd11ee9492895d86fb03487b4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:48 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/338c2dfc73037c13bbc4687fb90d940eec91778dd11ee9492895d86fb03487b4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:48 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/338c2dfc73037c13bbc4687fb90d940eec91778dd11ee9492895d86fb03487b4/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:48 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/338c2dfc73037c13bbc4687fb90d940eec91778dd11ee9492895d86fb03487b4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e21 e21: 2 total, 2 up, 2 in
Oct  2 07:33:48 np0005465988 podman[77054]: 2025-10-02 11:33:48.360197108 +0000 UTC m=+2.154122879 container init 1fd1f805ef646b78255ffe5f6fa72a3e53c36381297f637a51b97c534b7ae6d2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 07:33:48 np0005465988 podman[77054]: 2025-10-02 11:33:48.36654859 +0000 UTC m=+2.160474311 container start 1fd1f805ef646b78255ffe5f6fa72a3e53c36381297f637a51b97c534b7ae6d2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Oct  2 07:33:48 np0005465988 bash[77054]: 1fd1f805ef646b78255ffe5f6fa72a3e53c36381297f637a51b97c534b7ae6d2
Oct  2 07:33:48 np0005465988 systemd[1]: Started Ceph crash.compute-2 for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2.
Oct  2 07:33:48 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2[77071]: INFO:ceph-crash:pinging cluster to exercise our key
Oct  2 07:33:48 np0005465988 ceph-mon[76355]: Health check update: 3 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct  2 07:33:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Oct  2 07:33:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Oct  2 07:33:48 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/4165571221' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct  2 07:33:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:33:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:33:48 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2[77071]: 2025-10-02T11:33:48.770+0000 7f6d10148640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Oct  2 07:33:48 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2[77071]: 2025-10-02T11:33:48.770+0000 7f6d10148640 -1 AuthRegistry(0x7f6d080675b0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Oct  2 07:33:48 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2[77071]: 2025-10-02T11:33:48.771+0000 7f6d10148640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Oct  2 07:33:48 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2[77071]: 2025-10-02T11:33:48.771+0000 7f6d10148640 -1 AuthRegistry(0x7f6d10147000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Oct  2 07:33:48 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2[77071]: 2025-10-02T11:33:48.772+0000 7f6d0d6bc640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Oct  2 07:33:48 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2[77071]: 2025-10-02T11:33:48.773+0000 7f6d0e6be640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Oct  2 07:33:48 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2[77071]: 2025-10-02T11:33:48.773+0000 7f6d0debd640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Oct  2 07:33:48 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2[77071]: 2025-10-02T11:33:48.773+0000 7f6d10148640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Oct  2 07:33:48 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2[77071]: [errno 13] RADOS permission denied (error connecting to the cluster)
Oct  2 07:33:48 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-crash-compute-2[77071]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Oct  2 07:33:48 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'localpool'
Oct  2 07:33:49 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'mds_autoscaler'
Oct  2 07:33:49 np0005465988 podman[77227]: 2025-10-02 11:33:49.17815236 +0000 UTC m=+0.057019908 container create 8d8db7120c76725ef70b2cc72da6eec59a803d4bca39b5f4e598d5cc597eed0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_lalande, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 07:33:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e22 e22: 2 total, 2 up, 2 in
Oct  2 07:33:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e22 _set_new_cache_sizes cache_size:1020053436 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:33:49 np0005465988 systemd[1]: Started libpod-conmon-8d8db7120c76725ef70b2cc72da6eec59a803d4bca39b5f4e598d5cc597eed0f.scope.
Oct  2 07:33:49 np0005465988 podman[77227]: 2025-10-02 11:33:49.151867839 +0000 UTC m=+0.030735397 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:49 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:33:49 np0005465988 podman[77227]: 2025-10-02 11:33:49.286792927 +0000 UTC m=+0.165660465 container init 8d8db7120c76725ef70b2cc72da6eec59a803d4bca39b5f4e598d5cc597eed0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_lalande, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct  2 07:33:49 np0005465988 podman[77227]: 2025-10-02 11:33:49.29750965 +0000 UTC m=+0.176377158 container start 8d8db7120c76725ef70b2cc72da6eec59a803d4bca39b5f4e598d5cc597eed0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_lalande, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 07:33:49 np0005465988 optimistic_lalande[77243]: 167 167
Oct  2 07:33:49 np0005465988 systemd[1]: libpod-8d8db7120c76725ef70b2cc72da6eec59a803d4bca39b5f4e598d5cc597eed0f.scope: Deactivated successfully.
Oct  2 07:33:49 np0005465988 podman[77227]: 2025-10-02 11:33:49.327531058 +0000 UTC m=+0.206398576 container attach 8d8db7120c76725ef70b2cc72da6eec59a803d4bca39b5f4e598d5cc597eed0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_lalande, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct  2 07:33:49 np0005465988 podman[77227]: 2025-10-02 11:33:49.327916097 +0000 UTC m=+0.206783615 container died 8d8db7120c76725ef70b2cc72da6eec59a803d4bca39b5f4e598d5cc597eed0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_lalande, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Oct  2 07:33:49 np0005465988 systemd[1]: var-lib-containers-storage-overlay-258f742f328d548a4cdf953a128bc3ca2e3219a271f20c6f40b341d69f6ef89c-merged.mount: Deactivated successfully.
Oct  2 07:33:49 np0005465988 podman[77227]: 2025-10-02 11:33:49.448511148 +0000 UTC m=+0.327378656 container remove 8d8db7120c76725ef70b2cc72da6eec59a803d4bca39b5f4e598d5cc597eed0f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_lalande, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Oct  2 07:33:49 np0005465988 systemd[1]: libpod-conmon-8d8db7120c76725ef70b2cc72da6eec59a803d4bca39b5f4e598d5cc597eed0f.scope: Deactivated successfully.
Oct  2 07:33:49 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'mirroring'
Oct  2 07:33:49 np0005465988 podman[77270]: 2025-10-02 11:33:49.625727957 +0000 UTC m=+0.028773406 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:49 np0005465988 podman[77270]: 2025-10-02 11:33:49.740219713 +0000 UTC m=+0.143265142 container create 8ec62501ebe430c0208db7500a8f4f7bfc2a571c57b2800287517667d03016b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef)
Oct  2 07:33:49 np0005465988 systemd[1]: Started libpod-conmon-8ec62501ebe430c0208db7500a8f4f7bfc2a571c57b2800287517667d03016b0.scope.
Oct  2 07:33:49 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:33:49 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/63af7693035f749b57e7343ecf5dce06111d2ea644d309ef31d24d3eb20992c8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:49 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/63af7693035f749b57e7343ecf5dce06111d2ea644d309ef31d24d3eb20992c8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:49 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/63af7693035f749b57e7343ecf5dce06111d2ea644d309ef31d24d3eb20992c8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:49 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/63af7693035f749b57e7343ecf5dce06111d2ea644d309ef31d24d3eb20992c8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:49 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/63af7693035f749b57e7343ecf5dce06111d2ea644d309ef31d24d3eb20992c8/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:49 np0005465988 podman[77270]: 2025-10-02 11:33:49.950807094 +0000 UTC m=+0.353852543 container init 8ec62501ebe430c0208db7500a8f4f7bfc2a571c57b2800287517667d03016b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 07:33:49 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'nfs'
Oct  2 07:33:49 np0005465988 podman[77270]: 2025-10-02 11:33:49.968435775 +0000 UTC m=+0.371481204 container start 8ec62501ebe430c0208db7500a8f4f7bfc2a571c57b2800287517667d03016b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Oct  2 07:33:50 np0005465988 podman[77270]: 2025-10-02 11:33:50.104456521 +0000 UTC m=+0.507501970 container attach 8ec62501ebe430c0208db7500a8f4f7bfc2a571c57b2800287517667d03016b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct  2 07:33:50 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/3857097005' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct  2 07:33:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e23 e23: 2 total, 2 up, 2 in
Oct  2 07:33:50 np0005465988 ceph-mgr[76715]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct  2 07:33:50 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'orchestrator'
Oct  2 07:33:50 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:50.626+0000 7f1a5cfaa140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct  2 07:33:50 np0005465988 interesting_dubinsky[77284]: --> passed data devices: 0 physical, 1 LVM
Oct  2 07:33:50 np0005465988 interesting_dubinsky[77284]: --> relative data size: 1.0
Oct  2 07:33:50 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct  2 07:33:50 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 4adc6a3a-57df-44c5-8148-0263723a70e6
Oct  2 07:33:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "4adc6a3a-57df-44c5-8148-0263723a70e6"} v 0) v1
Oct  2 07:33:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1986060367' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4adc6a3a-57df-44c5-8148-0263723a70e6"}]: dispatch
Oct  2 07:33:51 np0005465988 ceph-mgr[76715]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct  2 07:33:51 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'osd_perf_query'
Oct  2 07:33:51 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:51.285+0000 7f1a5cfaa140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct  2 07:33:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e24 e24: 3 total, 2 up, 3 in
Oct  2 07:33:51 np0005465988 ceph-mgr[76715]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct  2 07:33:51 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'osd_support'
Oct  2 07:33:51 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:51.544+0000 7f1a5cfaa140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct  2 07:33:51 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/3857097005' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct  2 07:33:51 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.102:0/1986060367' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4adc6a3a-57df-44c5-8148-0263723a70e6"}]: dispatch
Oct  2 07:33:51 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4adc6a3a-57df-44c5-8148-0263723a70e6"}]: dispatch
Oct  2 07:33:51 np0005465988 ceph-mgr[76715]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct  2 07:33:51 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'pg_autoscaler'
Oct  2 07:33:51 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:51.782+0000 7f1a5cfaa140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct  2 07:33:51 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct  2 07:33:51 np0005465988 lvm[77333]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct  2 07:33:51 np0005465988 lvm[77333]: VG ceph_vg0 finished
Oct  2 07:33:51 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Oct  2 07:33:51 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Oct  2 07:33:51 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct  2 07:33:51 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct  2 07:33:51 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Oct  2 07:33:52 np0005465988 ceph-mgr[76715]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct  2 07:33:52 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'progress'
Oct  2 07:33:52 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:52.082+0000 7f1a5cfaa140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct  2 07:33:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Oct  2 07:33:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3509981406' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Oct  2 07:33:52 np0005465988 interesting_dubinsky[77284]: stderr: got monmap epoch 3
Oct  2 07:33:52 np0005465988 interesting_dubinsky[77284]: --> Creating keyring file for osd.2
Oct  2 07:33:52 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Oct  2 07:33:52 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Oct  2 07:33:52 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 4adc6a3a-57df-44c5-8148-0263723a70e6 --setuser ceph --setgroup ceph
Oct  2 07:33:52 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:52.346+0000 7f1a5cfaa140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Oct  2 07:33:52 np0005465988 ceph-mgr[76715]: mgr[py] Module progress has missing NOTIFY_TYPES member
Oct  2 07:33:52 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'prometheus'
Oct  2 07:33:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e25 e25: 3 total, 2 up, 3 in
Oct  2 07:33:52 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "4adc6a3a-57df-44c5-8148-0263723a70e6"}]': finished
Oct  2 07:33:52 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/4255282364' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct  2 07:33:52 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct  2 07:33:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:52 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct  2 07:33:53 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:53.385+0000 7f1a5cfaa140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct  2 07:33:53 np0005465988 ceph-mgr[76715]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct  2 07:33:53 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'rbd_support'
Oct  2 07:33:53 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:53.699+0000 7f1a5cfaa140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct  2 07:33:53 np0005465988 ceph-mgr[76715]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct  2 07:33:53 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'restful'
Oct  2 07:33:54 np0005465988 ceph-mon[76355]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct  2 07:33:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e26 e26: 3 total, 2 up, 3 in
Oct  2 07:33:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e26 _set_new_cache_sizes cache_size:1020054716 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:33:54 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'rgw'
Oct  2 07:33:55 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:55.147+0000 7f1a5cfaa140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct  2 07:33:55 np0005465988 ceph-mgr[76715]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct  2 07:33:55 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'rook'
Oct  2 07:33:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e27 e27: 3 total, 2 up, 3 in
Oct  2 07:33:55 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/3376646480' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Oct  2 07:33:55 np0005465988 interesting_dubinsky[77284]: stderr: 2025-10-02T11:33:52.354+0000 7fb50935c740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Oct  2 07:33:55 np0005465988 interesting_dubinsky[77284]: stderr: 2025-10-02T11:33:52.354+0000 7fb50935c740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Oct  2 07:33:55 np0005465988 interesting_dubinsky[77284]: stderr: 2025-10-02T11:33:52.355+0000 7fb50935c740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Oct  2 07:33:55 np0005465988 interesting_dubinsky[77284]: stderr: 2025-10-02T11:33:52.355+0000 7fb50935c740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Oct  2 07:33:55 np0005465988 interesting_dubinsky[77284]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Oct  2 07:33:55 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct  2 07:33:55 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Oct  2 07:33:55 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct  2 07:33:56 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Oct  2 07:33:56 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct  2 07:33:56 np0005465988 interesting_dubinsky[77284]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct  2 07:33:56 np0005465988 interesting_dubinsky[77284]: --> ceph-volume lvm activate successful for osd ID: 2
Oct  2 07:33:56 np0005465988 interesting_dubinsky[77284]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Oct  2 07:33:56 np0005465988 systemd[1]: libpod-8ec62501ebe430c0208db7500a8f4f7bfc2a571c57b2800287517667d03016b0.scope: Deactivated successfully.
Oct  2 07:33:56 np0005465988 systemd[1]: libpod-8ec62501ebe430c0208db7500a8f4f7bfc2a571c57b2800287517667d03016b0.scope: Consumed 2.540s CPU time.
Oct  2 07:33:56 np0005465988 podman[78239]: 2025-10-02 11:33:56.122452737 +0000 UTC m=+0.022685240 container died 8ec62501ebe430c0208db7500a8f4f7bfc2a571c57b2800287517667d03016b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Oct  2 07:33:56 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:56 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/3376646480' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Oct  2 07:33:56 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:33:56 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct  2 07:33:56 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct  2 07:33:56 np0005465988 systemd[1]: var-lib-containers-storage-overlay-63af7693035f749b57e7343ecf5dce06111d2ea644d309ef31d24d3eb20992c8-merged.mount: Deactivated successfully.
Oct  2 07:33:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e28 e28: 3 total, 2 up, 3 in
Oct  2 07:33:56 np0005465988 podman[78239]: 2025-10-02 11:33:56.994545852 +0000 UTC m=+0.894778385 container remove 8ec62501ebe430c0208db7500a8f4f7bfc2a571c57b2800287517667d03016b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:33:57 np0005465988 systemd[1]: libpod-conmon-8ec62501ebe430c0208db7500a8f4f7bfc2a571c57b2800287517667d03016b0.scope: Deactivated successfully.
Oct  2 07:33:57 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:57.287+0000 7f1a5cfaa140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Oct  2 07:33:57 np0005465988 ceph-mgr[76715]: mgr[py] Module rook has missing NOTIFY_TYPES member
Oct  2 07:33:57 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'selftest'
Oct  2 07:33:57 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/2736872602' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Oct  2 07:33:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Oct  2 07:33:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Oct  2 07:33:57 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:57.536+0000 7f1a5cfaa140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct  2 07:33:57 np0005465988 ceph-mgr[76715]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct  2 07:33:57 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'snap_schedule'
Oct  2 07:33:57 np0005465988 podman[78390]: 2025-10-02 11:33:57.695250709 +0000 UTC m=+0.031429114 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:57 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:57.792+0000 7f1a5cfaa140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct  2 07:33:57 np0005465988 ceph-mgr[76715]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct  2 07:33:57 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'stats'
Oct  2 07:33:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e29 e29: 3 total, 2 up, 3 in
Oct  2 07:33:57 np0005465988 podman[78390]: 2025-10-02 11:33:57.838746136 +0000 UTC m=+0.174924481 container create a338a87061dd3eb54a10bcd4dac80c9c7141ccd1e99bacafb10a295ce2afc137 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lamport, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 07:33:58 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'status'
Oct  2 07:33:58 np0005465988 systemd[1]: Started libpod-conmon-a338a87061dd3eb54a10bcd4dac80c9c7141ccd1e99bacafb10a295ce2afc137.scope.
Oct  2 07:33:58 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:33:58 np0005465988 podman[78390]: 2025-10-02 11:33:58.205253992 +0000 UTC m=+0.541432357 container init a338a87061dd3eb54a10bcd4dac80c9c7141ccd1e99bacafb10a295ce2afc137 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lamport, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Oct  2 07:33:58 np0005465988 podman[78390]: 2025-10-02 11:33:58.217848954 +0000 UTC m=+0.554027299 container start a338a87061dd3eb54a10bcd4dac80c9c7141ccd1e99bacafb10a295ce2afc137 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lamport, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 07:33:58 np0005465988 podman[78390]: 2025-10-02 11:33:58.223465167 +0000 UTC m=+0.559643562 container attach a338a87061dd3eb54a10bcd4dac80c9c7141ccd1e99bacafb10a295ce2afc137 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lamport, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct  2 07:33:58 np0005465988 condescending_lamport[78406]: 167 167
Oct  2 07:33:58 np0005465988 systemd[1]: libpod-a338a87061dd3eb54a10bcd4dac80c9c7141ccd1e99bacafb10a295ce2afc137.scope: Deactivated successfully.
Oct  2 07:33:58 np0005465988 podman[78390]: 2025-10-02 11:33:58.226526765 +0000 UTC m=+0.562705110 container died a338a87061dd3eb54a10bcd4dac80c9c7141ccd1e99bacafb10a295ce2afc137 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lamport, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:33:58 np0005465988 systemd[1]: var-lib-containers-storage-overlay-27725709d54d0742263f3c38be9f2522c9d035dd7fe0dabcfdad7f524a463ff6-merged.mount: Deactivated successfully.
Oct  2 07:33:58 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:58.293+0000 7f1a5cfaa140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Oct  2 07:33:58 np0005465988 ceph-mgr[76715]: mgr[py] Module status has missing NOTIFY_TYPES member
Oct  2 07:33:58 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'telegraf'
Oct  2 07:33:58 np0005465988 podman[78390]: 2025-10-02 11:33:58.319407489 +0000 UTC m=+0.655585844 container remove a338a87061dd3eb54a10bcd4dac80c9c7141ccd1e99bacafb10a295ce2afc137 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lamport, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Oct  2 07:33:58 np0005465988 systemd[1]: libpod-conmon-a338a87061dd3eb54a10bcd4dac80c9c7141ccd1e99bacafb10a295ce2afc137.scope: Deactivated successfully.
Oct  2 07:33:58 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:58.533+0000 7f1a5cfaa140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct  2 07:33:58 np0005465988 ceph-mgr[76715]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct  2 07:33:58 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'telemetry'
Oct  2 07:33:58 np0005465988 podman[78430]: 2025-10-02 11:33:58.552179827 +0000 UTC m=+0.068048810 container create f249cf8c1e63a9bf87f6c27c39b8ef7333a7de81863bfe089a23bc729b5da60d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_diffie, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 07:33:58 np0005465988 systemd[1]: Started libpod-conmon-f249cf8c1e63a9bf87f6c27c39b8ef7333a7de81863bfe089a23bc729b5da60d.scope.
Oct  2 07:33:58 np0005465988 podman[78430]: 2025-10-02 11:33:58.52096825 +0000 UTC m=+0.036837323 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:33:58 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:33:58 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5af44d2c908aa064a8fce4fe7712e174022d086856a6b5dc6d8b3f61e76e73c7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:58 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5af44d2c908aa064a8fce4fe7712e174022d086856a6b5dc6d8b3f61e76e73c7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:58 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5af44d2c908aa064a8fce4fe7712e174022d086856a6b5dc6d8b3f61e76e73c7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:58 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5af44d2c908aa064a8fce4fe7712e174022d086856a6b5dc6d8b3f61e76e73c7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 07:33:58 np0005465988 podman[78430]: 2025-10-02 11:33:58.651325541 +0000 UTC m=+0.167194544 container init f249cf8c1e63a9bf87f6c27c39b8ef7333a7de81863bfe089a23bc729b5da60d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_diffie, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Oct  2 07:33:58 np0005465988 podman[78430]: 2025-10-02 11:33:58.663664376 +0000 UTC m=+0.179533359 container start f249cf8c1e63a9bf87f6c27c39b8ef7333a7de81863bfe089a23bc729b5da60d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_diffie, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True)
Oct  2 07:33:58 np0005465988 podman[78430]: 2025-10-02 11:33:58.667074453 +0000 UTC m=+0.182943436 container attach f249cf8c1e63a9bf87f6c27c39b8ef7333a7de81863bfe089a23bc729b5da60d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_diffie, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Oct  2 07:33:58 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/2736872602' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Oct  2 07:33:59 np0005465988 ceph-mgr[76715]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct  2 07:33:59 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'test_orchestrator'
Oct  2 07:33:59 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:59.128+0000 7f1a5cfaa140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct  2 07:33:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e29 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:33:59 np0005465988 elated_diffie[78446]: {
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:    "2": [
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:        {
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            "devices": [
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "/dev/loop3"
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            ],
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            "lv_name": "ceph_lv0",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            "lv_size": "7511998464",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=sxu8yl-OuMM-mDOr-uTZ2-7RoF-WruS-3MtCet,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=fd4c5763-22d1-50ea-ad0b-96a3dc3040b2,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=4adc6a3a-57df-44c5-8148-0263723a70e6,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            "lv_uuid": "sxu8yl-OuMM-mDOr-uTZ2-7RoF-WruS-3MtCet",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            "name": "ceph_lv0",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            "path": "/dev/ceph_vg0/ceph_lv0",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            "tags": {
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.block_uuid": "sxu8yl-OuMM-mDOr-uTZ2-7RoF-WruS-3MtCet",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.cephx_lockbox_secret": "",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.cluster_fsid": "fd4c5763-22d1-50ea-ad0b-96a3dc3040b2",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.cluster_name": "ceph",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.crush_device_class": "",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.encrypted": "0",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.osd_fsid": "4adc6a3a-57df-44c5-8148-0263723a70e6",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.osd_id": "2",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.osdspec_affinity": "default_drive_group",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.type": "block",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:                "ceph.vdo": "0"
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            },
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            "type": "block",
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:            "vg_name": "ceph_vg0"
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:        }
Oct  2 07:33:59 np0005465988 elated_diffie[78446]:    ]
Oct  2 07:33:59 np0005465988 elated_diffie[78446]: }
Oct  2 07:33:59 np0005465988 systemd[1]: libpod-f249cf8c1e63a9bf87f6c27c39b8ef7333a7de81863bfe089a23bc729b5da60d.scope: Deactivated successfully.
Oct  2 07:33:59 np0005465988 podman[78430]: 2025-10-02 11:33:59.502727058 +0000 UTC m=+1.018596041 container died f249cf8c1e63a9bf87f6c27c39b8ef7333a7de81863bfe089a23bc729b5da60d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_diffie, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 07:33:59 np0005465988 systemd[1]: var-lib-containers-storage-overlay-5af44d2c908aa064a8fce4fe7712e174022d086856a6b5dc6d8b3f61e76e73c7-merged.mount: Deactivated successfully.
Oct  2 07:33:59 np0005465988 podman[78430]: 2025-10-02 11:33:59.574036941 +0000 UTC m=+1.089905954 container remove f249cf8c1e63a9bf87f6c27c39b8ef7333a7de81863bfe089a23bc729b5da60d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_diffie, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:33:59 np0005465988 systemd[1]: libpod-conmon-f249cf8c1e63a9bf87f6c27c39b8ef7333a7de81863bfe089a23bc729b5da60d.scope: Deactivated successfully.
Oct  2 07:33:59 np0005465988 ceph-mon[76355]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct  2 07:33:59 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/2891127426' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Oct  2 07:33:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Oct  2 07:33:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e30 e30: 3 total, 2 up, 3 in
Oct  2 07:33:59 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:33:59.799+0000 7f1a5cfaa140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct  2 07:33:59 np0005465988 ceph-mgr[76715]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct  2 07:33:59 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'volumes'
Oct  2 07:34:00 np0005465988 podman[78609]: 2025-10-02 11:34:00.322401584 +0000 UTC m=+0.040913227 container create 40f8382802d695032b2e73eaa9ef51a296ba8df8e34a80ecfaf49d260b45f16f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_cartwright, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Oct  2 07:34:00 np0005465988 systemd[1]: Started libpod-conmon-40f8382802d695032b2e73eaa9ef51a296ba8df8e34a80ecfaf49d260b45f16f.scope.
Oct  2 07:34:00 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:34:00 np0005465988 podman[78609]: 2025-10-02 11:34:00.305981294 +0000 UTC m=+0.024493017 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:34:00 np0005465988 podman[78609]: 2025-10-02 11:34:00.405052036 +0000 UTC m=+0.123563769 container init 40f8382802d695032b2e73eaa9ef51a296ba8df8e34a80ecfaf49d260b45f16f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_cartwright, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef)
Oct  2 07:34:00 np0005465988 podman[78609]: 2025-10-02 11:34:00.413787659 +0000 UTC m=+0.132299342 container start 40f8382802d695032b2e73eaa9ef51a296ba8df8e34a80ecfaf49d260b45f16f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_cartwright, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:34:00 np0005465988 podman[78609]: 2025-10-02 11:34:00.418148991 +0000 UTC m=+0.136660674 container attach 40f8382802d695032b2e73eaa9ef51a296ba8df8e34a80ecfaf49d260b45f16f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_cartwright, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Oct  2 07:34:00 np0005465988 funny_cartwright[78625]: 167 167
Oct  2 07:34:00 np0005465988 systemd[1]: libpod-40f8382802d695032b2e73eaa9ef51a296ba8df8e34a80ecfaf49d260b45f16f.scope: Deactivated successfully.
Oct  2 07:34:00 np0005465988 podman[78609]: 2025-10-02 11:34:00.419462334 +0000 UTC m=+0.137974057 container died 40f8382802d695032b2e73eaa9ef51a296ba8df8e34a80ecfaf49d260b45f16f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_cartwright, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Oct  2 07:34:00 np0005465988 systemd[1]: var-lib-containers-storage-overlay-47fce22b64ef4bdbac57acaf0c575f3186b8e62bb962b7ad1310612f5b3b47c7-merged.mount: Deactivated successfully.
Oct  2 07:34:00 np0005465988 podman[78609]: 2025-10-02 11:34:00.468026925 +0000 UTC m=+0.186538618 container remove 40f8382802d695032b2e73eaa9ef51a296ba8df8e34a80ecfaf49d260b45f16f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_cartwright, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Oct  2 07:34:00 np0005465988 systemd[1]: libpod-conmon-40f8382802d695032b2e73eaa9ef51a296ba8df8e34a80ecfaf49d260b45f16f.scope: Deactivated successfully.
Oct  2 07:34:00 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:34:00.549+0000 7f1a5cfaa140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct  2 07:34:00 np0005465988 ceph-mgr[76715]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct  2 07:34:00 np0005465988 ceph-mgr[76715]: mgr[py] Loading python module 'zabbix'
Oct  2 07:34:00 np0005465988 podman[78656]: 2025-10-02 11:34:00.796470758 +0000 UTC m=+0.055807377 container create 26fa6140399a7a7999de1fe81ca5abc18f6ca35fb47bc707f5d3b312a2f1162b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 07:34:00 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mgr-compute-2-rbjjpf[76711]: 2025-10-02T11:34:00.808+0000 7f1a5cfaa140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct  2 07:34:00 np0005465988 ceph-mgr[76715]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct  2 07:34:00 np0005465988 ceph-mgr[76715]: ms_deliver_dispatch: unhandled message 0x55a298e1f1e0 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Oct  2 07:34:00 np0005465988 ceph-mgr[76715]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3158772141
Oct  2 07:34:00 np0005465988 ceph-mon[76355]: Deploying daemon osd.2 on compute-2
Oct  2 07:34:00 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/2891127426' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Oct  2 07:34:00 np0005465988 systemd[1]: Started libpod-conmon-26fa6140399a7a7999de1fe81ca5abc18f6ca35fb47bc707f5d3b312a2f1162b.scope.
Oct  2 07:34:00 np0005465988 podman[78656]: 2025-10-02 11:34:00.774644931 +0000 UTC m=+0.033981560 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:34:00 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:34:00 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17514575eef17d7fcac9a699e0fe5837cf5c1d3c25be1e508713a4d2caffc578/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:00 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17514575eef17d7fcac9a699e0fe5837cf5c1d3c25be1e508713a4d2caffc578/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:00 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17514575eef17d7fcac9a699e0fe5837cf5c1d3c25be1e508713a4d2caffc578/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:00 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17514575eef17d7fcac9a699e0fe5837cf5c1d3c25be1e508713a4d2caffc578/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:00 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17514575eef17d7fcac9a699e0fe5837cf5c1d3c25be1e508713a4d2caffc578/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:00 np0005465988 podman[78656]: 2025-10-02 11:34:00.898694661 +0000 UTC m=+0.158031300 container init 26fa6140399a7a7999de1fe81ca5abc18f6ca35fb47bc707f5d3b312a2f1162b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate-test, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Oct  2 07:34:00 np0005465988 podman[78656]: 2025-10-02 11:34:00.914249258 +0000 UTC m=+0.173585847 container start 26fa6140399a7a7999de1fe81ca5abc18f6ca35fb47bc707f5d3b312a2f1162b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Oct  2 07:34:00 np0005465988 podman[78656]: 2025-10-02 11:34:00.91863413 +0000 UTC m=+0.177970719 container attach 26fa6140399a7a7999de1fe81ca5abc18f6ca35fb47bc707f5d3b312a2f1162b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:34:01 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate-test[78672]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Oct  2 07:34:01 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate-test[78672]:                            [--no-systemd] [--no-tmpfs]
Oct  2 07:34:01 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate-test[78672]: ceph-volume activate: error: unrecognized arguments: --bad-option
Oct  2 07:34:01 np0005465988 systemd[1]: libpod-26fa6140399a7a7999de1fe81ca5abc18f6ca35fb47bc707f5d3b312a2f1162b.scope: Deactivated successfully.
Oct  2 07:34:01 np0005465988 podman[78656]: 2025-10-02 11:34:01.553173296 +0000 UTC m=+0.812509895 container died 26fa6140399a7a7999de1fe81ca5abc18f6ca35fb47bc707f5d3b312a2f1162b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate-test, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 07:34:01 np0005465988 systemd[1]: var-lib-containers-storage-overlay-17514575eef17d7fcac9a699e0fe5837cf5c1d3c25be1e508713a4d2caffc578-merged.mount: Deactivated successfully.
Oct  2 07:34:01 np0005465988 podman[78656]: 2025-10-02 11:34:01.6069401 +0000 UTC m=+0.866276669 container remove 26fa6140399a7a7999de1fe81ca5abc18f6ca35fb47bc707f5d3b312a2f1162b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3)
Oct  2 07:34:01 np0005465988 systemd[1]: libpod-conmon-26fa6140399a7a7999de1fe81ca5abc18f6ca35fb47bc707f5d3b312a2f1162b.scope: Deactivated successfully.
Oct  2 07:34:01 np0005465988 ceph-mgr[76715]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3158772141
Oct  2 07:34:01 np0005465988 systemd[1]: Reloading.
Oct  2 07:34:01 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:01 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:34:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e31 e31: 3 total, 2 up, 3 in
Oct  2 07:34:02 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/3239065561' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Oct  2 07:34:02 np0005465988 systemd[1]: Reloading.
Oct  2 07:34:02 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:02 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:34:02 np0005465988 systemd[1]: Starting Ceph osd.2 for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2...
Oct  2 07:34:02 np0005465988 podman[78834]: 2025-10-02 11:34:02.69012506 +0000 UTC m=+0.057242884 container create 5a70b39fe66ce07da7f56b5c026abffff4d0e230e55259b96a31f02b7af1a5ad (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct  2 07:34:02 np0005465988 podman[78834]: 2025-10-02 11:34:02.658903842 +0000 UTC m=+0.026021746 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:34:02 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:34:02 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1cfef0f88224c3ec0dcaedfe09afa4f6852ac4bfc9ac726405dc3fea4f750f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:02 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1cfef0f88224c3ec0dcaedfe09afa4f6852ac4bfc9ac726405dc3fea4f750f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:02 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1cfef0f88224c3ec0dcaedfe09afa4f6852ac4bfc9ac726405dc3fea4f750f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:02 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1cfef0f88224c3ec0dcaedfe09afa4f6852ac4bfc9ac726405dc3fea4f750f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:02 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1cfef0f88224c3ec0dcaedfe09afa4f6852ac4bfc9ac726405dc3fea4f750f/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:02 np0005465988 podman[78834]: 2025-10-02 11:34:02.806768361 +0000 UTC m=+0.173886265 container init 5a70b39fe66ce07da7f56b5c026abffff4d0e230e55259b96a31f02b7af1a5ad (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 07:34:02 np0005465988 podman[78834]: 2025-10-02 11:34:02.814388286 +0000 UTC m=+0.181506140 container start 5a70b39fe66ce07da7f56b5c026abffff4d0e230e55259b96a31f02b7af1a5ad (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 07:34:02 np0005465988 podman[78834]: 2025-10-02 11:34:02.819057415 +0000 UTC m=+0.186175269 container attach 5a70b39fe66ce07da7f56b5c026abffff4d0e230e55259b96a31f02b7af1a5ad (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 07:34:03 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/3239065561' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Oct  2 07:34:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:03 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate[78849]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct  2 07:34:03 np0005465988 bash[78834]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct  2 07:34:03 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate[78849]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Oct  2 07:34:03 np0005465988 bash[78834]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Oct  2 07:34:03 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate[78849]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Oct  2 07:34:03 np0005465988 bash[78834]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Oct  2 07:34:03 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate[78849]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct  2 07:34:03 np0005465988 bash[78834]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct  2 07:34:03 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate[78849]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct  2 07:34:03 np0005465988 bash[78834]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct  2 07:34:03 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate[78849]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct  2 07:34:03 np0005465988 bash[78834]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct  2 07:34:03 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate[78849]: --> ceph-volume raw activate successful for osd ID: 2
Oct  2 07:34:03 np0005465988 bash[78834]: --> ceph-volume raw activate successful for osd ID: 2
Oct  2 07:34:03 np0005465988 systemd[1]: libpod-5a70b39fe66ce07da7f56b5c026abffff4d0e230e55259b96a31f02b7af1a5ad.scope: Deactivated successfully.
Oct  2 07:34:03 np0005465988 podman[78834]: 2025-10-02 11:34:03.68802176 +0000 UTC m=+1.055139564 container died 5a70b39fe66ce07da7f56b5c026abffff4d0e230e55259b96a31f02b7af1a5ad (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 07:34:03 np0005465988 systemd[1]: var-lib-containers-storage-overlay-3a1cfef0f88224c3ec0dcaedfe09afa4f6852ac4bfc9ac726405dc3fea4f750f-merged.mount: Deactivated successfully.
Oct  2 07:34:03 np0005465988 podman[78834]: 2025-10-02 11:34:03.843480113 +0000 UTC m=+1.210597927 container remove 5a70b39fe66ce07da7f56b5c026abffff4d0e230e55259b96a31f02b7af1a5ad (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2-activate, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 07:34:04 np0005465988 podman[79019]: 2025-10-02 11:34:04.105999901 +0000 UTC m=+0.069154908 container create 5849831e42ec9a1a60aab716731dd0cb20382571625506aa1ccbf8250e9fcdad (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct  2 07:34:04 np0005465988 podman[79019]: 2025-10-02 11:34:04.063460034 +0000 UTC m=+0.026615061 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:34:04 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24971bf2efa530daf3b516788fc160d930de283f987eddb5115e624dacff06fe/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:04 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24971bf2efa530daf3b516788fc160d930de283f987eddb5115e624dacff06fe/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:04 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24971bf2efa530daf3b516788fc160d930de283f987eddb5115e624dacff06fe/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:04 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24971bf2efa530daf3b516788fc160d930de283f987eddb5115e624dacff06fe/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:04 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24971bf2efa530daf3b516788fc160d930de283f987eddb5115e624dacff06fe/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:04 np0005465988 podman[79019]: 2025-10-02 11:34:04.193290462 +0000 UTC m=+0.156445489 container init 5849831e42ec9a1a60aab716731dd0cb20382571625506aa1ccbf8250e9fcdad (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True)
Oct  2 07:34:04 np0005465988 podman[79019]: 2025-10-02 11:34:04.20298603 +0000 UTC m=+0.166141037 container start 5849831e42ec9a1a60aab716731dd0cb20382571625506aa1ccbf8250e9fcdad (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True)
Oct  2 07:34:04 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/817273948' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Oct  2 07:34:04 np0005465988 bash[79019]: 5849831e42ec9a1a60aab716731dd0cb20382571625506aa1ccbf8250e9fcdad
Oct  2 07:34:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: set uid:gid to 167:167 (ceph:ceph)
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: pidfile_write: ignore empty --pid-file
Oct  2 07:34:04 np0005465988 systemd[1]: Started Ceph osd.2 for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2.
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127cf91c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127cf91c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127cf91c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127dd9d000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127dd9d000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127dd9d000 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127dd9d000 /var/lib/ceph/osd/ceph-2/block) close
Oct  2 07:34:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e32 e32: 3 total, 2 up, 3 in
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127cf91c00 /var/lib/ceph/osd/ceph-2/block) close
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: load: jerasure load: lrc 
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127de24c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127de24c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127de24c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct  2 07:34:04 np0005465988 ceph-osd[79039]: bdev(0x56127de24c00 /var/lib/ceph/osd/ceph-2/block) close
Oct  2 07:34:05 np0005465988 podman[79199]: 2025-10-02 11:34:04.937958512 +0000 UTC m=+0.029230978 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de24c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de24c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de24c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de24c00 /var/lib/ceph/osd/ceph-2/block) close
Oct  2 07:34:05 np0005465988 podman[79199]: 2025-10-02 11:34:05.136645499 +0000 UTC m=+0.227917945 container create 9b9b60e60ac92f31900c96c5472f6338fa5a3d82e39c5c4765c30b6da17f6611 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_dubinsky, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 07:34:05 np0005465988 systemd[1]: Started libpod-conmon-9b9b60e60ac92f31900c96c5472f6338fa5a3d82e39c5c4765c30b6da17f6611.scope.
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de24c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de24c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de24c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de25400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de25400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de25400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluefs mount
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluefs mount shared_bdev_used = 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Oct  2 07:34:05 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: RocksDB version: 7.9.2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Git sha 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Compile date 2025-05-06 23:30:25
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: DB SUMMARY
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: DB Session ID:  C2HNWLOLFWI1DPDQ5AJT
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: CURRENT file:  CURRENT
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: IDENTITY file:  IDENTITY
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                         Options.error_if_exists: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.create_if_missing: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                         Options.paranoid_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.flush_verify_memtable_count: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                                     Options.env: 0x56127de27f10
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                                      Options.fs: LegacyFileSystem
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                                Options.info_log: 0x56127d01c720
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_file_opening_threads: 16
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                              Options.statistics: (nil)
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.use_fsync: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.max_log_file_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.log_file_time_to_roll: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.keep_log_file_num: 1000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.recycle_log_file_num: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                         Options.allow_fallocate: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.allow_mmap_reads: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.allow_mmap_writes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.use_direct_reads: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.create_missing_column_families: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                              Options.db_log_dir: 
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                                 Options.wal_dir: db.wal
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.table_cache_numshardbits: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                         Options.WAL_ttl_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.WAL_size_limit_MB: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.manifest_preallocation_size: 4194304
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                     Options.is_fd_close_on_exec: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.advise_random_on_open: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.db_write_buffer_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.write_buffer_manager: 0x56127df2e6e0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.access_hint_on_compaction_start: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                      Options.use_adaptive_mutex: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                            Options.rate_limiter: (nil)
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.wal_recovery_mode: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.enable_thread_tracking: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.enable_pipelined_write: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.unordered_write: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.write_thread_max_yield_usec: 100
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.row_cache: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                              Options.wal_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.avoid_flush_during_recovery: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.allow_ingest_behind: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.two_write_queues: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.manual_wal_flush: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.wal_compression: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.atomic_flush: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.persist_stats_to_disk: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.write_dbid_to_manifest: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.log_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.best_efforts_recovery: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.allow_data_in_errors: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.db_host_id: __hostname__
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.enforce_single_del_contracts: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.max_background_jobs: 4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.max_background_compactions: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.max_subcompactions: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.writable_file_max_buffer_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.delayed_write_rate : 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.max_total_wal_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.stats_dump_period_sec: 600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.stats_persist_period_sec: 600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.max_open_files: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.bytes_per_sync: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                      Options.wal_bytes_per_sync: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.strict_bytes_per_sync: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.compaction_readahead_size: 2097152
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.max_background_flushes: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Compression algorithms supported:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kZSTD supported: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kXpressCompression supported: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kBZip2Compression supported: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kLZ4Compression supported: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kZlibCompression supported: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kLZ4HCCompression supported: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kSnappyCompression supported: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Fast CRC32 supported: Supported on x86
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: DMutex implementation: pthread_mutex_t
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127d01cd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d004430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127d01cd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d004430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127d01cd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d004430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127d01cd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d004430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127d01cd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d004430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127d01cd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d004430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127d01cd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d004430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127d01cd60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d0042d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127d01cd60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d0042d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127d01cd60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d0042d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 5e16aebc-f2e1-436a-9d14-0202849346d4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759404845368888, "job": 1, "event": "recovery_started", "wal_files": [31]}
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759404845369123, "job": 1, "event": "recovery_finished"}
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: freelist init
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: freelist _read_cfg
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluefs umount
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de25400 /var/lib/ceph/osd/ceph-2/block) close
Oct  2 07:34:05 np0005465988 podman[79199]: 2025-10-02 11:34:05.512532225 +0000 UTC m=+0.603804721 container init 9b9b60e60ac92f31900c96c5472f6338fa5a3d82e39c5c4765c30b6da17f6611 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_dubinsky, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef)
Oct  2 07:34:05 np0005465988 podman[79199]: 2025-10-02 11:34:05.520706754 +0000 UTC m=+0.611979200 container start 9b9b60e60ac92f31900c96c5472f6338fa5a3d82e39c5c4765c30b6da17f6611 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_dubinsky, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 07:34:05 np0005465988 cranky_dubinsky[79219]: 167 167
Oct  2 07:34:05 np0005465988 systemd[1]: libpod-9b9b60e60ac92f31900c96c5472f6338fa5a3d82e39c5c4765c30b6da17f6611.scope: Deactivated successfully.
Oct  2 07:34:05 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/817273948' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Oct  2 07:34:05 np0005465988 ceph-mon[76355]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct  2 07:34:05 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:05 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de25400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de25400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bdev(0x56127de25400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluefs mount
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluefs mount shared_bdev_used = 4718592
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: RocksDB version: 7.9.2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Git sha 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Compile date 2025-05-06 23:30:25
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: DB SUMMARY
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: DB Session ID:  C2HNWLOLFWI1DPDQ5AJS
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: CURRENT file:  CURRENT
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: IDENTITY file:  IDENTITY
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                         Options.error_if_exists: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.create_if_missing: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                         Options.paranoid_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.flush_verify_memtable_count: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                                     Options.env: 0x56127dff0a10
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                                      Options.fs: LegacyFileSystem
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                                Options.info_log: 0x56127de1f660
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_file_opening_threads: 16
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                              Options.statistics: (nil)
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.use_fsync: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.max_log_file_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.log_file_time_to_roll: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.keep_log_file_num: 1000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.recycle_log_file_num: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                         Options.allow_fallocate: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.allow_mmap_reads: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.allow_mmap_writes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.use_direct_reads: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.create_missing_column_families: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                              Options.db_log_dir: 
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                                 Options.wal_dir: db.wal
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.table_cache_numshardbits: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                         Options.WAL_ttl_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.WAL_size_limit_MB: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.manifest_preallocation_size: 4194304
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                     Options.is_fd_close_on_exec: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.advise_random_on_open: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.db_write_buffer_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.write_buffer_manager: 0x56127df2e8c0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.access_hint_on_compaction_start: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                      Options.use_adaptive_mutex: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                            Options.rate_limiter: (nil)
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.wal_recovery_mode: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.enable_thread_tracking: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.enable_pipelined_write: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.unordered_write: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.write_thread_max_yield_usec: 100
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.row_cache: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                              Options.wal_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.avoid_flush_during_recovery: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.allow_ingest_behind: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.two_write_queues: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.manual_wal_flush: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.wal_compression: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.atomic_flush: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.persist_stats_to_disk: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.write_dbid_to_manifest: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.log_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.best_efforts_recovery: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.allow_data_in_errors: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.db_host_id: __hostname__
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.enforce_single_del_contracts: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.max_background_jobs: 4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.max_background_compactions: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.max_subcompactions: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.writable_file_max_buffer_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.delayed_write_rate : 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.max_total_wal_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.stats_dump_period_sec: 600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.stats_persist_period_sec: 600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.max_open_files: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.bytes_per_sync: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                      Options.wal_bytes_per_sync: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.strict_bytes_per_sync: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.compaction_readahead_size: 2097152
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.max_background_flushes: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Compression algorithms supported:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kZSTD supported: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kXpressCompression supported: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kBZip2Compression supported: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kLZ4Compression supported: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kZlibCompression supported: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kLZ4HCCompression supported: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: #011kSnappyCompression supported: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Fast CRC32 supported: Supported on x86
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: DMutex implementation: pthread_mutex_t
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127de1fce0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d005770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127de1fce0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d005770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127de1fce0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d005770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127de1fce0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d005770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127de1fce0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d005770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127de1fce0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d005770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127de1fce0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d005770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127de1f280)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d0058d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127de1f280)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d0058d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:           Options.merge_operator: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.compaction_filter_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.sst_partitioner_factory: None
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.table_factory: BlockBasedTable
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56127de1f280)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56127d0058d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.write_buffer_size: 16777216
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.max_write_buffer_number: 64
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.compression: LZ4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression: Disabled
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.num_levels: 7
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:            Options.compression_opts.window_bits: -14
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.level: 32767
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.compression_opts.strategy: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                  Options.compression_opts.enabled: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.target_file_size_base: 67108864
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:             Options.target_file_size_multiplier: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.arena_block_size: 1048576
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.disable_auto_compactions: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.inplace_update_support: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:   Options.memtable_huge_page_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.bloom_locality: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                    Options.max_successive_merges: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.paranoid_file_checks: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.force_consistency_checks: 1
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.report_bg_io_stats: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                               Options.ttl: 2592000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                       Options.enable_blob_files: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                           Options.min_blob_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                          Options.blob_file_size: 268435456
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb:                Options.blob_file_starting_level: 0
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 5e16aebc-f2e1-436a-9d14-0202849346d4
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759404845622392, "job": 1, "event": "recovery_started", "wal_files": [31]}
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Oct  2 07:34:05 np0005465988 podman[79199]: 2025-10-02 11:34:05.739090754 +0000 UTC m=+0.830363300 container attach 9b9b60e60ac92f31900c96c5472f6338fa5a3d82e39c5c4765c30b6da17f6611 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_dubinsky, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:34:05 np0005465988 podman[79199]: 2025-10-02 11:34:05.740197723 +0000 UTC m=+0.831470269 container died 9b9b60e60ac92f31900c96c5472f6338fa5a3d82e39c5c4765c30b6da17f6611 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_dubinsky, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759404845764644, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404845, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5e16aebc-f2e1-436a-9d14-0202849346d4", "db_session_id": "C2HNWLOLFWI1DPDQ5AJS", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759404845796169, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404845, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5e16aebc-f2e1-436a-9d14-0202849346d4", "db_session_id": "C2HNWLOLFWI1DPDQ5AJS", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759404845820829, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404845, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5e16aebc-f2e1-436a-9d14-0202849346d4", "db_session_id": "C2HNWLOLFWI1DPDQ5AJS", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759404845888806, "job": 1, "event": "recovery_finished"}
Oct  2 07:34:05 np0005465988 ceph-osd[79039]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Oct  2 07:34:06 np0005465988 systemd[1]: var-lib-containers-storage-overlay-64e160cb96b217bf29772e0e271a06e911d9b57cbc96442523837a288115f958-merged.mount: Deactivated successfully.
Oct  2 07:34:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e33 e33: 3 total, 2 up, 3 in
Oct  2 07:34:06 np0005465988 podman[79199]: 2025-10-02 11:34:06.515266389 +0000 UTC m=+1.606538835 container remove 9b9b60e60ac92f31900c96c5472f6338fa5a3d82e39c5c4765c30b6da17f6611 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_dubinsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 07:34:06 np0005465988 systemd[1]: libpod-conmon-9b9b60e60ac92f31900c96c5472f6338fa5a3d82e39c5c4765c30b6da17f6611.scope: Deactivated successfully.
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x56127de4a700
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: rocksdb: DB pointer 0x56127d047a00
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1.0 total, 1.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1.0 total, 1.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1.0 total, 1.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.9 total, 0.9 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 460.80 MB usag
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: _get_class not permitted to load lua
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: _get_class not permitted to load sdk
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: _get_class not permitted to load test_remote_reads
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: osd.2 0 load_pgs
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: osd.2 0 load_pgs opened 0 pgs
Oct  2 07:34:06 np0005465988 ceph-osd[79039]: osd.2 0 log_to_monitors true
Oct  2 07:34:06 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2[79035]: 2025-10-02T11:34:06.561+0000 7f905f1a3740 -1 osd.2 0 log_to_monitors true
Oct  2 07:34:06 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/1622950684' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Oct  2 07:34:06 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/1622950684' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Oct  2 07:34:06 np0005465988 podman[79653]: 2025-10-02 11:34:06.713921956 +0000 UTC m=+0.060579899 container create 31047783c3768e7508a83429be102952ec2d8f9fb8a264a994278e06e6776482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_euler, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:34:06 np0005465988 systemd[1]: Started libpod-conmon-31047783c3768e7508a83429be102952ec2d8f9fb8a264a994278e06e6776482.scope.
Oct  2 07:34:06 np0005465988 podman[79653]: 2025-10-02 11:34:06.68081698 +0000 UTC m=+0.027475013 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:34:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0) v1
Oct  2 07:34:06 np0005465988 ceph-mon[76355]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1863586281,v1:192.168.122.102:6801/1863586281]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Oct  2 07:34:06 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:34:06 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eda0b6472d86d5c10c5ec0e247af65e0d7b189a5a574027af44a1b5ec95cdc67/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:06 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eda0b6472d86d5c10c5ec0e247af65e0d7b189a5a574027af44a1b5ec95cdc67/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:06 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eda0b6472d86d5c10c5ec0e247af65e0d7b189a5a574027af44a1b5ec95cdc67/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:06 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eda0b6472d86d5c10c5ec0e247af65e0d7b189a5a574027af44a1b5ec95cdc67/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:06 np0005465988 podman[79653]: 2025-10-02 11:34:06.847679994 +0000 UTC m=+0.194338027 container init 31047783c3768e7508a83429be102952ec2d8f9fb8a264a994278e06e6776482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_euler, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Oct  2 07:34:06 np0005465988 podman[79653]: 2025-10-02 11:34:06.86082614 +0000 UTC m=+0.207484093 container start 31047783c3768e7508a83429be102952ec2d8f9fb8a264a994278e06e6776482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_euler, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 07:34:06 np0005465988 podman[79653]: 2025-10-02 11:34:06.873926224 +0000 UTC m=+0.220584167 container attach 31047783c3768e7508a83429be102952ec2d8f9fb8a264a994278e06e6776482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_euler, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Oct  2 07:34:07 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Oct  2 07:34:07 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Oct  2 07:34:07 np0005465988 ceph-mon[76355]: from='osd.2 [v2:192.168.122.102:6800/1863586281,v1:192.168.122.102:6801/1863586281]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Oct  2 07:34:07 np0005465988 ceph-mon[76355]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Oct  2 07:34:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e34 e34: 3 total, 2 up, 3 in
Oct  2 07:34:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]} v 0) v1
Oct  2 07:34:07 np0005465988 ceph-mon[76355]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1863586281,v1:192.168.122.102:6801/1863586281]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Oct  2 07:34:07 np0005465988 funny_euler[79670]: {
Oct  2 07:34:07 np0005465988 funny_euler[79670]:    "4adc6a3a-57df-44c5-8148-0263723a70e6": {
Oct  2 07:34:07 np0005465988 funny_euler[79670]:        "ceph_fsid": "fd4c5763-22d1-50ea-ad0b-96a3dc3040b2",
Oct  2 07:34:07 np0005465988 funny_euler[79670]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Oct  2 07:34:07 np0005465988 funny_euler[79670]:        "osd_id": 2,
Oct  2 07:34:07 np0005465988 funny_euler[79670]:        "osd_uuid": "4adc6a3a-57df-44c5-8148-0263723a70e6",
Oct  2 07:34:07 np0005465988 funny_euler[79670]:        "type": "bluestore"
Oct  2 07:34:07 np0005465988 funny_euler[79670]:    }
Oct  2 07:34:07 np0005465988 funny_euler[79670]: }
Oct  2 07:34:07 np0005465988 systemd[1]: libpod-31047783c3768e7508a83429be102952ec2d8f9fb8a264a994278e06e6776482.scope: Deactivated successfully.
Oct  2 07:34:07 np0005465988 podman[79691]: 2025-10-02 11:34:07.770159176 +0000 UTC m=+0.023937492 container died 31047783c3768e7508a83429be102952ec2d8f9fb8a264a994278e06e6776482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_euler, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 07:34:07 np0005465988 systemd[1]: var-lib-containers-storage-overlay-eda0b6472d86d5c10c5ec0e247af65e0d7b189a5a574027af44a1b5ec95cdc67-merged.mount: Deactivated successfully.
Oct  2 07:34:07 np0005465988 podman[79691]: 2025-10-02 11:34:07.887130266 +0000 UTC m=+0.140908582 container remove 31047783c3768e7508a83429be102952ec2d8f9fb8a264a994278e06e6776482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_euler, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 07:34:07 np0005465988 systemd[1]: libpod-conmon-31047783c3768e7508a83429be102952ec2d8f9fb8a264a994278e06e6776482.scope: Deactivated successfully.
Oct  2 07:34:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e35 e35: 3 total, 2 up, 3 in
Oct  2 07:34:08 np0005465988 ceph-osd[79039]: osd.2 0 done with init, starting boot process
Oct  2 07:34:08 np0005465988 ceph-osd[79039]: osd.2 0 start_boot
Oct  2 07:34:08 np0005465988 ceph-osd[79039]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Oct  2 07:34:08 np0005465988 ceph-osd[79039]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Oct  2 07:34:08 np0005465988 ceph-osd[79039]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Oct  2 07:34:08 np0005465988 ceph-osd[79039]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Oct  2 07:34:08 np0005465988 ceph-osd[79039]: osd.2 0  bench count 12288000 bsize 4 KiB
Oct  2 07:34:08 np0005465988 ceph-mon[76355]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Oct  2 07:34:08 np0005465988 ceph-mon[76355]: from='osd.2 [v2:192.168.122.102:6800/1863586281,v1:192.168.122.102:6801/1863586281]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Oct  2 07:34:08 np0005465988 ceph-mon[76355]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Oct  2 07:34:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:09 np0005465988 podman[79928]: 2025-10-02 11:34:09.228514214 +0000 UTC m=+0.085578598 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 07:34:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:09 np0005465988 podman[79928]: 2025-10-02 11:34:09.57896833 +0000 UTC m=+0.436032714 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True)
Oct  2 07:34:10 np0005465988 ceph-mon[76355]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Oct  2 07:34:10 np0005465988 ceph-mon[76355]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Oct  2 07:34:10 np0005465988 ceph-mon[76355]: Cluster is now healthy
Oct  2 07:34:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:11 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/220112243' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Oct  2 07:34:11 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/220112243' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Oct  2 07:34:11 np0005465988 podman[80289]: 2025-10-02 11:34:11.426555303 +0000 UTC m=+0.052444971 container create 621b90587021d8d46f57397f2f499089ee810c5ee598f72126d23ad1cd246917 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_colden, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 07:34:11 np0005465988 systemd[1]: Started libpod-conmon-621b90587021d8d46f57397f2f499089ee810c5ee598f72126d23ad1cd246917.scope.
Oct  2 07:34:11 np0005465988 podman[80289]: 2025-10-02 11:34:11.403563876 +0000 UTC m=+0.029453524 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:34:11 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:34:11 np0005465988 podman[80289]: 2025-10-02 11:34:11.557063409 +0000 UTC m=+0.182953097 container init 621b90587021d8d46f57397f2f499089ee810c5ee598f72126d23ad1cd246917 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_colden, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Oct  2 07:34:11 np0005465988 podman[80289]: 2025-10-02 11:34:11.563898773 +0000 UTC m=+0.189788441 container start 621b90587021d8d46f57397f2f499089ee810c5ee598f72126d23ad1cd246917 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_colden, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:34:11 np0005465988 jovial_colden[80306]: 167 167
Oct  2 07:34:11 np0005465988 systemd[1]: libpod-621b90587021d8d46f57397f2f499089ee810c5ee598f72126d23ad1cd246917.scope: Deactivated successfully.
Oct  2 07:34:11 np0005465988 podman[80289]: 2025-10-02 11:34:11.575692255 +0000 UTC m=+0.201581913 container attach 621b90587021d8d46f57397f2f499089ee810c5ee598f72126d23ad1cd246917 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_colden, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 07:34:11 np0005465988 podman[80289]: 2025-10-02 11:34:11.576180217 +0000 UTC m=+0.202069875 container died 621b90587021d8d46f57397f2f499089ee810c5ee598f72126d23ad1cd246917 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_colden, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Oct  2 07:34:11 np0005465988 systemd[1]: var-lib-containers-storage-overlay-c0a459dcf4c494c8c0758dd490205eb9354907377ac16abfb22e9303cb68c583-merged.mount: Deactivated successfully.
Oct  2 07:34:11 np0005465988 podman[80289]: 2025-10-02 11:34:11.635814091 +0000 UTC m=+0.261703719 container remove 621b90587021d8d46f57397f2f499089ee810c5ee598f72126d23ad1cd246917 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_colden, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Oct  2 07:34:11 np0005465988 systemd[1]: libpod-conmon-621b90587021d8d46f57397f2f499089ee810c5ee598f72126d23ad1cd246917.scope: Deactivated successfully.
Oct  2 07:34:11 np0005465988 podman[80329]: 2025-10-02 11:34:11.801644259 +0000 UTC m=+0.047550676 container create e8fabf61f634d8eff3b4ce853d4aa030775428887835a3d14ebb32511c4d689c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_lichterman, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 07:34:11 np0005465988 systemd[1]: Started libpod-conmon-e8fabf61f634d8eff3b4ce853d4aa030775428887835a3d14ebb32511c4d689c.scope.
Oct  2 07:34:11 np0005465988 podman[80329]: 2025-10-02 11:34:11.774741031 +0000 UTC m=+0.020647478 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:34:11 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:34:11 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f5385516895916aca2b30f0dc9191f02ca2e5e9ce1a90ffb53394b6333e7531/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:11 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f5385516895916aca2b30f0dc9191f02ca2e5e9ce1a90ffb53394b6333e7531/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:11 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f5385516895916aca2b30f0dc9191f02ca2e5e9ce1a90ffb53394b6333e7531/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:11 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f5385516895916aca2b30f0dc9191f02ca2e5e9ce1a90ffb53394b6333e7531/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:11 np0005465988 podman[80329]: 2025-10-02 11:34:11.911119696 +0000 UTC m=+0.157026133 container init e8fabf61f634d8eff3b4ce853d4aa030775428887835a3d14ebb32511c4d689c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_lichterman, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 07:34:11 np0005465988 podman[80329]: 2025-10-02 11:34:11.920519546 +0000 UTC m=+0.166425963 container start e8fabf61f634d8eff3b4ce853d4aa030775428887835a3d14ebb32511c4d689c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_lichterman, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 07:34:11 np0005465988 podman[80329]: 2025-10-02 11:34:11.925555975 +0000 UTC m=+0.171462392 container attach e8fabf61f634d8eff3b4ce853d4aa030775428887835a3d14ebb32511c4d689c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_lichterman, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 23.551 iops: 6028.960 elapsed_sec: 0.498
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: log_channel(cluster) log [WRN] : OSD bench result of 6028.959553 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 0 waiting for initial osdmap
Oct  2 07:34:12 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2[79035]: 2025-10-02T11:34:12.087+0000 7f905b123640 -1 osd.2 0 waiting for initial osdmap
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 35 crush map has features 288514051259236352, adjusting msgr requires for clients
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 35 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 35 crush map has features 3314933000852226048, adjusting msgr requires for osds
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 35 check_osdmap_features require_osd_release unknown -> reef
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 35 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Oct  2 07:34:12 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-osd-2[79035]: 2025-10-02T11:34:12.125+0000 7f905674b640 -1 osd.2 35 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 35 set_numa_affinity not setting numa affinity
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 35 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Oct  2 07:34:12 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/2085867460' entity='client.admin' 
Oct  2 07:34:12 np0005465988 ceph-mon[76355]: OSD bench result of 6028.959553 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct  2 07:34:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e36 e36: 3 total, 3 up, 3 in
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 36 state: booting -> active
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[3.1b( empty local-lis/les=0/0 n=0 ec=21/17 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[5.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[3.8( empty local-lis/les=0/0 n=0 ec=21/17 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[3.0( empty local-lis/les=0/0 n=0 ec=17/17 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.a( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.d( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.10( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.13( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.15( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.c( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.1b( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]: [
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:    {
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:        "available": false,
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:        "ceph_device": false,
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:        "device_id": "QEMU_DVD-ROM_QM00001",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:        "lsm_data": {},
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:        "lvs": [],
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:        "path": "/dev/sr0",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:        "rejected_reasons": [
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "Insufficient space (<5GB)",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "Has a FileSystem"
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:        ],
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:        "sys_api": {
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "actuators": null,
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "device_nodes": "sr0",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "devname": "sr0",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "human_readable_size": "482.00 KB",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "id_bus": "ata",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "model": "QEMU DVD-ROM",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "nr_requests": "2",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "parent": "/dev/sr0",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "partitions": {},
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "path": "/dev/sr0",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "removable": "1",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "rev": "2.5+",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "ro": "0",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "rotational": "0",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "sas_address": "",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "sas_device_handle": "",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "scheduler_mode": "mq-deadline",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "sectors": 0,
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "sectorsize": "2048",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "size": 493568.0,
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "support_discard": "2048",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "type": "disk",
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:            "vendor": "QEMU"
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:        }
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]:    }
Oct  2 07:34:13 np0005465988 distracted_lichterman[80345]: ]
Oct  2 07:34:13 np0005465988 systemd[1]: libpod-e8fabf61f634d8eff3b4ce853d4aa030775428887835a3d14ebb32511c4d689c.scope: Deactivated successfully.
Oct  2 07:34:13 np0005465988 podman[80329]: 2025-10-02 11:34:13.129571413 +0000 UTC m=+1.375477900 container died e8fabf61f634d8eff3b4ce853d4aa030775428887835a3d14ebb32511c4d689c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_lichterman, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Oct  2 07:34:13 np0005465988 systemd[1]: libpod-e8fabf61f634d8eff3b4ce853d4aa030775428887835a3d14ebb32511c4d689c.scope: Consumed 1.203s CPU time.
Oct  2 07:34:13 np0005465988 systemd[1]: var-lib-containers-storage-overlay-6f5385516895916aca2b30f0dc9191f02ca2e5e9ce1a90ffb53394b6333e7531-merged.mount: Deactivated successfully.
Oct  2 07:34:13 np0005465988 podman[80329]: 2025-10-02 11:34:13.192504942 +0000 UTC m=+1.438411359 container remove e8fabf61f634d8eff3b4ce853d4aa030775428887835a3d14ebb32511c4d689c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_lichterman, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct  2 07:34:13 np0005465988 systemd[1]: libpod-conmon-e8fabf61f634d8eff3b4ce853d4aa030775428887835a3d14ebb32511c4d689c.scope: Deactivated successfully.
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.1c( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[3.1d( empty local-lis/les=0/0 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.1d( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.5( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.b( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[3.9( empty local-lis/les=0/0 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.f( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[3.11( empty local-lis/les=0/0 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[3.15( empty local-lis/les=0/0 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.12( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[3.e( empty local-lis/les=0/0 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[2.18( empty local-lis/les=0/0 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 36 pg[3.1a( empty local-lis/les=0/0 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: osd.2 [v2:192.168.122.102:6800/1863586281,v1:192.168.122.102:6801/1863586281] boot
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: Saving service ingress.rgw.default spec with placement count:2
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: Adjusting osd_memory_target on compute-2 to 127.8M
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: Unable to set osd_memory_target on compute-2 to 134062899: error parsing value: Value '134062899' is below minimum 939524096
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: Updating compute-0:/etc/ceph/ceph.conf
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: Updating compute-1:/etc/ceph/ceph.conf
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: Updating compute-2:/etc/ceph/ceph.conf
Oct  2 07:34:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e37 e37: 3 total, 3 up, 3 in
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.12( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[3.15( empty local-lis/les=36/37 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.f( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[3.9( empty local-lis/les=36/37 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.b( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[3.e( empty local-lis/les=36/37 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.5( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.1d( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[3.1b( empty local-lis/les=36/37 n=0 ec=21/17 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[3.11( empty local-lis/les=36/37 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.10( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[3.0( empty local-lis/les=36/37 n=0 ec=17/17 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.a( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[5.0( empty local-lis/les=36/37 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.d( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[3.8( empty local-lis/les=36/37 n=0 ec=21/17 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.18( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.1c( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[3.1d( empty local-lis/les=36/37 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.13( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.15( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.1b( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[2.c( empty local-lis/les=36/37 n=0 ec=21/15 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 37 pg[3.1a( empty local-lis/les=36/37 n=0 ec=21/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:14 np0005465988 ceph-mon[76355]: Updating compute-1:/var/lib/ceph/fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/config/ceph.conf
Oct  2 07:34:14 np0005465988 ceph-mon[76355]: Updating compute-0:/var/lib/ceph/fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/config/ceph.conf
Oct  2 07:34:15 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Oct  2 07:34:15 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Oct  2 07:34:15 np0005465988 ceph-mon[76355]: Updating compute-2:/var/lib/ceph/fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/config/ceph.conf
Oct  2 07:34:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e2 new map
Oct  2 07:34:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e2 print_map#012e2#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-02T11:34:16.058920+0000#012modified#0112025-10-02T11:34:16.058958+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 
Oct  2 07:34:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e38 e38: 3 total, 3 up, 3 in
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Oct  2 07:34:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:18 np0005465988 ceph-mon[76355]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Oct  2 07:34:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:18 np0005465988 systemd[71899]: Starting Mark boot as successful...
Oct  2 07:34:18 np0005465988 systemd[71899]: Finished Mark boot as successful.
Oct  2 07:34:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:20 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/3861259132' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Oct  2 07:34:20 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/3861259132' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Oct  2 07:34:23 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:23 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:23 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.mwuxwy", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct  2 07:34:23 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.mwuxwy", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct  2 07:34:23 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:23 np0005465988 podman[82388]: 2025-10-02 11:34:23.7428743 +0000 UTC m=+0.020332321 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:34:23 np0005465988 podman[82388]: 2025-10-02 11:34:23.876254998 +0000 UTC m=+0.153712999 container create 2def6221759391db70dab1f4232eaf1907bae45c9c7a3da60535d6011c00364c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_curie, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Oct  2 07:34:23 np0005465988 systemd[1]: Started libpod-conmon-2def6221759391db70dab1f4232eaf1907bae45c9c7a3da60535d6011c00364c.scope.
Oct  2 07:34:23 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:34:24 np0005465988 podman[82388]: 2025-10-02 11:34:24.018820711 +0000 UTC m=+0.296278802 container init 2def6221759391db70dab1f4232eaf1907bae45c9c7a3da60535d6011c00364c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_curie, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 07:34:24 np0005465988 podman[82388]: 2025-10-02 11:34:24.030254004 +0000 UTC m=+0.307712045 container start 2def6221759391db70dab1f4232eaf1907bae45c9c7a3da60535d6011c00364c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_curie, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Oct  2 07:34:24 np0005465988 systemd[1]: libpod-2def6221759391db70dab1f4232eaf1907bae45c9c7a3da60535d6011c00364c.scope: Deactivated successfully.
Oct  2 07:34:24 np0005465988 infallible_curie[82405]: 167 167
Oct  2 07:34:24 np0005465988 conmon[82405]: conmon 2def6221759391db70da <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2def6221759391db70dab1f4232eaf1907bae45c9c7a3da60535d6011c00364c.scope/container/memory.events
Oct  2 07:34:24 np0005465988 podman[82388]: 2025-10-02 11:34:24.08567925 +0000 UTC m=+0.363137301 container attach 2def6221759391db70dab1f4232eaf1907bae45c9c7a3da60535d6011c00364c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_curie, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 07:34:24 np0005465988 podman[82388]: 2025-10-02 11:34:24.086239304 +0000 UTC m=+0.363697345 container died 2def6221759391db70dab1f4232eaf1907bae45c9c7a3da60535d6011c00364c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_curie, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 07:34:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:24 np0005465988 systemd[1]: var-lib-containers-storage-overlay-4d93c41c9b3378d5858c5e82e6b358b8e1bff7da51c92ff1e3c1946053035512-merged.mount: Deactivated successfully.
Oct  2 07:34:24 np0005465988 podman[82388]: 2025-10-02 11:34:24.611930388 +0000 UTC m=+0.889388469 container remove 2def6221759391db70dab1f4232eaf1907bae45c9c7a3da60535d6011c00364c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_curie, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Oct  2 07:34:24 np0005465988 systemd[1]: libpod-conmon-2def6221759391db70dab1f4232eaf1907bae45c9c7a3da60535d6011c00364c.scope: Deactivated successfully.
Oct  2 07:34:24 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Oct  2 07:34:24 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Oct  2 07:34:24 np0005465988 ceph-mon[76355]: Deploying daemon rgw.rgw.compute-2.mwuxwy on compute-2
Oct  2 07:34:25 np0005465988 systemd[1]: Reloading.
Oct  2 07:34:25 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:25 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:34:25 np0005465988 systemd[1]: Reloading.
Oct  2 07:34:25 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.a scrub starts
Oct  2 07:34:25 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.a scrub ok
Oct  2 07:34:25 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:25 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:34:26 np0005465988 systemd[1]: Starting Ceph rgw.rgw.compute-2.mwuxwy for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2...
Oct  2 07:34:26 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/3798139046' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Oct  2 07:34:26 np0005465988 podman[82552]: 2025-10-02 11:34:26.320292623 +0000 UTC m=+0.062501048 container create 49fe6b14c478e82172701f5be666ddd7705253abdb6dcd5fb900783366871901 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-rgw-rgw-compute-2-mwuxwy, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Oct  2 07:34:26 np0005465988 podman[82552]: 2025-10-02 11:34:26.278256309 +0000 UTC m=+0.020464784 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:34:26 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a833e7cbbb284a04f357477293cff09537fe93226cc328e49a3920f56305d03/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:26 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a833e7cbbb284a04f357477293cff09537fe93226cc328e49a3920f56305d03/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:26 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a833e7cbbb284a04f357477293cff09537fe93226cc328e49a3920f56305d03/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:26 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a833e7cbbb284a04f357477293cff09537fe93226cc328e49a3920f56305d03/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.mwuxwy supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:26 np0005465988 podman[82552]: 2025-10-02 11:34:26.458552286 +0000 UTC m=+0.200760711 container init 49fe6b14c478e82172701f5be666ddd7705253abdb6dcd5fb900783366871901 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-rgw-rgw-compute-2-mwuxwy, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct  2 07:34:26 np0005465988 podman[82552]: 2025-10-02 11:34:26.470253925 +0000 UTC m=+0.212462350 container start 49fe6b14c478e82172701f5be666ddd7705253abdb6dcd5fb900783366871901 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-rgw-rgw-compute-2-mwuxwy, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Oct  2 07:34:26 np0005465988 bash[82552]: 49fe6b14c478e82172701f5be666ddd7705253abdb6dcd5fb900783366871901
Oct  2 07:34:26 np0005465988 systemd[1]: Started Ceph rgw.rgw.compute-2.mwuxwy for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2.
Oct  2 07:34:26 np0005465988 radosgw[82571]: deferred set uid:gid to 167:167 (ceph:ceph)
Oct  2 07:34:26 np0005465988 radosgw[82571]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Oct  2 07:34:26 np0005465988 radosgw[82571]: framework: beast
Oct  2 07:34:26 np0005465988 radosgw[82571]: framework conf key: endpoint, val: 192.168.122.102:8082
Oct  2 07:34:26 np0005465988 radosgw[82571]: init_numa not setting numa affinity
Oct  2 07:34:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.tijdss", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct  2 07:34:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.tijdss", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct  2 07:34:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e39 e39: 3 total, 3 up, 3 in
Oct  2 07:34:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0) v1
Oct  2 07:34:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3022269828' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Oct  2 07:34:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Oct  2 07:34:28 np0005465988 ceph-mon[76355]: Deploying daemon rgw.rgw.compute-1.tijdss on compute-1
Oct  2 07:34:28 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.102:0/3022269828' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Oct  2 07:34:28 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0) v1
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3022269828' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-2.mwuxwy' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.mcnfdf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.mcnfdf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct  2 07:34:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Oct  2 07:34:30 np0005465988 ceph-mon[76355]: Deploying daemon rgw.rgw.compute-0.mcnfdf on compute-0
Oct  2 07:34:30 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.101:0/1957857602' entity='client.rgw.rgw.compute-1.tijdss' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct  2 07:34:30 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.102:0/3022269828' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct  2 07:34:30 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct  2 07:34:30 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-1.tijdss' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct  2 07:34:31 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.d deep-scrub starts
Oct  2 07:34:31 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.d deep-scrub ok
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-2.mwuxwy' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-1.tijdss' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: Deploying daemon haproxy.rgw.default.compute-0.qdmsoe on compute-0
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Oct  2 07:34:31 np0005465988 ceph-mon[76355]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3022269828' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct  2 07:34:33 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/308578156' entity='client.rgw.rgw.compute-0.mcnfdf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct  2 07:34:33 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.101:0/1957857602' entity='client.rgw.rgw.compute-1.tijdss' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct  2 07:34:33 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-1.tijdss' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct  2 07:34:33 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.102:0/3022269828' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct  2 07:34:33 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct  2 07:34:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Oct  2 07:34:33 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Oct  2 07:34:33 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Oct  2 07:34:34 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/308578156' entity='client.rgw.rgw.compute-0.mcnfdf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Oct  2 07:34:34 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-1.tijdss' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Oct  2 07:34:34 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-2.mwuxwy' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Oct  2 07:34:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Oct  2 07:34:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Oct  2 07:34:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2401809657' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct  2 07:34:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Oct  2 07:34:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Oct  2 07:34:35 np0005465988 ceph-mon[76355]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2401809657' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct  2 07:34:35 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/4060792636' entity='client.rgw.rgw.compute-0.mcnfdf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct  2 07:34:35 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.101:0/1317742668' entity='client.rgw.rgw.compute-1.tijdss' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct  2 07:34:35 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-1.tijdss' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct  2 07:34:35 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.102:0/2401809657' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct  2 07:34:35 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct  2 07:34:35 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Oct  2 07:34:35 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Oct  2 07:34:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Oct  2 07:34:36 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/4060792636' entity='client.rgw.rgw.compute-0.mcnfdf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Oct  2 07:34:36 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-1.tijdss' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Oct  2 07:34:36 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-2.mwuxwy' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Oct  2 07:34:36 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.102:0/2401809657' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct  2 07:34:36 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.101:0/1317742668' entity='client.rgw.rgw.compute-1.tijdss' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct  2 07:34:36 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-2.mwuxwy' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct  2 07:34:36 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-1.tijdss' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct  2 07:34:36 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/4060792636' entity='client.rgw.rgw.compute-0.mcnfdf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct  2 07:34:36 np0005465988 radosgw[82571]: LDAP not started since no server URIs were provided in the configuration.
Oct  2 07:34:36 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-rgw-rgw-compute-2-mwuxwy[82567]: 2025-10-02T11:34:36.508+0000 7fcdd31cf940 -1 LDAP not started since no server URIs were provided in the configuration.
Oct  2 07:34:36 np0005465988 radosgw[82571]: framework: beast
Oct  2 07:34:36 np0005465988 radosgw[82571]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Oct  2 07:34:36 np0005465988 radosgw[82571]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Oct  2 07:34:36 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Oct  2 07:34:36 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Oct  2 07:34:36 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Oct  2 07:34:36 np0005465988 radosgw[82571]: starting handler: beast
Oct  2 07:34:36 np0005465988 radosgw[82571]: set uid:gid to 167:167 (ceph:ceph)
Oct  2 07:34:36 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Oct  2 07:34:36 np0005465988 radosgw[82571]: mgrc service_daemon_register rgw.24139 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.mwuxwy,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025,kernel_version=5.14.0-620.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864104,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=d51d2c37-0c52-473d-a68f-705b7a7c6947,zone_name=default,zonegroup_id=46b6f139-f98e-4349-978c-7decf8294e94,zonegroup_name=default}
Oct  2 07:34:36 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.8 deep-scrub starts
Oct  2 07:34:36 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Oct  2 07:34:36 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Oct  2 07:34:36 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.8 deep-scrub ok
Oct  2 07:34:36 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Oct  2 07:34:36 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Oct  2 07:34:36 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Oct  2 07:34:37 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-2.mwuxwy' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Oct  2 07:34:37 np0005465988 ceph-mon[76355]: from='client.? ' entity='client.rgw.rgw.compute-1.tijdss' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Oct  2 07:34:37 np0005465988 ceph-mon[76355]: from='client.? 192.168.122.100:0/4060792636' entity='client.rgw.rgw.compute-0.mcnfdf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Oct  2 07:34:37 np0005465988 ceph-mon[76355]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Oct  2 07:34:37 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Oct  2 07:34:37 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Oct  2 07:34:38 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.c scrub starts
Oct  2 07:34:38 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.c scrub ok
Oct  2 07:34:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.005000144s ======
Oct  2 07:34:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:34:38.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000144s
Oct  2 07:34:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:38 np0005465988 ceph-mon[76355]: Deploying daemon haproxy.rgw.default.compute-2.jycvzz on compute-2
Oct  2 07:34:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:34:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:34:40.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:34:40 np0005465988 podman[83327]: 2025-10-02 11:34:40.998346036 +0000 UTC m=+2.627426846 container create b439b86425b941d62a3e0643f10493dbb12befc802d5ab5d6c4895ee23060804 (image=quay.io/ceph/haproxy:2.3, name=reverent_sutherland)
Oct  2 07:34:41 np0005465988 podman[83327]: 2025-10-02 11:34:40.916175266 +0000 UTC m=+2.545256116 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Oct  2 07:34:41 np0005465988 systemd[1]: Started libpod-conmon-b439b86425b941d62a3e0643f10493dbb12befc802d5ab5d6c4895ee23060804.scope.
Oct  2 07:34:41 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:34:41 np0005465988 podman[83327]: 2025-10-02 11:34:41.205717332 +0000 UTC m=+2.834798132 container init b439b86425b941d62a3e0643f10493dbb12befc802d5ab5d6c4895ee23060804 (image=quay.io/ceph/haproxy:2.3, name=reverent_sutherland)
Oct  2 07:34:41 np0005465988 podman[83327]: 2025-10-02 11:34:41.21289128 +0000 UTC m=+2.841972070 container start b439b86425b941d62a3e0643f10493dbb12befc802d5ab5d6c4895ee23060804 (image=quay.io/ceph/haproxy:2.3, name=reverent_sutherland)
Oct  2 07:34:41 np0005465988 reverent_sutherland[83440]: 0 0
Oct  2 07:34:41 np0005465988 systemd[1]: libpod-b439b86425b941d62a3e0643f10493dbb12befc802d5ab5d6c4895ee23060804.scope: Deactivated successfully.
Oct  2 07:34:41 np0005465988 conmon[83440]: conmon b439b86425b941d62a3e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b439b86425b941d62a3e0643f10493dbb12befc802d5ab5d6c4895ee23060804.scope/container/memory.events
Oct  2 07:34:41 np0005465988 podman[83327]: 2025-10-02 11:34:41.24431148 +0000 UTC m=+2.873392290 container attach b439b86425b941d62a3e0643f10493dbb12befc802d5ab5d6c4895ee23060804 (image=quay.io/ceph/haproxy:2.3, name=reverent_sutherland)
Oct  2 07:34:41 np0005465988 podman[83327]: 2025-10-02 11:34:41.245238737 +0000 UTC m=+2.874319517 container died b439b86425b941d62a3e0643f10493dbb12befc802d5ab5d6c4895ee23060804 (image=quay.io/ceph/haproxy:2.3, name=reverent_sutherland)
Oct  2 07:34:41 np0005465988 systemd[1]: var-lib-containers-storage-overlay-a7d5aa8333a852040d517ab13b2e8e359fef86d49043e6730361972b5c8897b8-merged.mount: Deactivated successfully.
Oct  2 07:34:41 np0005465988 podman[83327]: 2025-10-02 11:34:41.289433037 +0000 UTC m=+2.918513827 container remove b439b86425b941d62a3e0643f10493dbb12befc802d5ab5d6c4895ee23060804 (image=quay.io/ceph/haproxy:2.3, name=reverent_sutherland)
Oct  2 07:34:41 np0005465988 systemd[1]: libpod-conmon-b439b86425b941d62a3e0643f10493dbb12befc802d5ab5d6c4895ee23060804.scope: Deactivated successfully.
Oct  2 07:34:41 np0005465988 systemd[1]: Reloading.
Oct  2 07:34:41 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:34:41 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:41 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Oct  2 07:34:41 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Oct  2 07:34:41 np0005465988 systemd[1]: Reloading.
Oct  2 07:34:41 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:41 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:34:41 np0005465988 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.jycvzz for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2...
Oct  2 07:34:42 np0005465988 podman[83586]: 2025-10-02 11:34:42.154783549 +0000 UTC m=+0.030495654 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Oct  2 07:34:42 np0005465988 podman[83586]: 2025-10-02 11:34:42.425083537 +0000 UTC m=+0.300795652 container create 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:34:42 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ed3733414c23436bb41b3898d3c5478015f0a2b9454c8d2b962665667c00956/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:42 np0005465988 podman[83586]: 2025-10-02 11:34:42.63231366 +0000 UTC m=+0.508025835 container init 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:34:42 np0005465988 podman[83586]: 2025-10-02 11:34:42.641590638 +0000 UTC m=+0.517302703 container start 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:34:42 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz[83601]: [NOTICE] 274/113442 (2) : New worker #1 (4) forked
Oct  2 07:34:42 np0005465988 bash[83586]: 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4
Oct  2 07:34:42 np0005465988 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.jycvzz for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2.
Oct  2 07:34:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:34:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:34:42.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:34:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:34:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:34:43.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:34:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:44 np0005465988 ceph-mon[76355]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct  2 07:34:44 np0005465988 ceph-mon[76355]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct  2 07:34:44 np0005465988 ceph-mon[76355]: Deploying daemon keepalived.rgw.default.compute-2.ahfyxt on compute-2
Oct  2 07:34:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:34:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:34:44.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:34:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Oct  2 07:34:45 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Oct  2 07:34:45 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Oct  2 07:34:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Oct  2 07:34:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:34:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:34:45.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:34:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:34:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:34:46.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:34:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Oct  2 07:34:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Oct  2 07:34:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Oct  2 07:34:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct  2 07:34:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:34:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:34:47.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:34:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Oct  2 07:34:48 np0005465988 systemd[1]: session-20.scope: Deactivated successfully.
Oct  2 07:34:48 np0005465988 systemd[1]: session-20.scope: Consumed 8.491s CPU time.
Oct  2 07:34:48 np0005465988 systemd-logind[827]: Session 20 logged out. Waiting for processes to exit.
Oct  2 07:34:48 np0005465988 systemd-logind[827]: Removed session 20.
Oct  2 07:34:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:34:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:34:48.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:34:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Oct  2 07:34:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Oct  2 07:34:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Oct  2 07:34:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Oct  2 07:34:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Oct  2 07:34:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Oct  2 07:34:49 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 51 pg[5.0( empty local-lis/les=36/37 n=0 ec=21/21 lis/c=36/36 les/c/f=37/37/0 sis=51 pruub=12.525800705s) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active pruub 55.294979095s@ mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:34:49 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 51 pg[5.0( empty local-lis/les=36/37 n=0 ec=21/21 lis/c=36/36 les/c/f=37/37/0 sis=51 pruub=12.525800705s) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown pruub 55.294979095s@ mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:49 np0005465988 podman[83757]: 2025-10-02 11:34:49.388572612 +0000 UTC m=+5.628455101 container create 85795b8e57b882058fd232a6967b6ad6d629a7731219717497944ef4ee36703c (image=quay.io/ceph/keepalived:2.2.4, name=stoic_villani, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, build-date=2023-02-22T09:23:20, vcs-type=git, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, description=keepalived for Ceph, name=keepalived)
Oct  2 07:34:49 np0005465988 podman[83757]: 2025-10-02 11:34:49.364871326 +0000 UTC m=+5.604753855 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Oct  2 07:34:49 np0005465988 systemd[1]: Started libpod-conmon-85795b8e57b882058fd232a6967b6ad6d629a7731219717497944ef4ee36703c.scope.
Oct  2 07:34:49 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:34:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Oct  2 07:34:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct  2 07:34:49 np0005465988 podman[83757]: 2025-10-02 11:34:49.491832233 +0000 UTC m=+5.731714732 container init 85795b8e57b882058fd232a6967b6ad6d629a7731219717497944ef4ee36703c (image=quay.io/ceph/keepalived:2.2.4, name=stoic_villani, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, description=keepalived for Ceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, release=1793, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, version=2.2.4, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 07:34:49 np0005465988 podman[83757]: 2025-10-02 11:34:49.502738729 +0000 UTC m=+5.742621198 container start 85795b8e57b882058fd232a6967b6ad6d629a7731219717497944ef4ee36703c (image=quay.io/ceph/keepalived:2.2.4, name=stoic_villani, architecture=x86_64, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1793, distribution-scope=public, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.buildah.version=1.28.2, version=2.2.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph)
Oct  2 07:34:49 np0005465988 podman[83757]: 2025-10-02 11:34:49.506421395 +0000 UTC m=+5.746303864 container attach 85795b8e57b882058fd232a6967b6ad6d629a7731219717497944ef4ee36703c (image=quay.io/ceph/keepalived:2.2.4, name=stoic_villani, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Oct  2 07:34:49 np0005465988 stoic_villani[83855]: 0 0
Oct  2 07:34:49 np0005465988 systemd[1]: libpod-85795b8e57b882058fd232a6967b6ad6d629a7731219717497944ef4ee36703c.scope: Deactivated successfully.
Oct  2 07:34:49 np0005465988 podman[83757]: 2025-10-02 11:34:49.512235924 +0000 UTC m=+5.752118413 container died 85795b8e57b882058fd232a6967b6ad6d629a7731219717497944ef4ee36703c (image=quay.io/ceph/keepalived:2.2.4, name=stoic_villani, com.redhat.component=keepalived-container, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, vcs-type=git)
Oct  2 07:34:49 np0005465988 systemd[1]: var-lib-containers-storage-overlay-bb5bf0155b8eb5b550546d9eab94863d8f5dae0a938f1924a759a101817f8d23-merged.mount: Deactivated successfully.
Oct  2 07:34:49 np0005465988 podman[83757]: 2025-10-02 11:34:49.563672064 +0000 UTC m=+5.803554563 container remove 85795b8e57b882058fd232a6967b6ad6d629a7731219717497944ef4ee36703c (image=quay.io/ceph/keepalived:2.2.4, name=stoic_villani, io.buildah.version=1.28.2, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, name=keepalived, vcs-type=git, com.redhat.component=keepalived-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, description=keepalived for Ceph)
Oct  2 07:34:49 np0005465988 systemd[1]: libpod-conmon-85795b8e57b882058fd232a6967b6ad6d629a7731219717497944ef4ee36703c.scope: Deactivated successfully.
Oct  2 07:34:49 np0005465988 systemd[1]: Reloading.
Oct  2 07:34:49 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:34:49 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:49 np0005465988 systemd[1]: Reloading.
Oct  2 07:34:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:34:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:34:49.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:34:50 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:50 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:34:50 np0005465988 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.ahfyxt for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2...
Oct  2 07:34:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.18( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1e( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.19( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.10( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.11( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.9( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.7( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.16( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1c( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.12( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.b( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1b( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.4( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.a( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.14( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.6( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.d( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.17( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.e( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1d( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1a( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.2( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.c( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.f( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.13( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.8( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1f( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.15( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.3( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.5( empty local-lis/les=36/37 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.18( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.10( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.19( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.7( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.16( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1e( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.11( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1c( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.12( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.b( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1b( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.9( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.4( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.a( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.14( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.0( empty local-lis/les=51/52 n=0 ec=21/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.6( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.17( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1d( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.e( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.d( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.2( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.c( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.f( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.8( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1f( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.1a( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.15( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.13( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.5( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 52 pg[5.3( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:34:50 np0005465988 podman[83998]: 2025-10-02 11:34:50.504338786 +0000 UTC m=+0.052265504 container create 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, version=2.2.4, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, architecture=x86_64)
Oct  2 07:34:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Oct  2 07:34:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Oct  2 07:34:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Oct  2 07:34:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Oct  2 07:34:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct  2 07:34:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Oct  2 07:34:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Oct  2 07:34:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Oct  2 07:34:50 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Oct  2 07:34:50 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82d0d38b26afda3c3e17001f57879979bdcfbb35e2b10488ca3af5a37586204b/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:34:50 np0005465988 podman[83998]: 2025-10-02 11:34:50.566068014 +0000 UTC m=+0.113994752 container init 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, com.redhat.component=keepalived-container, version=2.2.4, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, vcs-type=git, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.buildah.version=1.28.2, name=keepalived)
Oct  2 07:34:50 np0005465988 podman[83998]: 2025-10-02 11:34:50.573502199 +0000 UTC m=+0.121428897 container start 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, com.redhat.component=keepalived-container, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, architecture=x86_64, distribution-scope=public, name=keepalived, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 07:34:50 np0005465988 bash[83998]: 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4
Oct  2 07:34:50 np0005465988 podman[83998]: 2025-10-02 11:34:50.484274025 +0000 UTC m=+0.032200733 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Oct  2 07:34:50 np0005465988 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.ahfyxt for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2.
Oct  2 07:34:50 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:50 2025: Starting Keepalived v2.2.4 (08/21,2021)
Oct  2 07:34:50 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:50 2025: Running on Linux 5.14.0-620.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025 (built for Linux 5.14.0)
Oct  2 07:34:50 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:50 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Oct  2 07:34:50 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:50 2025: Configuration file /etc/keepalived/keepalived.conf
Oct  2 07:34:50 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:50 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Oct  2 07:34:50 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:50 2025: Starting VRRP child process, pid=4
Oct  2 07:34:50 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:50 2025: Startup complete
Oct  2 07:34:50 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:50 2025: (VI_0) Entering BACKUP STATE (init)
Oct  2 07:34:50 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:50 2025: VRRP_Script(check_backend) succeeded
Oct  2 07:34:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:34:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:34:50.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:34:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Oct  2 07:34:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:34:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:34:51.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: Deploying daemon keepalived.rgw.default.compute-0.dcvgot on compute-0
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Oct  2 07:34:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Oct  2 07:34:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:34:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:34:52.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:34:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Oct  2 07:34:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Oct  2 07:34:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:34:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:34:53.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:34:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:54 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:54 2025: (VI_0) Entering MASTER STATE
Oct  2 07:34:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:34:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:34:54.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:34:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct  2 07:34:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct  2 07:34:55 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.b scrub starts
Oct  2 07:34:55 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.b scrub ok
Oct  2 07:34:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Oct  2 07:34:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:34:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:34:56.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:34:56 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Oct  2 07:34:56 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Oct  2 07:34:56 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.9 deep-scrub starts
Oct  2 07:34:56 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.9 deep-scrub ok
Oct  2 07:34:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Oct  2 07:34:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:34:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:34:56.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:34:57 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Oct  2 07:34:57 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Oct  2 07:34:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:34:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:34:58.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:34:58 np0005465988 podman[84297]: 2025-10-02 11:34:58.441697881 +0000 UTC m=+0.081665544 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Oct  2 07:34:58 np0005465988 podman[84297]: 2025-10-02 11:34:58.567543875 +0000 UTC m=+0.207511488 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Oct  2 07:34:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:34:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:34:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:34:58.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:34:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:34:59 np0005465988 podman[84429]: 2025-10-02 11:34:59.261991929 +0000 UTC m=+0.073440058 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:34:59 np0005465988 podman[84429]: 2025-10-02 11:34:59.310889022 +0000 UTC m=+0.122337141 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:34:59 np0005465988 podman[84493]: 2025-10-02 11:34:59.583968491 +0000 UTC m=+0.073030776 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, version=2.2.4, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, name=keepalived, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 07:34:59 np0005465988 podman[84493]: 2025-10-02 11:34:59.606803644 +0000 UTC m=+0.095865929 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, com.redhat.component=keepalived-container, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., description=keepalived for Ceph, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, release=1793, distribution-scope=public, io.openshift.expose-services=)
Oct  2 07:34:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:34:59 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:59 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Oct  2 07:34:59 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt[84013]: Thu Oct  2 11:34:59 2025: (VI_0) Entering BACKUP STATE
Oct  2 07:35:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:00.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.f scrub starts
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.f scrub ok
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct  2 07:35:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:00.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.18( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.520588875s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.839759827s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1e( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.520601273s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.839805603s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.19( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.520564079s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.839782715s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1e( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.520525932s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.839805603s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.18( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.520467758s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.839759827s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.16( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.520573616s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.839881897s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.19( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.520449638s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.839782715s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.16( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.520491600s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.839881897s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.11( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.524094582s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.843673706s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.11( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.524061203s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.843673706s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.9( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.524166107s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.843849182s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.9( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.524134636s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.843849182s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.7( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.520112038s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.839897156s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1c( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523900032s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.843772888s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.7( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.520022392s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.839897156s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1c( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523823738s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.843772888s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1b( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523722649s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.843765259s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1b( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523675919s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.843765259s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.14( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523824692s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844093323s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.14( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523787498s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844093323s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.6( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523773193s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844123840s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.a( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523690224s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844100952s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.6( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523587227s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844123840s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.10( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.519268990s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.839782715s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.17( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523386002s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844139099s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1d( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523530960s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844322205s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.17( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523349762s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844139099s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.10( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.519008636s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.839782715s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.a( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523264885s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844100952s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1d( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523474693s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844322205s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.2( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523208618s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844329834s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.f( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523170471s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844360352s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.2( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523150444s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844329834s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.f( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.523142815s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844360352s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.c( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522952080s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844352722s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1f( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522840500s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844375610s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.c( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522850990s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844352722s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1f( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522813797s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844375610s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.3( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522980690s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844581604s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.3( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522933960s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844581604s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.5( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522682190s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844482422s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.5( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522642136s) [0] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844482422s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522542000s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844383240s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.15( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522592545s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active pruub 67.844474792s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.15( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522562027s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844474792s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[5.1( empty local-lis/les=51/52 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=58 pruub=13.522488594s) [1] r=-1 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.844383240s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[11.13( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.1c( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.11( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.1d( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.16( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.1f( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.14( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.2( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.6( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.9( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.5( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[11.17( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[11.19( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[11.16( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.19( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.3( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.15( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.2( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.3( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.f( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.9( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[6.1( empty local-lis/les=0/0 n=0 ec=51/23 lis/c=51/51 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.6( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.a( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[11.e( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.1( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.d( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[11.a( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.c( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.b( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[11.3( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.8( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.15( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[8.1c( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[4.1f( empty local-lis/les=0/0 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[11.8( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[7.1d( empty local-lis/les=0/0 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[10.10( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[10.11( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[7.16( empty local-lis/les=0/0 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[7.a( empty local-lis/les=0/0 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[10.1( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[10.3( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[7.5( empty local-lis/les=0/0 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[10.4( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[7.14( empty local-lis/les=0/0 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[7.11( empty local-lis/les=0/0 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[10.1e( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[7.1f( empty local-lis/les=0/0 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:00 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 58 pg[10.12( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:01 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Oct  2 07:35:01 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Oct  2 07:35:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Oct  2 07:35:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Oct  2 07:35:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Oct  2 07:35:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Oct  2 07:35:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Oct  2 07:35:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Oct  2 07:35:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Oct  2 07:35:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Oct  2 07:35:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:02.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.c( v 40'6 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[7.1d( empty local-lis/les=58/59 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.2( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[7.11( empty local-lis/les=58/59 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.14( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[7.a( empty local-lis/les=58/59 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[10.10( v 44'48 (0'0,44'48] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.5( v 40'6 (0'0,40'6] local-lis/les=58/59 n=1 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.9( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[10.3( v 57'51 lc 44'42 (0'0,57'51] local-lis/les=58/59 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=57'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[10.12( v 44'48 (0'0,44'48] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.2( v 40'6 (0'0,40'6] local-lis/les=58/59 n=1 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[7.1f( empty local-lis/les=58/59 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[11.13( empty local-lis/les=58/59 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.f( v 40'6 lc 0'0 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.1c( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[10.1( v 44'48 (0'0,44'48] local-lis/les=58/59 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[6.1( empty local-lis/les=58/59 n=0 ec=51/23 lis/c=51/51 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[51,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[10.f( v 44'48 (0'0,44'48] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.d( v 40'6 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[11.e( empty local-lis/les=58/59 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.3( v 40'6 (0'0,40'6] local-lis/les=58/59 n=1 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[11.19( empty local-lis/les=58/59 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[11.8( empty local-lis/les=58/59 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.3( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.b( v 40'6 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[7.16( empty local-lis/les=58/59 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[11.a( empty local-lis/les=58/59 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.9( v 40'6 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.15( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.1( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[11.3( empty local-lis/les=58/59 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.1f( v 40'6 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.16( v 40'6 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[10.4( v 44'48 (0'0,44'48] local-lis/les=58/59 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.6( v 40'6 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.1d( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.a( v 40'6 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.8( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[7.5( empty local-lis/les=58/59 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.11( v 40'6 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.1c( v 40'6 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[10.1e( v 44'48 (0'0,44'48] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[8.15( v 40'6 (0'0,40'6] local-lis/les=58/59 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=40'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[7.14( empty local-lis/les=58/59 n=0 ec=52/25 lis/c=52/52 les/c/f=53/53/0 sis=58) [2] r=0 lpr=58 pi=[52,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.19( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[11.16( empty local-lis/les=58/59 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.6( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[11.17( empty local-lis/les=58/59 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[4.1f( empty local-lis/les=58/59 n=0 ec=49/19 lis/c=49/49 les/c/f=50/50/0 sis=58) [2] r=0 lpr=58 pi=[49,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 59 pg[10.11( v 44'48 (0'0,44'48] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Oct  2 07:35:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:02.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Oct  2 07:35:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Oct  2 07:35:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:04.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:35:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:04.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Oct  2 07:35:04 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 61 pg[9.17( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:04 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 61 pg[9.1b( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:04 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 61 pg[9.1f( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:04 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 61 pg[9.3( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:04 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 61 pg[9.7( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:04 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 61 pg[9.f( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:04 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 61 pg[9.b( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:04 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 61 pg[9.13( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:04 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Oct  2 07:35:05 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Oct  2 07:35:05 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Oct  2 07:35:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:06.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.3( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.3( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.17( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.17( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.7( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.7( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.b( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 62 pg[9.b( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.e scrub starts
Oct  2 07:35:06 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.e scrub ok
Oct  2 07:35:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:06.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Oct  2 07:35:07 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Oct  2 07:35:07 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Oct  2 07:35:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:08.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Oct  2 07:35:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Oct  2 07:35:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Oct  2 07:35:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Oct  2 07:35:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:35:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:08.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:35:09 np0005465988 podman[84671]: 2025-10-02 11:35:09.213672992 +0000 UTC m=+0.063891102 container create 2d4382be2099a35825cc1aa6c0362606e9e2ec7efee8b97f61087d10ef7b165a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bardeen, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef)
Oct  2 07:35:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e63 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:35:09 np0005465988 systemd[1]: Started libpod-conmon-2d4382be2099a35825cc1aa6c0362606e9e2ec7efee8b97f61087d10ef7b165a.scope.
Oct  2 07:35:09 np0005465988 podman[84671]: 2025-10-02 11:35:09.177106818 +0000 UTC m=+0.027325008 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:35:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.1f( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.13( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.1f( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.13( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.3( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.3( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.f( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.f( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.1b( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.1b( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.7( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.7( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.b( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.b( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.17( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.17( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.15( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.d( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 64 pg[9.5( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:09 np0005465988 podman[84671]: 2025-10-02 11:35:09.323806707 +0000 UTC m=+0.174024827 container init 2d4382be2099a35825cc1aa6c0362606e9e2ec7efee8b97f61087d10ef7b165a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bardeen, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef)
Oct  2 07:35:09 np0005465988 podman[84671]: 2025-10-02 11:35:09.336152486 +0000 UTC m=+0.186370586 container start 2d4382be2099a35825cc1aa6c0362606e9e2ec7efee8b97f61087d10ef7b165a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bardeen, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct  2 07:35:09 np0005465988 podman[84671]: 2025-10-02 11:35:09.340173466 +0000 UTC m=+0.190391596 container attach 2d4382be2099a35825cc1aa6c0362606e9e2ec7efee8b97f61087d10ef7b165a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bardeen, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef)
Oct  2 07:35:09 np0005465988 systemd[1]: libpod-2d4382be2099a35825cc1aa6c0362606e9e2ec7efee8b97f61087d10ef7b165a.scope: Deactivated successfully.
Oct  2 07:35:09 np0005465988 admiring_bardeen[84688]: 167 167
Oct  2 07:35:09 np0005465988 podman[84671]: 2025-10-02 11:35:09.345452744 +0000 UTC m=+0.195670854 container died 2d4382be2099a35825cc1aa6c0362606e9e2ec7efee8b97f61087d10ef7b165a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bardeen, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Oct  2 07:35:09 np0005465988 conmon[84688]: conmon 2d4382be2099a35825cc <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2d4382be2099a35825cc1aa6c0362606e9e2ec7efee8b97f61087d10ef7b165a.scope/container/memory.events
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Oct  2 07:35:09 np0005465988 systemd[1]: var-lib-containers-storage-overlay-66970637f22e8d356d005087e2ecc5e363437f19ef2ee41c8e811e7e7f0dfd75-merged.mount: Deactivated successfully.
Oct  2 07:35:09 np0005465988 podman[84671]: 2025-10-02 11:35:09.403629024 +0000 UTC m=+0.253847144 container remove 2d4382be2099a35825cc1aa6c0362606e9e2ec7efee8b97f61087d10ef7b165a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bardeen, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Oct  2 07:35:09 np0005465988 systemd[1]: libpod-conmon-2d4382be2099a35825cc1aa6c0362606e9e2ec7efee8b97f61087d10ef7b165a.scope: Deactivated successfully.
Oct  2 07:35:09 np0005465988 systemd[1]: Reloading.
Oct  2 07:35:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.13( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.3( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.1f( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.f( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.1b( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.17( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.b( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:09 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 65 pg[9.7( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=6 ec=54/41 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:09 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:35:09 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:35:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.gpiyct", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Oct  2 07:35:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.gpiyct", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Oct  2 07:35:09 np0005465988 ceph-mon[76355]: Deploying daemon mds.cephfs.compute-2.gpiyct on compute-2
Oct  2 07:35:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Oct  2 07:35:09 np0005465988 systemd[1]: Reloading.
Oct  2 07:35:09 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:35:09 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:35:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:10.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:10 np0005465988 systemd[1]: Starting Ceph mds.cephfs.compute-2.gpiyct for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2...
Oct  2 07:35:10 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.18 deep-scrub starts
Oct  2 07:35:10 np0005465988 podman[84832]: 2025-10-02 11:35:10.339869091 +0000 UTC m=+0.038361258 container create c5018155171af3117c4f1c452ee1c5386c17e340816708d3569a5437818f7f51 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mds-cephfs-compute-2-gpiyct, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Oct  2 07:35:10 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 2.18 deep-scrub ok
Oct  2 07:35:10 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3321f9778db7608158efc1f52efb87d14596d881f74b4f8e0f9f58b6d309c4a1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 07:35:10 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3321f9778db7608158efc1f52efb87d14596d881f74b4f8e0f9f58b6d309c4a1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 07:35:10 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3321f9778db7608158efc1f52efb87d14596d881f74b4f8e0f9f58b6d309c4a1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 07:35:10 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3321f9778db7608158efc1f52efb87d14596d881f74b4f8e0f9f58b6d309c4a1/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.gpiyct supports timestamps until 2038 (0x7fffffff)
Oct  2 07:35:10 np0005465988 podman[84832]: 2025-10-02 11:35:10.405416362 +0000 UTC m=+0.103908549 container init c5018155171af3117c4f1c452ee1c5386c17e340816708d3569a5437818f7f51 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mds-cephfs-compute-2-gpiyct, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Oct  2 07:35:10 np0005465988 podman[84832]: 2025-10-02 11:35:10.414563356 +0000 UTC m=+0.113055503 container start c5018155171af3117c4f1c452ee1c5386c17e340816708d3569a5437818f7f51 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mds-cephfs-compute-2-gpiyct, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef)
Oct  2 07:35:10 np0005465988 podman[84832]: 2025-10-02 11:35:10.321997987 +0000 UTC m=+0.020490134 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:35:10 np0005465988 bash[84832]: c5018155171af3117c4f1c452ee1c5386c17e340816708d3569a5437818f7f51
Oct  2 07:35:10 np0005465988 systemd[1]: Started Ceph mds.cephfs.compute-2.gpiyct for fd4c5763-22d1-50ea-ad0b-96a3dc3040b2.
Oct  2 07:35:10 np0005465988 ceph-mds[84851]: set uid:gid to 167:167 (ceph:ceph)
Oct  2 07:35:10 np0005465988 ceph-mds[84851]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Oct  2 07:35:10 np0005465988 ceph-mds[84851]: main not setting numa affinity
Oct  2 07:35:10 np0005465988 ceph-mds[84851]: pidfile_write: ignore empty --pid-file
Oct  2 07:35:10 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mds-cephfs-compute-2-gpiyct[84847]: starting mds.cephfs.compute-2.gpiyct at 
Oct  2 07:35:10 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct Updating MDS map to version 2 from mon.1
Oct  2 07:35:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Oct  2 07:35:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Oct  2 07:35:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Oct  2 07:35:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.odxjnj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Oct  2 07:35:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.odxjnj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Oct  2 07:35:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:10.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e3 new map
Oct  2 07:35:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e3 print_map#012e3#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-02T11:34:16.058920+0000#012modified#0112025-10-02T11:34:16.058958+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.gpiyct{-1:24148} state up:standby seq 1 addr [v2:192.168.122.102:6804/1224979121,v1:192.168.122.102:6805/1224979121] compat {c=[1],r=[1],i=[7ff]}]
Oct  2 07:35:10 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct Updating MDS map to version 3 from mon.1
Oct  2 07:35:10 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct Monitors have assigned me to become a standby.
Oct  2 07:35:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e4 new map
Oct  2 07:35:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e4 print_map#012e4#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-02T11:34:16.058920+0000#012modified#0112025-10-02T11:35:10.977247+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24148}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.gpiyct{0:24148} state up:creating seq 1 addr [v2:192.168.122.102:6804/1224979121,v1:192.168.122.102:6805/1224979121] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct Updating MDS map to version 4 from mon.1
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.4 handle_mds_map i am now mds.0.4
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x1
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x100
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x600
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x601
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x602
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x603
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x604
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x605
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x606
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x607
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x608
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.cache creating system inode with ino:0x609
Oct  2 07:35:11 np0005465988 ceph-mds[84851]: mds.0.4 creating_done
Oct  2 07:35:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Oct  2 07:35:11 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 67 pg[9.d( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:11 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 67 pg[9.d( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:11 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 67 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:11 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 67 pg[9.5( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:11 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 67 pg[9.15( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:11 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 67 pg[9.15( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:11 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 67 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:11 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 67 pg[9.5( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:11 np0005465988 ceph-mon[76355]: Deploying daemon mds.cephfs.compute-0.odxjnj on compute-0
Oct  2 07:35:11 np0005465988 ceph-mon[76355]: daemon mds.cephfs.compute-2.gpiyct assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Oct  2 07:35:11 np0005465988 ceph-mon[76355]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Oct  2 07:35:11 np0005465988 ceph-mon[76355]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Oct  2 07:35:11 np0005465988 ceph-mon[76355]: Cluster is now healthy
Oct  2 07:35:11 np0005465988 ceph-mon[76355]: daemon mds.cephfs.compute-2.gpiyct is now active in filesystem cephfs as rank 0
Oct  2 07:35:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:12.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e5 new map
Oct  2 07:35:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e5 print_map#012e5#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-02T11:34:16.058920+0000#012modified#0112025-10-02T11:35:12.043439+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24148}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.gpiyct{0:24148} state up:active seq 2 addr [v2:192.168.122.102:6804/1224979121,v1:192.168.122.102:6805/1224979121] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Oct  2 07:35:12 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct Updating MDS map to version 5 from mon.1
Oct  2 07:35:12 np0005465988 ceph-mds[84851]: mds.0.4 handle_mds_map i am now mds.0.4
Oct  2 07:35:12 np0005465988 ceph-mds[84851]: mds.0.4 handle_mds_map state change up:creating --> up:active
Oct  2 07:35:12 np0005465988 ceph-mds[84851]: mds.0.4 recovery_done -- successful recovery!
Oct  2 07:35:12 np0005465988 ceph-mds[84851]: mds.0.4 active_start
Oct  2 07:35:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Oct  2 07:35:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 68 pg[9.15( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=5 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 68 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=5 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 68 pg[9.5( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=6 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 68 pg[9.d( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=6 ec=54/41 lis/c=65/54 les/c/f=66/55/0 sis=67) [2] r=0 lpr=67 pi=[54,67)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:12.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e6 new map
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e6 print_map#012e6#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-02T11:34:16.058920+0000#012modified#0112025-10-02T11:35:12.043439+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24148}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.gpiyct{0:24148} state up:active seq 2 addr [v2:192.168.122.102:6804/1224979121,v1:192.168.122.102:6805/1224979121] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.odxjnj{-1:14421} state up:standby seq 1 addr [v2:192.168.122.100:6806/3281017148,v1:192.168.122.100:6807/3281017148] compat {c=[1],r=[1],i=[7ff]}]
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e7 new map
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e7 print_map#012e7#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-02T11:34:16.058920+0000#012modified#0112025-10-02T11:35:12.043439+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24148}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gpiyct{0:24148} state up:active seq 2 addr [v2:192.168.122.102:6804/1224979121,v1:192.168.122.102:6805/1224979121] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.odxjnj{-1:14421} state up:standby seq 1 addr [v2:192.168.122.100:6806/3281017148,v1:192.168.122.100:6807/3281017148] compat {c=[1],r=[1],i=[7ff]}]
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.zfhmgy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.zfhmgy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Oct  2 07:35:13 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Oct  2 07:35:13 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Oct  2 07:35:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Oct  2 07:35:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:14.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:14 np0005465988 ceph-mon[76355]: Deploying daemon mds.cephfs.compute-1.zfhmgy on compute-1
Oct  2 07:35:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:35:14 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Oct  2 07:35:14 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Oct  2 07:35:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Oct  2 07:35:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:14.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:14 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 70 pg[9.8( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=70) [2] r=0 lpr=70 pi=[54,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:14 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 70 pg[9.18( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=70) [2] r=0 lpr=70 pi=[54,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e8 new map
Oct  2 07:35:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e8 print_map#012e8#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-02T11:34:16.058920+0000#012modified#0112025-10-02T11:35:12.043439+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24148}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gpiyct{0:24148} state up:active seq 2 addr [v2:192.168.122.102:6804/1224979121,v1:192.168.122.102:6805/1224979121] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.odxjnj{-1:14421} state up:standby seq 1 addr [v2:192.168.122.100:6806/3281017148,v1:192.168.122.100:6807/3281017148] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.zfhmgy{-1:24158} state up:standby seq 1 addr [v2:192.168.122.101:6804/2486237496,v1:192.168.122.101:6805/2486237496] compat {c=[1],r=[1],i=[7ff]}]
Oct  2 07:35:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Oct  2 07:35:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Oct  2 07:35:15 np0005465988 podman[85107]: 2025-10-02 11:35:15.579644332 +0000 UTC m=+0.076902561 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Oct  2 07:35:15 np0005465988 podman[85107]: 2025-10-02 11:35:15.681696825 +0000 UTC m=+0.178955034 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 07:35:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Oct  2 07:35:15 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 71 pg[9.18( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[54,71)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:15 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 71 pg[9.8( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[54,71)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:15 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 71 pg[9.8( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[54,71)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:15 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 71 pg[9.18( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[54,71)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:15 np0005465988 ceph-mds[84851]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Oct  2 07:35:15 np0005465988 ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mds-cephfs-compute-2-gpiyct[84847]: 2025-10-02T11:35:15.990+0000 7f8913f84640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Oct  2 07:35:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:16.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e9 new map
Oct  2 07:35:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e9 print_map#012e9#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0119#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-02T11:34:16.058920+0000#012modified#0112025-10-02T11:35:16.189190+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24148}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gpiyct{0:24148} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1224979121,v1:192.168.122.102:6805/1224979121] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.odxjnj{-1:14421} state up:standby seq 1 addr [v2:192.168.122.100:6806/3281017148,v1:192.168.122.100:6807/3281017148] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.zfhmgy{-1:24158} state up:standby seq 1 addr [v2:192.168.122.101:6804/2486237496,v1:192.168.122.101:6805/2486237496] compat {c=[1],r=[1],i=[7ff]}]
Oct  2 07:35:16 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct Updating MDS map to version 9 from mon.1
Oct  2 07:35:16 np0005465988 podman[85240]: 2025-10-02 11:35:16.327783132 +0000 UTC m=+0.055592784 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:35:16 np0005465988 podman[85240]: 2025-10-02 11:35:16.339595365 +0000 UTC m=+0.067405007 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:35:16 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.b deep-scrub starts
Oct  2 07:35:16 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.b deep-scrub ok
Oct  2 07:35:16 np0005465988 podman[85305]: 2025-10-02 11:35:16.756081604 +0000 UTC m=+0.251170094 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1793, description=keepalived for Ceph, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, version=2.2.4)
Oct  2 07:35:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:16.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Oct  2 07:35:17 np0005465988 podman[85328]: 2025-10-02 11:35:17.016591217 +0000 UTC m=+0.235427514 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, release=1793, version=2.2.4, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=keepalived for Ceph)
Oct  2 07:35:17 np0005465988 podman[85305]: 2025-10-02 11:35:17.087118057 +0000 UTC m=+0.582206547 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, vcs-type=git, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, io.buildah.version=1.28.2, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Oct  2 07:35:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Oct  2 07:35:17 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 72 pg[9.19( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=72) [2] r=0 lpr=72 pi=[54,72)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:17 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 72 pg[9.9( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=72) [2] r=0 lpr=72 pi=[54,72)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e10 new map
Oct  2 07:35:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e10 print_map#012e10#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0119#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-02T11:34:16.058920+0000#012modified#0112025-10-02T11:35:16.189190+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24148}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gpiyct{0:24148} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1224979121,v1:192.168.122.102:6805/1224979121] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.odxjnj{-1:14421} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/3281017148,v1:192.168.122.100:6807/3281017148] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.zfhmgy{-1:24158} state up:standby seq 1 addr [v2:192.168.122.101:6804/2486237496,v1:192.168.122.101:6805/2486237496] compat {c=[1],r=[1],i=[7ff]}]
Oct  2 07:35:17 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.d deep-scrub starts
Oct  2 07:35:17 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.d deep-scrub ok
Oct  2 07:35:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Oct  2 07:35:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:18.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Oct  2 07:35:18 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 73 pg[9.9( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=73) [2]/[0] r=-1 lpr=73 pi=[54,73)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:18 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 73 pg[9.9( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=73) [2]/[0] r=-1 lpr=73 pi=[54,73)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:18 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 73 pg[9.19( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=73) [2]/[0] r=-1 lpr=73 pi=[54,73)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:18 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 73 pg[9.19( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=54/54 les/c/f=55/55/0 sis=73) [2]/[0] r=-1 lpr=73 pi=[54,73)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:18 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 73 pg[9.18( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=71/54 les/c/f=72/55/0 sis=73) [2] r=0 lpr=73 pi=[54,73)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:18 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 73 pg[9.18( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=71/54 les/c/f=72/55/0 sis=73) [2] r=0 lpr=73 pi=[54,73)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:18 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 73 pg[9.8( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=71/54 les/c/f=72/55/0 sis=73) [2] r=0 lpr=73 pi=[54,73)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:18 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 73 pg[9.8( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=71/54 les/c/f=72/55/0 sis=73) [2] r=0 lpr=73 pi=[54,73)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:35:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:35:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:18.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e73 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:35:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e11 new map
Oct  2 07:35:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).mds e11 print_map#012e11#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0119#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-02T11:34:16.058920+0000#012modified#0112025-10-02T11:35:16.189190+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24148}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gpiyct{0:24148} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1224979121,v1:192.168.122.102:6805/1224979121] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.odxjnj{-1:14421} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/3281017148,v1:192.168.122.100:6807/3281017148] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.zfhmgy{-1:24158} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/2486237496,v1:192.168.122.101:6805/2486237496] compat {c=[1],r=[1],i=[7ff]}]
Oct  2 07:35:19 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.e scrub starts
Oct  2 07:35:19 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.e scrub ok
Oct  2 07:35:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Oct  2 07:35:19 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 74 pg[9.18( v 47'1065 (0'0,47'1065] local-lis/les=73/74 n=5 ec=54/41 lis/c=71/54 les/c/f=72/55/0 sis=73) [2] r=0 lpr=73 pi=[54,73)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:19 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 74 pg[9.8( v 47'1065 (0'0,47'1065] local-lis/les=73/74 n=6 ec=54/41 lis/c=71/54 les/c/f=72/55/0 sis=73) [2] r=0 lpr=73 pi=[54,73)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:20.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Oct  2 07:35:20 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 75 pg[9.9( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=73/54 les/c/f=74/55/0 sis=75) [2] r=0 lpr=75 pi=[54,75)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:20 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 75 pg[9.9( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=6 ec=54/41 lis/c=73/54 les/c/f=74/55/0 sis=75) [2] r=0 lpr=75 pi=[54,75)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:20 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 75 pg[9.19( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=73/54 les/c/f=74/55/0 sis=75) [2] r=0 lpr=75 pi=[54,75)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:20 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 75 pg[9.19( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=73/54 les/c/f=74/55/0 sis=75) [2] r=0 lpr=75 pi=[54,75)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:35:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:20.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:35:21 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Oct  2 07:35:21 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Oct  2 07:35:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Oct  2 07:35:21 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 76 pg[9.19( v 47'1065 (0'0,47'1065] local-lis/les=75/76 n=5 ec=54/41 lis/c=73/54 les/c/f=74/55/0 sis=75) [2] r=0 lpr=75 pi=[54,75)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:21 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 76 pg[9.9( v 47'1065 (0'0,47'1065] local-lis/les=75/76 n=6 ec=54/41 lis/c=73/54 les/c/f=74/55/0 sis=75) [2] r=0 lpr=75 pi=[54,75)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:22.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:22.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:23 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Oct  2 07:35:23 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Oct  2 07:35:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:24.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e76 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:35:24 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Oct  2 07:35:24 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Oct  2 07:35:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Oct  2 07:35:24 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Oct  2 07:35:24 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:24 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:24.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:25 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.13 deep-scrub starts
Oct  2 07:35:25 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.13 deep-scrub ok
Oct  2 07:35:25 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Oct  2 07:35:25 np0005465988 ceph-mon[76355]: Reconfiguring mon.compute-0 (monmap changed)...
Oct  2 07:35:25 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct  2 07:35:25 np0005465988 ceph-mon[76355]: Reconfiguring daemon mon.compute-0 on compute-0
Oct  2 07:35:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Oct  2 07:35:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:26.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:35:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:26.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:35:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:27 np0005465988 ceph-mon[76355]: Reconfiguring mgr.compute-0.fmcstn (monmap changed)...
Oct  2 07:35:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.fmcstn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Oct  2 07:35:27 np0005465988 ceph-mon[76355]: Reconfiguring daemon mgr.compute-0.fmcstn on compute-0
Oct  2 07:35:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Oct  2 07:35:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Oct  2 07:35:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:28.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Oct  2 07:35:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Oct  2 07:35:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Oct  2 07:35:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:28.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:29 np0005465988 ceph-mon[76355]: Reconfiguring crash.compute-0 (monmap changed)...
Oct  2 07:35:29 np0005465988 ceph-mon[76355]: Reconfiguring daemon crash.compute-0 on compute-0
Oct  2 07:35:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Oct  2 07:35:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:29 np0005465988 ceph-mon[76355]: Reconfiguring osd.1 (monmap changed)...
Oct  2 07:35:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Oct  2 07:35:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Oct  2 07:35:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:35:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Oct  2 07:35:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:30.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:30 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Oct  2 07:35:30 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Oct  2 07:35:30 np0005465988 ceph-mon[76355]: Reconfiguring daemon osd.1 on compute-0
Oct  2 07:35:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:30 np0005465988 ceph-mon[76355]: Reconfiguring crash.compute-1 (monmap changed)...
Oct  2 07:35:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Oct  2 07:35:30 np0005465988 ceph-mon[76355]: Reconfiguring daemon crash.compute-1 on compute-1
Oct  2 07:35:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Oct  2 07:35:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:30.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Oct  2 07:35:31 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 82 pg[9.d( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=6 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=82 pruub=12.762773514s) [1] r=-1 lpr=82 pi=[67,82)/1 crt=47'1065 mlcod 0'0 active pruub 98.135528564s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:31 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 82 pg[9.d( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=6 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=82 pruub=12.762704849s) [1] r=-1 lpr=82 pi=[67,82)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 98.135528564s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:31 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 82 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=5 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=82 pruub=12.760921478s) [1] r=-1 lpr=82 pi=[67,82)/1 crt=47'1065 mlcod 0'0 active pruub 98.135528564s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:31 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 82 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=5 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=82 pruub=12.760855675s) [1] r=-1 lpr=82 pi=[67,82)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 98.135528564s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:32 np0005465988 ceph-mon[76355]: Reconfiguring osd.0 (monmap changed)...
Oct  2 07:35:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Oct  2 07:35:32 np0005465988 ceph-mon[76355]: Reconfiguring daemon osd.0 on compute-1
Oct  2 07:35:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Oct  2 07:35:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:32.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:32.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Oct  2 07:35:33 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 83 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=5 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=83) [1]/[2] r=0 lpr=83 pi=[67,83)/1 crt=47'1065 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:33 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 83 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=5 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=83) [1]/[2] r=0 lpr=83 pi=[67,83)/1 crt=47'1065 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:33 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 83 pg[9.d( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=6 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=83) [1]/[2] r=0 lpr=83 pi=[67,83)/1 crt=47'1065 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:33 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 83 pg[9.d( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=6 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=83) [1]/[2] r=0 lpr=83 pi=[67,83)/1 crt=47'1065 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:33 np0005465988 ceph-mon[76355]: Reconfiguring mon.compute-1 (monmap changed)...
Oct  2 07:35:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct  2 07:35:33 np0005465988 ceph-mon[76355]: Reconfiguring daemon mon.compute-1 on compute-1
Oct  2 07:35:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Oct  2 07:35:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct  2 07:35:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Oct  2 07:35:33 np0005465988 podman[85715]: 2025-10-02 11:35:33.313784217 +0000 UTC m=+0.027318488 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 07:35:33 np0005465988 podman[85715]: 2025-10-02 11:35:33.672029903 +0000 UTC m=+0.385564074 container create d29088715bd0db4c2f9eed48e6eb30a1fc9c2cf4d4de44a057fc155e0b88bd2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_joliot, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Oct  2 07:35:33 np0005465988 systemd[1]: Started libpod-conmon-d29088715bd0db4c2f9eed48e6eb30a1fc9c2cf4d4de44a057fc155e0b88bd2d.scope.
Oct  2 07:35:33 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:35:33 np0005465988 podman[85715]: 2025-10-02 11:35:33.844168703 +0000 UTC m=+0.557702884 container init d29088715bd0db4c2f9eed48e6eb30a1fc9c2cf4d4de44a057fc155e0b88bd2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_joliot, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 07:35:33 np0005465988 podman[85715]: 2025-10-02 11:35:33.853322417 +0000 UTC m=+0.566856588 container start d29088715bd0db4c2f9eed48e6eb30a1fc9c2cf4d4de44a057fc155e0b88bd2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_joliot, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 07:35:33 np0005465988 objective_joliot[85732]: 167 167
Oct  2 07:35:33 np0005465988 podman[85715]: 2025-10-02 11:35:33.858378088 +0000 UTC m=+0.571912269 container attach d29088715bd0db4c2f9eed48e6eb30a1fc9c2cf4d4de44a057fc155e0b88bd2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_joliot, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Oct  2 07:35:33 np0005465988 systemd[1]: libpod-d29088715bd0db4c2f9eed48e6eb30a1fc9c2cf4d4de44a057fc155e0b88bd2d.scope: Deactivated successfully.
Oct  2 07:35:33 np0005465988 conmon[85732]: conmon d29088715bd0db4c2f9e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d29088715bd0db4c2f9eed48e6eb30a1fc9c2cf4d4de44a057fc155e0b88bd2d.scope/container/memory.events
Oct  2 07:35:33 np0005465988 podman[85715]: 2025-10-02 11:35:33.859905094 +0000 UTC m=+0.573439255 container died d29088715bd0db4c2f9eed48e6eb30a1fc9c2cf4d4de44a057fc155e0b88bd2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_joliot, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Oct  2 07:35:33 np0005465988 systemd[1]: var-lib-containers-storage-overlay-3a9c5b7bbe35ba30be759113ec404eee00f555824963258473b56f9df8aaab78-merged.mount: Deactivated successfully.
Oct  2 07:35:33 np0005465988 podman[85715]: 2025-10-02 11:35:33.915419775 +0000 UTC m=+0.628953976 container remove d29088715bd0db4c2f9eed48e6eb30a1fc9c2cf4d4de44a057fc155e0b88bd2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_joliot, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Oct  2 07:35:33 np0005465988 systemd[1]: libpod-conmon-d29088715bd0db4c2f9eed48e6eb30a1fc9c2cf4d4de44a057fc155e0b88bd2d.scope: Deactivated successfully.
Oct  2 07:35:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:34.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:35:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Oct  2 07:35:34 np0005465988 ceph-mon[76355]: Reconfiguring mon.compute-2 (monmap changed)...
Oct  2 07:35:34 np0005465988 ceph-mon[76355]: Reconfiguring daemon mon.compute-2 on compute-2
Oct  2 07:35:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:34 np0005465988 podman[85921]: 2025-10-02 11:35:34.885670069 +0000 UTC m=+0.083938952 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef)
Oct  2 07:35:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:34.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:34 np0005465988 podman[85921]: 2025-10-02 11:35:34.998743561 +0000 UTC m=+0.197012334 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Oct  2 07:35:35 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 84 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=83/84 n=5 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=83) [1]/[2] async=[1] r=0 lpr=83 pi=[67,83)/1 crt=47'1065 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:35 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 84 pg[9.d( v 47'1065 (0'0,47'1065] local-lis/les=83/84 n=6 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=83) [1]/[2] async=[1] r=0 lpr=83 pi=[67,83)/1 crt=47'1065 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:35 np0005465988 podman[86057]: 2025-10-02 11:35:35.615626985 +0000 UTC m=+0.058118760 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:35:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Oct  2 07:35:35 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 85 pg[9.f( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=6 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=85 pruub=13.891147614s) [1] r=-1 lpr=85 pi=[64,85)/1 crt=47'1065 mlcod 0'0 active pruub 102.964538574s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:35 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 85 pg[9.f( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=6 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=85 pruub=13.891083717s) [1] r=-1 lpr=85 pi=[64,85)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 102.964538574s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:35 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 85 pg[9.1f( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=85 pruub=13.890727043s) [1] r=-1 lpr=85 pi=[64,85)/1 crt=47'1065 mlcod 0'0 active pruub 102.964363098s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:35 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 85 pg[9.1f( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=85 pruub=13.890692711s) [1] r=-1 lpr=85 pi=[64,85)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 102.964363098s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:35 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 85 pg[9.d( v 47'1065 (0'0,47'1065] local-lis/les=83/84 n=6 ec=54/41 lis/c=83/67 les/c/f=84/68/0 sis=85 pruub=15.569364548s) [1] async=[1] r=-1 lpr=85 pi=[67,85)/1 crt=47'1065 mlcod 47'1065 active pruub 104.643074036s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:35 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 85 pg[9.d( v 47'1065 (0'0,47'1065] local-lis/les=83/84 n=6 ec=54/41 lis/c=83/67 les/c/f=84/68/0 sis=85 pruub=15.569201469s) [1] r=-1 lpr=85 pi=[67,85)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 104.643074036s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:35 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 85 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=83/84 n=5 ec=54/41 lis/c=83/67 les/c/f=84/68/0 sis=85 pruub=15.563771248s) [1] async=[1] r=-1 lpr=85 pi=[67,85)/1 crt=47'1065 mlcod 47'1065 active pruub 104.638618469s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:35 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 85 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=83/84 n=5 ec=54/41 lis/c=83/67 les/c/f=84/68/0 sis=85 pruub=15.563423157s) [1] r=-1 lpr=85 pi=[67,85)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 104.638618469s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:35 np0005465988 podman[86057]: 2025-10-02 11:35:35.632580382 +0000 UTC m=+0.075072187 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:35:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Oct  2 07:35:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:35 np0005465988 podman[86125]: 2025-10-02 11:35:35.915401612 +0000 UTC m=+0.069101578 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, vendor=Red Hat, Inc., vcs-type=git, build-date=2023-02-22T09:23:20, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, name=keepalived, version=2.2.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph)
Oct  2 07:35:35 np0005465988 podman[86125]: 2025-10-02 11:35:35.931822263 +0000 UTC m=+0.085522199 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, name=keepalived, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, distribution-scope=public, release=1793, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4)
Oct  2 07:35:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:36.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:36 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.11 deep-scrub starts
Oct  2 07:35:36 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.11 deep-scrub ok
Oct  2 07:35:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Oct  2 07:35:36 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 86 pg[9.f( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=6 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=86) [1]/[2] r=0 lpr=86 pi=[64,86)/1 crt=47'1065 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:36 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 86 pg[9.f( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=6 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=86) [1]/[2] r=0 lpr=86 pi=[64,86)/1 crt=47'1065 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:36 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 86 pg[9.1f( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=86) [1]/[2] r=0 lpr=86 pi=[64,86)/1 crt=47'1065 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:36 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 86 pg[9.1f( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=86) [1]/[2] r=0 lpr=86 pi=[64,86)/1 crt=47'1065 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct  2 07:35:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Oct  2 07:35:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:35:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:35:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Oct  2 07:35:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Oct  2 07:35:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:36.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Oct  2 07:35:37 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 87 pg[9.1f( v 47'1065 (0'0,47'1065] local-lis/les=86/87 n=5 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=86) [1]/[2] async=[1] r=0 lpr=86 pi=[64,86)/1 crt=47'1065 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:37 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 87 pg[9.f( v 47'1065 (0'0,47'1065] local-lis/les=86/87 n=6 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=86) [1]/[2] async=[1] r=0 lpr=86 pi=[64,86)/1 crt=47'1065 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:35:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:38.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Oct  2 07:35:38 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 88 pg[9.f( v 47'1065 (0'0,47'1065] local-lis/les=86/87 n=6 ec=54/41 lis/c=86/64 les/c/f=87/65/0 sis=88 pruub=14.992271423s) [1] async=[1] r=-1 lpr=88 pi=[64,88)/1 crt=47'1065 mlcod 47'1065 active pruub 107.112464905s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:38 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 88 pg[9.1f( v 47'1065 (0'0,47'1065] local-lis/les=86/87 n=5 ec=54/41 lis/c=86/64 les/c/f=87/65/0 sis=88 pruub=14.992101669s) [1] async=[1] r=-1 lpr=88 pi=[64,88)/1 crt=47'1065 mlcod 47'1065 active pruub 107.112312317s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:35:38 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 88 pg[9.f( v 47'1065 (0'0,47'1065] local-lis/les=86/87 n=6 ec=54/41 lis/c=86/64 les/c/f=87/65/0 sis=88 pruub=14.992177963s) [1] r=-1 lpr=88 pi=[64,88)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 107.112464905s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:38 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 88 pg[9.1f( v 47'1065 (0'0,47'1065] local-lis/les=86/87 n=5 ec=54/41 lis/c=86/64 les/c/f=87/65/0 sis=88 pruub=14.991994858s) [1] r=-1 lpr=88 pi=[64,88)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 107.112312317s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:35:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Oct  2 07:35:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Oct  2 07:35:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:35:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:38.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:35:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:35:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Oct  2 07:35:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:40.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:40 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Oct  2 07:35:40 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Oct  2 07:35:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Oct  2 07:35:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:40.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Oct  2 07:35:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:42.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:35:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:42.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Oct  2 07:35:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:44.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:35:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:44.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:46.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Oct  2 07:35:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:46.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:47 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.1d deep-scrub starts
Oct  2 07:35:47 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.1d deep-scrub ok
Oct  2 07:35:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Oct  2 07:35:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Oct  2 07:35:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:48.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Oct  2 07:35:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Oct  2 07:35:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:48.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:35:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:50.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Oct  2 07:35:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:50.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:52.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:52.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Oct  2 07:35:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Oct  2 07:35:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:54.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:35:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000060s ======
Oct  2 07:35:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:54.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000060s
Oct  2 07:35:55 np0005465988 systemd-logind[827]: New session 34 of user zuul.
Oct  2 07:35:55 np0005465988 systemd[1]: Started Session 34 of User zuul.
Oct  2 07:35:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Oct  2 07:35:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:56.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:56 np0005465988 python3.9[86440]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:35:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:35:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:56.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:35:57 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Oct  2 07:35:57 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Oct  2 07:35:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:35:58.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:58 np0005465988 python3.9[86706]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:35:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:35:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:35:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:35:58.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:35:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:36:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:00.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:36:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:00.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Oct  2 07:36:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Oct  2 07:36:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:02.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Oct  2 07:36:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Oct  2 07:36:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 99 pg[9.15( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=5 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=99 pruub=13.770298958s) [1] r=-1 lpr=99 pi=[67,99)/1 crt=47'1065 mlcod 0'0 active pruub 130.131103516s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:02 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 99 pg[9.15( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=5 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=99 pruub=13.770224571s) [1] r=-1 lpr=99 pi=[67,99)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 130.131103516s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:36:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:02.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Oct  2 07:36:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Oct  2 07:36:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Oct  2 07:36:03 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 100 pg[9.15( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=5 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=100) [1]/[2] r=0 lpr=100 pi=[67,100)/1 crt=47'1065 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:03 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 100 pg[9.15( v 47'1065 (0'0,47'1065] local-lis/les=67/68 n=5 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=100) [1]/[2] r=0 lpr=100 pi=[67,100)/1 crt=47'1065 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct  2 07:36:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:04.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:04 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Oct  2 07:36:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:04.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Oct  2 07:36:05 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 101 pg[9.16( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=69/69 les/c/f=70/70/0 sis=101) [2] r=0 lpr=101 pi=[69,101)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:36:05 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 101 pg[9.15( v 47'1065 (0'0,47'1065] local-lis/les=100/101 n=5 ec=54/41 lis/c=67/67 les/c/f=68/68/0 sis=100) [1]/[2] async=[1] r=0 lpr=100 pi=[67,100)/1 crt=47'1065 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:36:05 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Oct  2 07:36:05 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Oct  2 07:36:05 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Oct  2 07:36:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Oct  2 07:36:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 102 pg[9.16( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=69/69 les/c/f=70/70/0 sis=102) [2]/[1] r=-1 lpr=102 pi=[69,102)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 102 pg[9.16( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=69/69 les/c/f=70/70/0 sis=102) [2]/[1] r=-1 lpr=102 pi=[69,102)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:36:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 102 pg[9.15( v 47'1065 (0'0,47'1065] local-lis/les=100/101 n=5 ec=54/41 lis/c=100/67 les/c/f=101/68/0 sis=102 pruub=15.091874123s) [1] async=[1] r=-1 lpr=102 pi=[67,102)/1 crt=47'1065 mlcod 47'1065 active pruub 134.601470947s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:06 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 102 pg[9.15( v 47'1065 (0'0,47'1065] local-lis/les=100/101 n=5 ec=54/41 lis/c=100/67 les/c/f=101/68/0 sis=102 pruub=15.091648102s) [1] r=-1 lpr=102 pi=[67,102)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 134.601470947s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:36:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:36:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:06.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:36:06 np0005465988 systemd[1]: session-34.scope: Deactivated successfully.
Oct  2 07:36:06 np0005465988 systemd[1]: session-34.scope: Consumed 8.637s CPU time.
Oct  2 07:36:06 np0005465988 systemd-logind[827]: Session 34 logged out. Waiting for processes to exit.
Oct  2 07:36:06 np0005465988 systemd-logind[827]: Removed session 34.
Oct  2 07:36:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Oct  2 07:36:06 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.14 deep-scrub starts
Oct  2 07:36:06 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.14 deep-scrub ok
Oct  2 07:36:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:06.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Oct  2 07:36:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Oct  2 07:36:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:08.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:08 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Oct  2 07:36:08 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Oct  2 07:36:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Oct  2 07:36:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Oct  2 07:36:08 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 104 pg[9.16( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=102/69 les/c/f=103/70/0 sis=104) [2] r=0 lpr=104 pi=[69,104)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:08 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 104 pg[9.16( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=102/69 les/c/f=103/70/0 sis=104) [2] r=0 lpr=104 pi=[69,104)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:36:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:08.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:09 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.9 deep-scrub starts
Oct  2 07:36:09 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.9 deep-scrub ok
Oct  2 07:36:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Oct  2 07:36:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Oct  2 07:36:10 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 105 pg[9.16( v 47'1065 (0'0,47'1065] local-lis/les=104/105 n=5 ec=54/41 lis/c=102/69 les/c/f=103/70/0 sis=104) [2] r=0 lpr=104 pi=[69,104)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:36:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:10.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:10 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Oct  2 07:36:10 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Oct  2 07:36:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:10.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Oct  2 07:36:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Oct  2 07:36:11 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 106 pg[9.19( v 47'1065 (0'0,47'1065] local-lis/les=75/76 n=5 ec=54/41 lis/c=75/75 les/c/f=76/76/0 sis=106 pruub=14.448824883s) [0] r=-1 lpr=106 pi=[75,106)/1 crt=47'1065 mlcod 0'0 active pruub 138.972396851s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:11 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 106 pg[9.19( v 47'1065 (0'0,47'1065] local-lis/les=75/76 n=5 ec=54/41 lis/c=75/75 les/c/f=76/76/0 sis=106 pruub=14.448763847s) [0] r=-1 lpr=106 pi=[75,106)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 138.972396851s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:36:11 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.16 scrub starts
Oct  2 07:36:11 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.16 scrub ok
Oct  2 07:36:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Oct  2 07:36:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 107 pg[9.19( v 47'1065 (0'0,47'1065] local-lis/les=75/76 n=5 ec=54/41 lis/c=75/75 les/c/f=76/76/0 sis=107) [0]/[2] r=0 lpr=107 pi=[75,107)/1 crt=47'1065 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:36:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:12.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:36:12 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 107 pg[9.19( v 47'1065 (0'0,47'1065] local-lis/les=75/76 n=5 ec=54/41 lis/c=75/75 les/c/f=76/76/0 sis=107) [0]/[2] r=0 lpr=107 pi=[75,107)/1 crt=47'1065 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct  2 07:36:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Oct  2 07:36:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:12.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Oct  2 07:36:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Oct  2 07:36:13 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Oct  2 07:36:13 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Oct  2 07:36:13 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 108 pg[9.19( v 47'1065 (0'0,47'1065] local-lis/les=107/108 n=5 ec=54/41 lis/c=75/75 les/c/f=76/76/0 sis=107) [0]/[2] async=[0] r=0 lpr=107 pi=[75,107)/1 crt=47'1065 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:36:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:14.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:14 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Oct  2 07:36:14 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Oct  2 07:36:14 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Oct  2 07:36:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:14.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Oct  2 07:36:15 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 109 pg[9.19( v 47'1065 (0'0,47'1065] local-lis/les=107/108 n=5 ec=54/41 lis/c=107/75 les/c/f=108/76/0 sis=109 pruub=14.659156799s) [0] async=[0] r=-1 lpr=109 pi=[75,109)/1 crt=47'1065 mlcod 47'1065 active pruub 143.290695190s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:15 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 109 pg[9.19( v 47'1065 (0'0,47'1065] local-lis/les=107/108 n=5 ec=54/41 lis/c=107/75 les/c/f=108/76/0 sis=109 pruub=14.659019470s) [0] r=-1 lpr=109 pi=[75,109)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 143.290695190s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:36:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:16.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Oct  2 07:36:16 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Oct  2 07:36:16 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Oct  2 07:36:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:16.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Oct  2 07:36:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:18.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Oct  2 07:36:18 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Oct  2 07:36:18 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Oct  2 07:36:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:18.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:20.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Oct  2 07:36:20 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 113 pg[9.1b( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=113 pruub=9.012292862s) [0] r=-1 lpr=113 pi=[64,113)/1 crt=47'1065 mlcod 0'0 active pruub 142.965423584s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:20 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 113 pg[9.1b( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=113 pruub=9.012001991s) [0] r=-1 lpr=113 pi=[64,113)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 142.965423584s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:36:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Oct  2 07:36:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:36:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:20.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:36:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Oct  2 07:36:21 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 114 pg[9.1b( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=114) [0]/[2] r=0 lpr=114 pi=[64,114)/1 crt=47'1065 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:21 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 114 pg[9.1b( v 47'1065 (0'0,47'1065] local-lis/les=64/65 n=5 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=114) [0]/[2] r=0 lpr=114 pi=[64,114)/1 crt=47'1065 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct  2 07:36:21 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Oct  2 07:36:21 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Oct  2 07:36:21 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Oct  2 07:36:21 np0005465988 systemd-logind[827]: New session 35 of user zuul.
Oct  2 07:36:21 np0005465988 systemd[1]: Started Session 35 of User zuul.
Oct  2 07:36:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:22.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Oct  2 07:36:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Oct  2 07:36:22 np0005465988 python3.9[86977]: ansible-ansible.legacy.ping Invoked with data=pong
Oct  2 07:36:22 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 115 pg[9.1b( v 47'1065 (0'0,47'1065] local-lis/les=114/115 n=5 ec=54/41 lis/c=64/64 les/c/f=65/65/0 sis=114) [0]/[2] async=[0] r=0 lpr=114 pi=[64,114)/1 crt=47'1065 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:36:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:36:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:22.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:36:23 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Oct  2 07:36:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Oct  2 07:36:23 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 116 pg[9.1b( v 47'1065 (0'0,47'1065] local-lis/les=114/115 n=5 ec=54/41 lis/c=114/64 les/c/f=115/65/0 sis=116 pruub=15.214699745s) [0] async=[0] r=-1 lpr=116 pi=[64,116)/1 crt=47'1065 mlcod 47'1065 active pruub 152.265426636s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:23 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 116 pg[9.1b( v 47'1065 (0'0,47'1065] local-lis/les=114/115 n=5 ec=54/41 lis/c=114/64 les/c/f=115/65/0 sis=116 pruub=15.214571953s) [0] r=-1 lpr=116 pi=[64,116)/1 crt=47'1065 mlcod 0'0 unknown NOTIFY pruub 152.265426636s@ mbc={}] state<Start>: transitioning to Stray
Oct  2 07:36:23 np0005465988 python3.9[87152]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:36:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:24.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Oct  2 07:36:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:24.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:25 np0005465988 python3.9[87309]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:26.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:26 np0005465988 python3.9[87462]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:36:26 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.f scrub starts
Oct  2 07:36:26 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.f scrub ok
Oct  2 07:36:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:27.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:27 np0005465988 python3.9[87617]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:28.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:28 np0005465988 python3.9[87767]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:36:28 np0005465988 network[87784]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:36:28 np0005465988 network[87785]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:36:28 np0005465988 network[87786]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:36:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:29.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:36:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:30.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:36:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Oct  2 07:36:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Oct  2 07:36:30 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 118 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=85/85 les/c/f=86/86/0 sis=118) [2] r=0 lpr=118 pi=[85,118)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:36:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:31.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Oct  2 07:36:31 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 119 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=85/85 les/c/f=86/86/0 sis=119) [2]/[1] r=-1 lpr=119 pi=[85,119)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:31 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 119 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/41 lis/c=85/85 les/c/f=86/86/0 sis=119) [2]/[1] r=-1 lpr=119 pi=[85,119)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct  2 07:36:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Oct  2 07:36:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:32.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Oct  2 07:36:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Oct  2 07:36:32 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Oct  2 07:36:32 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Oct  2 07:36:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:33.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Oct  2 07:36:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Oct  2 07:36:33 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 121 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=119/85 les/c/f=120/86/0 sis=121) [2] r=0 lpr=121 pi=[85,121)/1 luod=0'0 crt=47'1065 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Oct  2 07:36:33 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 121 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=0/0 n=5 ec=54/41 lis/c=119/85 les/c/f=120/86/0 sis=121) [2] r=0 lpr=121 pi=[85,121)/1 crt=47'1065 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct  2 07:36:33 np0005465988 python3.9[88052]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:36:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:34.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:34 np0005465988 python3.9[88202]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:36:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct  2 07:36:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Oct  2 07:36:34 np0005465988 ceph-osd[79039]: osd.2 pg_epoch: 122 pg[9.1d( v 47'1065 (0'0,47'1065] local-lis/les=121/122 n=5 ec=54/41 lis/c=119/85 les/c/f=120/86/0 sis=121) [2] r=0 lpr=121 pi=[85,121)/1 crt=47'1065 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct  2 07:36:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:35.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Oct  2 07:36:35 np0005465988 python3.9[88357]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:36:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Oct  2 07:36:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:36.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:36 np0005465988 python3.9[88515]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:36:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Oct  2 07:36:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:37.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:37 np0005465988 python3.9[88600]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:36:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Oct  2 07:36:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:36:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:38.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:36:38 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.6 deep-scrub starts
Oct  2 07:36:38 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.6 deep-scrub ok
Oct  2 07:36:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Oct  2 07:36:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:36:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:39.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:36:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:40.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:40 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.e scrub starts
Oct  2 07:36:40 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.e scrub ok
Oct  2 07:36:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:41.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:42.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:36:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:43.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:36:43 np0005465988 podman[88894]: 2025-10-02 11:36:43.871599721 +0000 UTC m=+0.097643611 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Oct  2 07:36:43 np0005465988 podman[88894]: 2025-10-02 11:36:43.995210382 +0000 UTC m=+0.221254322 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Oct  2 07:36:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:44.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:44 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Oct  2 07:36:44 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Oct  2 07:36:44 np0005465988 podman[89033]: 2025-10-02 11:36:44.680919187 +0000 UTC m=+0.050166401 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:36:44 np0005465988 podman[89033]: 2025-10-02 11:36:44.692870142 +0000 UTC m=+0.062117306 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:36:44 np0005465988 podman[89100]: 2025-10-02 11:36:44.976403722 +0000 UTC m=+0.069281039 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, release=1793, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=Ceph keepalived, version=2.2.4, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., io.openshift.expose-services=)
Oct  2 07:36:44 np0005465988 podman[89100]: 2025-10-02 11:36:44.991893572 +0000 UTC m=+0.084770829 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, description=keepalived for Ceph, vendor=Red Hat, Inc., io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, version=2.2.4, name=keepalived, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793)
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:36:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:36:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:45.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:36:45.338809) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405005338914, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7265, "num_deletes": 256, "total_data_size": 13604801, "memory_usage": 13816848, "flush_reason": "Manual Compaction"}
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Oct  2 07:36:45 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.a scrub starts
Oct  2 07:36:45 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.a scrub ok
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405005419343, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 7990441, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 241, "largest_seqno": 7270, "table_properties": {"data_size": 7962251, "index_size": 18475, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8645, "raw_key_size": 81079, "raw_average_key_size": 23, "raw_value_size": 7895206, "raw_average_value_size": 2295, "num_data_blocks": 817, "num_entries": 3440, "num_filter_entries": 3440, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 1759404809, "file_creation_time": 1759405005, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 80594 microseconds, and 13728 cpu microseconds.
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:36:45.419410) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 7990441 bytes OK
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:36:45.419430) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:36:45.424568) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:36:45.424587) EVENT_LOG_v1 {"time_micros": 1759405005424582, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:36:45.424598) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 13566962, prev total WAL file size 13627530, number of live WAL files 2.
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:36:45.427082) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(7803KB) 8(1648B)]
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405005427247, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 7992089, "oldest_snapshot_seqno": -1}
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3187 keys, 7986665 bytes, temperature: kUnknown
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405005580085, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 7986665, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7959159, "index_size": 18436, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8005, "raw_key_size": 76863, "raw_average_key_size": 24, "raw_value_size": 7895264, "raw_average_value_size": 2477, "num_data_blocks": 817, "num_entries": 3187, "num_filter_entries": 3187, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759405005, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:36:45.580485) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 7986665 bytes
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:36:45.587031) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 52.3 rd, 52.2 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(7.6, 0.0 +0.0 blob) out(7.6 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3445, records dropped: 258 output_compression: NoCompression
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:36:45.587071) EVENT_LOG_v1 {"time_micros": 1759405005587052, "job": 4, "event": "compaction_finished", "compaction_time_micros": 152930, "compaction_time_cpu_micros": 32133, "output_level": 6, "num_output_files": 1, "total_output_size": 7986665, "num_input_records": 3445, "num_output_records": 3187, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405005595889, "job": 4, "event": "table_file_deletion", "file_number": 14}
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405005595992, "job": 4, "event": "table_file_deletion", "file_number": 8}
Oct  2 07:36:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:36:45.426950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:36:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:46.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:36:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:36:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:36:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:36:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:47.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:36:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:36:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:36:47 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.d scrub starts
Oct  2 07:36:47 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.d scrub ok
Oct  2 07:36:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:48.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:49.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:36:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:50.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:36:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:51.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:52.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:52 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.c scrub starts
Oct  2 07:36:52 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.c scrub ok
Oct  2 07:36:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:36:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:53.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:36:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:36:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:36:53 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.b scrub starts
Oct  2 07:36:53 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.b scrub ok
Oct  2 07:36:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:54.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:54 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Oct  2 07:36:54 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Oct  2 07:36:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:55.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:36:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:56.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:36:56 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Oct  2 07:36:56 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Oct  2 07:36:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:57.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:57 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.a scrub starts
Oct  2 07:36:57 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.a scrub ok
Oct  2 07:36:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:36:58.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:58 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.1c deep-scrub starts
Oct  2 07:36:58 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 8.1c deep-scrub ok
Oct  2 07:36:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:36:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:36:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:36:59.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:36:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:36:59 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.15 deep-scrub starts
Oct  2 07:36:59 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.15 deep-scrub ok
Oct  2 07:37:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:00.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:01.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:02.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:02 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Oct  2 07:37:02 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Oct  2 07:37:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:03.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:03 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Oct  2 07:37:03 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Oct  2 07:37:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:04.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:37:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:37:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:05.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:37:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:06.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:06 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Oct  2 07:37:06 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Oct  2 07:37:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:37:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:07.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:37:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:08.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:09.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:37:09 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.10 scrub starts
Oct  2 07:37:09 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.10 scrub ok
Oct  2 07:37:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:10.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:10 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Oct  2 07:37:10 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Oct  2 07:37:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:11.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:12.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:37:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:13.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:37:13 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.a scrub starts
Oct  2 07:37:13 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.a scrub ok
Oct  2 07:37:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:14.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:37:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:15.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:15 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Oct  2 07:37:15 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Oct  2 07:37:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:16.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:16 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Oct  2 07:37:16 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Oct  2 07:37:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:17.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:17 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Oct  2 07:37:17 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Oct  2 07:37:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:18.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:18 np0005465988 systemd[71899]: Created slice User Background Tasks Slice.
Oct  2 07:37:18 np0005465988 systemd[71899]: Starting Cleanup of User's Temporary Files and Directories...
Oct  2 07:37:18 np0005465988 systemd[71899]: Finished Cleanup of User's Temporary Files and Directories.
Oct  2 07:37:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:19.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:37:19 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Oct  2 07:37:19 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Oct  2 07:37:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:20.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:21.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:22 np0005465988 python3.9[89675]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:37:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:22.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:22 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.14 deep-scrub starts
Oct  2 07:37:22 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.14 deep-scrub ok
Oct  2 07:37:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:23.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:24 np0005465988 python3.9[89963]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct  2 07:37:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:24.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:37:24 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Oct  2 07:37:24 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Oct  2 07:37:25 np0005465988 python3.9[90116]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct  2 07:37:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:25.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:25 np0005465988 python3.9[90268]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:37:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:26.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:26 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.f scrub starts
Oct  2 07:37:26 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.f scrub ok
Oct  2 07:37:26 np0005465988 python3.9[90420]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct  2 07:37:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:27.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:37:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:28.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:37:28 np0005465988 python3.9[90573]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:37:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:29.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:29 np0005465988 python3.9[90726]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:37:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:37:29 np0005465988 python3.9[90804]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:37:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:30.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:31.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:31 np0005465988 python3.9[90957]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct  2 07:37:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:32.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:32 np0005465988 python3.9[91110]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct  2 07:37:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:33.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:33 np0005465988 python3.9[91264]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:37:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:34.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:34 np0005465988 python3.9[91416]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct  2 07:37:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:37:34 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Oct  2 07:37:34 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Oct  2 07:37:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:35.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:35 np0005465988 python3.9[91569]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:37:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:36.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:37.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:37 np0005465988 python3.9[91723]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:37:37 np0005465988 python3.9[91875]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:37:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:38.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:38 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.1e scrub starts
Oct  2 07:37:38 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.1e scrub ok
Oct  2 07:37:38 np0005465988 python3.9[91953]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:37:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:39.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:37:39 np0005465988 python3.9[92156]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:37:39 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Oct  2 07:37:39 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Oct  2 07:37:39 np0005465988 python3.9[92234]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:37:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:40.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:40 np0005465988 python3.9[92387]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:37:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:41.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:41 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.12 scrub starts
Oct  2 07:37:41 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 10.12 scrub ok
Oct  2 07:37:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:37:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:42.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:37:43 np0005465988 python3.9[92539]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:37:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:43.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:44 np0005465988 python3.9[92691]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct  2 07:37:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:44.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:37:45 np0005465988 python3.9[92842]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:37:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:45.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:45 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.13 deep-scrub starts
Oct  2 07:37:45 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.13 deep-scrub ok
Oct  2 07:37:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:46.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:46 np0005465988 python3.9[92994]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:37:46 np0005465988 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct  2 07:37:46 np0005465988 systemd[1]: tuned.service: Deactivated successfully.
Oct  2 07:37:46 np0005465988 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct  2 07:37:46 np0005465988 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  2 07:37:46 np0005465988 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  2 07:37:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:47.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:47 np0005465988 python3.9[93156]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct  2 07:37:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:48.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:48 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.b scrub starts
Oct  2 07:37:48 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.b scrub ok
Oct  2 07:37:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:49.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:37:49 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Oct  2 07:37:49 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Oct  2 07:37:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:50.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:51.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:51 np0005465988 python3.9[93310]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:37:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:52.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:52 np0005465988 python3.9[93464]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:37:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:53.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:53 np0005465988 systemd[1]: session-35.scope: Deactivated successfully.
Oct  2 07:37:53 np0005465988 systemd[1]: session-35.scope: Consumed 1min 5.211s CPU time.
Oct  2 07:37:53 np0005465988 systemd-logind[827]: Session 35 logged out. Waiting for processes to exit.
Oct  2 07:37:53 np0005465988 systemd-logind[827]: Removed session 35.
Oct  2 07:37:54 np0005465988 podman[93664]: 2025-10-02 11:37:54.072481247 +0000 UTC m=+0.073882930 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Oct  2 07:37:54 np0005465988 podman[93664]: 2025-10-02 11:37:54.174139779 +0000 UTC m=+0.175541452 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct  2 07:37:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:54.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:37:54 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Oct  2 07:37:54 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Oct  2 07:37:54 np0005465988 podman[93800]: 2025-10-02 11:37:54.8863791 +0000 UTC m=+0.058655525 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:37:54 np0005465988 podman[93800]: 2025-10-02 11:37:54.895709117 +0000 UTC m=+0.067985532 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:37:55 np0005465988 podman[93866]: 2025-10-02 11:37:55.11723819 +0000 UTC m=+0.065489760 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, io.buildah.version=1.28.2, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, release=1793, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Oct  2 07:37:55 np0005465988 podman[93866]: 2025-10-02 11:37:55.12985013 +0000 UTC m=+0.078101590 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, version=2.2.4, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Oct  2 07:37:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:37:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:55.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:37:55 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Oct  2 07:37:55 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Oct  2 07:37:56 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:37:56 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:37:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:56.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:57.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:37:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:37:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:37:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:37:58.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:59 np0005465988 systemd-logind[827]: New session 36 of user zuul.
Oct  2 07:37:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:37:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:37:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:37:59.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:37:59 np0005465988 systemd[1]: Started Session 36 of User zuul.
Oct  2 07:37:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:38:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:00.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:38:00 np0005465988 python3.9[94250]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:01.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:01 np0005465988 python3.9[94407]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct  2 07:38:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:02.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:02 np0005465988 python3.9[94593]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:38:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:38:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:03.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:38:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:38:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:38:03 np0005465988 python3.9[94695]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:38:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:04.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:05.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:05 np0005465988 python3.9[94849]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:06.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:07.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:08.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:08 np0005465988 python3.9[95003]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:38:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:09.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:09 np0005465988 python3.9[95157]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:09 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Oct  2 07:38:09 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Oct  2 07:38:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:10.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:10 np0005465988 python3.9[95309]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct  2 07:38:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:11.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:11 np0005465988 python3.9[95460]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:12.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:12 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Oct  2 07:38:12 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Oct  2 07:38:12 np0005465988 python3.9[95619]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:13.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:14.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:14 np0005465988 python3.9[95774]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:38:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:15.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:16.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:16 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Oct  2 07:38:16 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Oct  2 07:38:16 np0005465988 python3.9[96061]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:38:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:17.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:17 np0005465988 python3.9[96212]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:38:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:38:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:18.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:38:18 np0005465988 python3.9[96366]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:18 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Oct  2 07:38:18 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Oct  2 07:38:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:19.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:20.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:20 np0005465988 python3.9[96570]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:20 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.16 deep-scrub starts
Oct  2 07:38:20 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.16 deep-scrub ok
Oct  2 07:38:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:21.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:38:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:22.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:38:22 np0005465988 python3.9[96724]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:38:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:23.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:23 np0005465988 python3.9[96879]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Oct  2 07:38:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:24.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:24 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Oct  2 07:38:24 np0005465988 ceph-osd[79039]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Oct  2 07:38:24 np0005465988 systemd[1]: session-36.scope: Deactivated successfully.
Oct  2 07:38:24 np0005465988 systemd[1]: session-36.scope: Consumed 18.882s CPU time.
Oct  2 07:38:24 np0005465988 systemd-logind[827]: Session 36 logged out. Waiting for processes to exit.
Oct  2 07:38:24 np0005465988 systemd-logind[827]: Removed session 36.
Oct  2 07:38:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:25.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:26.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:27.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:28.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:29.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:29 np0005465988 systemd-logind[827]: New session 37 of user zuul.
Oct  2 07:38:29 np0005465988 systemd[1]: Started Session 37 of User zuul.
Oct  2 07:38:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:30.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:30 np0005465988 python3.9[97061]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:31.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:32 np0005465988 python3.9[97215]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:38:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:32.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:33.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:33 np0005465988 python3.9[97409]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:38:33 np0005465988 systemd[1]: session-37.scope: Deactivated successfully.
Oct  2 07:38:33 np0005465988 systemd[1]: session-37.scope: Consumed 2.682s CPU time.
Oct  2 07:38:33 np0005465988 systemd-logind[827]: Session 37 logged out. Waiting for processes to exit.
Oct  2 07:38:33 np0005465988 systemd-logind[827]: Removed session 37.
Oct  2 07:38:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:34.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:35.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:36.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:37.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:38.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:39.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:39 np0005465988 systemd-logind[827]: New session 38 of user zuul.
Oct  2 07:38:39 np0005465988 systemd[1]: Started Session 38 of User zuul.
Oct  2 07:38:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:40.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:40 np0005465988 python3.9[97641]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:41.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:41 np0005465988 python3.9[97796]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:42.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:42 np0005465988 python3.9[97952]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:38:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:43.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:43 np0005465988 python3.9[98037]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:44.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:45.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:45 np0005465988 python3.9[98191]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:38:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:46.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:46 np0005465988 python3.9[98386]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:38:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:47.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:47 np0005465988 python3.9[98539]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:38:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:48.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:48 np0005465988 python3.9[98705]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:38:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:49.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:49 np0005465988 python3.9[98783]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:38:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:50.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:50 np0005465988 python3.9[98935]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:38:50 np0005465988 python3.9[99014]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:38:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:51.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:51 np0005465988 python3.9[99166]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:38:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:52.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:52 np0005465988 python3.9[99318]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:38:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:53.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:53 np0005465988 python3.9[99471]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:38:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:54.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:38:54 np0005465988 python3.9[99623]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:38:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:55.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:55 np0005465988 python3.9[99776]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:56.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:57.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:38:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:38:58.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:38:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:38:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:38:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:38:59.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:38:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:00.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:01.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:02.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:03 np0005465988 python3.9[100085]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:39:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:03.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:03 np0005465988 podman[100182]: 2025-10-02 11:39:03.436642867 +0000 UTC m=+0.073219512 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Oct  2 07:39:03 np0005465988 podman[100182]: 2025-10-02 11:39:03.546753163 +0000 UTC m=+0.183329768 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 07:39:04 np0005465988 python3.9[100396]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:39:04 np0005465988 podman[100445]: 2025-10-02 11:39:04.184753176 +0000 UTC m=+0.058788373 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:39:04 np0005465988 podman[100445]: 2025-10-02 11:39:04.199597948 +0000 UTC m=+0.073633125 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:39:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:04.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:04 np0005465988 podman[100551]: 2025-10-02 11:39:04.411774364 +0000 UTC m=+0.054299972 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, vcs-type=git, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., name=keepalived)
Oct  2 07:39:04 np0005465988 podman[100551]: 2025-10-02 11:39:04.424797163 +0000 UTC m=+0.067322751 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, name=keepalived, version=2.2.4, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, vendor=Red Hat, Inc.)
Oct  2 07:39:04 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:39:04 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:39:04 np0005465988 python3.9[100739]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:39:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:05.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:05 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:39:05 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:39:05 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:39:05 np0005465988 python3.9[100991]: ansible-service_facts Invoked
Oct  2 07:39:05 np0005465988 network[101008]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:39:05 np0005465988 network[101009]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:39:05 np0005465988 network[101010]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:39:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:39:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:06.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:39:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:07.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:08.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:09.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:10.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:11.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:11 np0005465988 python3.9[101469]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:39:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:12.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:13.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:39:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:39:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:14.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:14 np0005465988 python3.9[101673]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct  2 07:39:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:15.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:16 np0005465988 python3.9[101826]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:16.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:16 np0005465988 python3.9[101904]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:17.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:17 np0005465988 python3.9[102057]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:17 np0005465988 python3.9[102135]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:18.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:19.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:19 np0005465988 python3.9[102338]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:20.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:21 np0005465988 python3.9[102491]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:39:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:21.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:22 np0005465988 python3.9[102575]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:39:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:22.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:23 np0005465988 systemd[1]: session-38.scope: Deactivated successfully.
Oct  2 07:39:23 np0005465988 systemd[1]: session-38.scope: Consumed 25.987s CPU time.
Oct  2 07:39:23 np0005465988 systemd-logind[827]: Session 38 logged out. Waiting for processes to exit.
Oct  2 07:39:23 np0005465988 systemd-logind[827]: Removed session 38.
Oct  2 07:39:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:23.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:24.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:25.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:26.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:27.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:28.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:28 np0005465988 systemd-logind[827]: New session 39 of user zuul.
Oct  2 07:39:28 np0005465988 systemd[1]: Started Session 39 of User zuul.
Oct  2 07:39:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:29.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:29 np0005465988 python3.9[102761]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:30.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:30 np0005465988 python3.9[102913]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:31 np0005465988 python3.9[102992]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:31.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:31 np0005465988 systemd[1]: session-39.scope: Deactivated successfully.
Oct  2 07:39:31 np0005465988 systemd[1]: session-39.scope: Consumed 1.871s CPU time.
Oct  2 07:39:31 np0005465988 systemd-logind[827]: Session 39 logged out. Waiting for processes to exit.
Oct  2 07:39:31 np0005465988 systemd-logind[827]: Removed session 39.
Oct  2 07:39:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:32.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:39:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:33.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:39:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:34.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:35.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:36.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:36 np0005465988 systemd-logind[827]: New session 40 of user zuul.
Oct  2 07:39:36 np0005465988 systemd[1]: Started Session 40 of User zuul.
Oct  2 07:39:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:37.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:37 np0005465988 python3.9[103174]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:39:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:38.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:38 np0005465988 python3.9[103330]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:39.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:39 np0005465988 python3.9[103556]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:40 np0005465988 python3.9[103634]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.e_whqfmu recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:40.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:41.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:41 np0005465988 python3.9[103787]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:42 np0005465988 python3.9[103865]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.qob0jyul recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:42.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:42 np0005465988 python3.9[104018]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:43.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:43 np0005465988 python3.9[104170]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:44.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:44 np0005465988 python3.9[104248]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:45 np0005465988 python3.9[104401]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:45.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:45 np0005465988 python3.9[104479]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:46 np0005465988 python3.9[104631]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:46.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:46 np0005465988 python3.9[104784]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:47.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:47 np0005465988 python3.9[104862]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:48 np0005465988 python3.9[105014]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:39:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:48.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:39:48 np0005465988 python3.9[105092]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:49.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:49 np0005465988 python3.9[105245]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:39:49 np0005465988 systemd[1]: Reloading.
Oct  2 07:39:50 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:39:50 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:39:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:50.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:51 np0005465988 python3.9[105434]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:51.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:51 np0005465988 python3.9[105512]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:52.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:52 np0005465988 python3.9[105664]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:53 np0005465988 python3.9[105743]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:53.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:53 np0005465988 python3.9[105895]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:39:53 np0005465988 systemd[1]: Reloading.
Oct  2 07:39:53 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:39:53 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:39:54 np0005465988 systemd[1]: Starting Create netns directory...
Oct  2 07:39:54 np0005465988 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:39:54 np0005465988 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:39:54 np0005465988 systemd[1]: Finished Create netns directory.
Oct  2 07:39:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:54.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:55 np0005465988 python3.9[106090]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:39:55 np0005465988 network[106107]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:39:55 np0005465988 network[106108]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:39:55 np0005465988 network[106109]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:39:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:55.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:56.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:57.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:39:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:39:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:39:58.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:39:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:39:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:39:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:39:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:39:59.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:00 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 07:40:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:40:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:00.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:40:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:40:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:01.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:40:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:40:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:02.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:40:02 np0005465988 python3.9[106427]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.020485) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405203020559, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2507, "num_deletes": 251, "total_data_size": 5465617, "memory_usage": 5544160, "flush_reason": "Manual Compaction"}
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405203131022, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3561289, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7275, "largest_seqno": 9777, "table_properties": {"data_size": 3551728, "index_size": 5671, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2821, "raw_key_size": 23520, "raw_average_key_size": 20, "raw_value_size": 3530765, "raw_average_value_size": 3149, "num_data_blocks": 256, "num_entries": 1121, "num_filter_entries": 1121, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759405005, "oldest_key_time": 1759405005, "file_creation_time": 1759405203, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 110581 microseconds, and 12185 cpu microseconds.
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.131077) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3561289 bytes OK
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.131101) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.135913) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.135959) EVENT_LOG_v1 {"time_micros": 1759405203135946, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.136003) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5454201, prev total WAL file size 5455927, number of live WAL files 2.
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.139660) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3477KB)], [15(7799KB)]
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405203139758, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 11547954, "oldest_snapshot_seqno": -1}
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3791 keys, 9851841 bytes, temperature: kUnknown
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405203234026, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9851841, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9820344, "index_size": 20891, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9541, "raw_key_size": 91315, "raw_average_key_size": 24, "raw_value_size": 9745795, "raw_average_value_size": 2570, "num_data_blocks": 917, "num_entries": 3791, "num_filter_entries": 3791, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759405203, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.234418) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9851841 bytes
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.236176) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 122.4 rd, 104.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 7.6 +0.0 blob) out(9.4 +0.0 blob), read-write-amplify(6.0) write-amplify(2.8) OK, records in: 4308, records dropped: 517 output_compression: NoCompression
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.236217) EVENT_LOG_v1 {"time_micros": 1759405203236197, "job": 6, "event": "compaction_finished", "compaction_time_micros": 94374, "compaction_time_cpu_micros": 45127, "output_level": 6, "num_output_files": 1, "total_output_size": 9851841, "num_input_records": 4308, "num_output_records": 3791, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405203237984, "job": 6, "event": "table_file_deletion", "file_number": 17}
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405203240407, "job": 6, "event": "table_file_deletion", "file_number": 15}
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.139525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.240467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.240473) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.240475) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.240477) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:03 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:03.240480) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:03 np0005465988 python3.9[106506]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:03.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:04 np0005465988 python3.9[106658]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:04.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:04 np0005465988 python3.9[106811]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:05.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:05 np0005465988 python3.9[106889]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:40:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:06.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:40:06 np0005465988 python3.9[107041]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  2 07:40:06 np0005465988 systemd[1]: Starting Time & Date Service...
Oct  2 07:40:06 np0005465988 systemd[1]: Started Time & Date Service.
Oct  2 07:40:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:07.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:07 np0005465988 python3.9[107198]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:08.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:08 np0005465988 python3.9[107350]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:09 np0005465988 python3.9[107429]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:40:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:09.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:40:09 np0005465988 python3.9[107581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:10.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:10 np0005465988 python3.9[107659]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.k4eyg14i recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:11 np0005465988 python3.9[107812]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:40:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:11.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:40:11 np0005465988 python3.9[107890]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:40:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:12.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:40:12 np0005465988 python3.9[108042]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:40:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:40:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:13.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:40:13 np0005465988 python3[108335]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:40:13 np0005465988 podman[108366]: 2025-10-02 11:40:13.825225782 +0000 UTC m=+0.059568932 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 07:40:13 np0005465988 podman[108366]: 2025-10-02 11:40:13.942672694 +0000 UTC m=+0.177015844 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct  2 07:40:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:40:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:14.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:40:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:14 np0005465988 python3.9[108633]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:14 np0005465988 podman[108651]: 2025-10-02 11:40:14.79566166 +0000 UTC m=+0.339698318 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:40:15 np0005465988 podman[108733]: 2025-10-02 11:40:15.006554193 +0000 UTC m=+0.177602881 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:40:15 np0005465988 podman[108651]: 2025-10-02 11:40:15.01261005 +0000 UTC m=+0.556646718 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:40:15 np0005465988 python3.9[108762]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:15 np0005465988 podman[108802]: 2025-10-02 11:40:15.227868281 +0000 UTC m=+0.049787276 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, io.openshift.tags=Ceph keepalived, release=1793, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, version=2.2.4, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., com.redhat.component=keepalived-container)
Oct  2 07:40:15 np0005465988 podman[108802]: 2025-10-02 11:40:15.241935092 +0000 UTC m=+0.063854077 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, name=keepalived, vendor=Red Hat, Inc., version=2.2.4, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, vcs-type=git)
Oct  2 07:40:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:15.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:15 np0005465988 python3.9[109098]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:16 np0005465988 python3.9[109207]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:16.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:40:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:40:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:40:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:40:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:40:17 np0005465988 python3.9[109360]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:17.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:17 np0005465988 python3.9[109438]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:18 np0005465988 python3.9[109590]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:18.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:18 np0005465988 python3.9[109669]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:19.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:19 np0005465988 python3.9[109821]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:20 np0005465988 python3.9[109949]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:20.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:21 np0005465988 python3.9[110102]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:40:21 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:40:21 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:40:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:21.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:22 np0005465988 python3.9[110307]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:22.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:22 np0005465988 python3.9[110460]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:23.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:23 np0005465988 python3.9[110613]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:24.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:24 np0005465988 python3.9[110765]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  2 07:40:25 np0005465988 python3.9[110918]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  2 07:40:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:25.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:25 np0005465988 systemd[1]: session-40.scope: Deactivated successfully.
Oct  2 07:40:25 np0005465988 systemd[1]: session-40.scope: Consumed 33.582s CPU time.
Oct  2 07:40:25 np0005465988 systemd-logind[827]: Session 40 logged out. Waiting for processes to exit.
Oct  2 07:40:25 np0005465988 systemd-logind[827]: Removed session 40.
Oct  2 07:40:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:26.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:27.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:28.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:29.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:30.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:31 np0005465988 systemd-logind[827]: New session 41 of user zuul.
Oct  2 07:40:31 np0005465988 systemd[1]: Started Session 41 of User zuul.
Oct  2 07:40:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:31.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:32 np0005465988 python3.9[111101]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct  2 07:40:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:32.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:33 np0005465988 python3.9[111254]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:40:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000027s ======
Oct  2 07:40:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:33.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Oct  2 07:40:33 np0005465988 python3.9[111408]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Oct  2 07:40:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:34.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:34 np0005465988 python3.9[111560]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.8tl62v8_ follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:35.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:35 np0005465988 python3.9[111686]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.8tl62v8_ mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405234.170369-109-35339556252290/.source.8tl62v8_ _original_basename=.6qm3u5c1 follow=False checksum=279e85be4c03ef167cd5d1085e11bbf46760f3f2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:36.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:36 np0005465988 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  2 07:40:36 np0005465988 python3.9[111838]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:40:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:37.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:37 np0005465988 python3.9[111993]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDFg5rufAFy1itLjBBGlAJUDsQsaZUavZeI3stNJBLolkBBMB4sBpwAvQFbu2iUhtVavUC7q9xD2LsX0DVBu9DCaQn6tETqUUvMQqzvmaXd34gwo5fH6vo+bjqVdZEih0pIVI1O2OfOUvnv2MFLdKx8MWLQd54beGjWQsC3xCnYVuh0W/aAQtRC2EA77nBo+r40u5V3HXOhdmUbFNvL0r6I8FwP4IvbKC5jkBTtqIzewh+/cyJrURCh0aCpeUjBqNqw3ADhtuR2h5n3ioq+IwPXbhHViJUWQyJ5XKmlSzupEEYA+RV8i1Y3eHJK2RuYlCXkpRP3MEsyBxmISTPhVdQwfxClvyi/mTQkl6k5XFGyZher7KbE6lx4qzp8iCOyOWkw32N3tG0AlnOtPI5HJw8uKbwWl2Apb7RncDQ5fpNOKNFcB1sg61g2Vvew7xJs62OxhkOTiSkEEUYoFfXAqNLiH8gC0+Go12qYleZKbfzL00BDT2boQ2UxYn2rWK7YifU=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIB29M+5Yr1BRNmm2RoLe921umFtraZRFTbdptrBdgsAV#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv7vUjfSNyE5eIqsBh1jfLF/N1YKOXT7KtCRIxAQ1i9+ljB9j4j/dQgL6TGk3m+hQRPyAVxTDwUpeBxHWIpFjU=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2ry6wCZyHJmZsI4Z83U5DYzCaQhL5JmDbykEZokepEcnLFJt20bbnTU0eQzXJylkCgp7rhmpZo7V7qVNnZUMI3aLUHK30Yr5jzQVofHBRg6ZnAIq1MAwqwGH1s6vfNo+//zth4OMHvolMSEO6zSmOWeAsuHM2DTEJ6IdRasKfhOCc3oI/Tcf5vOUyVGg/BH+fFOHKzPiyJNXozsvw2u4ppfdkMJvVC9w2oTNHMIGcDxSsx0zD2bLdYe5l23tFIOaBM149ktg5KPPsKYyQFymOi5qJHHnf9027MqZ4N7Z9SYuQrqt2nY4C/XmaVFOmUIFNNMZ5qMWDsc38V9cHCgurSaMsQ4em1srXr9nzADLh9bw4WksIRfrtt3twMp7FG9fMsw8rdmFt0+4/IdHr/3wCmHeF07qp10kJPXa5z9dApoIKiQlbIl+UCzlaN5tHD6vb4q0MyhqAtU4mSA1zGz67c2lLSGbF4FTgU9yza15FZjHzQ0ArNu/1KIheA8nrpkE=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHwAEaDivXDvJkCgJw1MVhYQArg6qfdDb4SKBZRPdoOc#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNMghqQyWdigdn5yyuBSIQ3tHLq/tZwQO222aoRtckuDI9Ml6snE/xKJ7YWmTvRTsqj2tqCqXIllFFfreYY7Apw=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpTq9G8ymc65djWd0YMUA1/KMKQbBxw7LoyOCyAnPotUx6UyYfBtjYX5I4TzqzEugao1w+4AHDZ5XKSwr8sv9kaSGm0ERmNxz22+5cmKwWxcvUfNGQQXbk6gk6z5p0qpH/Ue9e19xDUC+RDUMGcwrysoGQ05aVcGDaEmNUxvYjj0UUfs45KX/pHPk5xQ4c0WjiL0BfzPJmphY2PAj6O9b4iFA3HjIJgvQ3+i3jEOkvA1FsXm5s7O1/wEjqwsdfKPlX0LUuCqXyxI4uhWY16Ofi89lEtsdQRwFyoZcDMJUDHMH8oJSopUNwwMEe7UBD1MHJSIzrd6NUGnvRjhqH6dE/IoT2X3f4JN/Six+J9ayDqiIkd1QNsJzPBr6G2Lj/dQbUusb3nXhPk5TXKMOXm5i+J940nYQv8/Y9rf2H1qltGaDEOS95ktKpcL6EVplOsQand/Qmb/ShKbiAo2dr3YC3v/FFE2AAj+0Dnh4xob14bhivkYHDhIF0zyzcVGhHZXc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICufzWCrq7lQCIqxq8UNP+WfGRQD+uOEPLr+ZneqofrM#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHP494uEOdMq07v1W25s7bKFki4bQuHkde7xWzYJuUT44SD4tSCrPbQiOkLCqtg9H5yxKL0Ovnl22PYLf1HMKAs=#012 create=True mode=0644 path=/tmp/ansible.8tl62v8_ state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:38.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:38 np0005465988 python3.9[112145]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.8tl62v8_' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:39.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:39 np0005465988 python3.9[112300]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.8tl62v8_ state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.657060) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405239657099, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 617, "num_deletes": 251, "total_data_size": 1040793, "memory_usage": 1051560, "flush_reason": "Manual Compaction"}
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405239661245, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 503823, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9782, "largest_seqno": 10394, "table_properties": {"data_size": 500929, "index_size": 866, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7335, "raw_average_key_size": 19, "raw_value_size": 494999, "raw_average_value_size": 1334, "num_data_blocks": 37, "num_entries": 371, "num_filter_entries": 371, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759405203, "oldest_key_time": 1759405203, "file_creation_time": 1759405239, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 4219 microseconds, and 1597 cpu microseconds.
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.661288) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 503823 bytes OK
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.661300) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.662386) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.662400) EVENT_LOG_v1 {"time_micros": 1759405239662395, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.662415) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1037339, prev total WAL file size 1037339, number of live WAL files 2.
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.663019) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323532' seq:0, type:0; will stop at (end)
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(492KB)], [18(9620KB)]
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405239663129, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 10355664, "oldest_snapshot_seqno": -1}
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 3657 keys, 7502146 bytes, temperature: kUnknown
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405239718576, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 7502146, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7474906, "index_size": 17018, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9157, "raw_key_size": 89020, "raw_average_key_size": 24, "raw_value_size": 7405924, "raw_average_value_size": 2025, "num_data_blocks": 747, "num_entries": 3657, "num_filter_entries": 3657, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759405239, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.718860) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 7502146 bytes
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.720712) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 186.5 rd, 135.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 9.4 +0.0 blob) out(7.2 +0.0 blob), read-write-amplify(35.4) write-amplify(14.9) OK, records in: 4162, records dropped: 505 output_compression: NoCompression
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.720767) EVENT_LOG_v1 {"time_micros": 1759405239720744, "job": 8, "event": "compaction_finished", "compaction_time_micros": 55520, "compaction_time_cpu_micros": 22520, "output_level": 6, "num_output_files": 1, "total_output_size": 7502146, "num_input_records": 4162, "num_output_records": 3657, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405239721124, "job": 8, "event": "table_file_deletion", "file_number": 20}
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405239724596, "job": 8, "event": "table_file_deletion", "file_number": 18}
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.662868) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.724709) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.724717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.724720) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.724724) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:40:39.724727) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:40:40 np0005465988 systemd[1]: session-41.scope: Deactivated successfully.
Oct  2 07:40:40 np0005465988 systemd[1]: session-41.scope: Consumed 5.670s CPU time.
Oct  2 07:40:40 np0005465988 systemd-logind[827]: Session 41 logged out. Waiting for processes to exit.
Oct  2 07:40:40 np0005465988 systemd-logind[827]: Removed session 41.
Oct  2 07:40:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:40.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:41.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:42.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:43.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:44.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:45 np0005465988 systemd-logind[827]: New session 42 of user zuul.
Oct  2 07:40:45 np0005465988 systemd[1]: Started Session 42 of User zuul.
Oct  2 07:40:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:45.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:46.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:46 np0005465988 python3.9[112532]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:40:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:47.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:47 np0005465988 python3.9[112689]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  2 07:40:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:48.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:48 np0005465988 python3.9[112843]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:40:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:49 np0005465988 python3.9[112997]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:40:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:49.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:50 np0005465988 python3.9[113150]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:40:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:50.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:51 np0005465988 python3.9[113303]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:51.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:51 np0005465988 systemd-logind[827]: Session 42 logged out. Waiting for processes to exit.
Oct  2 07:40:51 np0005465988 systemd[1]: session-42.scope: Deactivated successfully.
Oct  2 07:40:51 np0005465988 systemd[1]: session-42.scope: Consumed 4.151s CPU time.
Oct  2 07:40:51 np0005465988 systemd-logind[827]: Removed session 42.
Oct  2 07:40:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:52.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:53.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:54.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:55.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:55 np0005465988 systemd-logind[827]: New session 43 of user zuul.
Oct  2 07:40:55 np0005465988 systemd[1]: Started Session 43 of User zuul.
Oct  2 07:40:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:40:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:56.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:40:57 np0005465988 python3.9[113484]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:40:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:57.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:40:58 np0005465988 python3.9[113640]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:40:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000027s ======
Oct  2 07:40:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:40:58.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Oct  2 07:40:58 np0005465988 python3.9[113725]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:40:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:40:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:40:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:40:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:40:59.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:00.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:01 np0005465988 python3.9[113927]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:41:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:01.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:02 np0005465988 python3.9[114078]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:41:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:02.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:03 np0005465988 python3.9[114229]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:41:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:03.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:04 np0005465988 python3.9[114379]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:41:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:04.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:04 np0005465988 systemd[1]: session-43.scope: Deactivated successfully.
Oct  2 07:41:04 np0005465988 systemd[1]: session-43.scope: Consumed 6.203s CPU time.
Oct  2 07:41:04 np0005465988 systemd-logind[827]: Session 43 logged out. Waiting for processes to exit.
Oct  2 07:41:04 np0005465988 systemd-logind[827]: Removed session 43.
Oct  2 07:41:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:05.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:06.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:07.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:08.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:09 np0005465988 systemd-logind[827]: New session 44 of user zuul.
Oct  2 07:41:09 np0005465988 systemd[1]: Started Session 44 of User zuul.
Oct  2 07:41:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:09.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:10.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:10 np0005465988 python3.9[114560]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:41:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:11.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:12 np0005465988 python3.9[114717]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:12.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:12 np0005465988 python3.9[114870]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:13.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:13 np0005465988 python3.9[115022]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:14 np0005465988 python3.9[115145]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405273.2126558-166-114236271207204/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=8ca9eb936a585b0cd8eed42783dcc42460c48bb9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000027s ======
Oct  2 07:41:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:14.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Oct  2 07:41:15 np0005465988 python3.9[115298]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:15.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:15 np0005465988 python3.9[115421]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405274.5929804-166-242632320393936/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=7abde3fe8c59ace3eb9cb99b75c7e56e5fa913ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:16.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:16 np0005465988 python3.9[115573]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:17 np0005465988 python3.9[115697]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405275.9721909-166-168337680734951/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=8a9ea9924997f1616c677094afd61aeba71ef0fe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:17.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:17 np0005465988 python3.9[115849]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:18.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:18 np0005465988 python3.9[116001]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:19 np0005465988 python3.9[116154]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:19.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:20 np0005465988 python3.9[116277]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405278.8280892-353-242496197653212/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=37f181b54f9e902e292d90ad6372cba2d771efd0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:20.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:20 np0005465988 python3.9[116479]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:21 np0005465988 python3.9[116603]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405280.2821856-353-163861554881338/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=c2cb2163d8d8d6ce387c5934cff1fd20c9d087eb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:21.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:22 np0005465988 python3.9[116853]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:22.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:41:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:41:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:41:22 np0005465988 python3.9[117010]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405281.7136154-353-235934118860284/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=b3e70257df736f41dc20f42993353c56022d7935 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:23 np0005465988 python3.9[117163]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:23.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:24 np0005465988 python3.9[117315]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:24.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:24 np0005465988 python3.9[117468]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:25 np0005465988 python3.9[117591]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405284.4450088-547-222049342947481/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=cf49a220d3c603a3c42004ec0b22c02809571c50 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:25.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:26 np0005465988 python3.9[117743]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:26.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:26 np0005465988 python3.9[117866]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405285.6765437-547-9046569129579/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=c2cb2163d8d8d6ce387c5934cff1fd20c9d087eb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:27 np0005465988 python3.9[118019]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:27.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:27 np0005465988 python3.9[118142]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405286.9053626-547-26216255193203/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=1ef21d8a5025da8a60f3deddff9830d6c5b70d6a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:28.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:29 np0005465988 python3.9[118295]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:29.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:29 np0005465988 python3.9[118447]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:30.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:30 np0005465988 python3.9[118618]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405289.4523637-760-10174801116012/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=131465f77d8d4faa1442e1beada7324e1814ff9f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:41:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:41:31 np0005465988 python3.9[118773]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:41:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:31.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:41:32 np0005465988 python3.9[118925]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:32.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:32 np0005465988 python3.9[119048]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405291.5611045-836-90784373521963/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=131465f77d8d4faa1442e1beada7324e1814ff9f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:33 np0005465988 python3.9[119201]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:33.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:34 np0005465988 python3.9[119353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:34.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:34 np0005465988 python3.9[119476]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405293.6981418-913-57002539313063/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=131465f77d8d4faa1442e1beada7324e1814ff9f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:35.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:35 np0005465988 python3.9[119629]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:36.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:36 np0005465988 python3.9[119781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:37 np0005465988 python3.9[119905]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405296.0217633-992-100566805135866/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=131465f77d8d4faa1442e1beada7324e1814ff9f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:41:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:37.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:41:37 np0005465988 python3.9[120057]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:38.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:38 np0005465988 python3.9[120209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:39 np0005465988 python3.9[120333]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405298.181757-1068-203365201782579/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=131465f77d8d4faa1442e1beada7324e1814ff9f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:39.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:40 np0005465988 python3.9[120485]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:40.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:40 np0005465988 python3.9[120687]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:41 np0005465988 python3.9[120811]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405300.2689009-1108-199242608865169/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=131465f77d8d4faa1442e1beada7324e1814ff9f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:41.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:41 np0005465988 systemd-logind[827]: Session 44 logged out. Waiting for processes to exit.
Oct  2 07:41:41 np0005465988 systemd[1]: session-44.scope: Deactivated successfully.
Oct  2 07:41:41 np0005465988 systemd[1]: session-44.scope: Consumed 25.528s CPU time.
Oct  2 07:41:41 np0005465988 systemd-logind[827]: Removed session 44.
Oct  2 07:41:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:42.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:43.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:41:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:44.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:41:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:41:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:45.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:41:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:46.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:47.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:48.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:49 np0005465988 systemd-logind[827]: New session 45 of user zuul.
Oct  2 07:41:49 np0005465988 systemd[1]: Started Session 45 of User zuul.
Oct  2 07:41:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:41:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:49.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:41:50 np0005465988 python3.9[120995]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:50.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:50 np0005465988 python3.9[121148]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:51.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:51 np0005465988 python3.9[121271]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405310.2639706-68-147809517783433/.source.conf _original_basename=ceph.conf follow=False checksum=a063547ed46c9b567daa2ad5bf469f0aff0b35ec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:52 np0005465988 python3.9[121423]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:52.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:53 np0005465988 python3.9[121547]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405311.8380117-68-235437156234857/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=879f4ae20801e566b8dfcda89b2df304e135843d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:53 np0005465988 systemd[1]: session-45.scope: Deactivated successfully.
Oct  2 07:41:53 np0005465988 systemd[1]: session-45.scope: Consumed 2.870s CPU time.
Oct  2 07:41:53 np0005465988 systemd-logind[827]: Session 45 logged out. Waiting for processes to exit.
Oct  2 07:41:53 np0005465988 systemd-logind[827]: Removed session 45.
Oct  2 07:41:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:53.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:54.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:41:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:55.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:41:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:56.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:41:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:57.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:41:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:41:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:41:58.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:41:59 np0005465988 systemd-logind[827]: New session 46 of user zuul.
Oct  2 07:41:59 np0005465988 systemd[1]: Started Session 46 of User zuul.
Oct  2 07:41:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:41:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:41:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:41:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:41:59.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:42:00 np0005465988 python3.9[121728]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:42:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:00.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:01 np0005465988 python3.9[121935]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:42:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:01.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:02 np0005465988 python3.9[122087]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:42:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:02.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:02 np0005465988 python3.9[122238]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:42:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:03.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:03 np0005465988 python3.9[122390]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  2 07:42:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:42:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:04.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:42:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:42:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:05.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:42:05 np0005465988 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct  2 07:42:06 np0005465988 python3.9[122547]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:42:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:42:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:06.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:42:07 np0005465988 python3.9[122632]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:42:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:07.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:08.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:09 np0005465988 python3.9[122786]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:42:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:09.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:10 np0005465988 python3[122942]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct  2 07:42:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:10.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:11 np0005465988 python3.9[123095]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:11.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:12 np0005465988 python3.9[123247]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:42:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:12.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:42:12 np0005465988 python3.9[123325]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:13 np0005465988 python3.9[123478]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:42:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:13.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:42:13 np0005465988 python3.9[123556]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ltxg7ymi recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:42:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:14.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:42:14 np0005465988 python3.9[123708]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:15 np0005465988 python3.9[123787]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:15.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:16 np0005465988 python3.9[123939]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:16.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:17 np0005465988 python3[124093]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:42:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:42:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:17.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:42:17 np0005465988 python3.9[124245]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:18.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:18 np0005465988 python3.9[124370]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405337.3469634-439-19566884911145/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:19 np0005465988 python3.9[124523]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:19.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:20 np0005465988 python3.9[124648]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405338.857552-484-143375110431515/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:20.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:21 np0005465988 python3.9[124851]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:21.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:21 np0005465988 python3.9[124976]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405340.4015942-528-32350101718010/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:22 np0005465988 python3.9[125128]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:22.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:23 np0005465988 python3.9[125254]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405341.8877637-574-99290687329907/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:23.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:23 np0005465988 python3.9[125406]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:42:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:24.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:42:24 np0005465988 python3.9[125531]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405343.276126-618-86774684602226/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:25 np0005465988 python3.9[125684]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:42:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:25.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:42:26 np0005465988 python3.9[125836]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:42:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:26.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:42:26 np0005465988 python3.9[125992]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:42:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:27.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:42:27 np0005465988 python3.9[126144]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:28 np0005465988 python3.9[126297]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:42:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:28.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:29 np0005465988 python3.9[126452]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:42:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:29.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:42:30 np0005465988 python3.9[126607]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:30.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:31 np0005465988 python3.9[126876]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:42:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:31.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:32.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:42:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:42:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:42:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:42:32 np0005465988 python3.9[127041]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:d8:76:c8:90" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:32 np0005465988 ovs-vsctl[127043]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:d8:76:c8:90 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct  2 07:42:33 np0005465988 python3.9[127195]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000027s ======
Oct  2 07:42:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:33.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Oct  2 07:42:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:42:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:42:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:42:34 np0005465988 python3.9[127350]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:34 np0005465988 ovs-vsctl[127351]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct  2 07:42:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:34.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:35 np0005465988 python3.9[127502]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:42:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:42:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:35.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:42:35 np0005465988 python3.9[127656]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:42:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:42:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:36.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:42:36 np0005465988 python3.9[127808]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:37 np0005465988 python3.9[127887]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:42:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:37.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:37 np0005465988 python3.9[128039]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:38 np0005465988 python3.9[128117]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:42:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:38.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:38 np0005465988 python3.9[128270]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:39 np0005465988 python3.9[128447]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:39.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:42:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:42:40 np0005465988 python3.9[128550]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:42:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:40.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:42:40 np0005465988 python3.9[128702]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:41 np0005465988 python3.9[128831]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:41.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:42 np0005465988 python3.9[128983]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:42:42 np0005465988 systemd[1]: Reloading.
Oct  2 07:42:42 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:42:42 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:42:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:42.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:43 np0005465988 python3.9[129173]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:42:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:43.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:42:44 np0005465988 python3.9[129251]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:44.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:44 np0005465988 python3.9[129403]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:45 np0005465988 python3.9[129482]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:45.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:46 np0005465988 python3.9[129634]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:42:46 np0005465988 systemd[1]: Reloading.
Oct  2 07:42:46 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:42:46 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:42:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:46.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:46 np0005465988 systemd[1]: Starting Create netns directory...
Oct  2 07:42:46 np0005465988 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:42:46 np0005465988 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:42:46 np0005465988 systemd[1]: Finished Create netns directory.
Oct  2 07:42:47 np0005465988 python3.9[129830]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:42:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:47.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:48 np0005465988 python3.9[129982]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:42:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:48.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:42:49 np0005465988 python3.9[130106]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405367.8061674-1372-85851706923107/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:42:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:49.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:50 np0005465988 python3.9[130258]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:42:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:42:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:50.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:42:50 np0005465988 python3.9[130410]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:51 np0005465988 python3.9[130534]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405370.322727-1446-205239936142692/.source.json _original_basename=.ynlyrtw6 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:51.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:52 np0005465988 python3.9[130686]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:52.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:53.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:54 np0005465988 python3.9[131114]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct  2 07:42:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:54.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:55 np0005465988 python3.9[131267]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:42:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:55.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:56 np0005465988 python3.9[131419]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:42:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:56.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:57.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:58 np0005465988 python3[131598]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:42:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:42:58.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:42:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:42:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:42:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:42:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:42:59.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:00.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:01.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:02.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:43:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:03.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:43:03 np0005465988 podman[131611]: 2025-10-02 11:43:03.891820669 +0000 UTC m=+5.407741250 image pull ae232aa720979600656d94fc26ba957f1cdf5bca825fe9b57990f60c6534611f quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  2 07:43:04 np0005465988 podman[131783]: 2025-10-02 11:43:04.013970672 +0000 UTC m=+0.023657762 image pull ae232aa720979600656d94fc26ba957f1cdf5bca825fe9b57990f60c6534611f quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  2 07:43:04 np0005465988 podman[131783]: 2025-10-02 11:43:04.148755855 +0000 UTC m=+0.158442955 container create 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 07:43:04 np0005465988 python3[131598]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  2 07:43:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:43:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:43:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:04.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:43:05 np0005465988 python3.9[131974]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:43:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:43:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:05.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:43:05 np0005465988 python3.9[132128]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:06 np0005465988 python3.9[132204]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:43:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:43:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:06.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:43:07 np0005465988 python3.9[132356]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405386.4830704-1710-115217571121282/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:07.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:07 np0005465988 python3.9[132432]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:43:07 np0005465988 systemd[1]: Reloading.
Oct  2 07:43:07 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:43:07 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:43:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000027s ======
Oct  2 07:43:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:08.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Oct  2 07:43:08 np0005465988 python3.9[132543]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:43:08 np0005465988 systemd[1]: Reloading.
Oct  2 07:43:08 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:43:08 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:43:09 np0005465988 systemd[1]: Starting ovn_controller container...
Oct  2 07:43:09 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:43:09 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ba545f80d758371d149a074830e82eecfe878182150236bc21052df85e86247/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  2 07:43:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:43:09 np0005465988 systemd[1]: Started /usr/bin/podman healthcheck run 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0.
Oct  2 07:43:09 np0005465988 podman[132586]: 2025-10-02 11:43:09.572094902 +0000 UTC m=+0.467846587 container init 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 07:43:09 np0005465988 ovn_controller[132601]: + sudo -E kolla_set_configs
Oct  2 07:43:09 np0005465988 podman[132586]: 2025-10-02 11:43:09.606781522 +0000 UTC m=+0.502533227 container start 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Oct  2 07:43:09 np0005465988 systemd[1]: Created slice User Slice of UID 0.
Oct  2 07:43:09 np0005465988 edpm-start-podman-container[132586]: ovn_controller
Oct  2 07:43:09 np0005465988 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  2 07:43:09 np0005465988 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  2 07:43:09 np0005465988 systemd[1]: Starting User Manager for UID 0...
Oct  2 07:43:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:09.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:09 np0005465988 podman[132608]: 2025-10-02 11:43:09.724304183 +0000 UTC m=+0.096370578 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller)
Oct  2 07:43:09 np0005465988 edpm-start-podman-container[132585]: Creating additional drop-in dependency for "ovn_controller" (3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0)
Oct  2 07:43:09 np0005465988 systemd[1]: 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0-6690ff5a707c6045.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:43:09 np0005465988 systemd[1]: 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0-6690ff5a707c6045.service: Failed with result 'exit-code'.
Oct  2 07:43:09 np0005465988 systemd[1]: Reloading.
Oct  2 07:43:09 np0005465988 systemd[132635]: Queued start job for default target Main User Target.
Oct  2 07:43:09 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:43:09 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:43:09 np0005465988 systemd[132635]: Created slice User Application Slice.
Oct  2 07:43:09 np0005465988 systemd[132635]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  2 07:43:09 np0005465988 systemd[132635]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 07:43:09 np0005465988 systemd[132635]: Reached target Paths.
Oct  2 07:43:09 np0005465988 systemd[132635]: Reached target Timers.
Oct  2 07:43:09 np0005465988 systemd[132635]: Starting D-Bus User Message Bus Socket...
Oct  2 07:43:09 np0005465988 systemd[132635]: Starting Create User's Volatile Files and Directories...
Oct  2 07:43:09 np0005465988 systemd[132635]: Finished Create User's Volatile Files and Directories.
Oct  2 07:43:09 np0005465988 systemd[132635]: Listening on D-Bus User Message Bus Socket.
Oct  2 07:43:09 np0005465988 systemd[132635]: Reached target Sockets.
Oct  2 07:43:09 np0005465988 systemd[132635]: Reached target Basic System.
Oct  2 07:43:09 np0005465988 systemd[132635]: Reached target Main User Target.
Oct  2 07:43:09 np0005465988 systemd[132635]: Startup finished in 157ms.
Oct  2 07:43:10 np0005465988 systemd[1]: Started User Manager for UID 0.
Oct  2 07:43:10 np0005465988 systemd[1]: Started ovn_controller container.
Oct  2 07:43:10 np0005465988 systemd[1]: Started Session c1 of User root.
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: INFO:__main__:Validating config file
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: INFO:__main__:Writing out command to execute
Oct  2 07:43:10 np0005465988 systemd[1]: session-c1.scope: Deactivated successfully.
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: ++ cat /run_command
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: + ARGS=
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: + sudo kolla_copy_cacerts
Oct  2 07:43:10 np0005465988 systemd[1]: Started Session c2 of User root.
Oct  2 07:43:10 np0005465988 systemd[1]: session-c2.scope: Deactivated successfully.
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: + [[ ! -n '' ]]
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: + . kolla_extend_start
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: + umask 0022
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct  2 07:43:10 np0005465988 NetworkManager[45041]: <info>  [1759405390.2192] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Oct  2 07:43:10 np0005465988 NetworkManager[45041]: <info>  [1759405390.2200] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:43:10 np0005465988 NetworkManager[45041]: <info>  [1759405390.2212] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct  2 07:43:10 np0005465988 NetworkManager[45041]: <info>  [1759405390.2219] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Oct  2 07:43:10 np0005465988 NetworkManager[45041]: <info>  [1759405390.2223] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  2 07:43:10 np0005465988 kernel: br-int: entered promiscuous mode
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00022|main|INFO|OVS feature set changed, force recompute.
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:43:10 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:10Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:43:10 np0005465988 NetworkManager[45041]: <info>  [1759405390.2430] manager: (ovn-6718a9-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Oct  2 07:43:10 np0005465988 NetworkManager[45041]: <info>  [1759405390.2444] manager: (ovn-672825-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Oct  2 07:43:10 np0005465988 NetworkManager[45041]: <info>  [1759405390.2456] manager: (ovn-e4c887-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Oct  2 07:43:10 np0005465988 systemd-udevd[132748]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:43:10 np0005465988 NetworkManager[45041]: <info>  [1759405390.2787] device (genev_sys_6081): carrier: link connected
Oct  2 07:43:10 np0005465988 NetworkManager[45041]: <info>  [1759405390.2789] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Oct  2 07:43:10 np0005465988 kernel: genev_sys_6081: entered promiscuous mode
Oct  2 07:43:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:10.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:10 np0005465988 python3.9[132868]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:43:10 np0005465988 ovs-vsctl[132870]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct  2 07:43:11 np0005465988 python3.9[133022]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:43:11 np0005465988 ovs-vsctl[133024]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct  2 07:43:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:11.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:12.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:12 np0005465988 python3.9[133177]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:43:12 np0005465988 ovs-vsctl[133179]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct  2 07:43:13 np0005465988 systemd[1]: session-46.scope: Deactivated successfully.
Oct  2 07:43:13 np0005465988 systemd[1]: session-46.scope: Consumed 1min 402ms CPU time.
Oct  2 07:43:13 np0005465988 systemd-logind[827]: Session 46 logged out. Waiting for processes to exit.
Oct  2 07:43:13 np0005465988 systemd-logind[827]: Removed session 46.
Oct  2 07:43:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:13.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:43:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000026s ======
Oct  2 07:43:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:14.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Oct  2 07:43:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:15.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:16.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:17.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:18.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:18 np0005465988 systemd-logind[827]: New session 48 of user zuul.
Oct  2 07:43:18 np0005465988 systemd[1]: Started Session 48 of User zuul.
Oct  2 07:43:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:43:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:19.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:19 np0005465988 python3.9[133361]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:43:20 np0005465988 systemd[1]: Stopping User Manager for UID 0...
Oct  2 07:43:20 np0005465988 systemd[132635]: Activating special unit Exit the Session...
Oct  2 07:43:20 np0005465988 systemd[132635]: Stopped target Main User Target.
Oct  2 07:43:20 np0005465988 systemd[132635]: Stopped target Basic System.
Oct  2 07:43:20 np0005465988 systemd[132635]: Stopped target Paths.
Oct  2 07:43:20 np0005465988 systemd[132635]: Stopped target Sockets.
Oct  2 07:43:20 np0005465988 systemd[132635]: Stopped target Timers.
Oct  2 07:43:20 np0005465988 systemd[132635]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 07:43:20 np0005465988 systemd[132635]: Closed D-Bus User Message Bus Socket.
Oct  2 07:43:20 np0005465988 systemd[132635]: Stopped Create User's Volatile Files and Directories.
Oct  2 07:43:20 np0005465988 systemd[132635]: Removed slice User Application Slice.
Oct  2 07:43:20 np0005465988 systemd[132635]: Reached target Shutdown.
Oct  2 07:43:20 np0005465988 systemd[132635]: Finished Exit the Session.
Oct  2 07:43:20 np0005465988 systemd[132635]: Reached target Exit the Session.
Oct  2 07:43:20 np0005465988 systemd[1]: user@0.service: Deactivated successfully.
Oct  2 07:43:20 np0005465988 systemd[1]: Stopped User Manager for UID 0.
Oct  2 07:43:20 np0005465988 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  2 07:43:20 np0005465988 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  2 07:43:20 np0005465988 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  2 07:43:20 np0005465988 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  2 07:43:20 np0005465988 systemd[1]: Removed slice User Slice of UID 0.
Oct  2 07:43:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:20.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:20 np0005465988 python3.9[133522]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:21 np0005465988 python3.9[133724]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:43:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:21.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:43:22 np0005465988 python3.9[133876]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:43:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:22.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:43:23 np0005465988 python3.9[134029]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:23.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:23 np0005465988 python3.9[134181]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:43:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:24.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:24 np0005465988 python3.9[134331]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:43:25 np0005465988 python3.9[134484]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  2 07:43:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:25.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:26.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:27 np0005465988 python3.9[134635]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:27.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:27 np0005465988 python3.9[134756]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405406.4761553-225-186107444456863/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:28.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:28 np0005465988 python3.9[134907]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 07:43:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2116 writes, 11K keys, 2116 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 2116 writes, 2116 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2116 writes, 11K keys, 2116 commit groups, 1.0 writes per commit group, ingest: 23.24 MB, 0.04 MB/s#012Interval WAL: 2116 writes, 2116 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     58.3      0.20              0.03         4    0.049       0      0       0.0       0.0#012  L6      1/0    7.15 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.1     94.1     79.8      0.30              0.10         3    0.101     11K   1280       0.0       0.0#012 Sum      1/0    7.15 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.1     57.0     71.3      0.50              0.13         7    0.071     11K   1280       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.1     57.2     71.6      0.50              0.13         6    0.083     11K   1280       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     94.1     79.8      0.30              0.10         3    0.101     11K   1280       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     58.8      0.20              0.03         3    0.065       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.011, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.03 GB write, 0.06 MB/s write, 0.03 GB read, 0.05 MB/s read, 0.5 seconds#012Interval compaction: 0.03 GB write, 0.06 MB/s write, 0.03 GB read, 0.05 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 308.00 MB usage: 1.01 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 5.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(52,905.30 KB,0.287039%) FilterBlock(7,40.30 KB,0.0127768%) IndexBlock(7,91.45 KB,0.0289967%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 07:43:29 np0005465988 python3.9[135029]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405408.1198754-270-83754053776736/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:43:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:29.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:30.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:31.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:32 np0005465988 python3.9[135182]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:43:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:32.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:33.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:34.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:43:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:35.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:36.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:37.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:38.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:38 np0005465988 python3.9[135267]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:43:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:39.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:39 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:39Z|00025|memory|INFO|16000 kB peak resident set size after 29.7 seconds
Oct  2 07:43:39 np0005465988 ovn_controller[132601]: 2025-10-02T11:43:39Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Oct  2 07:43:39 np0005465988 podman[135374]: 2025-10-02 11:43:39.956129048 +0000 UTC m=+0.120727499 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 07:43:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:40.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:43:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct  2 07:43:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:43:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:43:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 07:43:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Oct  2 07:43:41 np0005465988 python3.9[135704]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:43:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:41.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:42 np0005465988 python3.9[135907]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:43:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:43:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:43:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:42.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:42 np0005465988 python3.9[136028]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405421.6975477-382-225544207457932/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:43 np0005465988 python3.9[136179]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:43:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:43.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:43:44 np0005465988 python3.9[136300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405423.010386-382-220873986950125/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:44.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:45 np0005465988 python3.9[136451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:43:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:45.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:46 np0005465988 python3.9[136572]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405425.0020485-513-188852116602071/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:46.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:46 np0005465988 python3.9[136722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:47 np0005465988 python3.9[136844]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405426.2850654-513-220130000389197/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:47.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:48 np0005465988 python3.9[136994]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:43:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:48.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:49 np0005465988 python3.9[137149]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:49.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:49 np0005465988 python3.9[137301]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:50 np0005465988 python3.9[137379]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:50.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:50.827225) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405430827304, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1970, "num_deletes": 252, "total_data_size": 4958282, "memory_usage": 5020192, "flush_reason": "Manual Compaction"}
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405430917899, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 3253241, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10399, "largest_seqno": 12364, "table_properties": {"data_size": 3245113, "index_size": 5007, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16060, "raw_average_key_size": 19, "raw_value_size": 3228885, "raw_average_value_size": 3966, "num_data_blocks": 224, "num_entries": 814, "num_filter_entries": 814, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759405240, "oldest_key_time": 1759405240, "file_creation_time": 1759405430, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 90761 microseconds, and 12428 cpu microseconds.
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:50.917995) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 3253241 bytes OK
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:50.918023) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:50.924708) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:50.924740) EVENT_LOG_v1 {"time_micros": 1759405430924731, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:50.924766) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4949544, prev total WAL file size 4949808, number of live WAL files 2.
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:50.926978) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(3176KB)], [21(7326KB)]
Oct  2 07:43:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405430927275, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 10755387, "oldest_snapshot_seqno": -1}
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 3949 keys, 8581673 bytes, temperature: kUnknown
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405431179818, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 8581673, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8552182, "index_size": 18501, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9925, "raw_key_size": 95743, "raw_average_key_size": 24, "raw_value_size": 8477724, "raw_average_value_size": 2146, "num_data_blocks": 802, "num_entries": 3949, "num_filter_entries": 3949, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759405430, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:51.180175) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 8581673 bytes
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:51.185461) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 42.6 rd, 34.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 7.2 +0.0 blob) out(8.2 +0.0 blob), read-write-amplify(5.9) write-amplify(2.6) OK, records in: 4471, records dropped: 522 output_compression: NoCompression
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:51.185495) EVENT_LOG_v1 {"time_micros": 1759405431185478, "job": 10, "event": "compaction_finished", "compaction_time_micros": 252640, "compaction_time_cpu_micros": 30916, "output_level": 6, "num_output_files": 1, "total_output_size": 8581673, "num_input_records": 4471, "num_output_records": 3949, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405431186555, "job": 10, "event": "table_file_deletion", "file_number": 23}
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405431188942, "job": 10, "event": "table_file_deletion", "file_number": 21}
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:50.926823) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:51.189037) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:51.189043) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:51.189045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:51.189047) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:43:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:43:51.189048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:43:51 np0005465988 python3.9[137532]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:51.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:51 np0005465988 python3.9[137610]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:52.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:52 np0005465988 python3.9[137812]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:43:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:43:53 np0005465988 python3.9[137965]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:53.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:53 np0005465988 python3.9[138043]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:54.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:54 np0005465988 python3.9[138195]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:55 np0005465988 python3.9[138274]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:43:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:43:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:55.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:43:56 np0005465988 python3.9[138426]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:43:56 np0005465988 systemd[1]: Reloading.
Oct  2 07:43:56 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:43:56 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:43:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:56.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:57 np0005465988 python3.9[138616]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:43:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:57.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:43:57 np0005465988 python3.9[138694]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:43:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:43:58.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:43:58 np0005465988 python3.9[138846]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:59 np0005465988 python3.9[138925]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:43:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:43:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:43:59.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:44:00 np0005465988 python3.9[139077]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:00 np0005465988 systemd[1]: Reloading.
Oct  2 07:44:00 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:00 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:00 np0005465988 systemd[1]: Starting Create netns directory...
Oct  2 07:44:00 np0005465988 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:44:00 np0005465988 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:44:00 np0005465988 systemd[1]: Finished Create netns directory.
Oct  2 07:44:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:00.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:01 np0005465988 python3.9[139271]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:01.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:02 np0005465988 python3.9[139473]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:02.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:02 np0005465988 python3.9[139596]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405441.7214882-966-131982713500329/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:03.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:03 np0005465988 python3.9[139749]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:04 np0005465988 python3.9[139901]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:04.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:05 np0005465988 python3.9[140025]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405444.1055088-1041-122217680238892/.source.json _original_basename=.z_hj2ph4 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:44:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:05.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:44:06 np0005465988 python3.9[140177]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 07:44:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 601.0 total, 600.0 interval#012Cumulative writes: 5106 writes, 22K keys, 5106 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5106 writes, 731 syncs, 6.98 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5106 writes, 22K keys, 5106 commit groups, 1.0 writes per commit group, ingest: 18.20 MB, 0.03 MB/s#012Interval WAL: 5106 writes, 731 syncs, 6.98 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 601.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 601.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 601.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowd
Oct  2 07:44:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:44:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:06.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:44:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:07.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:08 np0005465988 python3.9[140605]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct  2 07:44:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:08.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:09 np0005465988 python3.9[140758]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:44:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:09.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:10 np0005465988 podman[140882]: 2025-10-02 11:44:10.304689708 +0000 UTC m=+0.102591128 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 07:44:10 np0005465988 python3.9[140930]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:44:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:10.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:11.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:12 np0005465988 python3[141115]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:44:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:44:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:12.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:44:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:13.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:14.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:15.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:16.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:17.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:18.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:19.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:20 np0005465988 podman[141129]: 2025-10-02 11:44:20.159521539 +0000 UTC m=+7.883273597 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 07:44:20 np0005465988 podman[141251]: 2025-10-02 11:44:20.353343648 +0000 UTC m=+0.055524606 container create daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:44:20 np0005465988 podman[141251]: 2025-10-02 11:44:20.324122088 +0000 UTC m=+0.026303086 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 07:44:20 np0005465988 python3[141115]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 07:44:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:44:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:20.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:44:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:21 np0005465988 python3.9[141442]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:44:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:21.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:22 np0005465988 python3.9[141646]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:22 np0005465988 python3.9[141722]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:44:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:22.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:23 np0005465988 python3.9[141874]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405462.6609402-1305-43435679754210/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:23.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:23 np0005465988 python3.9[141950]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:44:23 np0005465988 systemd[1]: Reloading.
Oct  2 07:44:24 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:24 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:44:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:24.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:44:24 np0005465988 python3.9[142062]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:24 np0005465988 systemd[1]: Reloading.
Oct  2 07:44:24 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:24 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:25 np0005465988 systemd[1]: Starting ovn_metadata_agent container...
Oct  2 07:44:25 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:44:25 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eb7108148d2d41f2f8ba18bebdfe8d331e30458116f44d0477a48687c250876/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct  2 07:44:25 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eb7108148d2d41f2f8ba18bebdfe8d331e30458116f44d0477a48687c250876/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 07:44:25 np0005465988 systemd[1]: Started /usr/bin/podman healthcheck run daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501.
Oct  2 07:44:25 np0005465988 podman[142103]: 2025-10-02 11:44:25.261065763 +0000 UTC m=+0.139233722 container init daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: + sudo -E kolla_set_configs
Oct  2 07:44:25 np0005465988 podman[142103]: 2025-10-02 11:44:25.288132071 +0000 UTC m=+0.166300030 container start daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 07:44:25 np0005465988 edpm-start-podman-container[142103]: ovn_metadata_agent
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Validating config file
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Copying service configuration files
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Writing out command to execute
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Setting permission for /var/lib/neutron
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: ++ cat /run_command
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: + CMD=neutron-ovn-metadata-agent
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: + ARGS=
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: + sudo kolla_copy_cacerts
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: + [[ ! -n '' ]]
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: + . kolla_extend_start
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: Running command: 'neutron-ovn-metadata-agent'
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: + umask 0022
Oct  2 07:44:25 np0005465988 ovn_metadata_agent[142119]: + exec neutron-ovn-metadata-agent
Oct  2 07:44:25 np0005465988 edpm-start-podman-container[142102]: Creating additional drop-in dependency for "ovn_metadata_agent" (daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501)
Oct  2 07:44:25 np0005465988 podman[142126]: 2025-10-02 11:44:25.396204156 +0000 UTC m=+0.097449021 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 07:44:25 np0005465988 systemd[1]: Reloading.
Oct  2 07:44:25 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:25 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:25 np0005465988 systemd[1]: Started ovn_metadata_agent container.
Oct  2 07:44:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:44:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:25.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:44:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:26.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:26 np0005465988 systemd[1]: session-48.scope: Deactivated successfully.
Oct  2 07:44:26 np0005465988 systemd[1]: session-48.scope: Consumed 57.334s CPU time.
Oct  2 07:44:26 np0005465988 systemd-logind[827]: Session 48 logged out. Waiting for processes to exit.
Oct  2 07:44:26 np0005465988 systemd-logind[827]: Removed session 48.
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.267 142124 INFO neutron.common.config [-] Logging enabled!#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.267 142124 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.267 142124 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.268 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.268 142124 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.268 142124 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.268 142124 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.268 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.269 142124 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.269 142124 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.269 142124 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.269 142124 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.269 142124 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.269 142124 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.269 142124 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.270 142124 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.270 142124 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.270 142124 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.270 142124 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.270 142124 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.270 142124 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.270 142124 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.271 142124 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.271 142124 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.271 142124 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.271 142124 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.271 142124 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.271 142124 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.271 142124 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.271 142124 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.272 142124 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.272 142124 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.272 142124 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.272 142124 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.272 142124 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.272 142124 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.272 142124 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.273 142124 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.273 142124 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.273 142124 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.273 142124 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.273 142124 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.273 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.274 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.274 142124 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.274 142124 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.274 142124 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.274 142124 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.274 142124 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.274 142124 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.274 142124 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.274 142124 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.275 142124 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.275 142124 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.275 142124 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.275 142124 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.275 142124 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.275 142124 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.275 142124 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.276 142124 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.276 142124 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.276 142124 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.276 142124 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.276 142124 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.276 142124 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.276 142124 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.277 142124 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.277 142124 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.277 142124 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.277 142124 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.277 142124 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.277 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.277 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.278 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.278 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.278 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.278 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.278 142124 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.278 142124 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.278 142124 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.278 142124 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.279 142124 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.279 142124 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.279 142124 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.279 142124 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.279 142124 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.279 142124 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.279 142124 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.280 142124 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.280 142124 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.280 142124 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.280 142124 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.280 142124 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.280 142124 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.280 142124 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.280 142124 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.281 142124 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.281 142124 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.281 142124 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.281 142124 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.281 142124 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.281 142124 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.281 142124 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.281 142124 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.282 142124 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.282 142124 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.282 142124 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.282 142124 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.282 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.282 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.282 142124 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.283 142124 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.283 142124 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.283 142124 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.283 142124 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.283 142124 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.283 142124 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.283 142124 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.284 142124 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.284 142124 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.284 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.284 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.284 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.284 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.284 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.285 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.285 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.285 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.285 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.286 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.286 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.286 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.286 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.287 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.287 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.287 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.287 142124 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.288 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.288 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.288 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.288 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.289 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.289 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.289 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.289 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.290 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.290 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.290 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.290 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.290 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.290 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.291 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.291 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.291 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.291 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.291 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.291 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.291 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.291 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.291 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.291 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.292 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.292 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.292 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.292 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.292 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.292 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.292 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.292 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.292 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.293 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.293 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.293 142124 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.293 142124 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.293 142124 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.293 142124 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.293 142124 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.293 142124 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.294 142124 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.294 142124 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.294 142124 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.294 142124 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.294 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.294 142124 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.294 142124 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.294 142124 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.294 142124 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.295 142124 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.295 142124 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.295 142124 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.295 142124 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.295 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.295 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.295 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.295 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.295 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.296 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.296 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.296 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.296 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.296 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.296 142124 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.296 142124 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.296 142124 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.296 142124 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.297 142124 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.297 142124 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.297 142124 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.297 142124 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.297 142124 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.297 142124 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.297 142124 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.297 142124 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.297 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.298 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.298 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.298 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.298 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.298 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.298 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.298 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.298 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.298 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.298 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.299 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.299 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.299 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.299 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.299 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.299 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.299 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.299 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.299 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.299 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.300 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.300 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.300 142124 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.300 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.300 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.300 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.300 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.300 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.301 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.301 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.301 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.301 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.301 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.301 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.301 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.301 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.301 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.302 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.302 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.302 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.302 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.302 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.302 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.302 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.302 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.303 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.303 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.303 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.303 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.303 142124 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.303 142124 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.303 142124 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.303 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.303 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.303 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.304 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.304 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.304 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.304 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.304 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.304 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.304 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.304 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.304 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.304 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.305 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.305 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.305 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.305 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.305 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.305 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.305 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.305 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.305 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.306 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.306 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.306 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.306 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.306 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.306 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.306 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.306 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.307 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.307 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.307 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.307 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.307 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.307 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.307 142124 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.307 142124 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.316 142124 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.316 142124 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.316 142124 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.316 142124 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.317 142124 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Oct  2 07:44:27 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.330 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 90028908-5ebc-4bb4-8a1f-92ec79bb27aa (UUID: 90028908-5ebc-4bb4-8a1f-92ec79bb27aa) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.352 142124 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.352 142124 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.352 142124 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.352 142124 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.355 142124 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.361 142124 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.367 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '90028908-5ebc-4bb4-8a1f-92ec79bb27aa'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], external_ids={}, name=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, nb_cfg_timestamp=1759405398289, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.367 142124 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fcdc7032f40>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.368 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.368 142124 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.369 142124 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.369 142124 INFO oslo_service.service [-] Starting 1 workers#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.374 142124 DEBUG oslo_service.service [-] Started child 142241 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.376 142241 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-168418'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.378 142124 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmps0mqvzyk/privsep.sock']#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.398 142241 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.399 142241 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.399 142241 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.402 142241 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.408 142241 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  2 07:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.413 142241 INFO eventlet.wsgi.server [-] (142241) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Oct  2 07:44:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:44:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:27.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:44:27 np0005465988 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct  2 07:44:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:28.042 142124 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 07:44:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:28.043 142124 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmps0mqvzyk/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  2 07:44:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.926 142246 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 07:44:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.930 142246 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 07:44:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.933 142246 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Oct  2 07:44:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:27.933 142246 INFO oslo.privsep.daemon [-] privsep daemon running as pid 142246#033[00m
Oct  2 07:44:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:28.046 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[1682fee7-84ab-4123-bc1e-fc389bcd0094]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:44:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:28.529 142246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:44:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:28.529 142246 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:44:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:28.529 142246 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:44:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:28.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.059 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[2caac389-a9fc-4389-9fba-d03bf2264c2c]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.064 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, column=external_ids, values=({'neutron:ovn-metadata-id': '33f29854-6de8-5ced-8f89-4043e33bbaff'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.076 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.083 142124 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.083 142124 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.084 142124 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.084 142124 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.084 142124 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.084 142124 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.084 142124 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.085 142124 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.085 142124 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.085 142124 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.085 142124 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.085 142124 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.086 142124 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.086 142124 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.086 142124 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.086 142124 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.086 142124 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.087 142124 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.087 142124 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.087 142124 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.087 142124 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.087 142124 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.087 142124 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.088 142124 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.088 142124 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.088 142124 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.088 142124 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.088 142124 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.089 142124 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.089 142124 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.089 142124 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.089 142124 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.089 142124 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.090 142124 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.090 142124 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.090 142124 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.090 142124 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.091 142124 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.091 142124 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.091 142124 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.091 142124 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.091 142124 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.092 142124 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.092 142124 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.092 142124 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.092 142124 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.093 142124 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.093 142124 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.093 142124 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.093 142124 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.094 142124 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.094 142124 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.094 142124 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.094 142124 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.094 142124 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.094 142124 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.094 142124 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.095 142124 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.095 142124 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.095 142124 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.095 142124 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.095 142124 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.095 142124 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.096 142124 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.096 142124 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.096 142124 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.096 142124 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.097 142124 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.097 142124 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.097 142124 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.097 142124 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.097 142124 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.097 142124 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.097 142124 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.098 142124 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.098 142124 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.098 142124 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.098 142124 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.098 142124 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.098 142124 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.099 142124 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.099 142124 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.099 142124 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.099 142124 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.099 142124 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.099 142124 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.099 142124 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.100 142124 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.100 142124 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.100 142124 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.100 142124 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.100 142124 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.100 142124 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.101 142124 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.101 142124 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.101 142124 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.101 142124 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.101 142124 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.101 142124 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.102 142124 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.102 142124 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.102 142124 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.102 142124 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.102 142124 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.102 142124 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.103 142124 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.103 142124 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.103 142124 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.103 142124 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.104 142124 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.104 142124 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.104 142124 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.104 142124 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.104 142124 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.104 142124 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.105 142124 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.105 142124 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.105 142124 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.105 142124 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.105 142124 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.106 142124 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.106 142124 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.106 142124 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.106 142124 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.106 142124 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.107 142124 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.107 142124 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.107 142124 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.107 142124 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.107 142124 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.107 142124 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.108 142124 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.108 142124 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.108 142124 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.108 142124 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.108 142124 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.109 142124 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.109 142124 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.109 142124 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.109 142124 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.109 142124 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.110 142124 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.110 142124 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.110 142124 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.110 142124 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.110 142124 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.110 142124 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.110 142124 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.111 142124 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.111 142124 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.111 142124 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.111 142124 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.111 142124 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.112 142124 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.112 142124 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.112 142124 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.112 142124 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.112 142124 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.113 142124 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.113 142124 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.113 142124 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.113 142124 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.113 142124 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.114 142124 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.114 142124 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.114 142124 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.114 142124 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.114 142124 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.115 142124 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.115 142124 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.115 142124 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.115 142124 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.116 142124 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.116 142124 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.116 142124 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.116 142124 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.116 142124 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.117 142124 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.117 142124 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.117 142124 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.117 142124 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.117 142124 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.118 142124 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.118 142124 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.118 142124 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.118 142124 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.118 142124 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.119 142124 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.119 142124 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.119 142124 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.119 142124 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.119 142124 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.120 142124 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.120 142124 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.120 142124 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.120 142124 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.121 142124 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.121 142124 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.121 142124 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.121 142124 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.122 142124 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.122 142124 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.122 142124 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.122 142124 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.122 142124 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.123 142124 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.123 142124 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.123 142124 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.123 142124 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.123 142124 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.123 142124 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.124 142124 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.124 142124 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.124 142124 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.124 142124 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.124 142124 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.124 142124 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.125 142124 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.125 142124 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.125 142124 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.125 142124 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.125 142124 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.125 142124 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.126 142124 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.126 142124 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.126 142124 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.126 142124 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.126 142124 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.126 142124 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.126 142124 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.127 142124 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.127 142124 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.127 142124 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.127 142124 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.127 142124 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.127 142124 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.128 142124 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.128 142124 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.128 142124 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.128 142124 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.128 142124 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.129 142124 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.129 142124 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.129 142124 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.129 142124 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.129 142124 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.130 142124 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.130 142124 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.130 142124 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.130 142124 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.130 142124 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.131 142124 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.131 142124 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.131 142124 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.131 142124 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.132 142124 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.132 142124 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.132 142124 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.132 142124 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.132 142124 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.132 142124 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.133 142124 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.133 142124 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.133 142124 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.133 142124 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.134 142124 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.134 142124 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.134 142124 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.134 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.134 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.134 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.135 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.135 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.135 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.135 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.135 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.135 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.136 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.136 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.136 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.136 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.136 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.136 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.136 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.137 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.137 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.137 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.137 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.137 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.137 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.138 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.138 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.138 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.138 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.138 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.139 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.139 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.139 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.139 142124 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.139 142124 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.139 142124 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.140 142124 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.140 142124 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:44:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:44:29.140 142124 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:44:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:29.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:30.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:44:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:31.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:44:32 np0005465988 systemd-logind[827]: New session 49 of user zuul.
Oct  2 07:44:32 np0005465988 systemd[1]: Started Session 49 of User zuul.
Oct  2 07:44:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:32.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:33 np0005465988 python3.9[142407]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:44:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:33.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:34 np0005465988 python3.9[142563]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:34.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:35 np0005465988 python3.9[142729]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:44:35 np0005465988 systemd[1]: Reloading.
Oct  2 07:44:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:35.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:35 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:35 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:36.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:36 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Oct  2 07:44:36 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Oct  2 07:44:37 np0005465988 python3.9[142914]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:44:37 np0005465988 network[142931]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:44:37 np0005465988 network[142932]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:44:37 np0005465988 network[142933]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:44:37 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Oct  2 07:44:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:37.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:38.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:39.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:40 np0005465988 podman[142980]: 2025-10-02 11:44:40.610535797 +0000 UTC m=+0.145017516 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:44:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:40.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:41.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:42.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:43 np0005465988 python3.9[143277]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:43.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:44 np0005465988 python3.9[143430]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:44:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:44.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:44:44 np0005465988 python3.9[143583]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:45 np0005465988 python3.9[143737]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:45.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:46 np0005465988 python3.9[143890]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:46.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:47 np0005465988 python3.9[144044]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:47.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:48 np0005465988 python3.9[144197]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:48.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:49 np0005465988 python3.9[144351]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:49.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:50 np0005465988 python3.9[144503]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:44:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:50.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:44:50 np0005465988 python3.9[144655]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:51 np0005465988 python3.9[144808]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:51.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:52 np0005465988 python3.9[144960]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:52.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:52 np0005465988 python3.9[145167]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:53 np0005465988 python3.9[145396]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:53.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:54 np0005465988 python3.9[145548]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:54.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:54 np0005465988 python3.9[145700]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 07:44:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:44:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:44:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:44:55 np0005465988 podman[145854]: 2025-10-02 11:44:55.542449567 +0000 UTC m=+0.072038954 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 07:44:55 np0005465988 python3.9[145853]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:55.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:44:56 np0005465988 python3.9[146025]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:56.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:56 np0005465988 python3.9[146178]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:57 np0005465988 python3.9[146330]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:57.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:44:58 np0005465988 python3.9[146482]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:44:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:44:58.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:44:59 np0005465988 python3.9[146635]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:44:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:44:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:44:59.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:00 np0005465988 python3.9[146787]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:45:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:00.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:01 np0005465988 python3.9[146989]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:45:01 np0005465988 systemd[1]: Reloading.
Oct  2 07:45:01 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:45:01 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:45:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:45:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:45:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:01.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:02 np0005465988 python3.9[147227]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:45:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:02.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:03 np0005465988 python3.9[147381]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:45:03 np0005465988 python3.9[147534]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:45:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:03.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:04 np0005465988 python3.9[147687]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:45:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:04.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:05 np0005465988 python3.9[147841]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:45:05 np0005465988 python3.9[147994]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:45:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:05.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:06 np0005465988 python3.9[148147]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:45:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:06.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:07 np0005465988 python3.9[148301]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct  2 07:45:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:07.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:08.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:08 np0005465988 python3.9[148454]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:45:09 np0005465988 python3.9[148613]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:45:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:09.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:10.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:10 np0005465988 podman[148746]: 2025-10-02 11:45:10.872268623 +0000 UTC m=+0.099919715 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 07:45:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:11 np0005465988 python3.9[148793]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:45:11 np0005465988 python3.9[148883]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:45:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:11.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:12.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:13.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:14.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:15.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:16.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:17.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:18.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:19.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:20.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:21.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:22.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:23.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:24.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:25.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:26 np0005465988 podman[149124]: 2025-10-02 11:45:26.545113968 +0000 UTC m=+0.074914018 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 07:45:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:26.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:45:27.309 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:45:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:45:27.309 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:45:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:45:27.310 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:45:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:27.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:28.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:29.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:30.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:31.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:32.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:33.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:34.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:35.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:36.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:37.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:38.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:39 np0005465988 kernel: SELinux:  Converting 2768 SID table entries...
Oct  2 07:45:39 np0005465988 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:45:39 np0005465988 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:45:39 np0005465988 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:45:39 np0005465988 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:45:39 np0005465988 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:45:39 np0005465988 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:45:39 np0005465988 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:45:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:39.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:40.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:41 np0005465988 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Oct  2 07:45:41 np0005465988 podman[149165]: 2025-10-02 11:45:41.630702788 +0000 UTC m=+0.148486971 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 07:45:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:41.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:42.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:43.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:44.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:45.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:46.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:47.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:48.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:49 np0005465988 kernel: SELinux:  Converting 2768 SID table entries...
Oct  2 07:45:49 np0005465988 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:45:49 np0005465988 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:45:49 np0005465988 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:45:49 np0005465988 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:45:49 np0005465988 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:45:49 np0005465988 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:45:49 np0005465988 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:45:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:49.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:50.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:51.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:52.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:53.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:54.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:45:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:55.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:56.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:57 np0005465988 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct  2 07:45:57 np0005465988 podman[149258]: 2025-10-02 11:45:57.543217611 +0000 UTC m=+0.070893073 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 07:45:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:45:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:57.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:45:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:45:58.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:45:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:45:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:45:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:45:59.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:00.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:01.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:46:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:46:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:46:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:02.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:03.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:04.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:06.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:06.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:08.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:08.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:10.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:10.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:46:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:46:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:12.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:12 np0005465988 podman[156297]: 2025-10-02 11:46:12.55327513 +0000 UTC m=+0.097699013 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:46:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:12.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:14.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:14.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:16.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:16.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:18.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:18.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:20.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:20.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:22.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:22.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:24.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:24.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:26.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:26.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:46:27.309 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:46:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:46:27.310 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:46:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:46:27.310 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:46:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:28.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:28 np0005465988 podman[165760]: 2025-10-02 11:46:28.526112028 +0000 UTC m=+0.060510222 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:46:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:28.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:30.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:30.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:32.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:32.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:34.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:34.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:36.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:36.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:38.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:38.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:40.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:40.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:42.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:42.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:42 np0005465988 podman[166424]: 2025-10-02 11:46:42.828194898 +0000 UTC m=+0.144801213 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Oct  2 07:46:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:44.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:44 np0005465988 kernel: SELinux:  Converting 2769 SID table entries...
Oct  2 07:46:44 np0005465988 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:46:44 np0005465988 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:46:44 np0005465988 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:46:44 np0005465988 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:46:44 np0005465988 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:46:44 np0005465988 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:46:44 np0005465988 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:46:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:44.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:46.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:46 np0005465988 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Oct  2 07:46:46 np0005465988 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct  2 07:46:46 np0005465988 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Oct  2 07:46:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:46.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:48.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:46:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:48.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:46:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:50.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:50.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:52.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:52.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:46:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:54.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:46:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:54.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:54 np0005465988 systemd[1]: Stopping OpenSSH server daemon...
Oct  2 07:46:54 np0005465988 systemd[1]: sshd.service: Deactivated successfully.
Oct  2 07:46:54 np0005465988 systemd[1]: Stopped OpenSSH server daemon.
Oct  2 07:46:54 np0005465988 systemd[1]: sshd.service: Consumed 2.452s CPU time, read 0B from disk, written 8.0K to disk.
Oct  2 07:46:54 np0005465988 systemd[1]: Stopped target sshd-keygen.target.
Oct  2 07:46:54 np0005465988 systemd[1]: Stopping sshd-keygen.target...
Oct  2 07:46:54 np0005465988 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 07:46:54 np0005465988 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 07:46:54 np0005465988 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 07:46:54 np0005465988 systemd[1]: Reached target sshd-keygen.target.
Oct  2 07:46:55 np0005465988 systemd[1]: Starting OpenSSH server daemon...
Oct  2 07:46:55 np0005465988 systemd[1]: Started OpenSSH server daemon.
Oct  2 07:46:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:46:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:46:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:56.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:46:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:56.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:57 np0005465988 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:46:57 np0005465988 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:46:57 np0005465988 systemd[1]: Reloading.
Oct  2 07:46:57 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:46:57 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:46:57 np0005465988 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:46:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:46:58.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:46:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:46:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:46:58.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:46:59 np0005465988 podman[168748]: 2025-10-02 11:46:59.898063173 +0000 UTC m=+0.439261838 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 07:47:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:47:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:00.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:47:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:00.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:02.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:02.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:04.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:04 np0005465988 systemd[1]: Starting PackageKit Daemon...
Oct  2 07:47:04 np0005465988 systemd[1]: Started PackageKit Daemon.
Oct  2 07:47:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:04.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:06.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:06 np0005465988 python3.9[175132]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:47:06 np0005465988 systemd[1]: Reloading.
Oct  2 07:47:06 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:47:06 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:47:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:06.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:07 np0005465988 python3.9[176226]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:47:07 np0005465988 systemd[1]: Reloading.
Oct  2 07:47:07 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:47:07 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:47:07 np0005465988 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:47:07 np0005465988 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:47:07 np0005465988 systemd[1]: man-db-cache-update.service: Consumed 11.458s CPU time.
Oct  2 07:47:07 np0005465988 systemd[1]: run-r68d383b3c7b64de08f0659cd9a81291f.service: Deactivated successfully.
Oct  2 07:47:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:08.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:08 np0005465988 python3.9[176440]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:47:08 np0005465988 systemd[1]: Reloading.
Oct  2 07:47:08 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:47:08 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:47:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:08.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:09 np0005465988 python3.9[176631]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:47:09 np0005465988 systemd[1]: Reloading.
Oct  2 07:47:09 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:47:09 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:47:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:10.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:10.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:12.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:12 np0005465988 python3.9[176952]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:12.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:12 np0005465988 systemd[1]: Reloading.
Oct  2 07:47:12 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:47:12 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:47:13 np0005465988 podman[176956]: 2025-10-02 11:47:13.102072992 +0000 UTC m=+0.225815315 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:47:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:47:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:47:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:47:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:47:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:14.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:14 np0005465988 python3.9[177169]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:14 np0005465988 systemd[1]: Reloading.
Oct  2 07:47:14 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:47:14 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:47:14 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:47:14 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:47:14 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:47:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:14.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.303107) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405635303179, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2042, "num_deletes": 250, "total_data_size": 5242214, "memory_usage": 5310192, "flush_reason": "Manual Compaction"}
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405635320468, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 3431016, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12370, "largest_seqno": 14406, "table_properties": {"data_size": 3422590, "index_size": 5241, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 15298, "raw_average_key_size": 18, "raw_value_size": 3406123, "raw_average_value_size": 4108, "num_data_blocks": 235, "num_entries": 829, "num_filter_entries": 829, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759405430, "oldest_key_time": 1759405430, "file_creation_time": 1759405635, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 17413 microseconds, and 8199 cpu microseconds.
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.320528) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 3431016 bytes OK
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.320549) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.322252) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.322265) EVENT_LOG_v1 {"time_micros": 1759405635322260, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.322282) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 5233226, prev total WAL file size 5233226, number of live WAL files 2.
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.323614) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323531' seq:0, type:0; will stop at (end)
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(3350KB)], [24(8380KB)]
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405635323666, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 12012689, "oldest_snapshot_seqno": -1}
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4263 keys, 11472743 bytes, temperature: kUnknown
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405635392760, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 11472743, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11438515, "index_size": 22470, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10693, "raw_key_size": 103726, "raw_average_key_size": 24, "raw_value_size": 11355845, "raw_average_value_size": 2663, "num_data_blocks": 960, "num_entries": 4263, "num_filter_entries": 4263, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759405635, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.393100) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 11472743 bytes
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.394241) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 173.4 rd, 165.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.3, 8.2 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(6.8) write-amplify(3.3) OK, records in: 4778, records dropped: 515 output_compression: NoCompression
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.394255) EVENT_LOG_v1 {"time_micros": 1759405635394248, "job": 12, "event": "compaction_finished", "compaction_time_micros": 69296, "compaction_time_cpu_micros": 23477, "output_level": 6, "num_output_files": 1, "total_output_size": 11472743, "num_input_records": 4778, "num_output_records": 4263, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405635394824, "job": 12, "event": "table_file_deletion", "file_number": 26}
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405635395980, "job": 12, "event": "table_file_deletion", "file_number": 24}
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.323485) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.396020) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.396026) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.396029) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.396031) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:47:15 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:47:15.396032) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:47:15 np0005465988 python3.9[177359]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:15 np0005465988 systemd[1]: Reloading.
Oct  2 07:47:15 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:47:15 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:47:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:16.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:16 np0005465988 python3.9[177550]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:16.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:17 np0005465988 python3.9[177706]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:17 np0005465988 systemd[1]: Reloading.
Oct  2 07:47:17 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:47:17 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:47:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:47:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:18.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:47:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:18.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:19 np0005465988 python3.9[177898]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:47:19 np0005465988 systemd[1]: Reloading.
Oct  2 07:47:19 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:47:19 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:47:19 np0005465988 systemd[1]: Listening on libvirt proxy daemon socket.
Oct  2 07:47:19 np0005465988 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct  2 07:47:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:20.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:20 np0005465988 python3.9[178091]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:20.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:21 np0005465988 python3.9[178295]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:21 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:47:21 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:47:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:22.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:22 np0005465988 python3.9[178452]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:22.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:23 np0005465988 python3.9[178658]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:24.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:24 np0005465988 python3.9[178813]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:24.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:25 np0005465988 python3.9[178969]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:26.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:26 np0005465988 python3.9[179124]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:26.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:27 np0005465988 python3.9[179280]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:47:27.311 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:47:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:47:27.312 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:47:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:47:27.312 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:47:27 np0005465988 python3.9[179435]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:28.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:28 np0005465988 python3.9[179590]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:28.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:29 np0005465988 python3.9[179746]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:30.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:30 np0005465988 podman[179873]: 2025-10-02 11:47:30.451418702 +0000 UTC m=+0.064857601 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 07:47:30 np0005465988 python3.9[179920]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:30.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:31 np0005465988 python3.9[180076]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:47:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:32.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:47:32 np0005465988 python3.9[180231]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:47:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:32.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:33 np0005465988 python3.9[180387]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:47:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:34.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:34 np0005465988 python3.9[180539]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:47:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:34.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:35 np0005465988 python3.9[180692]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:47:35 np0005465988 python3.9[180844]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:47:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:36.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:36 np0005465988 python3.9[180996]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:47:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:47:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:36.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:47:37 np0005465988 python3.9[181149]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:47:37 np0005465988 python3.9[181301]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:47:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:38.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:38 np0005465988 python3.9[181426]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405657.2753358-1629-252942520663695/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:38.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:39 np0005465988 python3.9[181579]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:47:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:40.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:40 np0005465988 python3.9[181704]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405658.9532077-1629-36400791267177/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:40.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:40 np0005465988 python3.9[181857]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:47:41 np0005465988 python3.9[181982]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405660.4179971-1629-202783359567640/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:42.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:42 np0005465988 python3.9[182134]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:47:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:42.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:43 np0005465988 python3.9[182260]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405661.9296002-1629-185923175872729/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:43 np0005465988 podman[182394]: 2025-10-02 11:47:43.565852257 +0000 UTC m=+0.104034948 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 07:47:43 np0005465988 python3.9[182488]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:47:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:44.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:44 np0005465988 python3.9[182613]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405663.2941573-1629-200215642055825/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:44.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:45 np0005465988 python3.9[182766]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:47:45 np0005465988 python3.9[182891]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405664.6943533-1629-130508370214796/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:46.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:46 np0005465988 python3.9[183043]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:47:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:46.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:47 np0005465988 python3.9[183167]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405666.078885-1629-95795553788376/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:48 np0005465988 python3.9[183319]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:47:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:48.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:48 np0005465988 python3.9[183444]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405667.3720334-1629-270575652678707/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:48.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:49 np0005465988 python3.9[183597]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct  2 07:47:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:50.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:50.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:51 np0005465988 ceph-mds[84851]: mds.beacon.cephfs.compute-2.gpiyct missed beacon ack from the monitors
Oct  2 07:47:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:52.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:52.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:53 np0005465988 python3.9[183752]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:53 np0005465988 python3.9[183904]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:54.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:54 np0005465988 python3.9[184056]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:54.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:55 np0005465988 python3.9[184209]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:55 np0005465988 python3.9[184361]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:56.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:56 np0005465988 python3.9[184513]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:47:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:56.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:57 np0005465988 python3.9[184666]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:58 np0005465988 python3.9[184818]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:47:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:47:58.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:47:58 np0005465988 python3.9[184970]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:47:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:47:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:47:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:47:58.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:47:59 np0005465988 python3.9[185123]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:00 np0005465988 python3.9[185275]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:00.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:00 np0005465988 podman[185399]: 2025-10-02 11:48:00.55805638 +0000 UTC m=+0.058304839 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:48:00 np0005465988 python3.9[185446]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:00.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:01 np0005465988 python3.9[185599]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:02 np0005465988 python3.9[185751]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:02.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:02.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:04 np0005465988 python3.9[185954]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:04.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:04 np0005465988 python3.9[186077]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405683.626068-2292-51604613221716/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:04.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:05 np0005465988 python3.9[186230]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:06 np0005465988 python3.9[186353]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405684.9027002-2292-41590873378173/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:06.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:06 np0005465988 python3.9[186505]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:06.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:07 np0005465988 python3.9[186629]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405686.1614869-2292-206538734137795/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:07 np0005465988 python3.9[186781]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:08.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:08 np0005465988 python3.9[186904]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405687.3734965-2292-274502808518031/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:08.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:09 np0005465988 python3.9[187057]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:09 np0005465988 python3.9[187180]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405688.6968539-2292-191502997438426/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:48:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:10.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:48:10 np0005465988 python3.9[187332]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:10.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:11 np0005465988 python3.9[187456]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405689.8860488-2292-132665126306241/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:11 np0005465988 python3.9[187608]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:12.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:12 np0005465988 python3.9[187731]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405691.2149389-2292-80915579633908/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:12.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:13 np0005465988 python3.9[187884]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:13 np0005465988 podman[188007]: 2025-10-02 11:48:13.755022453 +0000 UTC m=+0.109709694 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 07:48:13 np0005465988 python3.9[188008]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405692.5973842-2292-83082909618783/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:14.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:14 np0005465988 python3.9[188184]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:14.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:15 np0005465988 python3.9[188308]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405694.0476992-2292-172613001800884/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:15 np0005465988 python3.9[188460]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:48:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:16.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:48:16 np0005465988 python3.9[188583]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405695.3665013-2292-39778827879371/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:16.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:17 np0005465988 python3.9[188736]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:18 np0005465988 python3.9[188859]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405696.652334-2292-57174152002310/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:18.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:18 np0005465988 python3.9[189011]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:18.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:19 np0005465988 python3.9[189135]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405698.1700149-2292-210070118750068/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:19 np0005465988 python3.9[189287]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:20.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:20 np0005465988 python3.9[189410]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405699.4854362-2292-192399369991304/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:48:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:20.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:48:21 np0005465988 python3.9[189563]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:21 np0005465988 python3.9[189805]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405700.7573035-2292-250337040296345/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:22.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:22 np0005465988 python3.9[189969]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:48:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:48:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:48:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:48:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:22.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:23 np0005465988 python3.9[190175]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct  2 07:48:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:24.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:48:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:24.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:48:25 np0005465988 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct  2 07:48:25 np0005465988 python3.9[190332]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:26.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:26 np0005465988 python3.9[190484]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:26.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:27 np0005465988 python3.9[190637]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:48:27.312 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:48:27.313 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:48:27.313 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:48:27 np0005465988 python3.9[190789]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:28.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:28 np0005465988 python3.9[190941]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:28 np0005465988 auditd[708]: Audit daemon rotating log files
Oct  2 07:48:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:28.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:29 np0005465988 python3.9[191142]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:48:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:48:30 np0005465988 python3.9[191296]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:48:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:30.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:48:30 np0005465988 python3.9[191448]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:30 np0005465988 podman[191450]: 2025-10-02 11:48:30.893112724 +0000 UTC m=+0.057807334 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:48:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:30.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:31 np0005465988 python3.9[191620]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:32 np0005465988 python3.9[191772]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:32.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:48:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:32.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:48:33 np0005465988 python3.9[191925]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:48:33 np0005465988 systemd[1]: Reloading.
Oct  2 07:48:33 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:33 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:33 np0005465988 systemd[1]: Starting libvirt logging daemon socket...
Oct  2 07:48:33 np0005465988 systemd[1]: Listening on libvirt logging daemon socket.
Oct  2 07:48:33 np0005465988 systemd[1]: Starting libvirt logging daemon admin socket...
Oct  2 07:48:33 np0005465988 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct  2 07:48:33 np0005465988 systemd[1]: Starting libvirt logging daemon...
Oct  2 07:48:33 np0005465988 systemd[1]: Started libvirt logging daemon.
Oct  2 07:48:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:34.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:34 np0005465988 python3.9[192119]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:48:34 np0005465988 systemd[1]: Reloading.
Oct  2 07:48:34 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:34 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:34 np0005465988 systemd[1]: Starting libvirt nodedev daemon socket...
Oct  2 07:48:34 np0005465988 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct  2 07:48:34 np0005465988 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct  2 07:48:34 np0005465988 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct  2 07:48:34 np0005465988 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct  2 07:48:34 np0005465988 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct  2 07:48:34 np0005465988 systemd[1]: Starting libvirt nodedev daemon...
Oct  2 07:48:34 np0005465988 systemd[1]: Started libvirt nodedev daemon.
Oct  2 07:48:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:34.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:35 np0005465988 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct  2 07:48:35 np0005465988 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct  2 07:48:35 np0005465988 python3.9[192336]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:48:35 np0005465988 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct  2 07:48:35 np0005465988 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct  2 07:48:35 np0005465988 systemd[1]: Reloading.
Oct  2 07:48:35 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:35 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:35 np0005465988 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct  2 07:48:35 np0005465988 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct  2 07:48:35 np0005465988 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct  2 07:48:35 np0005465988 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct  2 07:48:36 np0005465988 systemd[1]: Starting libvirt proxy daemon...
Oct  2 07:48:36 np0005465988 systemd[1]: Started libvirt proxy daemon.
Oct  2 07:48:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:36.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:36 np0005465988 setroubleshoot[192260]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 77b8b18a-36e9-4116-b9e4-6b8a4f1db259
Oct  2 07:48:36 np0005465988 setroubleshoot[192260]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  2 07:48:36 np0005465988 setroubleshoot[192260]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 77b8b18a-36e9-4116-b9e4-6b8a4f1db259
Oct  2 07:48:36 np0005465988 setroubleshoot[192260]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  2 07:48:36 np0005465988 python3.9[192553]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:48:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:36 np0005465988 systemd[1]: Reloading.
Oct  2 07:48:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:36.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:36 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:36 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:37 np0005465988 systemd[1]: Listening on libvirt locking daemon socket.
Oct  2 07:48:37 np0005465988 systemd[1]: Starting libvirt QEMU daemon socket...
Oct  2 07:48:37 np0005465988 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct  2 07:48:37 np0005465988 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct  2 07:48:37 np0005465988 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct  2 07:48:37 np0005465988 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct  2 07:48:37 np0005465988 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct  2 07:48:37 np0005465988 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct  2 07:48:37 np0005465988 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct  2 07:48:37 np0005465988 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct  2 07:48:37 np0005465988 systemd[1]: Starting libvirt QEMU daemon...
Oct  2 07:48:37 np0005465988 systemd[1]: Started libvirt QEMU daemon.
Oct  2 07:48:38 np0005465988 python3.9[192767]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:48:38 np0005465988 systemd[1]: Reloading.
Oct  2 07:48:38 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:38 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:38.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:38 np0005465988 systemd[1]: Starting libvirt secret daemon socket...
Oct  2 07:48:38 np0005465988 systemd[1]: Listening on libvirt secret daemon socket.
Oct  2 07:48:38 np0005465988 systemd[1]: Starting libvirt secret daemon admin socket...
Oct  2 07:48:38 np0005465988 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct  2 07:48:38 np0005465988 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct  2 07:48:38 np0005465988 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct  2 07:48:38 np0005465988 systemd[1]: Starting libvirt secret daemon...
Oct  2 07:48:38 np0005465988 systemd[1]: Started libvirt secret daemon.
Oct  2 07:48:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:38.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:39 np0005465988 python3.9[192978]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:40.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:40 np0005465988 python3.9[193130]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:48:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:40.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:41 np0005465988 python3.9[193283]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:48:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:41 np0005465988 python3.9[193437]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:48:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:42.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:42 np0005465988 python3.9[193587]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:42.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:43 np0005465988 python3.9[193709]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405722.4405575-3367-175237982117867/.source.xml follow=False _original_basename=secret.xml.j2 checksum=63af5286395175bfc9ebaef2783b75cb37ce0b72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:43 np0005465988 podman[193883]: 2025-10-02 11:48:43.990508121 +0000 UTC m=+0.075856483 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 07:48:44 np0005465988 python3.9[193932]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine fd4c5763-22d1-50ea-ad0b-96a3dc3040b2#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:48:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:44.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:44.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:45 np0005465988 python3.9[194099]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:46.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:46 np0005465988 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct  2 07:48:46 np0005465988 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct  2 07:48:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:46.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:48 np0005465988 python3.9[194563]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:48.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:48.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:48 np0005465988 python3.9[194716]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:49 np0005465988 python3.9[194839]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405728.4524221-3532-227709843713581/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:50.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:50 np0005465988 python3.9[194991]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:48:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:50.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:48:51 np0005465988 python3.9[195144]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:51 np0005465988 python3.9[195222]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:52.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:52 np0005465988 python3.9[195374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:52.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:53 np0005465988 python3.9[195453]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.a0bh1v4_ recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:53 np0005465988 python3.9[195605]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:54.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:54 np0005465988 python3.9[195683]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:54.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:55 np0005465988 python3.9[195836]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:48:56 np0005465988 python3[195989]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:48:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:56.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:48:56 np0005465988 python3.9[196141]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:56.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:57 np0005465988 python3.9[196220]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:58 np0005465988 python3.9[196372]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:48:58.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:58 np0005465988 python3.9[196450]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:48:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:48:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:48:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:48:58.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:48:59 np0005465988 python3.9[196603]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:48:59 np0005465988 python3.9[196681]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:00.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:00 np0005465988 python3.9[196833]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:00.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:01 np0005465988 podman[196884]: 2025-10-02 11:49:01.199158486 +0000 UTC m=+0.090475933 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:49:01 np0005465988 python3.9[196923]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:02 np0005465988 python3.9[197082]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:02.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:02 np0005465988 ceph-mgr[76715]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3158772141
Oct  2 07:49:02 np0005465988 python3.9[197207]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405741.6221347-3907-206690440551290/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:02.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:03 np0005465988 python3.9[197360]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:04.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:04 np0005465988 python3.9[197562]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:49:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:04.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:05 np0005465988 python3.9[197718]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:06 np0005465988 python3.9[197870]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:49:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:06.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:06 np0005465988 python3.9[198023]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:49:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:06.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:07 np0005465988 python3.9[198178]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:49:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:08.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:08 np0005465988 python3.9[198333]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:08.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:09 np0005465988 python3.9[198486]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:09 np0005465988 python3.9[198609]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405748.7174087-4123-220495572495869/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:10.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:10 np0005465988 python3.9[198761]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.569830) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405750569899, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1470, "num_deletes": 502, "total_data_size": 2846076, "memory_usage": 2889032, "flush_reason": "Manual Compaction"}
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405750577185, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 1118591, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14411, "largest_seqno": 15876, "table_properties": {"data_size": 1113801, "index_size": 1738, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 14678, "raw_average_key_size": 19, "raw_value_size": 1101709, "raw_average_value_size": 1430, "num_data_blocks": 79, "num_entries": 770, "num_filter_entries": 770, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759405635, "oldest_key_time": 1759405635, "file_creation_time": 1759405750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 7386 microseconds, and 3711 cpu microseconds.
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.577226) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 1118591 bytes OK
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.577242) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.578981) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.578994) EVENT_LOG_v1 {"time_micros": 1759405750578990, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.579011) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 2838296, prev total WAL file size 2838296, number of live WAL files 2.
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.579805) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323531' seq:72057594037927935, type:22 .. '6D67727374617400353033' seq:0, type:0; will stop at (end)
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(1092KB)], [27(10MB)]
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405750579840, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 12591334, "oldest_snapshot_seqno": -1}
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4064 keys, 7663918 bytes, temperature: kUnknown
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405750632182, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 7663918, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7635578, "index_size": 17094, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10181, "raw_key_size": 100796, "raw_average_key_size": 24, "raw_value_size": 7560782, "raw_average_value_size": 1860, "num_data_blocks": 719, "num_entries": 4064, "num_filter_entries": 4064, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759405750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.632331) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 7663918 bytes
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.633392) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 240.3 rd, 146.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 10.9 +0.0 blob) out(7.3 +0.0 blob), read-write-amplify(18.1) write-amplify(6.9) OK, records in: 5033, records dropped: 969 output_compression: NoCompression
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.633406) EVENT_LOG_v1 {"time_micros": 1759405750633399, "job": 14, "event": "compaction_finished", "compaction_time_micros": 52388, "compaction_time_cpu_micros": 16526, "output_level": 6, "num_output_files": 1, "total_output_size": 7663918, "num_input_records": 5033, "num_output_records": 4064, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405750633630, "job": 14, "event": "table_file_deletion", "file_number": 29}
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405750635305, "job": 14, "event": "table_file_deletion", "file_number": 27}
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.579743) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.635396) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.635415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.635417) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.635419) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:49:10 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:49:10.635421) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:49:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:10.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:11 np0005465988 python3.9[198885]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405750.0999563-4168-241353298364614/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:11 np0005465988 python3.9[199037]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:12.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:12 np0005465988 python3.9[199160]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405751.3746223-4213-178390155204885/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:12.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:13 np0005465988 python3.9[199313]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:49:13 np0005465988 systemd[1]: Reloading.
Oct  2 07:49:13 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:13 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:13 np0005465988 systemd[1]: Reached target edpm_libvirt.target.
Oct  2 07:49:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:14.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:14 np0005465988 podman[199478]: 2025-10-02 11:49:14.412390382 +0000 UTC m=+0.111230245 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 07:49:14 np0005465988 python3.9[199523]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  2 07:49:14 np0005465988 systemd[1]: Reloading.
Oct  2 07:49:14 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:14 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:14.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:15 np0005465988 systemd[1]: Reloading.
Oct  2 07:49:15 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:15 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:15 np0005465988 systemd[1]: session-49.scope: Deactivated successfully.
Oct  2 07:49:15 np0005465988 systemd[1]: session-49.scope: Consumed 3min 40.834s CPU time.
Oct  2 07:49:15 np0005465988 systemd-logind[827]: Session 49 logged out. Waiting for processes to exit.
Oct  2 07:49:15 np0005465988 systemd-logind[827]: Removed session 49.
Oct  2 07:49:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:16.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:16.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:18.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:18.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:20.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:20.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:21 np0005465988 systemd-logind[827]: New session 50 of user zuul.
Oct  2 07:49:21 np0005465988 systemd[1]: Started Session 50 of User zuul.
Oct  2 07:49:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:22.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:22 np0005465988 python3.9[199787]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:49:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:22.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:24 np0005465988 python3.9[199994]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:24.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:24 np0005465988 python3.9[200146]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:24.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:25 np0005465988 python3.9[200299]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:26 np0005465988 python3.9[200451]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:49:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:26.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:26 np0005465988 python3.9[200603]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:26.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:49:27.313 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:49:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:49:27.314 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:49:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:49:27.314 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:49:27 np0005465988 python3.9[200756]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:49:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:28.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:28.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:29 np0005465988 python3.9[200911]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:49:29 np0005465988 systemd[1]: Reloading.
Oct  2 07:49:29 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:29 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:30.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:30 np0005465988 python3.9[201230]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:49:30 np0005465988 network[201249]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:49:30 np0005465988 network[201250]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:49:30 np0005465988 network[201251]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:49:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:30.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:49:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:49:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:49:31 np0005465988 podman[201259]: 2025-10-02 11:49:31.424263001 +0000 UTC m=+0.097450316 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 07:49:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:32.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:32.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:34.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:34.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:35 np0005465988 python3.9[201546]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:49:35 np0005465988 systemd[1]: Reloading.
Oct  2 07:49:36 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:36 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:36.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:36.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:37 np0005465988 python3.9[201733]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:49:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:49:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:49:38 np0005465988 python3.9[201935]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  2 07:49:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:38 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:49:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:38.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.004000116s ======
Oct  2 07:49:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:38.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000116s
Oct  2 07:49:39 np0005465988 podman[201950]: 2025-10-02 11:49:39.597659884 +0000 UTC m=+1.153756393 image pull 1b3fd7f2436e5c6f2e28c01b83721476c7b295789c77b3d63e30f49404389ea1 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:49:39 np0005465988 podman[202011]: 2025-10-02 11:49:39.804628952 +0000 UTC m=+0.063358977 container create ae0db6508346f28fe67d08ef0fe06ccb0e263da76f68729d5fb7710eeffccbfb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.8414] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/23)
Oct  2 07:49:39 np0005465988 kernel: podman0: port 1(veth0) entered blocking state
Oct  2 07:49:39 np0005465988 kernel: podman0: port 1(veth0) entered disabled state
Oct  2 07:49:39 np0005465988 kernel: veth0: entered allmulticast mode
Oct  2 07:49:39 np0005465988 kernel: veth0: entered promiscuous mode
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.8619] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Oct  2 07:49:39 np0005465988 kernel: podman0: port 1(veth0) entered blocking state
Oct  2 07:49:39 np0005465988 kernel: podman0: port 1(veth0) entered forwarding state
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.8642] device (veth0): carrier: link connected
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.8646] device (podman0): carrier: link connected
Oct  2 07:49:39 np0005465988 podman[202011]: 2025-10-02 11:49:39.771966306 +0000 UTC m=+0.030696401 image pull 1b3fd7f2436e5c6f2e28c01b83721476c7b295789c77b3d63e30f49404389ea1 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:49:39 np0005465988 systemd-udevd[202034]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:49:39 np0005465988 systemd-udevd[202036]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.8963] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.8981] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.8998] device (podman0): Activation: starting connection 'podman0' (50b4f1e7-f800-486d-8372-56f9a14257a3)
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.9000] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.9005] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.9008] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.9036] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 07:49:39 np0005465988 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:49:39 np0005465988 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.9315] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.9318] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 07:49:39 np0005465988 NetworkManager[45041]: <info>  [1759405779.9324] device (podman0): Activation: successful, device activated.
Oct  2 07:49:39 np0005465988 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct  2 07:49:40 np0005465988 systemd[1]: Started libpod-conmon-ae0db6508346f28fe67d08ef0fe06ccb0e263da76f68729d5fb7710eeffccbfb.scope.
Oct  2 07:49:40 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:49:40 np0005465988 podman[202011]: 2025-10-02 11:49:40.158059257 +0000 UTC m=+0.416789272 container init ae0db6508346f28fe67d08ef0fe06ccb0e263da76f68729d5fb7710eeffccbfb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:49:40 np0005465988 podman[202011]: 2025-10-02 11:49:40.166962695 +0000 UTC m=+0.425692690 container start ae0db6508346f28fe67d08ef0fe06ccb0e263da76f68729d5fb7710eeffccbfb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:49:40 np0005465988 podman[202011]: 2025-10-02 11:49:40.169749505 +0000 UTC m=+0.428479530 container attach ae0db6508346f28fe67d08ef0fe06ccb0e263da76f68729d5fb7710eeffccbfb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 07:49:40 np0005465988 iscsid_config[202167]: iqn.1994-05.com.redhat:7daf2c659dfe#015
Oct  2 07:49:40 np0005465988 systemd[1]: libpod-ae0db6508346f28fe67d08ef0fe06ccb0e263da76f68729d5fb7710eeffccbfb.scope: Deactivated successfully.
Oct  2 07:49:40 np0005465988 podman[202011]: 2025-10-02 11:49:40.173471923 +0000 UTC m=+0.432201938 container died ae0db6508346f28fe67d08ef0fe06ccb0e263da76f68729d5fb7710eeffccbfb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 07:49:40 np0005465988 kernel: podman0: port 1(veth0) entered disabled state
Oct  2 07:49:40 np0005465988 kernel: veth0 (unregistering): left allmulticast mode
Oct  2 07:49:40 np0005465988 kernel: veth0 (unregistering): left promiscuous mode
Oct  2 07:49:40 np0005465988 kernel: podman0: port 1(veth0) entered disabled state
Oct  2 07:49:40 np0005465988 NetworkManager[45041]: <info>  [1759405780.2311] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 07:49:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:40.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:40 np0005465988 systemd[1]: run-netns-netns\x2dde26dc1d\x2de48d\x2dff97\x2d7d48\x2d9d1dd06b0865.mount: Deactivated successfully.
Oct  2 07:49:40 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae0db6508346f28fe67d08ef0fe06ccb0e263da76f68729d5fb7710eeffccbfb-userdata-shm.mount: Deactivated successfully.
Oct  2 07:49:40 np0005465988 podman[202011]: 2025-10-02 11:49:40.59704307 +0000 UTC m=+0.855773065 container remove ae0db6508346f28fe67d08ef0fe06ccb0e263da76f68729d5fb7710eeffccbfb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:49:40 np0005465988 python3.9[201935]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid:current-podified /usr/sbin/iscsi-iname
Oct  2 07:49:40 np0005465988 systemd[1]: libpod-conmon-ae0db6508346f28fe67d08ef0fe06ccb0e263da76f68729d5fb7710eeffccbfb.scope: Deactivated successfully.
Oct  2 07:49:40 np0005465988 python3.9[201935]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: #012DEPRECATED command:#012It is recommended to use Quadlets for running containers and pods under systemd.#012#012Please refer to podman-systemd.unit(5) for details.#012Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct  2 07:49:40 np0005465988 systemd[1]: var-lib-containers-storage-overlay-733a8994baa2a28c054bb2e44eb5a1f543e77e9f094fa4799c1c22ddd6a4e43e-merged.mount: Deactivated successfully.
Oct  2 07:49:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:40.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:41 np0005465988 python3.9[202409]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:42 np0005465988 python3.9[202532]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405780.9558463-325-44665408676713/.source.iscsi _original_basename=.8x_q69ht follow=False checksum=06c783e54d67abbe02cdac26dd69a2d638fa45c6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:42.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:42.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:43 np0005465988 python3.9[202685]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:43 np0005465988 python3.9[202835]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:49:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:44.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:44 np0005465988 podman[202964]: 2025-10-02 11:49:44.547120421 +0000 UTC m=+0.084784058 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 07:49:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:44.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:45 np0005465988 python3.9[203067]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:46 np0005465988 python3.9[203219]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:46.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:46.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:47 np0005465988 python3.9[203372]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:47 np0005465988 python3.9[203450]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:48 np0005465988 python3.9[203602]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:48.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:48 np0005465988 python3.9[203680]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:48.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:49 np0005465988 python3.9[203833]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:50 np0005465988 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:49:50 np0005465988 python3.9[203985]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:50.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:50 np0005465988 python3.9[204063]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:50.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:51 np0005465988 python3.9[204216]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:52 np0005465988 python3.9[204294]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:52.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:52.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:53 np0005465988 python3.9[204446]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:49:53 np0005465988 systemd[1]: Reloading.
Oct  2 07:49:53 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:53 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:54 np0005465988 python3.9[204636]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:54.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:54 np0005465988 python3.9[204714]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:54.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:55 np0005465988 python3.9[204867]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:55 np0005465988 python3.9[204945]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:56.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:56 np0005465988 python3.9[205097]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:49:56 np0005465988 systemd[1]: Reloading.
Oct  2 07:49:56 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:56 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:49:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:57.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:57 np0005465988 systemd[1]: Starting Create netns directory...
Oct  2 07:49:57 np0005465988 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:49:57 np0005465988 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:49:57 np0005465988 systemd[1]: Finished Create netns directory.
Oct  2 07:49:58 np0005465988 python3.9[205291]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:49:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:49:58.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:49:58 np0005465988 python3.9[205444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:49:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:49:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:49:59.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:49:59 np0005465988 python3.9[205567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405798.416203-787-221082180573171/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:00.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:00 np0005465988 python3.9[205719]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:00 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 07:50:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:01.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:01 np0005465988 python3.9[205872]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:01 np0005465988 podman[205873]: 2025-10-02 11:50:01.545309052 +0000 UTC m=+0.069590128 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 07:50:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:02 np0005465988 python3.9[206014]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405800.8478-861-237077409703417/.source.json _original_basename=.uiy9whlm follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:02.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:02 np0005465988 python3.9[206166]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:03.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:04.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:05.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:05 np0005465988 python3.9[206645]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct  2 07:50:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:06.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:06 np0005465988 python3.9[206797]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:50:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:07.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:07 np0005465988 python3.9[206950]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:50:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:08.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:09.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:09 np0005465988 python3[207129]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:50:09 np0005465988 podman[207167]: 2025-10-02 11:50:09.883341875 +0000 UTC m=+0.073521262 container create 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 07:50:09 np0005465988 podman[207167]: 2025-10-02 11:50:09.837240638 +0000 UTC m=+0.027420115 image pull 1b3fd7f2436e5c6f2e28c01b83721476c7b295789c77b3d63e30f49404389ea1 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:50:09 np0005465988 python3[207129]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:50:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:10.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:10 np0005465988 python3.9[207358]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:50:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:11.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:50:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:12 np0005465988 python3.9[207512]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:12.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:12 np0005465988 python3.9[207588]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:13.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:13 np0005465988 python3.9[207740]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405812.8933213-1125-101996769895427/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:14 np0005465988 python3.9[207816]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:50:14 np0005465988 systemd[1]: Reloading.
Oct  2 07:50:14 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:14 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:50:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:14.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:50:14 np0005465988 podman[207901]: 2025-10-02 11:50:14.843533431 +0000 UTC m=+0.085458488 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:50:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:15.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:15 np0005465988 python3.9[207949]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:50:15 np0005465988 systemd[1]: Reloading.
Oct  2 07:50:15 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:15 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:15 np0005465988 systemd[1]: Starting iscsid container...
Oct  2 07:50:15 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:50:15 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3342106da86387ee2f41359f8dc5b16481070d1f7f8baba936fe0f7826395534/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:50:15 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3342106da86387ee2f41359f8dc5b16481070d1f7f8baba936fe0f7826395534/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct  2 07:50:15 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3342106da86387ee2f41359f8dc5b16481070d1f7f8baba936fe0f7826395534/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:50:15 np0005465988 systemd[1]: Started /usr/bin/podman healthcheck run 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017.
Oct  2 07:50:15 np0005465988 podman[207994]: 2025-10-02 11:50:15.60390055 +0000 UTC m=+0.130960587 container init 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:50:15 np0005465988 iscsid[208010]: + sudo -E kolla_set_configs
Oct  2 07:50:15 np0005465988 podman[207994]: 2025-10-02 11:50:15.631533651 +0000 UTC m=+0.158593688 container start 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 07:50:15 np0005465988 podman[207994]: iscsid
Oct  2 07:50:15 np0005465988 systemd[1]: Started iscsid container.
Oct  2 07:50:15 np0005465988 systemd[1]: Created slice User Slice of UID 0.
Oct  2 07:50:15 np0005465988 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  2 07:50:15 np0005465988 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  2 07:50:15 np0005465988 systemd[1]: Starting User Manager for UID 0...
Oct  2 07:50:15 np0005465988 podman[208017]: 2025-10-02 11:50:15.721049726 +0000 UTC m=+0.076235771 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:50:15 np0005465988 systemd[1]: 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017-b45b5eb2043fee6.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:50:15 np0005465988 systemd[1]: 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017-b45b5eb2043fee6.service: Failed with result 'exit-code'.
Oct  2 07:50:15 np0005465988 systemd[208037]: Queued start job for default target Main User Target.
Oct  2 07:50:15 np0005465988 systemd[208037]: Created slice User Application Slice.
Oct  2 07:50:15 np0005465988 systemd[208037]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  2 07:50:15 np0005465988 systemd[208037]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 07:50:15 np0005465988 systemd[208037]: Reached target Paths.
Oct  2 07:50:15 np0005465988 systemd[208037]: Reached target Timers.
Oct  2 07:50:15 np0005465988 systemd[208037]: Starting D-Bus User Message Bus Socket...
Oct  2 07:50:15 np0005465988 systemd[208037]: Starting Create User's Volatile Files and Directories...
Oct  2 07:50:15 np0005465988 systemd[208037]: Listening on D-Bus User Message Bus Socket.
Oct  2 07:50:15 np0005465988 systemd[208037]: Reached target Sockets.
Oct  2 07:50:15 np0005465988 systemd[208037]: Finished Create User's Volatile Files and Directories.
Oct  2 07:50:15 np0005465988 systemd[208037]: Reached target Basic System.
Oct  2 07:50:15 np0005465988 systemd[208037]: Reached target Main User Target.
Oct  2 07:50:15 np0005465988 systemd[208037]: Startup finished in 140ms.
Oct  2 07:50:15 np0005465988 systemd[1]: Started User Manager for UID 0.
Oct  2 07:50:15 np0005465988 systemd[1]: Started Session c3 of User root.
Oct  2 07:50:15 np0005465988 iscsid[208010]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:50:15 np0005465988 iscsid[208010]: INFO:__main__:Validating config file
Oct  2 07:50:15 np0005465988 iscsid[208010]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:50:15 np0005465988 iscsid[208010]: INFO:__main__:Writing out command to execute
Oct  2 07:50:15 np0005465988 systemd[1]: session-c3.scope: Deactivated successfully.
Oct  2 07:50:15 np0005465988 iscsid[208010]: ++ cat /run_command
Oct  2 07:50:15 np0005465988 iscsid[208010]: + CMD='/usr/sbin/iscsid -f'
Oct  2 07:50:15 np0005465988 iscsid[208010]: + ARGS=
Oct  2 07:50:15 np0005465988 iscsid[208010]: + sudo kolla_copy_cacerts
Oct  2 07:50:16 np0005465988 systemd[1]: Started Session c4 of User root.
Oct  2 07:50:16 np0005465988 systemd[1]: session-c4.scope: Deactivated successfully.
Oct  2 07:50:16 np0005465988 iscsid[208010]: + [[ ! -n '' ]]
Oct  2 07:50:16 np0005465988 iscsid[208010]: + . kolla_extend_start
Oct  2 07:50:16 np0005465988 iscsid[208010]: Running command: '/usr/sbin/iscsid -f'
Oct  2 07:50:16 np0005465988 iscsid[208010]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct  2 07:50:16 np0005465988 iscsid[208010]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct  2 07:50:16 np0005465988 iscsid[208010]: + umask 0022
Oct  2 07:50:16 np0005465988 iscsid[208010]: + exec /usr/sbin/iscsid -f
Oct  2 07:50:16 np0005465988 kernel: Loading iSCSI transport class v2.0-870.
Oct  2 07:50:16 np0005465988 python3.9[208215]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:16.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:17.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:17 np0005465988 python3.9[208368]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:18 np0005465988 python3.9[208520]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:50:18 np0005465988 network[208537]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:50:18 np0005465988 network[208538]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:50:18 np0005465988 network[208539]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:50:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:18.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:19.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:20.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:21.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:22.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:23.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:23 np0005465988 python3.9[208817]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:50:24 np0005465988 python3.9[208969]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct  2 07:50:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:50:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:24.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:50:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:25.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:25 np0005465988 python3.9[209176]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:25 np0005465988 python3.9[209299]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405824.5228493-1348-193972781282068/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:26 np0005465988 systemd[1]: Stopping User Manager for UID 0...
Oct  2 07:50:26 np0005465988 systemd[208037]: Activating special unit Exit the Session...
Oct  2 07:50:26 np0005465988 systemd[208037]: Stopped target Main User Target.
Oct  2 07:50:26 np0005465988 systemd[208037]: Stopped target Basic System.
Oct  2 07:50:26 np0005465988 systemd[208037]: Stopped target Paths.
Oct  2 07:50:26 np0005465988 systemd[208037]: Stopped target Sockets.
Oct  2 07:50:26 np0005465988 systemd[208037]: Stopped target Timers.
Oct  2 07:50:26 np0005465988 systemd[208037]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 07:50:26 np0005465988 systemd[208037]: Closed D-Bus User Message Bus Socket.
Oct  2 07:50:26 np0005465988 systemd[208037]: Stopped Create User's Volatile Files and Directories.
Oct  2 07:50:26 np0005465988 systemd[208037]: Removed slice User Application Slice.
Oct  2 07:50:26 np0005465988 systemd[208037]: Reached target Shutdown.
Oct  2 07:50:26 np0005465988 systemd[208037]: Finished Exit the Session.
Oct  2 07:50:26 np0005465988 systemd[208037]: Reached target Exit the Session.
Oct  2 07:50:26 np0005465988 systemd[1]: user@0.service: Deactivated successfully.
Oct  2 07:50:26 np0005465988 systemd[1]: Stopped User Manager for UID 0.
Oct  2 07:50:26 np0005465988 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  2 07:50:26 np0005465988 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  2 07:50:26 np0005465988 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  2 07:50:26 np0005465988 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  2 07:50:26 np0005465988 systemd[1]: Removed slice User Slice of UID 0.
Oct  2 07:50:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:26.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:26 np0005465988 python3.9[209452]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:27.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:50:27.317 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:50:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:50:27.317 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:50:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:50:27.317 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:50:27 np0005465988 python3.9[209605]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:50:27 np0005465988 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  2 07:50:27 np0005465988 systemd[1]: Stopped Load Kernel Modules.
Oct  2 07:50:27 np0005465988 systemd[1]: Stopping Load Kernel Modules...
Oct  2 07:50:27 np0005465988 systemd[1]: Starting Load Kernel Modules...
Oct  2 07:50:27 np0005465988 systemd[1]: Finished Load Kernel Modules.
Oct  2 07:50:28 np0005465988 python3.9[209761]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:50:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:28.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:50:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:29.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:29 np0005465988 python3.9[209914]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:29 np0005465988 python3.9[210066]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:30.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:30 np0005465988 python3.9[210218]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:31.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:31 np0005465988 python3.9[210342]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405830.3789756-1522-242622997924634/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:32 np0005465988 podman[210466]: 2025-10-02 11:50:32.146562859 +0000 UTC m=+0.073015878 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:50:32 np0005465988 python3.9[210513]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:50:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:32.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:33.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:33 np0005465988 python3.9[210667]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:34 np0005465988 python3.9[210819]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:34.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:34 np0005465988 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct  2 07:50:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:35.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:35 np0005465988 python3.9[210973]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:36 np0005465988 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  2 07:50:36 np0005465988 python3.9[211125]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:36.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:36 np0005465988 python3.9[211278]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:37.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:37 np0005465988 python3.9[211479]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:37 np0005465988 podman[211716]: 2025-10-02 11:50:37.908624699 +0000 UTC m=+0.056041496 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 07:50:38 np0005465988 podman[211716]: 2025-10-02 11:50:38.019703138 +0000 UTC m=+0.167119935 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3)
Oct  2 07:50:38 np0005465988 python3.9[211775]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:38.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:38 np0005465988 podman[211941]: 2025-10-02 11:50:38.536087035 +0000 UTC m=+0.049686291 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:50:38 np0005465988 podman[211941]: 2025-10-02 11:50:38.543243773 +0000 UTC m=+0.056843009 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 07:50:38 np0005465988 podman[212087]: 2025-10-02 11:50:38.792732793 +0000 UTC m=+0.054129230 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=keepalived, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, distribution-scope=public, release=1793, version=2.2.4, com.redhat.component=keepalived-container, vcs-type=git, description=keepalived for Ceph)
Oct  2 07:50:38 np0005465988 podman[212087]: 2025-10-02 11:50:38.80468171 +0000 UTC m=+0.066078107 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, io.openshift.expose-services=, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, release=1793, name=keepalived, vcs-type=git, architecture=x86_64, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Oct  2 07:50:38 np0005465988 python3.9[212122]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:39.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:50:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:50:39 np0005465988 python3.9[212431]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:50:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:40.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:50:40 np0005465988 python3.9[212597]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:50:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:50:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:50:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:41.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:41 np0005465988 python3.9[212750]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:42 np0005465988 python3.9[212828]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:50:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:42.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:50:42 np0005465988 python3.9[212980]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:43.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:43 np0005465988 python3.9[213059]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:43 np0005465988 python3.9[213211]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:44.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:44 np0005465988 python3.9[213413]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:45.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:45 np0005465988 podman[213464]: 2025-10-02 11:50:45.206683619 +0000 UTC m=+0.153782588 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  2 07:50:45 np0005465988 python3.9[213505]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:45 np0005465988 podman[213642]: 2025-10-02 11:50:45.871656343 +0000 UTC m=+0.082833042 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 07:50:46 np0005465988 python3.9[213690]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:46.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:46 np0005465988 python3.9[213786]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:50:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:50:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:47.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:47 np0005465988 python3.9[213971]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:50:47 np0005465988 systemd[1]: virtqemud.service: Deactivated successfully.
Oct  2 07:50:47 np0005465988 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct  2 07:50:47 np0005465988 systemd[1]: Reloading.
Oct  2 07:50:47 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:47 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:50:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:48.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:50:48 np0005465988 python3.9[214161]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:49.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:49 np0005465988 python3.9[214240]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:49 np0005465988 python3.9[214392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:50.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:50 np0005465988 python3.9[214470]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:51.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:51 np0005465988 python3.9[214623]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:50:51 np0005465988 systemd[1]: Reloading.
Oct  2 07:50:51 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:51 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:51 np0005465988 systemd[1]: Starting Create netns directory...
Oct  2 07:50:51 np0005465988 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:50:51 np0005465988 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:50:51 np0005465988 systemd[1]: Finished Create netns directory.
Oct  2 07:50:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:52.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:52 np0005465988 python3.9[214816]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:53.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:53 np0005465988 python3.9[214969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:54 np0005465988 python3.9[215092]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405853.1925578-2143-269002917957215/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:54.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:55.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:55 np0005465988 python3.9[215245]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:56 np0005465988 python3.9[215397]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:56.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:56 np0005465988 python3.9[215520]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405855.689382-2217-205752821132781/.source.json _original_basename=.7lgd8hfi follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:50:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:57.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:57 np0005465988 python3.9[215673]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:50:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:50:58.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:50:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:50:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:50:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:50:59.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:51:00 np0005465988 python3.9[216101]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct  2 07:51:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:00.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:01 np0005465988 python3.9[216254]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:51:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:01.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:01 np0005465988 python3.9[216406]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:51:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:02.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:02 np0005465988 podman[216458]: 2025-10-02 11:51:02.522677082 +0000 UTC m=+0.061331180 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:51:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:03.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:03 np0005465988 python3[216604]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:51:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:51:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:04.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:51:04 np0005465988 podman[216616]: 2025-10-02 11:51:04.992116286 +0000 UTC m=+0.977721222 image pull d8d739f82a6fecf9df690e49539b589e74665b54e36448657b874630717d5bd1 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  2 07:51:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:05.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:05 np0005465988 podman[216723]: 2025-10-02 11:51:05.19413002 +0000 UTC m=+0.078004998 container create 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 07:51:05 np0005465988 podman[216723]: 2025-10-02 11:51:05.154312558 +0000 UTC m=+0.038187576 image pull d8d739f82a6fecf9df690e49539b589e74665b54e36448657b874630717d5bd1 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  2 07:51:05 np0005465988 python3[216604]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  2 07:51:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:06.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:06 np0005465988 python3.9[216911]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:07.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:07 np0005465988 python3.9[217066]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:08 np0005465988 python3.9[217142]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:08.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:09.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:09 np0005465988 python3.9[217294]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405868.4987776-2482-120714081534846/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:09 np0005465988 python3.9[217370]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:51:09 np0005465988 systemd[1]: Reloading.
Oct  2 07:51:09 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:09 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:51:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:10.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:51:10 np0005465988 python3.9[217481]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:10 np0005465988 systemd[1]: Reloading.
Oct  2 07:51:11 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:11 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:11.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:11 np0005465988 systemd[1]: Starting multipathd container...
Oct  2 07:51:11 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:51:11 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4eb59eacbe2fcbf2d7dc4ab635ff39148f07e25f698f93684722fee2ea88c10/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:51:11 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4eb59eacbe2fcbf2d7dc4ab635ff39148f07e25f698f93684722fee2ea88c10/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:51:11 np0005465988 systemd[1]: Started /usr/bin/podman healthcheck run 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0.
Oct  2 07:51:11 np0005465988 podman[217522]: 2025-10-02 11:51:11.456465856 +0000 UTC m=+0.170702797 container init 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251001)
Oct  2 07:51:11 np0005465988 multipathd[217537]: + sudo -E kolla_set_configs
Oct  2 07:51:11 np0005465988 podman[217522]: 2025-10-02 11:51:11.501807066 +0000 UTC m=+0.216043967 container start 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 07:51:11 np0005465988 podman[217522]: multipathd
Oct  2 07:51:11 np0005465988 systemd[1]: Started multipathd container.
Oct  2 07:51:11 np0005465988 multipathd[217537]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:51:11 np0005465988 multipathd[217537]: INFO:__main__:Validating config file
Oct  2 07:51:11 np0005465988 multipathd[217537]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:51:11 np0005465988 multipathd[217537]: INFO:__main__:Writing out command to execute
Oct  2 07:51:11 np0005465988 multipathd[217537]: ++ cat /run_command
Oct  2 07:51:11 np0005465988 multipathd[217537]: + CMD='/usr/sbin/multipathd -d'
Oct  2 07:51:11 np0005465988 multipathd[217537]: + ARGS=
Oct  2 07:51:11 np0005465988 multipathd[217537]: + sudo kolla_copy_cacerts
Oct  2 07:51:11 np0005465988 multipathd[217537]: + [[ ! -n '' ]]
Oct  2 07:51:11 np0005465988 multipathd[217537]: + . kolla_extend_start
Oct  2 07:51:11 np0005465988 multipathd[217537]: Running command: '/usr/sbin/multipathd -d'
Oct  2 07:51:11 np0005465988 multipathd[217537]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  2 07:51:11 np0005465988 multipathd[217537]: + umask 0022
Oct  2 07:51:11 np0005465988 multipathd[217537]: + exec /usr/sbin/multipathd -d
Oct  2 07:51:11 np0005465988 multipathd[217537]: 3931.991684 | --------start up--------
Oct  2 07:51:11 np0005465988 multipathd[217537]: 3931.991701 | read /etc/multipath.conf
Oct  2 07:51:11 np0005465988 podman[217544]: 2025-10-02 11:51:11.598072157 +0000 UTC m=+0.075566558 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd)
Oct  2 07:51:11 np0005465988 multipathd[217537]: 3931.998596 | path checkers start up
Oct  2 07:51:11 np0005465988 systemd[1]: 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0-7bfab19f6ea6bc63.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:51:11 np0005465988 systemd[1]: 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0-7bfab19f6ea6bc63.service: Failed with result 'exit-code'.
Oct  2 07:51:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:12 np0005465988 python3.9[217724]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:12.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:51:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:13.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:51:13 np0005465988 python3.9[217879]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:51:14 np0005465988 python3.9[218044]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:51:14 np0005465988 systemd[1]: Stopping multipathd container...
Oct  2 07:51:14 np0005465988 multipathd[217537]: 3934.759592 | exit (signal)
Oct  2 07:51:14 np0005465988 multipathd[217537]: 3934.760212 | --------shut down-------
Oct  2 07:51:14 np0005465988 systemd[1]: libpod-91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0.scope: Deactivated successfully.
Oct  2 07:51:14 np0005465988 podman[218048]: 2025-10-02 11:51:14.403802607 +0000 UTC m=+0.152032871 container died 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:51:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:51:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:14.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:51:14 np0005465988 systemd[1]: 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0-7bfab19f6ea6bc63.timer: Deactivated successfully.
Oct  2 07:51:14 np0005465988 systemd[1]: Stopped /usr/bin/podman healthcheck run 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0.
Oct  2 07:51:14 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0-userdata-shm.mount: Deactivated successfully.
Oct  2 07:51:14 np0005465988 systemd[1]: var-lib-containers-storage-overlay-b4eb59eacbe2fcbf2d7dc4ab635ff39148f07e25f698f93684722fee2ea88c10-merged.mount: Deactivated successfully.
Oct  2 07:51:14 np0005465988 podman[218048]: 2025-10-02 11:51:14.750688105 +0000 UTC m=+0.498918329 container cleanup 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd)
Oct  2 07:51:14 np0005465988 podman[218048]: multipathd
Oct  2 07:51:14 np0005465988 podman[218079]: multipathd
Oct  2 07:51:14 np0005465988 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct  2 07:51:14 np0005465988 systemd[1]: Stopped multipathd container.
Oct  2 07:51:14 np0005465988 systemd[1]: Starting multipathd container...
Oct  2 07:51:14 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:51:14 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4eb59eacbe2fcbf2d7dc4ab635ff39148f07e25f698f93684722fee2ea88c10/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:51:14 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4eb59eacbe2fcbf2d7dc4ab635ff39148f07e25f698f93684722fee2ea88c10/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:51:14 np0005465988 systemd[1]: Started /usr/bin/podman healthcheck run 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0.
Oct  2 07:51:14 np0005465988 podman[218092]: 2025-10-02 11:51:14.988460635 +0000 UTC m=+0.125313296 container init 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 07:51:15 np0005465988 multipathd[218107]: + sudo -E kolla_set_configs
Oct  2 07:51:15 np0005465988 podman[218092]: 2025-10-02 11:51:15.018248289 +0000 UTC m=+0.155100920 container start 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 07:51:15 np0005465988 podman[218092]: multipathd
Oct  2 07:51:15 np0005465988 systemd[1]: Started multipathd container.
Oct  2 07:51:15 np0005465988 multipathd[218107]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:51:15 np0005465988 multipathd[218107]: INFO:__main__:Validating config file
Oct  2 07:51:15 np0005465988 multipathd[218107]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:51:15 np0005465988 multipathd[218107]: INFO:__main__:Writing out command to execute
Oct  2 07:51:15 np0005465988 multipathd[218107]: ++ cat /run_command
Oct  2 07:51:15 np0005465988 multipathd[218107]: + CMD='/usr/sbin/multipathd -d'
Oct  2 07:51:15 np0005465988 multipathd[218107]: + ARGS=
Oct  2 07:51:15 np0005465988 multipathd[218107]: + sudo kolla_copy_cacerts
Oct  2 07:51:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:51:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:15.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:51:15 np0005465988 podman[218114]: 2025-10-02 11:51:15.099790588 +0000 UTC m=+0.066855279 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true)
Oct  2 07:51:15 np0005465988 multipathd[218107]: + [[ ! -n '' ]]
Oct  2 07:51:15 np0005465988 multipathd[218107]: + . kolla_extend_start
Oct  2 07:51:15 np0005465988 multipathd[218107]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  2 07:51:15 np0005465988 multipathd[218107]: Running command: '/usr/sbin/multipathd -d'
Oct  2 07:51:15 np0005465988 multipathd[218107]: + umask 0022
Oct  2 07:51:15 np0005465988 multipathd[218107]: + exec /usr/sbin/multipathd -d
Oct  2 07:51:15 np0005465988 systemd[1]: 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0-6fa61046879ef3d5.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:51:15 np0005465988 systemd[1]: 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0-6fa61046879ef3d5.service: Failed with result 'exit-code'.
Oct  2 07:51:15 np0005465988 multipathd[218107]: 3935.519668 | --------start up--------
Oct  2 07:51:15 np0005465988 multipathd[218107]: 3935.519687 | read /etc/multipath.conf
Oct  2 07:51:15 np0005465988 multipathd[218107]: 3935.525246 | path checkers start up
Oct  2 07:51:15 np0005465988 podman[218223]: 2025-10-02 11:51:15.59254369 +0000 UTC m=+0.118776968 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:51:15 np0005465988 python3.9[218324]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:16.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:16 np0005465988 podman[218354]: 2025-10-02 11:51:16.518310631 +0000 UTC m=+0.050727986 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 07:51:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:17 np0005465988 python3.9[218497]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:51:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:17.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:17 np0005465988 python3.9[218649]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct  2 07:51:17 np0005465988 kernel: Key type psk registered
Oct  2 07:51:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:18.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:18 np0005465988 python3.9[218811]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:19.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:19 np0005465988 python3.9[218935]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405878.2397945-2722-29095913616706/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:20 np0005465988 python3.9[219087]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:20.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:21.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:21 np0005465988 python3.9[219240]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:51:21 np0005465988 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  2 07:51:21 np0005465988 systemd[1]: Stopped Load Kernel Modules.
Oct  2 07:51:21 np0005465988 systemd[1]: Stopping Load Kernel Modules...
Oct  2 07:51:21 np0005465988 systemd[1]: Starting Load Kernel Modules...
Oct  2 07:51:21 np0005465988 systemd[1]: Finished Load Kernel Modules.
Oct  2 07:51:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:22 np0005465988 python3.9[219396]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:51:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:51:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:22.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:51:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:23.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:23 np0005465988 python3.9[219481]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:51:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:24.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:25.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:26.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:27.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:51:27.318 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:51:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:51:27.320 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:51:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:51:27.320 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:51:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:28.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:29.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:29 np0005465988 systemd[1]: Reloading.
Oct  2 07:51:29 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:29 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:30 np0005465988 systemd[1]: Reloading.
Oct  2 07:51:30 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:30 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:30.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:30 np0005465988 systemd-logind[827]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  2 07:51:30 np0005465988 systemd-logind[827]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  2 07:51:30 np0005465988 lvm[219651]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct  2 07:51:30 np0005465988 lvm[219651]: VG ceph_vg0 finished
Oct  2 07:51:30 np0005465988 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:51:30 np0005465988 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:51:30 np0005465988 systemd[1]: Reloading.
Oct  2 07:51:31 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:31 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:31.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:31 np0005465988 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:51:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:32 np0005465988 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:51:32 np0005465988 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:51:32 np0005465988 systemd[1]: man-db-cache-update.service: Consumed 1.864s CPU time.
Oct  2 07:51:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:32 np0005465988 systemd[1]: run-rf56a5575784644d18e0475d160aeef65.service: Deactivated successfully.
Oct  2 07:51:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:51:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:32.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:51:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:51:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:33.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:51:33 np0005465988 podman[220866]: 2025-10-02 11:51:33.551526108 +0000 UTC m=+0.079940034 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:51:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:51:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:34.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:51:34 np0005465988 python3.9[221013]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:51:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:35.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:51:35 np0005465988 python3.9[221164]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:51:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:36.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:37.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:37 np0005465988 python3.9[221321]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:38.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:38 np0005465988 python3.9[221473]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:51:38 np0005465988 systemd[1]: Reloading.
Oct  2 07:51:38 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:38 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:51:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:39.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:51:40 np0005465988 python3.9[221659]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:51:40 np0005465988 network[221676]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:51:40 np0005465988 network[221677]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:51:40 np0005465988 network[221678]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:51:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:40.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:51:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:41.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:51:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:42.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:43.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:44.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:45.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:45 np0005465988 podman[221905]: 2025-10-02 11:51:45.538422307 +0000 UTC m=+0.076126894 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 07:51:45 np0005465988 podman[222003]: 2025-10-02 11:51:45.961236304 +0000 UTC m=+0.102959884 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:51:46 np0005465988 python3.9[222053]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:51:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:46.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:51:46 np0005465988 podman[222158]: 2025-10-02 11:51:46.611489314 +0000 UTC m=+0.058352915 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.878133) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405906878162, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1644, "num_deletes": 251, "total_data_size": 4071668, "memory_usage": 4124656, "flush_reason": "Manual Compaction"}
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405906892275, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 2678237, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15881, "largest_seqno": 17520, "table_properties": {"data_size": 2671331, "index_size": 4041, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13830, "raw_average_key_size": 19, "raw_value_size": 2657679, "raw_average_value_size": 3785, "num_data_blocks": 182, "num_entries": 702, "num_filter_entries": 702, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759405751, "oldest_key_time": 1759405751, "file_creation_time": 1759405906, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 14173 microseconds, and 5785 cpu microseconds.
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.892306) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 2678237 bytes OK
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.892321) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.893886) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.893896) EVENT_LOG_v1 {"time_micros": 1759405906893893, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.893910) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 4064253, prev total WAL file size 4064253, number of live WAL files 2.
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.894836) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(2615KB)], [30(7484KB)]
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405906894919, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 10342155, "oldest_snapshot_seqno": -1}
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4249 keys, 8313406 bytes, temperature: kUnknown
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405906955067, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 8313406, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8283426, "index_size": 18286, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 105056, "raw_average_key_size": 24, "raw_value_size": 8204897, "raw_average_value_size": 1931, "num_data_blocks": 769, "num_entries": 4249, "num_filter_entries": 4249, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759405906, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.955358) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 8313406 bytes
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.957198) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 171.7 rd, 138.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 7.3 +0.0 blob) out(7.9 +0.0 blob), read-write-amplify(7.0) write-amplify(3.1) OK, records in: 4766, records dropped: 517 output_compression: NoCompression
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.957218) EVENT_LOG_v1 {"time_micros": 1759405906957208, "job": 16, "event": "compaction_finished", "compaction_time_micros": 60245, "compaction_time_cpu_micros": 32756, "output_level": 6, "num_output_files": 1, "total_output_size": 8313406, "num_input_records": 4766, "num_output_records": 4249, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405906957742, "job": 16, "event": "table_file_deletion", "file_number": 32}
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405906958998, "job": 16, "event": "table_file_deletion", "file_number": 30}
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.894722) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.959057) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.959061) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.959063) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.959065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:46 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:46.959067) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:47 np0005465988 python3.9[222331]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:51:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:47.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:51:47 np0005465988 python3.9[222514]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:51:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:51:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:51:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:48.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:48 np0005465988 python3.9[222667]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:49.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:49 np0005465988 python3.9[222821]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:50.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:50 np0005465988 python3.9[222974]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:50.951113) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405910951151, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 297, "num_deletes": 254, "total_data_size": 79383, "memory_usage": 86280, "flush_reason": "Manual Compaction"}
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405910953629, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 52058, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17525, "largest_seqno": 17817, "table_properties": {"data_size": 50156, "index_size": 130, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4579, "raw_average_key_size": 16, "raw_value_size": 46301, "raw_average_value_size": 165, "num_data_blocks": 6, "num_entries": 279, "num_filter_entries": 279, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759405907, "oldest_key_time": 1759405907, "file_creation_time": 1759405910, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 2558 microseconds, and 945 cpu microseconds.
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:50.953671) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 52058 bytes OK
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:50.953688) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:50.954886) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:50.954898) EVENT_LOG_v1 {"time_micros": 1759405910954895, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:50.954910) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 77170, prev total WAL file size 77170, number of live WAL files 2.
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:50.955440) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323531' seq:0, type:0; will stop at (end)
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(50KB)], [33(8118KB)]
Oct  2 07:51:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405910955479, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 8365464, "oldest_snapshot_seqno": -1}
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4011 keys, 8017939 bytes, temperature: kUnknown
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405911005718, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 8017939, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7989760, "index_size": 17061, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10053, "raw_key_size": 101427, "raw_average_key_size": 25, "raw_value_size": 7915473, "raw_average_value_size": 1973, "num_data_blocks": 704, "num_entries": 4011, "num_filter_entries": 4011, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759405910, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:51.006010) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 8017939 bytes
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:51.007911) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.2 rd, 159.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 7.9 +0.0 blob) out(7.6 +0.0 blob), read-write-amplify(314.7) write-amplify(154.0) OK, records in: 4528, records dropped: 517 output_compression: NoCompression
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:51.007932) EVENT_LOG_v1 {"time_micros": 1759405911007923, "job": 18, "event": "compaction_finished", "compaction_time_micros": 50325, "compaction_time_cpu_micros": 17683, "output_level": 6, "num_output_files": 1, "total_output_size": 8017939, "num_input_records": 4528, "num_output_records": 4011, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405911008060, "job": 18, "event": "table_file_deletion", "file_number": 35}
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759405911010035, "job": 18, "event": "table_file_deletion", "file_number": 33}
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:50.955168) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:51.010098) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:51.010106) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:51.010110) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:51.010115) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:51 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:51:51.010119) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:51:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:51.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:51 np0005465988 python3.9[223128]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:52 np0005465988 python3.9[223281]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:52.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:51:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:53.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:51:53 np0005465988 python3.9[223435]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 07:51:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:54.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 07:51:54 np0005465988 python3.9[223637]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:51:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:51:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:55.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:55 np0005465988 python3.9[223790]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:56 np0005465988 python3.9[223942]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:51:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:56.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:51:56 np0005465988 python3.9[224094]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:51:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:57.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:57 np0005465988 python3.9[224247]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:58 np0005465988 python3.9[224399]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:51:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:51:58.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:51:59 np0005465988 python3.9[224552]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:51:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:51:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:51:59.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:51:59 np0005465988 python3.9[224704]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:00.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:00 np0005465988 python3.9[224856]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:01.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:01 np0005465988 python3.9[225009]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:02 np0005465988 python3.9[225161]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:02.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:02 np0005465988 python3.9[225313]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:03.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:03 np0005465988 python3.9[225466]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:04 np0005465988 podman[225590]: 2025-10-02 11:52:04.104573783 +0000 UTC m=+0.064161662 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 07:52:04 np0005465988 python3.9[225637]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:52:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:04.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:52:05 np0005465988 python3.9[225791]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:05.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:06 np0005465988 python3.9[225993]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:52:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:06.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:07 np0005465988 python3.9[226146]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:52:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:52:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:07.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:52:08 np0005465988 python3.9[226298]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:52:08 np0005465988 systemd[1]: Reloading.
Oct  2 07:52:08 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:08 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:08.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:09.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:09 np0005465988 python3.9[226486]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:52:10 np0005465988 python3.9[226639]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:52:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:52:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:10.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:52:10 np0005465988 python3.9[226792]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:52:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:11.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:11 np0005465988 python3.9[226946]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:52:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:12 np0005465988 python3.9[227099]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:52:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:12.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:12 np0005465988 python3.9[227253]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:52:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:13.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:13 np0005465988 python3.9[227406]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:52:14 np0005465988 python3.9[227559]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:52:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:14.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:15.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:16 np0005465988 podman[227685]: 2025-10-02 11:52:16.361512386 +0000 UTC m=+0.098178777 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 07:52:16 np0005465988 podman[227686]: 2025-10-02 11:52:16.379772719 +0000 UTC m=+0.109874792 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:52:16 np0005465988 python3.9[227748]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:16.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:17 np0005465988 podman[227885]: 2025-10-02 11:52:17.081244768 +0000 UTC m=+0.077061521 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:52:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:17.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:17 np0005465988 python3.9[227931]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:18 np0005465988 python3.9[228083]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:18.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:18 np0005465988 python3.9[228235]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:19.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:19 np0005465988 python3.9[228388]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:20 np0005465988 python3.9[228540]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:20.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:20 np0005465988 python3.9[228692]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:21.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:21 np0005465988 python3.9[228845]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:22 np0005465988 python3.9[228997]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:52:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:22.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:52:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:23.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:23 np0005465988 python3.9[229150]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:24 np0005465988 python3.9[229302]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:24.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:24 np0005465988 python3.9[229454]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:25.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:26.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:27.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:52:27.319 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:52:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:52:27.320 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:52:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:52:27.320 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:52:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:28.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:29.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:30.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:31.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:31 np0005465988 python3.9[229660]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct  2 07:52:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:32 np0005465988 python3.9[229813]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:52:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:32.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:33.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:33 np0005465988 python3.9[229972]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:52:34 np0005465988 podman[230006]: 2025-10-02 11:52:34.510576287 +0000 UTC m=+0.049644965 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 07:52:34 np0005465988 systemd-logind[827]: New session 52 of user zuul.
Oct  2 07:52:34 np0005465988 systemd[1]: Started Session 52 of User zuul.
Oct  2 07:52:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:34.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:34 np0005465988 systemd[1]: session-52.scope: Deactivated successfully.
Oct  2 07:52:34 np0005465988 systemd-logind[827]: Session 52 logged out. Waiting for processes to exit.
Oct  2 07:52:34 np0005465988 systemd-logind[827]: Removed session 52.
Oct  2 07:52:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:52:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:35.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:52:35 np0005465988 python3.9[230178]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:36 np0005465988 python3.9[230299]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405954.9566708-4359-5345997543866/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:36.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:36 np0005465988 python3.9[230450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:37.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:37 np0005465988 python3.9[230526]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:38 np0005465988 python3.9[230676]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:38.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:38 np0005465988 python3.9[230797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405957.6363492-4359-39984715143112/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:52:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:39.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:52:39 np0005465988 python3.9[230948]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:40 np0005465988 python3.9[231069]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405959.0172038-4359-281401968222105/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:40.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:40 np0005465988 python3.9[231219]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:52:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:41.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:52:41 np0005465988 python3.9[231341]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405960.206823-4359-151040570185263/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:42 np0005465988 python3.9[231493]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:42.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:43 np0005465988 python3.9[231646]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:43.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:43 np0005465988 python3.9[231798]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:52:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:44.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:44 np0005465988 python3.9[231950]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:45.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:45 np0005465988 python3.9[232074]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759405964.0561373-4639-33717632896546/.source _original_basename=.9oz6_vmw follow=False checksum=8bafaae38fc3a6e6a30f0f0828436f79edca3ec6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct  2 07:52:46 np0005465988 python3.9[232276]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:52:46 np0005465988 podman[232339]: 2025-10-02 11:52:46.548299033 +0000 UTC m=+0.076584178 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 07:52:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:46.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:46 np0005465988 podman[232329]: 2025-10-02 11:52:46.621350968 +0000 UTC m=+0.153692349 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 07:52:46 np0005465988 python3.9[232474]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:47.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:47 np0005465988 podman[232570]: 2025-10-02 11:52:47.350280584 +0000 UTC m=+0.074503238 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 07:52:47 np0005465988 python3.9[232609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405966.4246285-4715-5157915187831/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=f022386746472553146d29f689b545df70fa8a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:48 np0005465988 python3.9[232766]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:48.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:48 np0005465988 python3.9[232888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405967.7967987-4761-53551676243488/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:49.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:49 np0005465988 python3.9[233040]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct  2 07:52:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:50.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:51 np0005465988 python3.9[233193]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:52:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:52:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:51.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:52:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:52 np0005465988 python3[233345]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:52:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:52:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:52.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:52:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:53.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:54.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:55.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:56.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:52:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:57.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:52:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:52:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:52:58.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:52:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:52:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:52:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:52:59.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:00.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:01.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:01 np0005465988 podman[233358]: 2025-10-02 11:53:01.250841228 +0000 UTC m=+9.054985912 image pull e36f31143f26011980def9337d375f895bea59b742a3a2b372b996aa8ad58eba quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  2 07:53:01 np0005465988 podman[233559]: 2025-10-02 11:53:01.388527477 +0000 UTC m=+0.049062598 container create 99d60efbe895442362003c2757c7516e8f2fac4a157d84867710565285b685e1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 07:53:01 np0005465988 podman[233559]: 2025-10-02 11:53:01.361497982 +0000 UTC m=+0.022033123 image pull e36f31143f26011980def9337d375f895bea59b742a3a2b372b996aa8ad58eba quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  2 07:53:01 np0005465988 python3[233345]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct  2 07:53:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:02.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:03.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:53:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:53:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:53:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:53:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:53:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:53:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:53:04 np0005465988 python3.9[233764]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:04.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:05.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:05 np0005465988 podman[233891]: 2025-10-02 11:53:05.261282309 +0000 UTC m=+0.069427923 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 07:53:05 np0005465988 python3.9[233938]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct  2 07:53:06 np0005465988 python3.9[234140]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:53:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:06.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:07.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:07 np0005465988 python3[234293]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:53:07 np0005465988 podman[234331]: 2025-10-02 11:53:07.596757001 +0000 UTC m=+0.090178438 container create 9fb7350c200a8f3713bdcf4324e86b76e6f7678092d3989b19172001f4ff439d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute)
Oct  2 07:53:07 np0005465988 podman[234331]: 2025-10-02 11:53:07.543479493 +0000 UTC m=+0.036900990 image pull e36f31143f26011980def9337d375f895bea59b742a3a2b372b996aa8ad58eba quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  2 07:53:07 np0005465988 python3[234293]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Oct  2 07:53:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:08.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:08 np0005465988 python3.9[234521]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:09.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:09 np0005465988 python3.9[234676]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:10 np0005465988 python3.9[234827]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405989.6799273-5036-158099325415465/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:10.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:11 np0005465988 python3.9[234954]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:53:11 np0005465988 systemd[1]: Reloading.
Oct  2 07:53:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:53:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:53:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:11.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:11 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:53:11 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:53:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:12 np0005465988 python3.9[235064]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:53:12 np0005465988 systemd[1]: Reloading.
Oct  2 07:53:12 np0005465988 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:53:12 np0005465988 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:53:12 np0005465988 systemd[1]: Starting nova_compute container...
Oct  2 07:53:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:12.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:12 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:53:12 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b625f64cb750c5b158a1b0209098e8289a7038c3662d94b95232fe9248815fb7/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:12 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b625f64cb750c5b158a1b0209098e8289a7038c3662d94b95232fe9248815fb7/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:12 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b625f64cb750c5b158a1b0209098e8289a7038c3662d94b95232fe9248815fb7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:12 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b625f64cb750c5b158a1b0209098e8289a7038c3662d94b95232fe9248815fb7/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:12 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b625f64cb750c5b158a1b0209098e8289a7038c3662d94b95232fe9248815fb7/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:12 np0005465988 podman[235104]: 2025-10-02 11:53:12.687272966 +0000 UTC m=+0.134175259 container init 9fb7350c200a8f3713bdcf4324e86b76e6f7678092d3989b19172001f4ff439d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 07:53:12 np0005465988 podman[235104]: 2025-10-02 11:53:12.699036524 +0000 UTC m=+0.145938797 container start 9fb7350c200a8f3713bdcf4324e86b76e6f7678092d3989b19172001f4ff439d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 07:53:12 np0005465988 podman[235104]: nova_compute
Oct  2 07:53:12 np0005465988 nova_compute[235120]: + sudo -E kolla_set_configs
Oct  2 07:53:12 np0005465988 systemd[1]: Started nova_compute container.
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Validating config file
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Copying service configuration files
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Deleting /etc/ceph
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Creating directory /etc/ceph
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /etc/ceph
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Writing out command to execute
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:12 np0005465988 nova_compute[235120]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:53:12 np0005465988 nova_compute[235120]: ++ cat /run_command
Oct  2 07:53:12 np0005465988 nova_compute[235120]: + CMD=nova-compute
Oct  2 07:53:12 np0005465988 nova_compute[235120]: + ARGS=
Oct  2 07:53:12 np0005465988 nova_compute[235120]: + sudo kolla_copy_cacerts
Oct  2 07:53:12 np0005465988 nova_compute[235120]: + [[ ! -n '' ]]
Oct  2 07:53:12 np0005465988 nova_compute[235120]: + . kolla_extend_start
Oct  2 07:53:12 np0005465988 nova_compute[235120]: Running command: 'nova-compute'
Oct  2 07:53:12 np0005465988 nova_compute[235120]: + echo 'Running command: '\''nova-compute'\'''
Oct  2 07:53:12 np0005465988 nova_compute[235120]: + umask 0022
Oct  2 07:53:12 np0005465988 nova_compute[235120]: + exec nova-compute
Oct  2 07:53:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:13.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:13 np0005465988 python3.9[235282]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:14.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:15.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:15 np0005465988 python3.9[235434]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:15 np0005465988 nova_compute[235120]: 2025-10-02 11:53:15.574 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:15 np0005465988 nova_compute[235120]: 2025-10-02 11:53:15.575 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:15 np0005465988 nova_compute[235120]: 2025-10-02 11:53:15.575 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:15 np0005465988 nova_compute[235120]: 2025-10-02 11:53:15.575 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  2 07:53:15 np0005465988 nova_compute[235120]: 2025-10-02 11:53:15.721 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:53:15 np0005465988 nova_compute[235120]: 2025-10-02 11:53:15.758 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:53:16 np0005465988 python3.9[235588]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.244 2 INFO nova.virt.driver [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.475 2 INFO nova.compute.provider_config [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.494 2 DEBUG oslo_concurrency.lockutils [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.495 2 DEBUG oslo_concurrency.lockutils [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.495 2 DEBUG oslo_concurrency.lockutils [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.495 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.496 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.496 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.496 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.496 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.497 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.497 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.497 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.497 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.497 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.498 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.498 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.498 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.498 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.498 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.499 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.499 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.499 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.499 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.500 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.500 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.500 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.500 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.500 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.501 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.501 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.501 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.501 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.502 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.502 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.502 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.502 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.503 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.503 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.503 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.503 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.503 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.504 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.504 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.504 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.504 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.504 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.505 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.505 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.505 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.505 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.506 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.506 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.506 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.506 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.507 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.507 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.507 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.507 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.507 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.507 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.508 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.508 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.509 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.509 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.509 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.510 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.510 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.510 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.511 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.511 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.512 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.512 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.513 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.513 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.514 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.514 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.515 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.515 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.516 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.516 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.517 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.517 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.518 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.518 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.519 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.519 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.520 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.520 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.521 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.521 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.522 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.522 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.523 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.523 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.524 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.524 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.525 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.525 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.526 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.526 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.527 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.527 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.528 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.528 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.528 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.529 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.529 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.530 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.530 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.531 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.531 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.532 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.532 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.533 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.533 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.534 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.534 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.534 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.535 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.535 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.536 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.536 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.537 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.537 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.538 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.538 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.539 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.539 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.540 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.540 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.540 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.541 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.541 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.541 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.541 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.542 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.542 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.542 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.543 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.543 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.543 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.544 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.544 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.544 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.545 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.545 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.545 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.546 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.546 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.546 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.547 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.547 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.547 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.548 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.548 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.548 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.549 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.549 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.549 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.550 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.550 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.550 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.551 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.551 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.551 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.552 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.552 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.552 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.553 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.553 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.553 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.554 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.554 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.554 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.554 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.555 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.555 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.555 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.556 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.556 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.556 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.557 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.557 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.557 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.558 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.558 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.558 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.559 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.559 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.560 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.560 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.560 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.561 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.561 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.561 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.562 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.562 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.562 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.562 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.563 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.563 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.563 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.564 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.564 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.564 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.564 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.564 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.565 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.565 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.565 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.565 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.565 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.565 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.566 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.566 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.566 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.566 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.566 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.567 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.567 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.567 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.567 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.567 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.568 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.568 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.568 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.568 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.568 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.568 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.569 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.569 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.569 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.569 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.569 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.570 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.570 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.570 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.570 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.570 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.570 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.571 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.571 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.571 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.571 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.571 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.572 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.572 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.572 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.572 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.572 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.573 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.573 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.573 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.573 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.573 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.573 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.574 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.574 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.574 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.574 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.574 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.575 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.575 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.575 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.575 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.575 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.575 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.576 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.576 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.576 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.576 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.576 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.577 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.577 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.577 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.577 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.577 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.578 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.578 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.578 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.578 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.578 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.579 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.579 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.579 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.579 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.579 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.579 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.580 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.580 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.580 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.580 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.581 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.581 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.581 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.581 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.581 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.582 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.582 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.582 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.582 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.582 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.583 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.583 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.583 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.583 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.583 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.584 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.584 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.584 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.584 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.584 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.585 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.585 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.585 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.586 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.586 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.586 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.586 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.587 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.587 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.587 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.587 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.588 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.588 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.588 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.588 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.588 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.589 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.589 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.589 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.589 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.589 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.590 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.590 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.590 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.590 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.591 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.591 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.591 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.591 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.591 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.592 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.592 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.592 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.592 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.593 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.593 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.593 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.593 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.594 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.594 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.594 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.594 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.595 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.595 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.595 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.595 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.595 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.596 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.596 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.596 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.596 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.597 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.597 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.597 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.597 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.597 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.598 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.598 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.598 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.598 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.598 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.599 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.599 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.599 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.599 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.599 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.600 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.600 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.600 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.600 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.600 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.600 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.601 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.601 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.601 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.601 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.601 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.601 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.601 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.602 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.602 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.602 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.602 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.602 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.602 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.602 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.603 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.603 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.603 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.603 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.603 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.603 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.603 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.604 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.604 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.604 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.604 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.604 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.604 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.605 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.605 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.605 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.605 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.605 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.605 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.605 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.606 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.606 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.606 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.606 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.606 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.607 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.607 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.607 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.607 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.607 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.607 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.607 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.608 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.608 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.608 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.608 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.608 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.608 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.608 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.609 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.609 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.609 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.609 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.609 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.609 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.609 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.610 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.610 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.610 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.610 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.610 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.610 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.610 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.611 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.611 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.611 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.611 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.611 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.611 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.611 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.612 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.612 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.612 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.612 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.612 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.612 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.612 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.613 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.613 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.613 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.613 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.613 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.613 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.614 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.614 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.614 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.614 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.614 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.615 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.615 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.615 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.615 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.615 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.615 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.615 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.616 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.616 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.616 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.616 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.616 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.616 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.616 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.617 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.617 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.617 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.617 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.617 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.618 2 WARNING oslo_config.cfg [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  2 07:53:16 np0005465988 nova_compute[235120]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  2 07:53:16 np0005465988 nova_compute[235120]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  2 07:53:16 np0005465988 nova_compute[235120]: and ``live_migration_inbound_addr`` respectively.
Oct  2 07:53:16 np0005465988 nova_compute[235120]: ).  Its value may be silently ignored in the future.#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.618 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.618 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.618 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.618 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.619 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.619 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.619 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.619 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.619 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.619 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.619 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.620 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.620 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.620 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.620 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.620 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.620 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.621 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.621 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.rbd_secret_uuid        = fd4c5763-22d1-50ea-ad0b-96a3dc3040b2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.621 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.621 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.621 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.621 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.621 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.621 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.622 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.622 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.622 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.622 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.622 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.623 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.623 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.623 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.623 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.623 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.623 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.624 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.624 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.624 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.624 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.624 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.624 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.624 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.625 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.625 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.625 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.625 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.625 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.626 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.626 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.626 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.626 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.626 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.626 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.626 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.627 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.627 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.627 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.627 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.627 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.627 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.628 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.628 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.628 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.628 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.628 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.628 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.628 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:16.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.629 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.629 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.629 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.629 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.629 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.629 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.629 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.630 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.630 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.630 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.630 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.630 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.630 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.630 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.631 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.631 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.631 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.631 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.631 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.631 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.631 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.632 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.632 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.632 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.632 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.632 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.632 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.632 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.633 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.633 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.633 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.633 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.633 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.633 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.633 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.634 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.634 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.634 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.634 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.634 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.634 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.634 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.635 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.635 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.635 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.635 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.635 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.635 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.635 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.635 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.636 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.636 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.636 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.636 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.636 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.636 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.636 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.637 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.637 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.637 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.637 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.637 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.637 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.637 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.638 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.638 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.638 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.638 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.638 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.638 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.638 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.639 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.639 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.639 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.639 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.639 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.639 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.640 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.640 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.640 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.640 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.640 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.640 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.640 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.641 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.641 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.641 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.641 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.641 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.641 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.641 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.642 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.642 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.642 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.642 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.642 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.642 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.642 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.643 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.643 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.643 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.643 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.643 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.643 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.643 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.643 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.644 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.644 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.644 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.644 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.644 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.644 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.645 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.645 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.645 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.645 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.645 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.645 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.645 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.645 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.646 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.646 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.646 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.646 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.646 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.646 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.646 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.647 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.647 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.647 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.647 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.647 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.647 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.648 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.648 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.648 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.648 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.648 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.648 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.648 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.648 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.649 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.649 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.649 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.649 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.649 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.649 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.649 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.650 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.650 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.650 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.650 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.650 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.650 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.650 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.650 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.651 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.651 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.651 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.651 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.651 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.651 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.651 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.652 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.652 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.652 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.652 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.652 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.652 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.652 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.652 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.653 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.653 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.653 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.653 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.653 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.653 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.653 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.654 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.654 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.654 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.654 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.654 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.654 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.655 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.655 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.655 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.655 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.655 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.655 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.655 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.656 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.656 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.656 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.656 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.656 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.656 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.656 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.657 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.657 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.657 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.657 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.657 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.657 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.658 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.658 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.658 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.658 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.658 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.658 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.658 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.659 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.659 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.659 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.659 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.659 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.659 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.659 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.659 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.660 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.660 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.660 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.660 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.660 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.661 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.661 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.661 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.661 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.661 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.661 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.662 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.662 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.662 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.662 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.662 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.663 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.663 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.663 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.663 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.663 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.663 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.664 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.664 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.664 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.664 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.664 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.664 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.665 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.665 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.665 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.665 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.665 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.665 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.666 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.666 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.666 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.666 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.666 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.666 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.667 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.667 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.667 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.667 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.667 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.667 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.667 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.668 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.668 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.668 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.668 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.668 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.668 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.669 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.669 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.669 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.669 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.669 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.669 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.669 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.670 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.670 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.670 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.670 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.670 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.671 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.671 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.671 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.671 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.671 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.671 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.671 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.672 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.672 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.672 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.672 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.672 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.672 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.672 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.673 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.673 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.673 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.673 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.673 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.673 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.673 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.674 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.674 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.674 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.674 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.674 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.674 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.674 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.675 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.675 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.675 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.675 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.675 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.675 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.675 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.675 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.676 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.676 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.676 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.676 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.676 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.676 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.676 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.677 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.677 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.677 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.677 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.677 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.677 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.677 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.678 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.678 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.678 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.678 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.678 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.678 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.678 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.679 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.679 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.679 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.679 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.679 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.679 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.679 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.680 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.680 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.680 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.680 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.680 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.680 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.680 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.681 2 DEBUG oslo_service.service [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.682 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.699 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.700 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.700 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.700 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  2 07:53:16 np0005465988 systemd[1]: Starting libvirt QEMU daemon...
Oct  2 07:53:16 np0005465988 systemd[1]: Started libvirt QEMU daemon.
Oct  2 07:53:16 np0005465988 podman[235688]: 2025-10-02 11:53:16.804477919 +0000 UTC m=+0.065809054 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.831 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f25f84d1b20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.835 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f25f84d1b20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.836 2 INFO nova.virt.libvirt.driver [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.855 2 WARNING nova.virt.libvirt.driver [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Oct  2 07:53:16 np0005465988 nova_compute[235120]: 2025-10-02 11:53:16.856 2 DEBUG nova.virt.libvirt.volume.mount [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  2 07:53:16 np0005465988 podman[235686]: 2025-10-02 11:53:16.870462587 +0000 UTC m=+0.122600557 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 07:53:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:17 np0005465988 python3.9[235833]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  2 07:53:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:17 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:53:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:17.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:17 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:53:17 np0005465988 podman[235881]: 2025-10-02 11:53:17.529069559 +0000 UTC m=+0.093131700 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible)
Oct  2 07:53:17 np0005465988 nova_compute[235120]: 2025-10-02 11:53:17.801 2 INFO nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Libvirt host capabilities <capabilities>
Oct  2 07:53:17 np0005465988 nova_compute[235120]: 
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <host>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <uuid>93278213-1c3c-4fb4-9fd1-d481e0b53ce1</uuid>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <cpu>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <arch>x86_64</arch>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model>EPYC-Rome-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <vendor>AMD</vendor>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <microcode version='16777317'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <signature family='23' model='49' stepping='0'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='x2apic'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='tsc-deadline'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='osxsave'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='hypervisor'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='tsc_adjust'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='spec-ctrl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='stibp'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='arch-capabilities'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='ssbd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='cmp_legacy'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='topoext'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='virt-ssbd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='lbrv'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='tsc-scale'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='vmcb-clean'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='pause-filter'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='pfthreshold'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='svme-addr-chk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='rdctl-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='skip-l1dfl-vmentry'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='mds-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature name='pschange-mc-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <pages unit='KiB' size='4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <pages unit='KiB' size='2048'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <pages unit='KiB' size='1048576'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </cpu>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <power_management>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <suspend_mem/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </power_management>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <iommu support='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <migration_features>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <live/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <uri_transports>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <uri_transport>tcp</uri_transport>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <uri_transport>rdma</uri_transport>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </uri_transports>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </migration_features>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <topology>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <cells num='1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <cell id='0'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:          <memory unit='KiB'>7864104</memory>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:          <pages unit='KiB' size='4'>1966026</pages>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:          <pages unit='KiB' size='2048'>0</pages>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:          <distances>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:            <sibling id='0' value='10'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:          </distances>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:          <cpus num='8'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:          </cpus>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        </cell>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </cells>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </topology>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <cache>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </cache>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <secmodel>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model>selinux</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <doi>0</doi>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </secmodel>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <secmodel>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model>dac</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <doi>0</doi>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </secmodel>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </host>
Oct  2 07:53:17 np0005465988 nova_compute[235120]: 
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <guest>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <os_type>hvm</os_type>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <arch name='i686'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <wordsize>32</wordsize>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <domain type='qemu'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <domain type='kvm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </arch>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <features>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <pae/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <nonpae/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <acpi default='on' toggle='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <apic default='on' toggle='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <cpuselection/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <deviceboot/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <externalSnapshot/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </features>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </guest>
Oct  2 07:53:17 np0005465988 nova_compute[235120]: 
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <guest>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <os_type>hvm</os_type>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <arch name='x86_64'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <wordsize>64</wordsize>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <domain type='qemu'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <domain type='kvm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </arch>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <features>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <acpi default='on' toggle='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <apic default='on' toggle='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <cpuselection/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <deviceboot/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <externalSnapshot/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </features>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </guest>
Oct  2 07:53:17 np0005465988 nova_compute[235120]: 
Oct  2 07:53:17 np0005465988 nova_compute[235120]: </capabilities>
Oct  2 07:53:17 np0005465988 nova_compute[235120]: #033[00m
Oct  2 07:53:17 np0005465988 nova_compute[235120]: 2025-10-02 11:53:17.808 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:53:17 np0005465988 nova_compute[235120]: 2025-10-02 11:53:17.840 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  2 07:53:17 np0005465988 nova_compute[235120]: <domainCapabilities>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <domain>kvm</domain>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <arch>i686</arch>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <vcpu max='240'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <iothreads supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <os supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <enum name='firmware'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <loader supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>rom</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>pflash</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='readonly'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>yes</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>no</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='secure'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>no</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </loader>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </os>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <cpu>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>on</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>off</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='maximumMigratable'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>on</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>off</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <vendor>AMD</vendor>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='succor'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='custom' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Denverton'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx10'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx10-128'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx10-256'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx10-512'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='KnightsMill'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512er'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512pf'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512er'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512pf'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G5'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tbm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tbm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SierraForest'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Snowridge'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='athlon'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='athlon-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='core2duo'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='core2duo-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='coreduo'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='coreduo-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='n270'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='n270-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='phenom'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='phenom-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </cpu>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <memoryBacking supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <enum name='sourceType'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <value>file</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <value>anonymous</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <value>memfd</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </memoryBacking>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <devices>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <disk supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='diskDevice'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>disk</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>cdrom</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>floppy</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>lun</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='bus'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>ide</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>fdc</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>scsi</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>sata</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio-transitional</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </disk>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <graphics supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>vnc</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>egl-headless</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>dbus</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </graphics>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <video supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='modelType'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>vga</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>cirrus</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>none</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>bochs</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>ramfb</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </video>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <hostdev supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='mode'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>subsystem</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='startupPolicy'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>default</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>mandatory</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>requisite</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>optional</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='subsysType'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>pci</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>scsi</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='capsType'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='pciBackend'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </hostdev>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <rng supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio-transitional</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>random</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>egd</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>builtin</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </rng>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <filesystem supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='driverType'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>path</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>handle</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtiofs</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </filesystem>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <tpm supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>tpm-tis</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>tpm-crb</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>emulator</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>external</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='backendVersion'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>2.0</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </tpm>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <redirdev supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='bus'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </redirdev>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <channel supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>pty</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>unix</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </channel>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <crypto supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='model'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>qemu</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>builtin</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </crypto>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <interface supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='backendType'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>default</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>passt</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </interface>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <panic supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>isa</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>hyperv</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </panic>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </devices>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <features>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <gic supported='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <genid supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <backup supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <async-teardown supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <ps2 supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <sev supported='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <sgx supported='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <hyperv supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='features'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>relaxed</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>vapic</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>spinlocks</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>vpindex</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>runtime</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>synic</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>stimer</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>reset</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>vendor_id</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>frequencies</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>reenlightenment</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>tlbflush</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>ipi</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>avic</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>emsr_bitmap</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>xmm_input</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </hyperv>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <launchSecurity supported='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </features>
Oct  2 07:53:17 np0005465988 nova_compute[235120]: </domainCapabilities>
Oct  2 07:53:17 np0005465988 nova_compute[235120]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:17 np0005465988 nova_compute[235120]: 2025-10-02 11:53:17.846 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  2 07:53:17 np0005465988 nova_compute[235120]: <domainCapabilities>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <domain>kvm</domain>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <arch>i686</arch>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <vcpu max='4096'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <iothreads supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <os supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <enum name='firmware'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <loader supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>rom</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>pflash</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='readonly'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>yes</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>no</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='secure'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>no</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </loader>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </os>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <cpu>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>on</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>off</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='maximumMigratable'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>on</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>off</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <vendor>AMD</vendor>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='succor'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='custom' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Denverton'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='EPYC-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx10'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx10-128'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx10-256'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx10-512'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='KnightsMill'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512er'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512pf'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512er'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512pf'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G5'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tbm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tbm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SierraForest'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-ifma'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Snowridge'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='athlon'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='athlon-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='core2duo'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='core2duo-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='coreduo'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='coreduo-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='n270'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='n270-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='phenom'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='phenom-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </cpu>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <memoryBacking supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <enum name='sourceType'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <value>file</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <value>anonymous</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <value>memfd</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </memoryBacking>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <devices>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <disk supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='diskDevice'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>disk</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>cdrom</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>floppy</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>lun</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='bus'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>fdc</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>scsi</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>sata</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio-transitional</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </disk>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <graphics supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>vnc</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>egl-headless</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>dbus</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </graphics>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <video supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='modelType'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>vga</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>cirrus</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>none</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>bochs</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>ramfb</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </video>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <hostdev supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='mode'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>subsystem</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='startupPolicy'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>default</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>mandatory</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>requisite</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>optional</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='subsysType'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>pci</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>scsi</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='capsType'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='pciBackend'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </hostdev>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <rng supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio-transitional</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>random</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>egd</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>builtin</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </rng>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <filesystem supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='driverType'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>path</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>handle</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>virtiofs</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </filesystem>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <tpm supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>tpm-tis</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>tpm-crb</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>emulator</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>external</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='backendVersion'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>2.0</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </tpm>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <redirdev supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='bus'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </redirdev>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <channel supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>pty</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>unix</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </channel>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <crypto supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='model'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>qemu</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>builtin</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </crypto>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <interface supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='backendType'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>default</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>passt</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </interface>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <panic supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>isa</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>hyperv</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </panic>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </devices>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <features>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <gic supported='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <genid supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <backup supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <async-teardown supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <ps2 supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <sev supported='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <sgx supported='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <hyperv supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='features'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>relaxed</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>vapic</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>spinlocks</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>vpindex</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>runtime</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>synic</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>stimer</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>reset</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>vendor_id</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>frequencies</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>reenlightenment</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>tlbflush</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>ipi</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>avic</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>emsr_bitmap</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>xmm_input</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </hyperv>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <launchSecurity supported='no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </features>
Oct  2 07:53:17 np0005465988 nova_compute[235120]: </domainCapabilities>
Oct  2 07:53:17 np0005465988 nova_compute[235120]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:17 np0005465988 nova_compute[235120]: 2025-10-02 11:53:17.886 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:53:17 np0005465988 nova_compute[235120]: 2025-10-02 11:53:17.892 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  2 07:53:17 np0005465988 nova_compute[235120]: <domainCapabilities>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <domain>kvm</domain>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <arch>x86_64</arch>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <vcpu max='240'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <iothreads supported='yes'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <os supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <enum name='firmware'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <loader supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>rom</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>pflash</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='readonly'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>yes</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>no</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='secure'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>no</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </loader>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  </os>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:  <cpu>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>on</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>off</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <enum name='maximumMigratable'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>on</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <value>off</value>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <vendor>AMD</vendor>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='succor'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:    <mode name='custom' supported='yes'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:17 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Denverton'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx10'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx10-128'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx10-256'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx10-512'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='KnightsMill'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512er'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512pf'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512er'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512pf'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G5'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tbm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tbm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SierraForest'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Snowridge'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='athlon'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='athlon-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='core2duo'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='core2duo-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='coreduo'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='coreduo-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='n270'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='n270-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='phenom'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='phenom-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  </cpu>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <memoryBacking supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <enum name='sourceType'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <value>file</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <value>anonymous</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <value>memfd</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  </memoryBacking>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <devices>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <disk supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='diskDevice'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>disk</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>cdrom</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>floppy</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>lun</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='bus'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>ide</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>fdc</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>scsi</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>sata</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio-transitional</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </disk>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <graphics supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>vnc</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>egl-headless</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>dbus</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </graphics>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <video supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='modelType'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>vga</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>cirrus</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>none</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>bochs</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>ramfb</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </video>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <hostdev supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='mode'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>subsystem</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='startupPolicy'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>default</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>mandatory</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>requisite</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>optional</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='subsysType'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>pci</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>scsi</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='capsType'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='pciBackend'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </hostdev>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <rng supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio-transitional</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>random</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>egd</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>builtin</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </rng>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <filesystem supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='driverType'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>path</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>handle</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtiofs</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </filesystem>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <tpm supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>tpm-tis</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>tpm-crb</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>emulator</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>external</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='backendVersion'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>2.0</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </tpm>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <redirdev supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='bus'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </redirdev>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <channel supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>pty</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>unix</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </channel>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <crypto supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='model'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>qemu</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>builtin</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </crypto>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <interface supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='backendType'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>default</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>passt</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </interface>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <panic supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>isa</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>hyperv</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </panic>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  </devices>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <features>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <gic supported='no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <genid supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <backup supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <async-teardown supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <ps2 supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <sev supported='no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <sgx supported='no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <hyperv supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='features'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>relaxed</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>vapic</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>spinlocks</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>vpindex</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>runtime</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>synic</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>stimer</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>reset</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>vendor_id</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>frequencies</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>reenlightenment</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>tlbflush</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>ipi</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>avic</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>emsr_bitmap</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>xmm_input</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </hyperv>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <launchSecurity supported='no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  </features>
Oct  2 07:53:18 np0005465988 nova_compute[235120]: </domainCapabilities>
Oct  2 07:53:18 np0005465988 nova_compute[235120]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:17.957 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  2 07:53:18 np0005465988 nova_compute[235120]: <domainCapabilities>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <domain>kvm</domain>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <arch>x86_64</arch>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <vcpu max='4096'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <iothreads supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <os supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <enum name='firmware'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <value>efi</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <loader supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>rom</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>pflash</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='readonly'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>yes</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>no</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='secure'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>yes</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>no</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </loader>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  </os>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <cpu>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>on</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>off</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='maximumMigratable'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>on</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>off</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <vendor>AMD</vendor>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='succor'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <mode name='custom' supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Broadwell'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Denverton'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Denverton-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amd-psfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='EPYC-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx10'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx10-128'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx10-256'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx10-512'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='prefetchiti'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Haswell-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='KnightsMill'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512er'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512pf'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512er'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512pf'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G5'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tbm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fma4'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tbm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xop'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='amx-tile'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrc'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fzrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='la57'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='taa-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xfd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SierraForest'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-ifma'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='fsrs'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ibrs-all'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mcdt-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='psdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='serialize'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vaes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='hle'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='rtm'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512bw'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512cd'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512dq'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512f'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='avx512vl'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='invpcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pcid'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='pku'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Snowridge'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='mpx'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='core-capability'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='cldemote'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='erms'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='gfni'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdir64b'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='movdiri'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='xsaves'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='athlon'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='athlon-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='core2duo'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='core2duo-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='coreduo'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='coreduo-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='n270'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='n270-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='ss'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='phenom'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <blockers model='phenom-v1'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnow'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <feature name='3dnowext'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </blockers>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </mode>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  </cpu>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <memoryBacking supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <enum name='sourceType'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <value>file</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <value>anonymous</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <value>memfd</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  </memoryBacking>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <devices>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <disk supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='diskDevice'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>disk</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>cdrom</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>floppy</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>lun</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='bus'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>fdc</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>scsi</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>sata</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio-transitional</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </disk>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <graphics supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>vnc</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>egl-headless</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>dbus</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </graphics>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <video supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='modelType'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>vga</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>cirrus</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>none</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>bochs</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>ramfb</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </video>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <hostdev supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='mode'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>subsystem</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='startupPolicy'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>default</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>mandatory</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>requisite</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>optional</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='subsysType'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>pci</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>scsi</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='capsType'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='pciBackend'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </hostdev>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <rng supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio-transitional</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>random</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>egd</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>builtin</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </rng>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <filesystem supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='driverType'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>path</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>handle</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>virtiofs</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </filesystem>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <tpm supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>tpm-tis</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>tpm-crb</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>emulator</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>external</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='backendVersion'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>2.0</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </tpm>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <redirdev supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='bus'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>usb</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </redirdev>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <channel supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>pty</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>unix</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </channel>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <crypto supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='model'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='type'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>qemu</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='backendModel'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>builtin</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </crypto>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <interface supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='backendType'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>default</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>passt</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </interface>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <panic supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='model'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>isa</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>hyperv</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </panic>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  </devices>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <features>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <gic supported='no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <genid supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <backup supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <async-teardown supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <ps2 supported='yes'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <sev supported='no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <sgx supported='no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <hyperv supported='yes'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      <enum name='features'>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>relaxed</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>vapic</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>spinlocks</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>vpindex</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>runtime</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>synic</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>stimer</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>reset</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>vendor_id</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>frequencies</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>reenlightenment</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>tlbflush</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>ipi</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>avic</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>emsr_bitmap</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:        <value>xmm_input</value>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:      </enum>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    </hyperv>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:    <launchSecurity supported='no'/>
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  </features>
Oct  2 07:53:18 np0005465988 nova_compute[235120]: </domainCapabilities>
Oct  2 07:53:18 np0005465988 nova_compute[235120]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.024 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.024 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.025 2 DEBUG nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.025 2 INFO nova.virt.libvirt.host [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Secure Boot support detected#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.027 2 INFO nova.virt.libvirt.driver [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.028 2 INFO nova.virt.libvirt.driver [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.057 2 DEBUG nova.virt.libvirt.driver [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] cpu compare xml: <cpu match="exact">
Oct  2 07:53:18 np0005465988 nova_compute[235120]:  <model>Nehalem</model>
Oct  2 07:53:18 np0005465988 nova_compute[235120]: </cpu>
Oct  2 07:53:18 np0005465988 nova_compute[235120]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.060 2 DEBUG nova.virt.libvirt.driver [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.097 2 INFO nova.virt.node [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Determined node identity 5abd2871-a992-42ab-bb6a-594a92f77d4d from /var/lib/nova/compute_id#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.126 2 WARNING nova.compute.manager [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Compute nodes ['5abd2871-a992-42ab-bb6a-594a92f77d4d'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.183 2 INFO nova.compute.manager [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.220 2 WARNING nova.compute.manager [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.221 2 DEBUG oslo_concurrency.lockutils [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.221 2 DEBUG oslo_concurrency.lockutils [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.221 2 DEBUG oslo_concurrency.lockutils [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.221 2 DEBUG nova.compute.resource_tracker [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.222 2 DEBUG oslo_concurrency.processutils [None req-7eea1a00-6ac1-4c03-b0a8-1a458c27e915 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:53:18 np0005465988 python3.9[236041]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:53:18 np0005465988 systemd[1]: Stopping nova_compute container...
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.422 2 DEBUG oslo_concurrency.lockutils [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.422 2 DEBUG oslo_concurrency.lockutils [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:53:18 np0005465988 nova_compute[235120]: 2025-10-02 11:53:18.423 2 DEBUG oslo_concurrency.lockutils [None req-20e07223-03d1-45c7-88aa-6b7f44c97463 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:53:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:18.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:53:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4205628115' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:53:18 np0005465988 virtqemud[235689]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct  2 07:53:18 np0005465988 virtqemud[235689]: hostname: compute-2
Oct  2 07:53:18 np0005465988 virtqemud[235689]: End of file while reading data: Input/output error
Oct  2 07:53:18 np0005465988 systemd[1]: libpod-9fb7350c200a8f3713bdcf4324e86b76e6f7678092d3989b19172001f4ff439d.scope: Deactivated successfully.
Oct  2 07:53:18 np0005465988 systemd[1]: libpod-9fb7350c200a8f3713bdcf4324e86b76e6f7678092d3989b19172001f4ff439d.scope: Consumed 3.734s CPU time.
Oct  2 07:53:18 np0005465988 podman[236065]: 2025-10-02 11:53:18.828343237 +0000 UTC m=+0.445715630 container died 9fb7350c200a8f3713bdcf4324e86b76e6f7678092d3989b19172001f4ff439d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Oct  2 07:53:18 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9fb7350c200a8f3713bdcf4324e86b76e6f7678092d3989b19172001f4ff439d-userdata-shm.mount: Deactivated successfully.
Oct  2 07:53:18 np0005465988 systemd[1]: var-lib-containers-storage-overlay-b625f64cb750c5b158a1b0209098e8289a7038c3662d94b95232fe9248815fb7-merged.mount: Deactivated successfully.
Oct  2 07:53:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:19.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:20.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:20 np0005465988 podman[236065]: 2025-10-02 11:53:20.929491877 +0000 UTC m=+2.546864270 container cleanup 9fb7350c200a8f3713bdcf4324e86b76e6f7678092d3989b19172001f4ff439d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 07:53:20 np0005465988 podman[236065]: nova_compute
Oct  2 07:53:21 np0005465988 podman[236098]: nova_compute
Oct  2 07:53:21 np0005465988 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct  2 07:53:21 np0005465988 systemd[1]: Stopped nova_compute container.
Oct  2 07:53:21 np0005465988 systemd[1]: Starting nova_compute container...
Oct  2 07:53:21 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:53:21 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b625f64cb750c5b158a1b0209098e8289a7038c3662d94b95232fe9248815fb7/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:21 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b625f64cb750c5b158a1b0209098e8289a7038c3662d94b95232fe9248815fb7/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:21 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b625f64cb750c5b158a1b0209098e8289a7038c3662d94b95232fe9248815fb7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:21 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b625f64cb750c5b158a1b0209098e8289a7038c3662d94b95232fe9248815fb7/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:21 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b625f64cb750c5b158a1b0209098e8289a7038c3662d94b95232fe9248815fb7/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:21 np0005465988 podman[236111]: 2025-10-02 11:53:21.167027379 +0000 UTC m=+0.107738490 container init 9fb7350c200a8f3713bdcf4324e86b76e6f7678092d3989b19172001f4ff439d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251001)
Oct  2 07:53:21 np0005465988 podman[236111]: 2025-10-02 11:53:21.181241277 +0000 UTC m=+0.121952358 container start 9fb7350c200a8f3713bdcf4324e86b76e6f7678092d3989b19172001f4ff439d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:53:21 np0005465988 nova_compute[236126]: + sudo -E kolla_set_configs
Oct  2 07:53:21 np0005465988 podman[236111]: nova_compute
Oct  2 07:53:21 np0005465988 systemd[1]: Started nova_compute container.
Oct  2 07:53:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:21.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Validating config file
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Copying service configuration files
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Deleting /etc/ceph
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Creating directory /etc/ceph
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /etc/ceph
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Writing out command to execute
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:21 np0005465988 nova_compute[236126]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:53:21 np0005465988 nova_compute[236126]: ++ cat /run_command
Oct  2 07:53:21 np0005465988 nova_compute[236126]: + CMD=nova-compute
Oct  2 07:53:21 np0005465988 nova_compute[236126]: + ARGS=
Oct  2 07:53:21 np0005465988 nova_compute[236126]: + sudo kolla_copy_cacerts
Oct  2 07:53:21 np0005465988 nova_compute[236126]: Running command: 'nova-compute'
Oct  2 07:53:21 np0005465988 nova_compute[236126]: + [[ ! -n '' ]]
Oct  2 07:53:21 np0005465988 nova_compute[236126]: + . kolla_extend_start
Oct  2 07:53:21 np0005465988 nova_compute[236126]: + echo 'Running command: '\''nova-compute'\'''
Oct  2 07:53:21 np0005465988 nova_compute[236126]: + umask 0022
Oct  2 07:53:21 np0005465988 nova_compute[236126]: + exec nova-compute
Oct  2 07:53:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:22 np0005465988 python3.9[236290]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  2 07:53:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:22.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:22 np0005465988 systemd[1]: Started libpod-conmon-99d60efbe895442362003c2757c7516e8f2fac4a157d84867710565285b685e1.scope.
Oct  2 07:53:22 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:53:22 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/442ea3658f3ed3913cd34a845f7fcca17ce00879b88ed778f49a7c6d47e47aef/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:22 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/442ea3658f3ed3913cd34a845f7fcca17ce00879b88ed778f49a7c6d47e47aef/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:22 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/442ea3658f3ed3913cd34a845f7fcca17ce00879b88ed778f49a7c6d47e47aef/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:22 np0005465988 podman[236315]: 2025-10-02 11:53:22.81131704 +0000 UTC m=+0.163688199 container init 99d60efbe895442362003c2757c7516e8f2fac4a157d84867710565285b685e1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=nova_compute_init, org.label-schema.build-date=20251001)
Oct  2 07:53:22 np0005465988 podman[236315]: 2025-10-02 11:53:22.820661309 +0000 UTC m=+0.173032478 container start 99d60efbe895442362003c2757c7516e8f2fac4a157d84867710565285b685e1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251001)
Oct  2 07:53:22 np0005465988 python3.9[236290]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Applying nova statedir ownership
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct  2 07:53:22 np0005465988 nova_compute_init[236338]: INFO:nova_statedir:Nova statedir ownership complete
Oct  2 07:53:22 np0005465988 systemd[1]: libpod-99d60efbe895442362003c2757c7516e8f2fac4a157d84867710565285b685e1.scope: Deactivated successfully.
Oct  2 07:53:22 np0005465988 podman[236353]: 2025-10-02 11:53:22.95909426 +0000 UTC m=+0.028208962 container died 99d60efbe895442362003c2757c7516e8f2fac4a157d84867710565285b685e1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=edpm)
Oct  2 07:53:22 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99d60efbe895442362003c2757c7516e8f2fac4a157d84867710565285b685e1-userdata-shm.mount: Deactivated successfully.
Oct  2 07:53:22 np0005465988 systemd[1]: var-lib-containers-storage-overlay-442ea3658f3ed3913cd34a845f7fcca17ce00879b88ed778f49a7c6d47e47aef-merged.mount: Deactivated successfully.
Oct  2 07:53:23 np0005465988 podman[236353]: 2025-10-02 11:53:23.007190053 +0000 UTC m=+0.076304755 container cleanup 99d60efbe895442362003c2757c7516e8f2fac4a157d84867710565285b685e1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:53:23 np0005465988 systemd[1]: libpod-conmon-99d60efbe895442362003c2757c7516e8f2fac4a157d84867710565285b685e1.scope: Deactivated successfully.
Oct  2 07:53:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:23.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:23 np0005465988 nova_compute[236126]: 2025-10-02 11:53:23.305 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:23 np0005465988 nova_compute[236126]: 2025-10-02 11:53:23.305 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:23 np0005465988 nova_compute[236126]: 2025-10-02 11:53:23.306 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:23 np0005465988 nova_compute[236126]: 2025-10-02 11:53:23.306 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  2 07:53:23 np0005465988 nova_compute[236126]: 2025-10-02 11:53:23.432 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:53:23 np0005465988 nova_compute[236126]: 2025-10-02 11:53:23.463 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:53:23 np0005465988 systemd[1]: session-50.scope: Deactivated successfully.
Oct  2 07:53:23 np0005465988 systemd[1]: session-50.scope: Consumed 2min 58.504s CPU time.
Oct  2 07:53:23 np0005465988 systemd-logind[827]: Session 50 logged out. Waiting for processes to exit.
Oct  2 07:53:23 np0005465988 systemd-logind[827]: Removed session 50.
Oct  2 07:53:23 np0005465988 nova_compute[236126]: 2025-10-02 11:53:23.894 2 INFO nova.virt.driver [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.025 2 INFO nova.compute.provider_config [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.034 2 DEBUG oslo_concurrency.lockutils [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.035 2 DEBUG oslo_concurrency.lockutils [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.035 2 DEBUG oslo_concurrency.lockutils [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.035 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.035 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.035 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.035 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.036 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.036 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.036 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.036 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.036 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.036 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.036 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.037 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.037 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.037 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.037 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.037 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.037 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.037 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.038 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.038 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.038 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.038 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.038 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.038 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.038 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.038 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.039 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.039 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.039 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.039 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.039 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.039 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.040 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.040 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.040 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.040 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.040 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.040 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.040 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.041 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.041 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.041 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.041 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.041 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.041 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.041 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.042 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.042 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.042 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.042 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.042 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.042 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.042 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.043 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.043 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.043 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.043 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.043 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.043 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.043 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.044 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.044 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.044 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.044 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.044 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.044 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.044 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.044 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.045 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.045 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.045 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.045 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.045 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.045 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.045 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.046 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.046 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.046 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.046 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.046 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.046 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.047 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.047 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.047 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.047 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.047 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.047 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.047 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.048 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.048 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.048 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.048 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.048 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.048 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.048 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.049 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.049 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.049 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.049 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.049 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.050 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.050 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.050 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.050 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.050 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.050 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.050 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.050 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.051 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.051 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.051 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.051 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.051 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.051 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.051 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.052 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.052 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.052 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.052 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.052 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.052 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.052 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.053 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.053 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.053 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.053 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.053 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.053 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.054 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.054 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.054 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.054 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.054 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.055 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.055 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.055 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.055 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.055 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.055 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.056 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.056 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.056 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.056 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.056 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.056 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.057 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.057 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.057 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.057 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.057 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.057 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.057 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.058 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.058 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.058 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.058 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.058 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.058 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.059 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.059 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.059 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.059 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.059 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.059 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.059 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.060 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.060 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.060 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.060 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.060 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.060 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.061 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.061 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.061 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.061 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.061 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.061 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.061 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.062 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.062 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.062 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.062 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.062 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.062 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.063 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.063 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.063 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.063 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.063 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.063 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.063 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.064 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.064 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.064 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.064 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.064 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.064 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.065 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.065 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.065 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.065 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.065 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.065 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.065 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.066 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.066 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.066 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.066 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.066 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.066 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.066 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.067 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.067 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.067 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.067 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.067 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.068 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.068 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.068 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.068 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.068 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.069 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.069 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.069 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.069 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.069 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.069 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.070 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.070 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.070 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.070 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.070 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.070 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.070 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.071 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.071 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.071 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.071 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.071 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.071 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.071 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.072 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.072 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.072 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.072 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.072 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.073 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.073 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.073 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.073 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.073 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.074 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.074 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.074 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.075 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.075 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.075 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.075 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.076 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.076 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.076 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.076 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.076 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.076 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.077 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.077 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.077 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.077 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.077 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.077 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.077 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.078 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.078 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.078 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.078 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.078 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.078 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.079 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.079 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.079 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.079 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.079 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.079 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.079 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.080 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.080 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.080 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.080 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.080 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.080 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.080 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.081 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.081 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.081 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.081 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.081 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.081 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.081 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.082 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.082 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.082 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.082 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.082 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.082 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.083 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.083 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.083 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.083 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.083 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.083 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.083 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.084 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.084 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.084 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.084 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.084 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.084 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.084 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.085 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.085 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.085 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.085 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.085 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.085 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.086 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.086 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.086 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.086 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.086 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.086 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.087 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.087 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.087 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.087 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.087 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.087 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.087 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.088 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.088 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.088 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.088 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.088 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.088 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.089 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.089 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.089 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.089 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.089 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.089 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.090 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.090 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.090 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.090 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.090 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.090 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.091 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.091 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.091 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.091 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.092 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.092 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.092 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.092 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.092 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.092 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.093 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.093 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.093 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.093 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.093 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.093 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.094 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.094 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.094 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.094 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.094 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.094 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.095 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.095 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.095 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.095 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.095 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.096 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.096 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.096 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.096 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.096 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.096 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.097 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.097 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.097 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.097 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.097 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.097 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.097 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.098 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.098 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.098 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.098 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.098 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.098 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.099 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.099 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.099 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.099 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.099 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.099 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.099 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.100 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.100 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.100 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.100 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.100 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.100 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.100 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.101 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.101 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.101 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.101 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.101 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.101 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.102 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.102 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.102 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.102 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.102 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.102 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.102 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.103 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.103 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.103 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.103 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.103 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.103 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.103 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.104 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.104 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.104 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.104 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.104 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.104 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.104 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.105 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.105 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.105 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.105 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.105 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.105 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.106 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.106 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.106 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.106 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.106 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.106 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.106 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.107 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.107 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.107 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.107 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.107 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.107 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.108 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.108 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.108 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.108 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.108 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.108 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.109 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.109 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.109 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.109 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.109 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.109 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.109 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.110 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.110 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.110 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.110 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.110 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.110 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.110 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.111 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.111 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.111 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.111 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.111 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.111 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.112 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.112 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.112 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.112 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.112 2 WARNING oslo_config.cfg [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  2 07:53:24 np0005465988 nova_compute[236126]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  2 07:53:24 np0005465988 nova_compute[236126]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  2 07:53:24 np0005465988 nova_compute[236126]: and ``live_migration_inbound_addr`` respectively.
Oct  2 07:53:24 np0005465988 nova_compute[236126]: ).  Its value may be silently ignored in the future.#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.112 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.113 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.113 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.113 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.113 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.113 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.113 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.114 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.114 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.114 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.114 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.114 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.114 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.114 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.115 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.115 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.115 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.115 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.115 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.rbd_secret_uuid        = fd4c5763-22d1-50ea-ad0b-96a3dc3040b2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.115 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.115 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.116 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.116 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.116 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.116 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.116 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.116 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.116 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.117 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.117 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.117 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.117 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.117 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.117 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.118 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.118 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.118 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.118 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.118 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.118 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.118 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.119 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.119 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.119 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.119 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.119 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.119 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.119 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.120 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.120 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.120 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.120 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.120 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.120 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.120 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.121 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.121 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.121 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.121 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.121 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.121 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.121 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.121 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.122 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.122 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.122 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.122 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.122 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.122 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.122 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.122 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.123 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.123 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.123 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.123 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.123 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.123 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.123 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.124 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.124 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.124 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.124 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.124 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.124 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.124 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.125 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.125 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.125 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.125 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.125 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.125 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.125 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.126 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.126 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.126 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.126 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.126 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.126 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.126 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.126 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.127 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.127 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.127 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.127 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.127 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.127 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.127 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.128 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.128 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.128 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.128 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.128 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.128 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.128 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.128 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.129 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.129 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.129 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.129 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.129 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.129 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.129 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.130 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.130 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.130 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.130 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.130 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.130 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.130 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.130 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.131 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.131 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.131 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.131 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.131 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.131 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.131 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.132 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.132 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.132 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.132 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.132 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.132 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.133 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.133 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.133 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.133 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.133 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.133 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.133 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.134 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.134 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.134 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.134 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.134 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.134 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.134 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.135 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.135 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.135 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.135 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.135 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.135 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.135 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.135 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.136 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.136 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.136 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.136 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.136 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.136 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.136 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.137 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.137 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.137 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.137 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.137 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.137 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.137 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.138 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.138 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.138 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.138 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.138 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.138 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.138 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.139 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.139 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.139 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.139 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.139 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.139 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.140 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.140 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.140 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.140 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.140 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.140 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.140 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.141 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.141 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.141 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.141 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.141 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.141 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.141 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.142 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.142 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.142 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.142 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.142 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.142 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.142 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.142 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.143 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.143 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.143 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.143 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.143 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.143 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.143 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.144 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.144 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.144 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.144 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.144 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.144 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.144 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.144 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.145 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.145 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.145 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.145 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.145 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.145 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.145 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.146 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.146 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.146 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.146 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.146 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.146 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.146 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.146 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.147 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.147 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.147 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.147 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.147 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.147 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.148 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.148 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.148 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.148 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.148 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.148 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.149 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.149 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.149 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.149 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.149 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.149 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.150 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.150 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.150 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.150 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.150 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.150 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.150 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.150 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.151 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.151 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.151 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.151 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.151 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.151 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.151 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.152 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.152 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.152 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.152 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.152 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.152 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.152 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.153 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.153 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.153 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.153 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.153 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.153 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.153 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.154 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.154 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.154 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.154 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.154 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.154 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.154 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.155 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.155 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.155 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.155 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.155 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.155 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.156 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.156 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.156 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.156 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.156 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.156 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.156 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.156 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.157 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.157 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.157 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.157 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.157 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.157 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.157 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.158 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.158 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.158 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.158 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.158 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.158 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.158 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.159 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.159 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.159 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.159 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.159 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.159 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.159 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.160 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.160 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.160 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.160 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.160 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.160 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.160 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.161 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.161 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.161 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.161 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.161 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.161 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.161 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.162 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.162 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.162 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.162 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.162 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.162 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.162 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.163 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.163 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.163 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.163 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.163 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.163 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.163 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.164 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.164 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.164 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.164 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.164 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.164 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.165 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.165 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.165 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.165 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.165 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.165 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.165 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.166 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.166 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.166 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.166 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.166 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.166 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.166 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.167 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.167 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.167 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.167 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.167 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.167 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.167 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.168 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.168 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.168 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.168 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.168 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.168 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.168 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.169 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.169 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.169 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.169 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.169 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.169 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.169 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.170 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.170 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.170 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.170 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.170 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.170 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.170 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.171 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.171 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.171 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.171 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.171 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.171 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.171 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.172 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.172 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.172 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.172 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.172 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.172 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.172 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.173 2 DEBUG oslo_service.service [None req-4dfaca92-806d-4907-9cf0-7bb2a16a695e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.173 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.186 2 INFO nova.virt.node [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Determined node identity 5abd2871-a992-42ab-bb6a-594a92f77d4d from /var/lib/nova/compute_id#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.186 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.187 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.187 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.187 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.204 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f67b2fa38b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.206 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f67b2fa38b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.207 2 INFO nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.219 2 DEBUG nova.virt.libvirt.volume.mount [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.220 2 INFO nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Libvirt host capabilities <capabilities>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <host>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <uuid>93278213-1c3c-4fb4-9fd1-d481e0b53ce1</uuid>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <cpu>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <arch>x86_64</arch>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model>EPYC-Rome-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <vendor>AMD</vendor>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <microcode version='16777317'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <signature family='23' model='49' stepping='0'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='x2apic'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='tsc-deadline'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='osxsave'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='hypervisor'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='tsc_adjust'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='spec-ctrl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='stibp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='arch-capabilities'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='cmp_legacy'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='topoext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='virt-ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='lbrv'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='tsc-scale'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='vmcb-clean'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='pause-filter'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='pfthreshold'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='svme-addr-chk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='rdctl-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='skip-l1dfl-vmentry'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='mds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature name='pschange-mc-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <pages unit='KiB' size='4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <pages unit='KiB' size='2048'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <pages unit='KiB' size='1048576'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </cpu>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <power_management>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <suspend_mem/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </power_management>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <iommu support='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <migration_features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <live/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <uri_transports>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <uri_transport>tcp</uri_transport>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <uri_transport>rdma</uri_transport>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </uri_transports>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </migration_features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <topology>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <cells num='1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <cell id='0'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:          <memory unit='KiB'>7864104</memory>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:          <pages unit='KiB' size='4'>1966026</pages>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:          <pages unit='KiB' size='2048'>0</pages>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:          <distances>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:            <sibling id='0' value='10'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:          </distances>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:          <cpus num='8'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:          </cpus>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        </cell>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </cells>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </topology>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <cache>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </cache>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <secmodel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model>selinux</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <doi>0</doi>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </secmodel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <secmodel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model>dac</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <doi>0</doi>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </secmodel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </host>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <guest>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <os_type>hvm</os_type>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <arch name='i686'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <wordsize>32</wordsize>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <domain type='qemu'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <domain type='kvm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </arch>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <pae/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <nonpae/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <acpi default='on' toggle='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <apic default='on' toggle='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <cpuselection/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <deviceboot/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <externalSnapshot/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </guest>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <guest>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <os_type>hvm</os_type>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <arch name='x86_64'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <wordsize>64</wordsize>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <domain type='qemu'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <domain type='kvm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </arch>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <acpi default='on' toggle='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <apic default='on' toggle='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <cpuselection/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <deviceboot/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <externalSnapshot/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </guest>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 
Oct  2 07:53:24 np0005465988 nova_compute[236126]: </capabilities>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: #033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.226 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.230 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  2 07:53:24 np0005465988 nova_compute[236126]: <domainCapabilities>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <domain>kvm</domain>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <arch>i686</arch>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <vcpu max='4096'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <iothreads supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <os supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <enum name='firmware'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <loader supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>rom</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pflash</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='readonly'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>yes</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>no</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='secure'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>no</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </loader>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </os>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <cpu>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>on</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>off</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='maximumMigratable'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>on</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>off</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <vendor>AMD</vendor>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='succor'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='custom' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='auto-ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='auto-ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-128'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-256'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-512'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='KnightsMill'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512er'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512pf'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512er'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512pf'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tbm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tbm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SierraForest'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cmpccxadd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cmpccxadd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='athlon'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='athlon-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='core2duo'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='core2duo-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='coreduo'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='coreduo-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='n270'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='n270-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='phenom'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='phenom-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <memoryBacking supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <enum name='sourceType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>file</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>anonymous</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>memfd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </memoryBacking>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <devices>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <disk supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='diskDevice'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>disk</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>cdrom</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>floppy</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>lun</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='bus'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>fdc</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>scsi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>sata</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-non-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </disk>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <graphics supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vnc</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>egl-headless</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>dbus</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </graphics>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <video supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='modelType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vga</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>cirrus</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>none</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>bochs</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>ramfb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </video>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <hostdev supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='mode'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>subsystem</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='startupPolicy'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>default</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>mandatory</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>requisite</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>optional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='subsysType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pci</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>scsi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='capsType'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='pciBackend'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </hostdev>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <rng supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-non-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>random</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>egd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>builtin</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </rng>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <filesystem supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='driverType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>path</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>handle</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtiofs</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </filesystem>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <tpm supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tpm-tis</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tpm-crb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>emulator</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>external</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendVersion'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>2.0</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </tpm>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <redirdev supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='bus'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </redirdev>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <channel supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pty</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>unix</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </channel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <crypto supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>qemu</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>builtin</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </crypto>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <interface supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>default</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>passt</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </interface>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <panic supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>isa</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>hyperv</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </panic>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </devices>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <gic supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <genid supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <backup supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <async-teardown supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <ps2 supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <sev supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <sgx supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <hyperv supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='features'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>relaxed</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vapic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>spinlocks</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vpindex</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>runtime</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>synic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>stimer</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>reset</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vendor_id</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>frequencies</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>reenlightenment</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tlbflush</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>ipi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>avic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>emsr_bitmap</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>xmm_input</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </hyperv>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <launchSecurity supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: </domainCapabilities>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.240 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  2 07:53:24 np0005465988 nova_compute[236126]: <domainCapabilities>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <domain>kvm</domain>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <arch>i686</arch>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <vcpu max='240'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <iothreads supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <os supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <enum name='firmware'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <loader supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>rom</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pflash</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='readonly'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>yes</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>no</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='secure'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>no</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </loader>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </os>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <cpu>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>on</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>off</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='maximumMigratable'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>on</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>off</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <vendor>AMD</vendor>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='succor'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='custom' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='auto-ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='auto-ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-128'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-256'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-512'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='KnightsMill'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512er'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512pf'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512er'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512pf'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tbm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tbm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SierraForest'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cmpccxadd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cmpccxadd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='athlon'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='athlon-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='core2duo'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='core2duo-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='coreduo'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='coreduo-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='n270'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='n270-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='phenom'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='phenom-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <memoryBacking supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <enum name='sourceType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>file</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>anonymous</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>memfd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </memoryBacking>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <devices>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <disk supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='diskDevice'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>disk</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>cdrom</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>floppy</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>lun</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='bus'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>ide</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>fdc</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>scsi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>sata</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-non-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </disk>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <graphics supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vnc</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>egl-headless</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>dbus</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </graphics>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <video supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='modelType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vga</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>cirrus</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>none</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>bochs</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>ramfb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </video>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <hostdev supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='mode'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>subsystem</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='startupPolicy'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>default</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>mandatory</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>requisite</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>optional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='subsysType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pci</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>scsi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='capsType'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='pciBackend'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </hostdev>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <rng supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-non-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>random</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>egd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>builtin</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </rng>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <filesystem supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='driverType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>path</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>handle</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtiofs</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </filesystem>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <tpm supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tpm-tis</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tpm-crb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>emulator</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>external</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendVersion'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>2.0</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </tpm>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <redirdev supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='bus'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </redirdev>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <channel supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pty</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>unix</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </channel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <crypto supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>qemu</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>builtin</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </crypto>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <interface supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>default</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>passt</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </interface>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <panic supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>isa</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>hyperv</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </panic>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </devices>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <gic supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <genid supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <backup supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <async-teardown supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <ps2 supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <sev supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <sgx supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <hyperv supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='features'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>relaxed</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vapic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>spinlocks</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vpindex</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>runtime</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>synic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>stimer</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>reset</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vendor_id</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>frequencies</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>reenlightenment</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tlbflush</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>ipi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>avic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>emsr_bitmap</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>xmm_input</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </hyperv>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <launchSecurity supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: </domainCapabilities>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.263 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.267 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  2 07:53:24 np0005465988 nova_compute[236126]: <domainCapabilities>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <domain>kvm</domain>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <arch>x86_64</arch>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <vcpu max='4096'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <iothreads supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <os supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <enum name='firmware'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>efi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <loader supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>rom</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pflash</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='readonly'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>yes</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>no</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='secure'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>yes</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>no</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </loader>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </os>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <cpu>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>on</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>off</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='maximumMigratable'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>on</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>off</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <vendor>AMD</vendor>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='succor'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='custom' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='auto-ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='auto-ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-128'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-256'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-512'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='KnightsMill'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512er'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512pf'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512er'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512pf'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tbm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tbm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SierraForest'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cmpccxadd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cmpccxadd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='athlon'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='athlon-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='core2duo'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='core2duo-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='coreduo'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='coreduo-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='n270'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='n270-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='phenom'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='phenom-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <memoryBacking supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <enum name='sourceType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>file</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>anonymous</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>memfd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </memoryBacking>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <devices>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <disk supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='diskDevice'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>disk</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>cdrom</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>floppy</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>lun</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='bus'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>fdc</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>scsi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>sata</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-non-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </disk>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <graphics supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vnc</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>egl-headless</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>dbus</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </graphics>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <video supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='modelType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vga</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>cirrus</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>none</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>bochs</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>ramfb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </video>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <hostdev supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='mode'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>subsystem</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='startupPolicy'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>default</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>mandatory</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>requisite</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>optional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='subsysType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pci</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>scsi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='capsType'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='pciBackend'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </hostdev>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <rng supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-non-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>random</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>egd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>builtin</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </rng>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <filesystem supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='driverType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>path</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>handle</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtiofs</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </filesystem>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <tpm supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tpm-tis</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tpm-crb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>emulator</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>external</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendVersion'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>2.0</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </tpm>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <redirdev supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='bus'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </redirdev>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <channel supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pty</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>unix</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </channel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <crypto supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>qemu</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>builtin</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </crypto>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <interface supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>default</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>passt</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </interface>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <panic supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>isa</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>hyperv</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </panic>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </devices>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <gic supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <genid supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <backup supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <async-teardown supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <ps2 supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <sev supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <sgx supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <hyperv supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='features'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>relaxed</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vapic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>spinlocks</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vpindex</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>runtime</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>synic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>stimer</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>reset</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vendor_id</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>frequencies</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>reenlightenment</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tlbflush</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>ipi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>avic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>emsr_bitmap</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>xmm_input</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </hyperv>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <launchSecurity supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: </domainCapabilities>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.335 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  2 07:53:24 np0005465988 nova_compute[236126]: <domainCapabilities>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <domain>kvm</domain>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <arch>x86_64</arch>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <vcpu max='240'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <iothreads supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <os supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <enum name='firmware'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <loader supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>rom</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pflash</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='readonly'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>yes</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>no</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='secure'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>no</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </loader>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </os>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <cpu>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>on</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>off</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='maximumMigratable'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>on</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>off</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <vendor>AMD</vendor>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='succor'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <mode name='custom' supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Denverton-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='auto-ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='auto-ibrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amd-psfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='stibp-always-on'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='EPYC-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-128'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-256'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx10-512'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='prefetchiti'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Haswell-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='KnightsMill'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512er'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512pf'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512er'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512pf'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tbm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fma4'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tbm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xop'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='amx-tile'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-bf16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-fp16'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bitalg'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrc'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fzrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='la57'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='taa-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xfd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SierraForest'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cmpccxadd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ifma'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cmpccxadd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fbsdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='fsrs'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ibrs-all'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mcdt-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pbrsb-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='psdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='serialize'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vaes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='hle'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='rtm'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512bw'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512cd'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512dq'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512f'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='avx512vl'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='invpcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pcid'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='pku'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='mpx'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='core-capability'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='split-lock-detect'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='cldemote'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='erms'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='gfni'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdir64b'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='movdiri'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='xsaves'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='athlon'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='athlon-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='core2duo'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='core2duo-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='coreduo'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='coreduo-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='n270'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='n270-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='ss'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='phenom'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <blockers model='phenom-v1'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnow'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <feature name='3dnowext'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </blockers>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </mode>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <memoryBacking supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <enum name='sourceType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>file</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>anonymous</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <value>memfd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </memoryBacking>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <devices>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <disk supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='diskDevice'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>disk</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>cdrom</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>floppy</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>lun</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='bus'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>ide</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>fdc</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>scsi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>sata</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-non-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </disk>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <graphics supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vnc</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>egl-headless</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>dbus</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </graphics>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <video supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='modelType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vga</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>cirrus</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>none</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>bochs</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>ramfb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </video>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <hostdev supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='mode'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>subsystem</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='startupPolicy'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>default</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>mandatory</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>requisite</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>optional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='subsysType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pci</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>scsi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='capsType'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='pciBackend'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </hostdev>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <rng supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtio-non-transitional</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>random</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>egd</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>builtin</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </rng>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <filesystem supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='driverType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>path</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>handle</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>virtiofs</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </filesystem>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <tpm supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tpm-tis</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tpm-crb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>emulator</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>external</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendVersion'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>2.0</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </tpm>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <redirdev supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='bus'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>usb</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </redirdev>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <channel supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>pty</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>unix</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </channel>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <crypto supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='type'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>qemu</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendModel'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>builtin</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </crypto>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <interface supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='backendType'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>default</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>passt</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </interface>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <panic supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='model'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>isa</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>hyperv</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </panic>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </devices>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <gic supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <genid supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <backup supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <async-teardown supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <ps2 supported='yes'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <sev supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <sgx supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <hyperv supported='yes'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      <enum name='features'>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>relaxed</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vapic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>spinlocks</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vpindex</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>runtime</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>synic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>stimer</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>reset</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>vendor_id</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>frequencies</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>reenlightenment</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>tlbflush</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>ipi</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>avic</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>emsr_bitmap</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:        <value>xmm_input</value>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:      </enum>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    </hyperv>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:    <launchSecurity supported='no'/>
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  </features>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: </domainCapabilities>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.395 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.396 2 INFO nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Secure Boot support detected#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.399 2 INFO nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.399 2 INFO nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.414 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] cpu compare xml: <cpu match="exact">
Oct  2 07:53:24 np0005465988 nova_compute[236126]:  <model>Nehalem</model>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: </cpu>
Oct  2 07:53:24 np0005465988 nova_compute[236126]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.417 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.466 2 INFO nova.virt.node [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Determined node identity 5abd2871-a992-42ab-bb6a-594a92f77d4d from /var/lib/nova/compute_id#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.487 2 WARNING nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Compute nodes ['5abd2871-a992-42ab-bb6a-594a92f77d4d'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.511 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.550 2 WARNING nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.550 2 DEBUG oslo_concurrency.lockutils [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.551 2 DEBUG oslo_concurrency.lockutils [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.551 2 DEBUG oslo_concurrency.lockutils [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.551 2 DEBUG nova.compute.resource_tracker [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.552 2 DEBUG oslo_concurrency.processutils [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:53:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:24.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:53:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2836184901' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:53:24 np0005465988 nova_compute[236126]: 2025-10-02 11:53:24.980 2 DEBUG oslo_concurrency.processutils [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:53:25 np0005465988 systemd[1]: Starting libvirt nodedev daemon...
Oct  2 07:53:25 np0005465988 systemd[1]: Started libvirt nodedev daemon.
Oct  2 07:53:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:25.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:25 np0005465988 nova_compute[236126]: 2025-10-02 11:53:25.409 2 WARNING nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:53:25 np0005465988 nova_compute[236126]: 2025-10-02 11:53:25.411 2 DEBUG nova.compute.resource_tracker [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5240MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:53:25 np0005465988 nova_compute[236126]: 2025-10-02 11:53:25.411 2 DEBUG oslo_concurrency.lockutils [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:53:25 np0005465988 nova_compute[236126]: 2025-10-02 11:53:25.411 2 DEBUG oslo_concurrency.lockutils [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:53:25 np0005465988 nova_compute[236126]: 2025-10-02 11:53:25.439 2 WARNING nova.compute.resource_tracker [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] No compute node record for compute-2.ctlplane.example.com:5abd2871-a992-42ab-bb6a-594a92f77d4d: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 5abd2871-a992-42ab-bb6a-594a92f77d4d could not be found.#033[00m
Oct  2 07:53:25 np0005465988 nova_compute[236126]: 2025-10-02 11:53:25.465 2 INFO nova.compute.resource_tracker [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 5abd2871-a992-42ab-bb6a-594a92f77d4d#033[00m
Oct  2 07:53:25 np0005465988 nova_compute[236126]: 2025-10-02 11:53:25.519 2 DEBUG nova.compute.resource_tracker [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:53:25 np0005465988 nova_compute[236126]: 2025-10-02 11:53:25.520 2 DEBUG nova.compute.resource_tracker [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.015 2 INFO nova.scheduler.client.report [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [req-feaf3d3d-28f0-42fe-9254-8d8e078adb22] Created resource provider record via placement API for resource provider with UUID 5abd2871-a992-42ab-bb6a-594a92f77d4d and name compute-2.ctlplane.example.com.#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.038 2 DEBUG oslo_concurrency.processutils [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:53:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:53:26 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2103042759' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.478 2 DEBUG oslo_concurrency.processutils [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.485 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct  2 07:53:26 np0005465988 nova_compute[236126]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.485 2 INFO nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] kernel doesn't support AMD SEV#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.487 2 DEBUG nova.compute.provider_tree [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.488 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.492 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Libvirt baseline CPU <cpu>
Oct  2 07:53:26 np0005465988 nova_compute[236126]:  <arch>x86_64</arch>
Oct  2 07:53:26 np0005465988 nova_compute[236126]:  <model>Nehalem</model>
Oct  2 07:53:26 np0005465988 nova_compute[236126]:  <vendor>AMD</vendor>
Oct  2 07:53:26 np0005465988 nova_compute[236126]:  <topology sockets="8" cores="1" threads="1"/>
Oct  2 07:53:26 np0005465988 nova_compute[236126]: </cpu>
Oct  2 07:53:26 np0005465988 nova_compute[236126]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.616 2 DEBUG nova.scheduler.client.report [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Updated inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.616 2 DEBUG nova.compute.provider_tree [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Updating resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.617 2 DEBUG nova.compute.provider_tree [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 07:53:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:26.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.719 2 DEBUG nova.compute.provider_tree [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Updating resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.757 2 DEBUG nova.compute.resource_tracker [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.758 2 DEBUG oslo_concurrency.lockutils [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.758 2 DEBUG nova.service [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.852 2 DEBUG nova.service [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Oct  2 07:53:26 np0005465988 nova_compute[236126]: 2025-10-02 11:53:26.852 2 DEBUG nova.servicegroup.drivers.db [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Oct  2 07:53:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:27.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:53:27.321 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:53:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:53:27.321 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:53:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:53:27.322 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:53:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:28.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 07:53:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3490 writes, 18K keys, 3490 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.03 MB/s#012Cumulative WAL: 3490 writes, 3490 syncs, 1.00 writes per sync, written: 0.04 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1374 writes, 6843 keys, 1374 commit groups, 1.0 writes per commit group, ingest: 14.77 MB, 0.02 MB/s#012Interval WAL: 1374 writes, 1374 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     65.4      0.33              0.06         9    0.037       0      0       0.0       0.0#012  L6      1/0    7.65 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.1    101.7     84.0      0.79              0.22         8    0.098     35K   4320       0.0       0.0#012 Sum      1/0    7.65 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.1     71.7     78.5      1.12              0.28        17    0.066     35K   4320       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   5.2     83.5     84.3      0.62              0.15        10    0.062     23K   3040       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    101.7     84.0      0.79              0.22         8    0.098     35K   4320       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     65.7      0.33              0.06         8    0.041       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.021, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.09 GB write, 0.07 MB/s write, 0.08 GB read, 0.07 MB/s read, 1.1 seconds#012Interval compaction: 0.05 GB write, 0.09 MB/s write, 0.05 GB read, 0.09 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 308.00 MB usage: 4.69 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 5.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(259,4.38 MB,1.42267%) FilterBlock(17,103.86 KB,0.0329303%) IndexBlock(17,213.12 KB,0.0675746%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 07:53:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:29.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:53:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:30.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:53:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:31.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:32.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:33.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:34.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:35.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:35 np0005465988 podman[236552]: 2025-10-02 11:53:35.545233965 +0000 UTC m=+0.072448025 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 07:53:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:36.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:37.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:38.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:39.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:40.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:41.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:42.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:43.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:44.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:45.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:46.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:47.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:47 np0005465988 podman[236628]: 2025-10-02 11:53:47.55628992 +0000 UTC m=+0.074165784 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251001, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:53:47 np0005465988 podman[236627]: 2025-10-02 11:53:47.619465397 +0000 UTC m=+0.140687758 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:53:47 np0005465988 podman[236666]: 2025-10-02 11:53:47.635435036 +0000 UTC m=+0.051690838 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 07:53:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:48.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:49.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:50.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:51.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:52.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:53.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:54.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:53:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:55.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:53:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:56.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:53:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:53:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:57.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:53:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:53:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:53:58.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:53:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:53:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:53:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:53:59.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:54:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:00.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:01.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:02.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:03.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:04.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:05.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:06 np0005465988 podman[236724]: 2025-10-02 11:54:06.037421587 +0000 UTC m=+0.091529473 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 07:54:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 07:54:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1201.0 total, 600.0 interval#012Cumulative writes: 5513 writes, 23K keys, 5513 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5513 writes, 905 syncs, 6.09 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 407 writes, 686 keys, 407 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s#012Interval WAL: 407 writes, 174 syncs, 2.34 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1201.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1201.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1201.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Oct  2 07:54:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:06.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:07.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:08.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:09.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:10.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:10 np0005465988 nova_compute[236126]: 2025-10-02 11:54:10.854 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:11.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:54:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:54:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct  2 07:54:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 07:54:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:12 np0005465988 nova_compute[236126]: 2025-10-02 11:54:12.433 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:12.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:54:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:54:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:54:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:13.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:14.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:15.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:16.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:17.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:18 np0005465988 podman[237028]: 2025-10-02 11:54:18.536296882 +0000 UTC m=+0.062836078 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:54:18 np0005465988 podman[237027]: 2025-10-02 11:54:18.558315865 +0000 UTC m=+0.088745333 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3)
Oct  2 07:54:18 np0005465988 podman[237026]: 2025-10-02 11:54:18.564431181 +0000 UTC m=+0.093957103 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:54:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:18.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:19.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:54:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:54:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:54:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:20.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:54:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:21.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:22.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:23.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.476 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.477 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.478 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.478 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.527 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.528 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.528 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.529 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.529 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.529 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.529 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.529 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.530 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.595 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.595 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.596 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.596 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:54:23 np0005465988 nova_compute[236126]: 2025-10-02 11:54:23.597 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:54:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:54:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4026187952' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:54:24 np0005465988 nova_compute[236126]: 2025-10-02 11:54:24.027 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:54:24 np0005465988 nova_compute[236126]: 2025-10-02 11:54:24.243 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:54:24 np0005465988 nova_compute[236126]: 2025-10-02 11:54:24.245 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5305MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:54:24 np0005465988 nova_compute[236126]: 2025-10-02 11:54:24.245 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:54:24 np0005465988 nova_compute[236126]: 2025-10-02 11:54:24.245 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:54:24 np0005465988 nova_compute[236126]: 2025-10-02 11:54:24.468 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:54:24 np0005465988 nova_compute[236126]: 2025-10-02 11:54:24.469 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:54:24 np0005465988 nova_compute[236126]: 2025-10-02 11:54:24.592 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:54:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:24.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:54:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/885902181' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:54:25 np0005465988 nova_compute[236126]: 2025-10-02 11:54:25.055 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:54:25 np0005465988 nova_compute[236126]: 2025-10-02 11:54:25.064 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:54:25 np0005465988 nova_compute[236126]: 2025-10-02 11:54:25.106 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:54:25 np0005465988 nova_compute[236126]: 2025-10-02 11:54:25.108 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:54:25 np0005465988 nova_compute[236126]: 2025-10-02 11:54:25.109 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.864s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:54:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:25.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:26.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:27.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:54:27.322 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:54:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:54:27.322 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:54:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:54:27.323 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:54:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:28.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:29.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:30.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:31.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:32.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:33.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:34.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:35.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:36 np0005465988 podman[237244]: 2025-10-02 11:54:36.537828975 +0000 UTC m=+0.069852620 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 07:54:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:36.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:37.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:37 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Oct  2 07:54:37 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Oct  2 07:54:37 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Oct  2 07:54:37 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Oct  2 07:54:37 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Oct  2 07:54:38 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Oct  2 07:54:38 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Oct  2 07:54:38 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Oct  2 07:54:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:38.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:39.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:54:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:40.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:54:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:41.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:54:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:42.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:54:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:43.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:44.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:45.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:46.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:47.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:48.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:49.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:49 np0005465988 podman[237322]: 2025-10-02 11:54:49.53218652 +0000 UTC m=+0.067597555 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:54:49 np0005465988 podman[237320]: 2025-10-02 11:54:49.550104275 +0000 UTC m=+0.092535332 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 07:54:49 np0005465988 podman[237321]: 2025-10-02 11:54:49.553695828 +0000 UTC m=+0.088031302 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 07:54:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:50.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:51.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:52.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:53.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:54.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:55.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:54:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 07:54:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3545203692' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 07:54:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 07:54:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3545203692' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 07:54:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:54:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:56.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:54:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:54:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:57.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:54:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:54:58.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:54:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:54:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:54:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:54:59.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:00.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:01.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:02.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:03.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:04.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:05.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:55:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:06.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:55:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:07.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:07 np0005465988 podman[237443]: 2025-10-02 11:55:07.515751408 +0000 UTC m=+0.058920295 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:55:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:08.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:09.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:10.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:11.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:12.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:13.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:14.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:15.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:16.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:17.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:18.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:19.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:20 np0005465988 podman[237602]: 2025-10-02 11:55:20.543512058 +0000 UTC m=+0.072064817 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0)
Oct  2 07:55:20 np0005465988 podman[237601]: 2025-10-02 11:55:20.567071647 +0000 UTC m=+0.091657372 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 07:55:20 np0005465988 podman[237600]: 2025-10-02 11:55:20.592029556 +0000 UTC m=+0.118263528 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:55:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:20.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 07:55:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:55:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:55:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:55:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:21.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:22.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:23.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:24.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.100 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.100 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.123 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.123 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.123 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.143 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.144 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.144 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.144 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.144 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.145 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.145 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.145 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.177 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.177 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.178 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.178 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.178 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:55:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:25.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:55:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/206332093' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.594 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.804 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.805 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5306MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.806 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.806 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.960 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.960 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:55:25 np0005465988 nova_compute[236126]: 2025-10-02 11:55:25.981 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:55:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:55:26 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1013949214' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:55:26 np0005465988 nova_compute[236126]: 2025-10-02 11:55:26.449 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:55:26 np0005465988 nova_compute[236126]: 2025-10-02 11:55:26.453 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:55:26 np0005465988 nova_compute[236126]: 2025-10-02 11:55:26.475 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:55:26 np0005465988 nova_compute[236126]: 2025-10-02 11:55:26.477 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:55:26 np0005465988 nova_compute[236126]: 2025-10-02 11:55:26.477 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:55:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:26.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:26 np0005465988 nova_compute[236126]: 2025-10-02 11:55:26.807 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:55:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:55:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:55:27.322 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:55:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:55:27.323 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:55:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:55:27.323 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:55:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:27.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:28.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:29.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:30.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:55:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:31.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:55:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:32.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:33.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.440912) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406133440960, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 2316, "num_deletes": 251, "total_data_size": 5925768, "memory_usage": 5997520, "flush_reason": "Manual Compaction"}
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406133526628, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 3870551, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17822, "largest_seqno": 20133, "table_properties": {"data_size": 3861083, "index_size": 6026, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18835, "raw_average_key_size": 20, "raw_value_size": 3842313, "raw_average_value_size": 4096, "num_data_blocks": 269, "num_entries": 938, "num_filter_entries": 938, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759405910, "oldest_key_time": 1759405910, "file_creation_time": 1759406133, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 85780 microseconds, and 11894 cpu microseconds.
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.526689) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 3870551 bytes OK
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.526714) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.586152) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.586206) EVENT_LOG_v1 {"time_micros": 1759406133586193, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.586234) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 5915605, prev total WAL file size 5915605, number of live WAL files 2.
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.588821) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(3779KB)], [36(7830KB)]
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406133588974, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 11888490, "oldest_snapshot_seqno": -1}
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4430 keys, 9816538 bytes, temperature: kUnknown
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406133688595, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 9816538, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9784120, "index_size": 20212, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11141, "raw_key_size": 110584, "raw_average_key_size": 24, "raw_value_size": 9701053, "raw_average_value_size": 2189, "num_data_blocks": 841, "num_entries": 4430, "num_filter_entries": 4430, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759406133, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.688803) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 9816538 bytes
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.689904) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 119.3 rd, 98.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 7.6 +0.0 blob) out(9.4 +0.0 blob), read-write-amplify(5.6) write-amplify(2.5) OK, records in: 4949, records dropped: 519 output_compression: NoCompression
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.689923) EVENT_LOG_v1 {"time_micros": 1759406133689914, "job": 20, "event": "compaction_finished", "compaction_time_micros": 99671, "compaction_time_cpu_micros": 46097, "output_level": 6, "num_output_files": 1, "total_output_size": 9816538, "num_input_records": 4949, "num_output_records": 4430, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406133690695, "job": 20, "event": "table_file_deletion", "file_number": 38}
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406133691855, "job": 20, "event": "table_file_deletion", "file_number": 36}
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.588579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.691980) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.691988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.691991) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.691993) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:55:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:55:33.691995) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:55:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:34.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:35.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:36.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:37.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:38 np0005465988 podman[237813]: 2025-10-02 11:55:38.555509093 +0000 UTC m=+0.084503796 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Oct  2 07:55:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:38.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:39.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:40.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:41.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:42.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:43.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:44.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:45.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:46.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:47.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:48.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:49.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:50.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:51.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:51 np0005465988 podman[237890]: 2025-10-02 11:55:51.509996312 +0000 UTC m=+0.052805083 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:55:51 np0005465988 podman[237891]: 2025-10-02 11:55:51.512812313 +0000 UTC m=+0.051125044 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2)
Oct  2 07:55:51 np0005465988 podman[237889]: 2025-10-02 11:55:51.549269023 +0000 UTC m=+0.091239480 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 07:55:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:52.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:53.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:54.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:55.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:55:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:56.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:55:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:57.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:55:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:55:58.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:55:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:55:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:55:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:55:59.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:00.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:01.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:56:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:02.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:03.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:04.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:05.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:06.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:56:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:07.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:08.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:56:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:09.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:56:09 np0005465988 podman[238011]: 2025-10-02 11:56:09.539547385 +0000 UTC m=+0.070663587 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible)
Oct  2 07:56:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:10.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:11.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:56:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:12.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:13.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:14.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:15.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:16.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:56:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:17.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:18.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:19.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:20.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:21.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:56:21.441 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:56:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:56:21.444 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 07:56:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:56:21.445 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:56:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:56:22 np0005465988 podman[238039]: 2025-10-02 11:56:22.522594199 +0000 UTC m=+0.059035392 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 07:56:22 np0005465988 podman[238037]: 2025-10-02 11:56:22.543851652 +0000 UTC m=+0.085512715 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:56:22 np0005465988 podman[238038]: 2025-10-02 11:56:22.560240694 +0000 UTC m=+0.086388740 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 07:56:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:22.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:23.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:23 np0005465988 nova_compute[236126]: 2025-10-02 11:56:23.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:24 np0005465988 nova_compute[236126]: 2025-10-02 11:56:24.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:24 np0005465988 nova_compute[236126]: 2025-10-02 11:56:24.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:24 np0005465988 nova_compute[236126]: 2025-10-02 11:56:24.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:56:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:24.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:25.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:25 np0005465988 nova_compute[236126]: 2025-10-02 11:56:25.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:25 np0005465988 nova_compute[236126]: 2025-10-02 11:56:25.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:25 np0005465988 nova_compute[236126]: 2025-10-02 11:56:25.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:25 np0005465988 nova_compute[236126]: 2025-10-02 11:56:25.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:26 np0005465988 nova_compute[236126]: 2025-10-02 11:56:26.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:26 np0005465988 nova_compute[236126]: 2025-10-02 11:56:26.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:56:26 np0005465988 nova_compute[236126]: 2025-10-02 11:56:26.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:56:26 np0005465988 nova_compute[236126]: 2025-10-02 11:56:26.499 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:56:26 np0005465988 nova_compute[236126]: 2025-10-02 11:56:26.500 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:26 np0005465988 nova_compute[236126]: 2025-10-02 11:56:26.543 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:56:26 np0005465988 nova_compute[236126]: 2025-10-02 11:56:26.543 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:56:26 np0005465988 nova_compute[236126]: 2025-10-02 11:56:26.543 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:56:26 np0005465988 nova_compute[236126]: 2025-10-02 11:56:26.544 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:56:26 np0005465988 nova_compute[236126]: 2025-10-02 11:56:26.544 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:56:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:26.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:56:26 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1602707873' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.017 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:56:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.164 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.167 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5312MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.167 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.167 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:56:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:56:27.323 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:56:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:56:27.324 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:56:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:56:27.324 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.400 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.401 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:56:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:27.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.430 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:56:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:56:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3674817225' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.898 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.906 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.936 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.938 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:56:27 np0005465988 nova_compute[236126]: 2025-10-02 11:56:27.939 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:56:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:56:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:56:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:56:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:28.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:29.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:30.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:31.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:56:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 07:56:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:32.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 07:56:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:33.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:34 np0005465988 systemd[1]: packagekit.service: Deactivated successfully.
Oct  2 07:56:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:34.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:35 np0005465988 ceph-mds[84851]: mds.beacon.cephfs.compute-2.gpiyct missed beacon ack from the monitors
Oct  2 07:56:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:35.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:36.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:56:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:37.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).paxos(paxos updating c 1507..2065) lease_timeout -- calling new election
Oct  2 07:56:38 np0005465988 ceph-mon[76355]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Oct  2 07:56:38 np0005465988 ceph-mon[76355]: paxos.1).electionLogic(14) init, last seen epoch 14
Oct  2 07:56:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 07:56:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:38.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:39 np0005465988 ceph-mds[84851]: mds.beacon.cephfs.compute-2.gpiyct missed beacon ack from the monitors
Oct  2 07:56:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:39.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:40 np0005465988 podman[238338]: 2025-10-02 11:56:40.536885535 +0000 UTC m=+0.070922525 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 07:56:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:40.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:41.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 07:56:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:42.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 07:56:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:56:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:43.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:56:44 np0005465988 ceph-mon[76355]: mon.compute-2 calling monitor election
Oct  2 07:56:44 np0005465988 ceph-mon[76355]: mon.compute-1 calling monitor election
Oct  2 07:56:44 np0005465988 ceph-mon[76355]: mon.compute-0 calling monitor election
Oct  2 07:56:44 np0005465988 ceph-mon[76355]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Oct  2 07:56:44 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 07:56:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:44.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:45.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:46.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:56:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:47.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:48.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:49.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:50.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:51.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:56:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:52.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:53.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:53 np0005465988 podman[238417]: 2025-10-02 11:56:53.534993225 +0000 UTC m=+0.062082830 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:56:53 np0005465988 podman[238416]: 2025-10-02 11:56:53.544453628 +0000 UTC m=+0.076261039 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 07:56:53 np0005465988 podman[238415]: 2025-10-02 11:56:53.599613737 +0000 UTC m=+0.130385528 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:56:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:56:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:56:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:54.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:56:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:55.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:56:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:56.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:56:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:57.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:56:58.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:56:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:56:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:56:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:56:59.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 07:57:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:00.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 07:57:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:01.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:02.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:03.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:04.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:05.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:06.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:07.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:08.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:09.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:10.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:11.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:11 np0005465988 podman[238585]: 2025-10-02 11:57:11.535387237 +0000 UTC m=+0.069446402 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 07:57:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:12.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:13.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:14.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:15.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:16.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:17.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:18.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:19.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:20.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:21.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:22.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:23.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:24 np0005465988 podman[238609]: 2025-10-02 11:57:24.546347643 +0000 UTC m=+0.075112805 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 07:57:24 np0005465988 podman[238610]: 2025-10-02 11:57:24.575660428 +0000 UTC m=+0.093781744 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 07:57:24 np0005465988 podman[238608]: 2025-10-02 11:57:24.576200403 +0000 UTC m=+0.109138686 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:57:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:24.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:25.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:25 np0005465988 nova_compute[236126]: 2025-10-02 11:57:25.913 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:25 np0005465988 nova_compute[236126]: 2025-10-02 11:57:25.913 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:25 np0005465988 nova_compute[236126]: 2025-10-02 11:57:25.914 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:25 np0005465988 nova_compute[236126]: 2025-10-02 11:57:25.914 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:26 np0005465988 nova_compute[236126]: 2025-10-02 11:57:26.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:26 np0005465988 nova_compute[236126]: 2025-10-02 11:57:26.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:26 np0005465988 nova_compute[236126]: 2025-10-02 11:57:26.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:26 np0005465988 nova_compute[236126]: 2025-10-02 11:57:26.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:57:26 np0005465988 nova_compute[236126]: 2025-10-02 11:57:26.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:26 np0005465988 nova_compute[236126]: 2025-10-02 11:57:26.586 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:57:26 np0005465988 nova_compute[236126]: 2025-10-02 11:57:26.587 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:57:26 np0005465988 nova_compute[236126]: 2025-10-02 11:57:26.587 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:57:26 np0005465988 nova_compute[236126]: 2025-10-02 11:57:26.587 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:57:26 np0005465988 nova_compute[236126]: 2025-10-02 11:57:26.587 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:57:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:26.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:57:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1526375032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.039 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:57:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.163 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.164 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5323MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.164 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.165 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.253 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.253 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.276 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:57:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:57:27.324 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:57:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:57:27.325 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:57:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:57:27.326 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:57:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:27.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:57:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2855471575' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.756 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.764 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.804 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.806 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:57:27 np0005465988 nova_compute[236126]: 2025-10-02 11:57:27.806 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:57:28 np0005465988 nova_compute[236126]: 2025-10-02 11:57:28.801 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:28 np0005465988 nova_compute[236126]: 2025-10-02 11:57:28.817 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:28 np0005465988 nova_compute[236126]: 2025-10-02 11:57:28.818 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:57:28 np0005465988 nova_compute[236126]: 2025-10-02 11:57:28.818 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:57:28 np0005465988 nova_compute[236126]: 2025-10-02 11:57:28.832 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:57:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:28.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:29.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:30.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:31.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:32.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:33.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:34.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:35.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:36.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:37.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=404 latency=0.002000058s ======
Oct  2 07:57:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:38.080 +0000] "GET /healthcheck HTTP/1.1" 404 240 - "python-urllib3/1.26.5" - latency=0.002000058s
Oct  2 07:57:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:38.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:39.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:40.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:41.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Oct  2 07:57:42 np0005465988 podman[238775]: 2025-10-02 11:57:42.550255738 +0000 UTC m=+0.074542870 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:57:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:42.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:43.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Oct  2 07:57:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:44.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:45.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Oct  2 07:57:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:46.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:47.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.627073) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406267627108, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 1396, "num_deletes": 252, "total_data_size": 3167396, "memory_usage": 3205352, "flush_reason": "Manual Compaction"}
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406267640822, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 1249446, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20139, "largest_seqno": 21529, "table_properties": {"data_size": 1244836, "index_size": 2006, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12121, "raw_average_key_size": 20, "raw_value_size": 1234610, "raw_average_value_size": 2092, "num_data_blocks": 91, "num_entries": 590, "num_filter_entries": 590, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759406133, "oldest_key_time": 1759406133, "file_creation_time": 1759406267, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 13834 microseconds, and 5340 cpu microseconds.
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.640900) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 1249446 bytes OK
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.640924) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.642890) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.642902) EVENT_LOG_v1 {"time_micros": 1759406267642898, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.642922) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 3160924, prev total WAL file size 3160924, number of live WAL files 2.
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.643828) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353032' seq:72057594037927935, type:22 .. '6D67727374617400373535' seq:0, type:0; will stop at (end)
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(1220KB)], [39(9586KB)]
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406267643873, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 11065984, "oldest_snapshot_seqno": -1}
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 4551 keys, 8028971 bytes, temperature: kUnknown
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406267704824, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 8028971, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7998812, "index_size": 17697, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11397, "raw_key_size": 113548, "raw_average_key_size": 24, "raw_value_size": 7916539, "raw_average_value_size": 1739, "num_data_blocks": 732, "num_entries": 4551, "num_filter_entries": 4551, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759406267, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.705124) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 8028971 bytes
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.707172) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 181.3 rd, 131.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 9.4 +0.0 blob) out(7.7 +0.0 blob), read-write-amplify(15.3) write-amplify(6.4) OK, records in: 5020, records dropped: 469 output_compression: NoCompression
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.707191) EVENT_LOG_v1 {"time_micros": 1759406267707181, "job": 22, "event": "compaction_finished", "compaction_time_micros": 61043, "compaction_time_cpu_micros": 29218, "output_level": 6, "num_output_files": 1, "total_output_size": 8028971, "num_input_records": 5020, "num_output_records": 4551, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406267707590, "job": 22, "event": "table_file_deletion", "file_number": 41}
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406267709928, "job": 22, "event": "table_file_deletion", "file_number": 39}
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.643747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.710059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.710069) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.710076) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.710077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:57:47 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-11:57:47.710079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 07:57:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:48.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Oct  2 07:57:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:49.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:50.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:51.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Oct  2 07:57:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:52.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:53.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:54.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:55.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:55 np0005465988 podman[238984]: 2025-10-02 11:57:55.53840812 +0000 UTC m=+0.073100588 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 07:57:55 np0005465988 podman[238983]: 2025-10-02 11:57:55.577420945 +0000 UTC m=+0.109741584 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  2 07:57:55 np0005465988 podman[238985]: 2025-10-02 11:57:55.583955405 +0000 UTC m=+0.099840466 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 07:57:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:56.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:57:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:57:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:57:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:57.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:57:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:57:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:57:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:57:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:57:58.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:57:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:57:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:57:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:57:59.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:57:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:57:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:58:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:01.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:01.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:58:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:03.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:03.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:05.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:05.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:07.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:58:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:07.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:58:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:58:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:09.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:09.055 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:58:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:09.056 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 07:58:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:09.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:58:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:11.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:58:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:58:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:11.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:58:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:58:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:13.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:13 np0005465988 podman[239159]: 2025-10-02 11:58:13.557729001 +0000 UTC m=+0.088171846 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 07:58:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:13.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:15.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:15.058 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:58:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:15.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:17.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:58:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:17.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:19.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:19.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Oct  2 07:58:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Oct  2 07:58:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:58:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:21.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:58:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:21.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:58:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:23.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:23 np0005465988 nova_compute[236126]: 2025-10-02 11:58:23.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:23 np0005465988 nova_compute[236126]: 2025-10-02 11:58:23.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 07:58:23 np0005465988 nova_compute[236126]: 2025-10-02 11:58:23.535 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 07:58:23 np0005465988 nova_compute[236126]: 2025-10-02 11:58:23.536 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:23 np0005465988 nova_compute[236126]: 2025-10-02 11:58:23.536 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 07:58:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:23.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:23 np0005465988 nova_compute[236126]: 2025-10-02 11:58:23.605 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:25.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:25.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:25 np0005465988 nova_compute[236126]: 2025-10-02 11:58:25.617 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:25 np0005465988 nova_compute[236126]: 2025-10-02 11:58:25.617 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:25 np0005465988 nova_compute[236126]: 2025-10-02 11:58:25.618 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:26 np0005465988 nova_compute[236126]: 2025-10-02 11:58:26.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:26 np0005465988 nova_compute[236126]: 2025-10-02 11:58:26.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:26 np0005465988 nova_compute[236126]: 2025-10-02 11:58:26.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:26 np0005465988 nova_compute[236126]: 2025-10-02 11:58:26.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:58:26 np0005465988 nova_compute[236126]: 2025-10-02 11:58:26.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:26 np0005465988 podman[239187]: 2025-10-02 11:58:26.537283594 +0000 UTC m=+0.063559750 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:58:26 np0005465988 nova_compute[236126]: 2025-10-02 11:58:26.554 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:26 np0005465988 nova_compute[236126]: 2025-10-02 11:58:26.554 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:26 np0005465988 nova_compute[236126]: 2025-10-02 11:58:26.555 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:26 np0005465988 nova_compute[236126]: 2025-10-02 11:58:26.555 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:58:26 np0005465988 nova_compute[236126]: 2025-10-02 11:58:26.555 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:58:26 np0005465988 podman[239188]: 2025-10-02 11:58:26.557836012 +0000 UTC m=+0.079327449 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:58:26 np0005465988 podman[239186]: 2025-10-02 11:58:26.58663 +0000 UTC m=+0.112213896 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2)
Oct  2 07:58:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:58:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1899045596' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:58:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:27.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.040 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:58:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.237 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.238 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5302MB free_disk=20.967517852783203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.238 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.238 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:27.325 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:27.326 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:27.326 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:27.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.621 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.621 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.697 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.772 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.772 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.790 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.820 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 07:58:27 np0005465988 nova_compute[236126]: 2025-10-02 11:58:27.838 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:58:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Oct  2 07:58:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:58:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3537283019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:58:28 np0005465988 nova_compute[236126]: 2025-10-02 11:58:28.319 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:58:28 np0005465988 nova_compute[236126]: 2025-10-02 11:58:28.326 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:58:28 np0005465988 nova_compute[236126]: 2025-10-02 11:58:28.398 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:58:28 np0005465988 nova_compute[236126]: 2025-10-02 11:58:28.400 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:58:28 np0005465988 nova_compute[236126]: 2025-10-02 11:58:28.401 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:29.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:29 np0005465988 nova_compute[236126]: 2025-10-02 11:58:29.401 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:29.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:30 np0005465988 nova_compute[236126]: 2025-10-02 11:58:30.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:30 np0005465988 nova_compute[236126]: 2025-10-02 11:58:30.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:58:30 np0005465988 nova_compute[236126]: 2025-10-02 11:58:30.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:58:30 np0005465988 nova_compute[236126]: 2025-10-02 11:58:30.497 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:58:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:31.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:31.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:58:32 np0005465988 nova_compute[236126]: 2025-10-02 11:58:32.138 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Acquiring lock "5902a7e7-5ff2-4bb5-a497-e584b578908e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:32 np0005465988 nova_compute[236126]: 2025-10-02 11:58:32.139 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "5902a7e7-5ff2-4bb5-a497-e584b578908e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:32 np0005465988 nova_compute[236126]: 2025-10-02 11:58:32.161 2 DEBUG nova.compute.manager [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 07:58:32 np0005465988 nova_compute[236126]: 2025-10-02 11:58:32.363 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:32 np0005465988 nova_compute[236126]: 2025-10-02 11:58:32.364 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:32 np0005465988 nova_compute[236126]: 2025-10-02 11:58:32.378 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 07:58:32 np0005465988 nova_compute[236126]: 2025-10-02 11:58:32.378 2 INFO nova.compute.claims [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 07:58:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:33.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:33 np0005465988 nova_compute[236126]: 2025-10-02 11:58:33.125 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:58:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:58:33 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4139404090' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:58:33 np0005465988 nova_compute[236126]: 2025-10-02 11:58:33.575 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:58:33 np0005465988 nova_compute[236126]: 2025-10-02 11:58:33.581 2 DEBUG nova.compute.provider_tree [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:58:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:33.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:33 np0005465988 nova_compute[236126]: 2025-10-02 11:58:33.660 2 DEBUG nova.scheduler.client.report [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:58:33 np0005465988 nova_compute[236126]: 2025-10-02 11:58:33.748 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:33 np0005465988 nova_compute[236126]: 2025-10-02 11:58:33.749 2 DEBUG nova.compute.manager [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 07:58:33 np0005465988 nova_compute[236126]: 2025-10-02 11:58:33.829 2 DEBUG nova.compute.manager [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 07:58:33 np0005465988 nova_compute[236126]: 2025-10-02 11:58:33.830 2 DEBUG nova.network.neutron [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 07:58:34 np0005465988 nova_compute[236126]: 2025-10-02 11:58:34.108 2 INFO nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 07:58:34 np0005465988 nova_compute[236126]: 2025-10-02 11:58:34.168 2 DEBUG nova.compute.manager [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 07:58:34 np0005465988 nova_compute[236126]: 2025-10-02 11:58:34.747 2 DEBUG nova.compute.manager [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 07:58:34 np0005465988 nova_compute[236126]: 2025-10-02 11:58:34.749 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 07:58:34 np0005465988 nova_compute[236126]: 2025-10-02 11:58:34.749 2 INFO nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Creating image(s)#033[00m
Oct  2 07:58:34 np0005465988 nova_compute[236126]: 2025-10-02 11:58:34.786 2 DEBUG nova.storage.rbd_utils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] rbd image 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 07:58:34 np0005465988 nova_compute[236126]: 2025-10-02 11:58:34.817 2 DEBUG nova.storage.rbd_utils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] rbd image 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 07:58:34 np0005465988 nova_compute[236126]: 2025-10-02 11:58:34.844 2 DEBUG nova.storage.rbd_utils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] rbd image 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 07:58:34 np0005465988 nova_compute[236126]: 2025-10-02 11:58:34.847 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:34 np0005465988 nova_compute[236126]: 2025-10-02 11:58:34.848 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:35.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:35 np0005465988 nova_compute[236126]: 2025-10-02 11:58:35.140 2 DEBUG nova.network.neutron [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Automatically allocating a network for project 8972026d0f3a4bf4b6debd9555f9225c. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Oct  2 07:58:35 np0005465988 nova_compute[236126]: 2025-10-02 11:58:35.352 2 DEBUG nova.virt.libvirt.imagebackend [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Image locations are: [{'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/c2d0c2bc-fe21-4689-86ae-d6728c15874c/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/c2d0c2bc-fe21-4689-86ae-d6728c15874c/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Oct  2 07:58:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:35.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:36 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Oct  2 07:58:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:58:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:37.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:58:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.179 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.257 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968.part --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.259 2 DEBUG nova.virt.images [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] c2d0c2bc-fe21-4689-86ae-d6728c15874c was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.260 2 DEBUG nova.privsep.utils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.260 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968.part /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.454 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968.part /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968.converted" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.462 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.538 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968.converted --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.540 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.571 2 DEBUG nova.storage.rbd_utils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] rbd image 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.576 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:58:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:37.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:37 np0005465988 nova_compute[236126]: 2025-10-02 11:58:37.942 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.367s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:58:38 np0005465988 nova_compute[236126]: 2025-10-02 11:58:38.024 2 DEBUG nova.storage.rbd_utils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] resizing rbd image 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 07:58:38 np0005465988 nova_compute[236126]: 2025-10-02 11:58:38.161 2 DEBUG nova.objects.instance [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lazy-loading 'migration_context' on Instance uuid 5902a7e7-5ff2-4bb5-a497-e584b578908e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 07:58:38 np0005465988 nova_compute[236126]: 2025-10-02 11:58:38.192 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 07:58:38 np0005465988 nova_compute[236126]: 2025-10-02 11:58:38.193 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Ensure instance console log exists: /var/lib/nova/instances/5902a7e7-5ff2-4bb5-a497-e584b578908e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 07:58:38 np0005465988 nova_compute[236126]: 2025-10-02 11:58:38.193 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:38 np0005465988 nova_compute[236126]: 2025-10-02 11:58:38.194 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:38 np0005465988 nova_compute[236126]: 2025-10-02 11:58:38.194 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:39.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:39.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:41.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:41.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:58:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Oct  2 07:58:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:43.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:43.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Oct  2 07:58:44 np0005465988 podman[239554]: 2025-10-02 11:58:44.531813306 +0000 UTC m=+0.072008375 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 07:58:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:45.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:45.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:47.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:58:47 np0005465988 nova_compute[236126]: 2025-10-02 11:58:47.494 2 DEBUG nova.network.neutron [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Automatically allocated network: {'id': '48e5c857-28d2-421a-9519-d32a13037daa', 'name': 'auto_allocated_network', 'tenant_id': '8972026d0f3a4bf4b6debd9555f9225c', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['16e0798e-6426-4773-8f30-55d7d7bbe4dc', 'ad194778-c6b1-4bb8-a50b-11049b061235'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-10-02T11:58:36Z', 'updated_at': '2025-10-02T11:58:47Z', 'revision_number': 4, 'project_id': '8972026d0f3a4bf4b6debd9555f9225c'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Oct  2 07:58:47 np0005465988 nova_compute[236126]: 2025-10-02 11:58:47.510 2 WARNING oslo_policy.policy [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  2 07:58:47 np0005465988 nova_compute[236126]: 2025-10-02 11:58:47.511 2 WARNING oslo_policy.policy [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  2 07:58:47 np0005465988 nova_compute[236126]: 2025-10-02 11:58:47.515 2 DEBUG nova.policy [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17903cd0333c407b96f0aede6dd3b16c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8972026d0f3a4bf4b6debd9555f9225c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 07:58:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:47.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:49.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:49 np0005465988 nova_compute[236126]: 2025-10-02 11:58:49.193 2 DEBUG nova.network.neutron [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Successfully created port: 8f48e31c-b017-4024-9753-0ae3fd1e22c4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 07:58:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:49.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:51.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:51 np0005465988 nova_compute[236126]: 2025-10-02 11:58:51.401 2 DEBUG nova.network.neutron [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Successfully updated port: 8f48e31c-b017-4024-9753-0ae3fd1e22c4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 07:58:51 np0005465988 nova_compute[236126]: 2025-10-02 11:58:51.427 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Acquiring lock "refresh_cache-5902a7e7-5ff2-4bb5-a497-e584b578908e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:58:51 np0005465988 nova_compute[236126]: 2025-10-02 11:58:51.427 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Acquired lock "refresh_cache-5902a7e7-5ff2-4bb5-a497-e584b578908e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:58:51 np0005465988 nova_compute[236126]: 2025-10-02 11:58:51.428 2 DEBUG nova.network.neutron [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 07:58:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:51.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:51 np0005465988 nova_compute[236126]: 2025-10-02 11:58:51.787 2 DEBUG nova.network.neutron [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 07:58:51 np0005465988 nova_compute[236126]: 2025-10-02 11:58:51.980 2 DEBUG nova.compute.manager [req-67980eda-291c-41e6-9ca6-57afc5f93332 req-2917f8d1-0ab9-47be-b9c9-fd73ca6b4d3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Received event network-changed-8f48e31c-b017-4024-9753-0ae3fd1e22c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 07:58:51 np0005465988 nova_compute[236126]: 2025-10-02 11:58:51.980 2 DEBUG nova.compute.manager [req-67980eda-291c-41e6-9ca6-57afc5f93332 req-2917f8d1-0ab9-47be-b9c9-fd73ca6b4d3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Refreshing instance network info cache due to event network-changed-8f48e31c-b017-4024-9753-0ae3fd1e22c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 07:58:51 np0005465988 nova_compute[236126]: 2025-10-02 11:58:51.981 2 DEBUG oslo_concurrency.lockutils [req-67980eda-291c-41e6-9ca6-57afc5f93332 req-2917f8d1-0ab9-47be-b9c9-fd73ca6b4d3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-5902a7e7-5ff2-4bb5-a497-e584b578908e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:58:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:58:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Oct  2 07:58:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:53.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:58:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:53.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:53 np0005465988 nova_compute[236126]: 2025-10-02 11:58:53.899 2 DEBUG nova.network.neutron [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Updating instance_info_cache with network_info: [{"id": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "address": "fa:16:3e:bb:1d:4d", "network": {"id": "48e5c857-28d2-421a-9519-d32a13037daa", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::2ab", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8972026d0f3a4bf4b6debd9555f9225c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f48e31c-b0", "ovs_interfaceid": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 07:58:53 np0005465988 nova_compute[236126]: 2025-10-02 11:58:53.994 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Releasing lock "refresh_cache-5902a7e7-5ff2-4bb5-a497-e584b578908e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:58:53 np0005465988 nova_compute[236126]: 2025-10-02 11:58:53.995 2 DEBUG nova.compute.manager [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Instance network_info: |[{"id": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "address": "fa:16:3e:bb:1d:4d", "network": {"id": "48e5c857-28d2-421a-9519-d32a13037daa", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::2ab", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8972026d0f3a4bf4b6debd9555f9225c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f48e31c-b0", "ovs_interfaceid": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 07:58:53 np0005465988 nova_compute[236126]: 2025-10-02 11:58:53.995 2 DEBUG oslo_concurrency.lockutils [req-67980eda-291c-41e6-9ca6-57afc5f93332 req-2917f8d1-0ab9-47be-b9c9-fd73ca6b4d3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-5902a7e7-5ff2-4bb5-a497-e584b578908e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:58:53 np0005465988 nova_compute[236126]: 2025-10-02 11:58:53.996 2 DEBUG nova.network.neutron [req-67980eda-291c-41e6-9ca6-57afc5f93332 req-2917f8d1-0ab9-47be-b9c9-fd73ca6b4d3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Refreshing network info cache for port 8f48e31c-b017-4024-9753-0ae3fd1e22c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.003 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Start _get_guest_xml network_info=[{"id": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "address": "fa:16:3e:bb:1d:4d", "network": {"id": "48e5c857-28d2-421a-9519-d32a13037daa", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::2ab", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8972026d0f3a4bf4b6debd9555f9225c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f48e31c-b0", "ovs_interfaceid": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.009 2 WARNING nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.016 2 DEBUG nova.virt.libvirt.host [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.017 2 DEBUG nova.virt.libvirt.host [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.023 2 DEBUG nova.virt.libvirt.host [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.024 2 DEBUG nova.virt.libvirt.host [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.025 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.026 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.027 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.027 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.028 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.028 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.029 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.029 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.030 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.030 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.031 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.031 2 DEBUG nova.virt.hardware [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.038 2 DEBUG nova.privsep.utils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.039 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:58:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 07:58:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3825480791' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.524 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.562 2 DEBUG nova.storage.rbd_utils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] rbd image 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.568 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:58:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 07:58:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/59783456' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.987 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.990 2 DEBUG nova.virt.libvirt.vif [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T11:58:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-524116669-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-524116669-2',id=3,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8972026d0f3a4bf4b6debd9555f9225c',ramdisk_id='',reservation_id='r-5dqt1vqy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-379631237',owner_user_name='tempest-AutoAllocateNetworkTest-379631237-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T11:58:34Z,user_data=None,user_id='17903cd0333c407b96f0aede6dd3b16c',uuid=5902a7e7-5ff2-4bb5-a497-e584b578908e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "address": "fa:16:3e:bb:1d:4d", "network": {"id": "48e5c857-28d2-421a-9519-d32a13037daa", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::2ab", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8972026d0f3a4bf4b6debd9555f9225c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f48e31c-b0", "ovs_interfaceid": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.990 2 DEBUG nova.network.os_vif_util [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Converting VIF {"id": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "address": "fa:16:3e:bb:1d:4d", "network": {"id": "48e5c857-28d2-421a-9519-d32a13037daa", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::2ab", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8972026d0f3a4bf4b6debd9555f9225c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f48e31c-b0", "ovs_interfaceid": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.992 2 DEBUG nova.network.os_vif_util [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:1d:4d,bridge_name='br-int',has_traffic_filtering=True,id=8f48e31c-b017-4024-9753-0ae3fd1e22c4,network=Network(48e5c857-28d2-421a-9519-d32a13037daa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f48e31c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 07:58:54 np0005465988 nova_compute[236126]: 2025-10-02 11:58:54.994 2 DEBUG nova.objects.instance [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lazy-loading 'pci_devices' on Instance uuid 5902a7e7-5ff2-4bb5-a497-e584b578908e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 07:58:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:55.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.143 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] End _get_guest_xml xml=<domain type="kvm">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  <uuid>5902a7e7-5ff2-4bb5-a497-e584b578908e</uuid>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  <name>instance-00000003</name>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <nova:name>tempest-tempest.common.compute-instance-524116669-2</nova:name>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 11:58:54</nova:creationTime>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <nova:user uuid="17903cd0333c407b96f0aede6dd3b16c">tempest-AutoAllocateNetworkTest-379631237-project-member</nova:user>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <nova:project uuid="8972026d0f3a4bf4b6debd9555f9225c">tempest-AutoAllocateNetworkTest-379631237</nova:project>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <nova:port uuid="8f48e31c-b017-4024-9753-0ae3fd1e22c4">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.1.0.68" ipVersion="4"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="fdfe:381f:8400:1::2ab" ipVersion="6"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <system>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <entry name="serial">5902a7e7-5ff2-4bb5-a497-e584b578908e</entry>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <entry name="uuid">5902a7e7-5ff2-4bb5-a497-e584b578908e</entry>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    </system>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  <os>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  </os>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  <features>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  </features>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  </clock>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  <devices>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/5902a7e7-5ff2-4bb5-a497-e584b578908e_disk">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      </source>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      </auth>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    </disk>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/5902a7e7-5ff2-4bb5-a497-e584b578908e_disk.config">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      </source>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      </auth>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    </disk>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:bb:1d:4d"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <target dev="tap8f48e31c-b0"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    </interface>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/5902a7e7-5ff2-4bb5-a497-e584b578908e/console.log" append="off"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    </serial>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <video>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    </video>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    </rng>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 07:58:55 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 07:58:55 np0005465988 nova_compute[236126]:  </devices>
Oct  2 07:58:55 np0005465988 nova_compute[236126]: </domain>
Oct  2 07:58:55 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.143 2 DEBUG nova.compute.manager [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Preparing to wait for external event network-vif-plugged-8f48e31c-b017-4024-9753-0ae3fd1e22c4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.144 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Acquiring lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.144 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.144 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.145 2 DEBUG nova.virt.libvirt.vif [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T11:58:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-524116669-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-524116669-2',id=3,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8972026d0f3a4bf4b6debd9555f9225c',ramdisk_id='',reservation_id='r-5dqt1vqy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-379631237',owner_user_name='tempest-AutoAllocateNetworkTest-379631237-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T11:58:34Z,user_data=None,user_id='17903cd0333c407b96f0aede6dd3b16c',uuid=5902a7e7-5ff2-4bb5-a497-e584b578908e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "address": "fa:16:3e:bb:1d:4d", "network": {"id": "48e5c857-28d2-421a-9519-d32a13037daa", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::2ab", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8972026d0f3a4bf4b6debd9555f9225c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f48e31c-b0", "ovs_interfaceid": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.145 2 DEBUG nova.network.os_vif_util [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Converting VIF {"id": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "address": "fa:16:3e:bb:1d:4d", "network": {"id": "48e5c857-28d2-421a-9519-d32a13037daa", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::2ab", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8972026d0f3a4bf4b6debd9555f9225c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f48e31c-b0", "ovs_interfaceid": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.146 2 DEBUG nova.network.os_vif_util [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:1d:4d,bridge_name='br-int',has_traffic_filtering=True,id=8f48e31c-b017-4024-9753-0ae3fd1e22c4,network=Network(48e5c857-28d2-421a-9519-d32a13037daa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f48e31c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.147 2 DEBUG os_vif [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:1d:4d,bridge_name='br-int',has_traffic_filtering=True,id=8f48e31c-b017-4024-9753-0ae3fd1e22c4,network=Network(48e5c857-28d2-421a-9519-d32a13037daa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f48e31c-b0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.181 2 DEBUG ovsdbapp.backend.ovs_idl [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.182 2 DEBUG ovsdbapp.backend.ovs_idl [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.182 2 DEBUG ovsdbapp.backend.ovs_idl [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [POLLOUT] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.199 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.200 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.201 2 INFO oslo.privsep.daemon [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp7r87pbqf/privsep.sock']#033[00m
Oct  2 07:58:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 07:58:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1787676681' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 07:58:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 07:58:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1787676681' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 07:58:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:55.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.975 2 INFO oslo.privsep.daemon [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.834 566 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.841 566 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.844 566 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Oct  2 07:58:55 np0005465988 nova_compute[236126]: 2025-10-02 11:58:55.844 566 INFO oslo.privsep.daemon [-] privsep daemon running as pid 566#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.306 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f48e31c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.307 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8f48e31c-b0, col_values=(('external_ids', {'iface-id': '8f48e31c-b017-4024-9753-0ae3fd1e22c4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:1d:4d', 'vm-uuid': '5902a7e7-5ff2-4bb5-a497-e584b578908e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:56 np0005465988 NetworkManager[45041]: <info>  [1759406336.3098] manager: (tap8f48e31c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.318 2 INFO os_vif [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:1d:4d,bridge_name='br-int',has_traffic_filtering=True,id=8f48e31c-b017-4024-9753-0ae3fd1e22c4,network=Network(48e5c857-28d2-421a-9519-d32a13037daa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f48e31c-b0')#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.388 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.388 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.389 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] No VIF found with MAC fa:16:3e:bb:1d:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.389 2 INFO nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Using config drive#033[00m
Oct  2 07:58:56 np0005465988 nova_compute[236126]: 2025-10-02 11:58:56.418 2 DEBUG nova.storage.rbd_utils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] rbd image 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 07:58:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:57.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.405 2 INFO nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Creating config drive at /var/lib/nova/instances/5902a7e7-5ff2-4bb5-a497-e584b578908e/disk.config#033[00m
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.410 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5902a7e7-5ff2-4bb5-a497-e584b578908e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpckjedinh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:58:57 np0005465988 podman[239726]: 2025-10-02 11:58:57.528513315 +0000 UTC m=+0.055776424 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, container_name=iscsid)
Oct  2 07:58:57 np0005465988 podman[239727]: 2025-10-02 11:58:57.534141019 +0000 UTC m=+0.059635176 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.551 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5902a7e7-5ff2-4bb5-a497-e584b578908e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpckjedinh" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:58:57 np0005465988 podman[239723]: 2025-10-02 11:58:57.553911654 +0000 UTC m=+0.084852090 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.582 2 DEBUG nova.storage.rbd_utils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] rbd image 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.587 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5902a7e7-5ff2-4bb5-a497-e584b578908e/disk.config 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.611 2 DEBUG nova.network.neutron [req-67980eda-291c-41e6-9ca6-57afc5f93332 req-2917f8d1-0ab9-47be-b9c9-fd73ca6b4d3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Updated VIF entry in instance network info cache for port 8f48e31c-b017-4024-9753-0ae3fd1e22c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.611 2 DEBUG nova.network.neutron [req-67980eda-291c-41e6-9ca6-57afc5f93332 req-2917f8d1-0ab9-47be-b9c9-fd73ca6b4d3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Updating instance_info_cache with network_info: [{"id": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "address": "fa:16:3e:bb:1d:4d", "network": {"id": "48e5c857-28d2-421a-9519-d32a13037daa", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::2ab", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8972026d0f3a4bf4b6debd9555f9225c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f48e31c-b0", "ovs_interfaceid": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 07:58:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:57.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.630 2 DEBUG oslo_concurrency.lockutils [req-67980eda-291c-41e6-9ca6-57afc5f93332 req-2917f8d1-0ab9-47be-b9c9-fd73ca6b4d3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-5902a7e7-5ff2-4bb5-a497-e584b578908e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.752 2 DEBUG oslo_concurrency.processutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5902a7e7-5ff2-4bb5-a497-e584b578908e/disk.config 5902a7e7-5ff2-4bb5-a497-e584b578908e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.165s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.753 2 INFO nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Deleting local config drive /var/lib/nova/instances/5902a7e7-5ff2-4bb5-a497-e584b578908e/disk.config because it was imported into RBD.#033[00m
Oct  2 07:58:57 np0005465988 systemd[1]: Starting libvirt secret daemon...
Oct  2 07:58:57 np0005465988 systemd[1]: Started libvirt secret daemon.
Oct  2 07:58:57 np0005465988 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct  2 07:58:57 np0005465988 kernel: tap8f48e31c-b0: entered promiscuous mode
Oct  2 07:58:57 np0005465988 NetworkManager[45041]: <info>  [1759406337.9101] manager: (tap8f48e31c-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Oct  2 07:58:57 np0005465988 ovn_controller[132601]: 2025-10-02T11:58:57Z|00027|binding|INFO|Claiming lport 8f48e31c-b017-4024-9753-0ae3fd1e22c4 for this chassis.
Oct  2 07:58:57 np0005465988 ovn_controller[132601]: 2025-10-02T11:58:57Z|00028|binding|INFO|8f48e31c-b017-4024-9753-0ae3fd1e22c4: Claiming fa:16:3e:bb:1d:4d 10.1.0.68 fdfe:381f:8400:1::2ab
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:57 np0005465988 nova_compute[236126]: 2025-10-02 11:58:57.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:57 np0005465988 systemd-udevd[239855]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:58:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:57.964 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:1d:4d 10.1.0.68 fdfe:381f:8400:1::2ab'], port_security=['fa:16:3e:bb:1d:4d 10.1.0.68 fdfe:381f:8400:1::2ab'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.68/26 fdfe:381f:8400:1::2ab/64', 'neutron:device_id': '5902a7e7-5ff2-4bb5-a497-e584b578908e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e5c857-28d2-421a-9519-d32a13037daa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8972026d0f3a4bf4b6debd9555f9225c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c1955bc9-f08c-4e28-af03-54d4a3949aee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0c8e46c-a173-4ff5-bd2b-7026f16b2de8, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=8f48e31c-b017-4024-9753-0ae3fd1e22c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:58:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:57.966 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 8f48e31c-b017-4024-9753-0ae3fd1e22c4 in datapath 48e5c857-28d2-421a-9519-d32a13037daa bound to our chassis#033[00m
Oct  2 07:58:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:57.969 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e5c857-28d2-421a-9519-d32a13037daa#033[00m
Oct  2 07:58:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:57.971 142124 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpbh_82c6h/privsep.sock']#033[00m
Oct  2 07:58:57 np0005465988 NetworkManager[45041]: <info>  [1759406337.9755] device (tap8f48e31c-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:58:57 np0005465988 NetworkManager[45041]: <info>  [1759406337.9765] device (tap8f48e31c-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 07:58:57 np0005465988 systemd-machined[192594]: New machine qemu-1-instance-00000003.
Oct  2 07:58:57 np0005465988 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Oct  2 07:58:58 np0005465988 nova_compute[236126]: 2025-10-02 11:58:58.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:58 np0005465988 ovn_controller[132601]: 2025-10-02T11:58:58Z|00029|binding|INFO|Setting lport 8f48e31c-b017-4024-9753-0ae3fd1e22c4 ovn-installed in OVS
Oct  2 07:58:58 np0005465988 ovn_controller[132601]: 2025-10-02T11:58:58Z|00030|binding|INFO|Setting lport 8f48e31c-b017-4024-9753-0ae3fd1e22c4 up in Southbound
Oct  2 07:58:58 np0005465988 nova_compute[236126]: 2025-10-02 11:58:58.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:58:58 np0005465988 nova_compute[236126]: 2025-10-02 11:58:58.507 2 DEBUG nova.compute.manager [req-b7f76c43-93ff-4899-94f5-e8ff0dfe3183 req-a0fb1830-f970-4ff7-9f95-30b1c25f7565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Received event network-vif-plugged-8f48e31c-b017-4024-9753-0ae3fd1e22c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 07:58:58 np0005465988 nova_compute[236126]: 2025-10-02 11:58:58.508 2 DEBUG oslo_concurrency.lockutils [req-b7f76c43-93ff-4899-94f5-e8ff0dfe3183 req-a0fb1830-f970-4ff7-9f95-30b1c25f7565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:58 np0005465988 nova_compute[236126]: 2025-10-02 11:58:58.508 2 DEBUG oslo_concurrency.lockutils [req-b7f76c43-93ff-4899-94f5-e8ff0dfe3183 req-a0fb1830-f970-4ff7-9f95-30b1c25f7565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:58 np0005465988 nova_compute[236126]: 2025-10-02 11:58:58.508 2 DEBUG oslo_concurrency.lockutils [req-b7f76c43-93ff-4899-94f5-e8ff0dfe3183 req-a0fb1830-f970-4ff7-9f95-30b1c25f7565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:58 np0005465988 nova_compute[236126]: 2025-10-02 11:58:58.509 2 DEBUG nova.compute.manager [req-b7f76c43-93ff-4899-94f5-e8ff0dfe3183 req-a0fb1830-f970-4ff7-9f95-30b1c25f7565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Processing event network-vif-plugged-8f48e31c-b017-4024-9753-0ae3fd1e22c4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 07:58:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:58.776 142124 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 07:58:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:58.777 142124 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpbh_82c6h/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  2 07:58:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:58.643 239912 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 07:58:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:58.647 239912 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 07:58:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:58.649 239912 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Oct  2 07:58:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:58.649 239912 INFO oslo.privsep.daemon [-] privsep daemon running as pid 239912#033[00m
Oct  2 07:58:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:58:58.780 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9ebd64b4-4444-4d07-81e0-b51df09d95d8]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.078 2 DEBUG nova.compute.manager [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.080 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406339.0777705, 5902a7e7-5ff2-4bb5-a497-e584b578908e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.081 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] VM Started (Lifecycle Event)#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.084 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 07:58:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:58:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:58:59.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.089 2 INFO nova.virt.libvirt.driver [-] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Instance spawned successfully.#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.089 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.154 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.158 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.167 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.168 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.168 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.168 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.168 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.169 2 DEBUG nova.virt.libvirt.driver [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.184 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.184 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406339.0780332, 5902a7e7-5ff2-4bb5-a497-e584b578908e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.185 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] VM Paused (Lifecycle Event)#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.222 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.226 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406339.083714, 5902a7e7-5ff2-4bb5-a497-e584b578908e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.226 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] VM Resumed (Lifecycle Event)#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.258 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.262 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.276 2 INFO nova.compute.manager [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Took 24.53 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.278 2 DEBUG nova.compute.manager [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.286 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.356 2 INFO nova.compute.manager [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Took 27.03 seconds to build instance.#033[00m
Oct  2 07:58:59 np0005465988 nova_compute[236126]: 2025-10-02 11:58:59.373 2 DEBUG oslo_concurrency.lockutils [None req-3aa091b9-f09d-4ab0-88c9-b4bd51339a3c 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "5902a7e7-5ff2-4bb5-a497-e584b578908e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 27.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:58:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:58:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:58:59.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:59:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:00.097 239912 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:00.098 239912 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:00.098 239912 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:00 np0005465988 nova_compute[236126]: 2025-10-02 11:59:00.814 2 DEBUG nova.compute.manager [req-fd2ebf6b-bc27-4051-a67d-2ef6658e80e5 req-f42190ec-cb8e-4c05-8f02-7604cce2f0eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Received event network-vif-plugged-8f48e31c-b017-4024-9753-0ae3fd1e22c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 07:59:00 np0005465988 nova_compute[236126]: 2025-10-02 11:59:00.814 2 DEBUG oslo_concurrency.lockutils [req-fd2ebf6b-bc27-4051-a67d-2ef6658e80e5 req-f42190ec-cb8e-4c05-8f02-7604cce2f0eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:00 np0005465988 nova_compute[236126]: 2025-10-02 11:59:00.815 2 DEBUG oslo_concurrency.lockutils [req-fd2ebf6b-bc27-4051-a67d-2ef6658e80e5 req-f42190ec-cb8e-4c05-8f02-7604cce2f0eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:00 np0005465988 nova_compute[236126]: 2025-10-02 11:59:00.815 2 DEBUG oslo_concurrency.lockutils [req-fd2ebf6b-bc27-4051-a67d-2ef6658e80e5 req-f42190ec-cb8e-4c05-8f02-7604cce2f0eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:00 np0005465988 nova_compute[236126]: 2025-10-02 11:59:00.815 2 DEBUG nova.compute.manager [req-fd2ebf6b-bc27-4051-a67d-2ef6658e80e5 req-f42190ec-cb8e-4c05-8f02-7604cce2f0eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] No waiting events found dispatching network-vif-plugged-8f48e31c-b017-4024-9753-0ae3fd1e22c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 07:59:00 np0005465988 nova_compute[236126]: 2025-10-02 11:59:00.815 2 WARNING nova.compute.manager [req-fd2ebf6b-bc27-4051-a67d-2ef6658e80e5 req-f42190ec-cb8e-4c05-8f02-7604cce2f0eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Received unexpected event network-vif-plugged-8f48e31c-b017-4024-9753-0ae3fd1e22c4 for instance with vm_state active and task_state None.#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.074 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7e8d7907-d073-4c5f-8947-fa1f6b5af662]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.075 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48e5c857-21 in ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.078 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48e5c857-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.078 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[92a69ac6-cd87-4322-a317-7472c6abe97c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.082 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7d6fdc87-b52c-46e8-8c10-e22e9a07b000]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:01.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.116 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[cdff9d4c-c06a-4fdd-82ab-3a8b4dd1d23e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.144 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0cdbebf0-f4ff-47d3-a96b-ac9713989127]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.146 142124 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpbbl2llq3/privsep.sock']#033[00m
Oct  2 07:59:01 np0005465988 nova_compute[236126]: 2025-10-02 11:59:01.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:01 np0005465988 nova_compute[236126]: 2025-10-02 11:59:01.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:01.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.933 142124 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.935 142124 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpbbl2llq3/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.738 239928 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.746 239928 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.750 239928 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.750 239928 INFO oslo.privsep.daemon [-] privsep daemon running as pid 239928#033[00m
Oct  2 07:59:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:01.938 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4a49abef-5317-4d55-97b5-749b073fa09d]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:59:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:02.542 239928 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:02.542 239928 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:02.542 239928 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:03.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.147 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5e575016-5163-46c2-ba2b-570dcd8eddc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.153 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[13d0752d-5f3d-4648-9117-67b4e9dcfd94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:03 np0005465988 NetworkManager[45041]: <info>  [1759406343.1621] manager: (tap48e5c857-20): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Oct  2 07:59:03 np0005465988 systemd-udevd[239941]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.195 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[498ef604-333e-4cb9-ace1-7570239c9e95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.199 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ed5250-3ca5-49f6-9fdf-086dbb8ef431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:03 np0005465988 NetworkManager[45041]: <info>  [1759406343.2331] device (tap48e5c857-20): carrier: link connected
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.240 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[63ad85c4-5866-41db-ae50-33217f89a271]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.268 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[44c6b790-849d-4c76-8caf-e86dbecb5581]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e5c857-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:83:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440356, 'reachable_time': 34510, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239960, 'error': None, 'target': 'ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.293 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e44e4a62-6c49-42b7-95c4-e1f52d848223]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe19:83c0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440356, 'tstamp': 440356}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239961, 'error': None, 'target': 'ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.318 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5cd4b6-c28e-402e-88f6-d0127da512e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e5c857-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:83:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440356, 'reachable_time': 34510, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239962, 'error': None, 'target': 'ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.361 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[536d796a-4d30-4483-9725-22f2fddb7ebc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.436 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dca89b6c-2ecb-4b36-8a48-94ea92f7f7bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.438 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e5c857-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.438 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.439 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e5c857-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:59:03 np0005465988 kernel: tap48e5c857-20: entered promiscuous mode
Oct  2 07:59:03 np0005465988 NetworkManager[45041]: <info>  [1759406343.4424] manager: (tap48e5c857-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Oct  2 07:59:03 np0005465988 nova_compute[236126]: 2025-10-02 11:59:03.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.446 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e5c857-20, col_values=(('external_ids', {'iface-id': '108050f3-e876-480b-8cdd-c1255d33ae84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:59:03 np0005465988 ovn_controller[132601]: 2025-10-02T11:59:03Z|00031|binding|INFO|Releasing lport 108050f3-e876-480b-8cdd-c1255d33ae84 from this chassis (sb_readonly=0)
Oct  2 07:59:03 np0005465988 nova_compute[236126]: 2025-10-02 11:59:03.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.451 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48e5c857-28d2-421a-9519-d32a13037daa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48e5c857-28d2-421a-9519-d32a13037daa.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.451 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1919b335-e1bf-4565-a5f0-0419f4f2dd9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.452 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-48e5c857-28d2-421a-9519-d32a13037daa
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/48e5c857-28d2-421a-9519-d32a13037daa.pid.haproxy
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 48e5c857-28d2-421a-9519-d32a13037daa
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 07:59:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:03.453 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa', 'env', 'PROCESS_TAG=haproxy-48e5c857-28d2-421a-9519-d32a13037daa', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48e5c857-28d2-421a-9519-d32a13037daa.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 07:59:03 np0005465988 nova_compute[236126]: 2025-10-02 11:59:03.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:59:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:03.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:59:03 np0005465988 podman[239994]: 2025-10-02 11:59:03.900045345 +0000 UTC m=+0.067533086 container create 15f9f731149d75c25783fb6db1f11a31118c7e0ad4f38bc564982c53d111e70d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 07:59:03 np0005465988 systemd[1]: Started libpod-conmon-15f9f731149d75c25783fb6db1f11a31118c7e0ad4f38bc564982c53d111e70d.scope.
Oct  2 07:59:03 np0005465988 podman[239994]: 2025-10-02 11:59:03.862681227 +0000 UTC m=+0.030168988 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 07:59:03 np0005465988 systemd[1]: Started libcrun container.
Oct  2 07:59:03 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc239bb9581b084256ef71853fe0334a8a230cdc037db0bba76b01784e7593a7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 07:59:04 np0005465988 podman[239994]: 2025-10-02 11:59:04.009910191 +0000 UTC m=+0.177397952 container init 15f9f731149d75c25783fb6db1f11a31118c7e0ad4f38bc564982c53d111e70d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:59:04 np0005465988 podman[239994]: 2025-10-02 11:59:04.020128849 +0000 UTC m=+0.187616600 container start 15f9f731149d75c25783fb6db1f11a31118c7e0ad4f38bc564982c53d111e70d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 07:59:04 np0005465988 neutron-haproxy-ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa[240011]: [NOTICE]   (240015) : New worker (240017) forked
Oct  2 07:59:04 np0005465988 neutron-haproxy-ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa[240011]: [NOTICE]   (240015) : Loading success.
Oct  2 07:59:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:05.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:05.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:06 np0005465988 nova_compute[236126]: 2025-10-02 11:59:06.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:06 np0005465988 nova_compute[236126]: 2025-10-02 11:59:06.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:59:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:07.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:59:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:59:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:59:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:07.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:59:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 07:59:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:09.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 07:59:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 07:59:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:09.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:10 np0005465988 nova_compute[236126]: 2025-10-02 11:59:10.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:10.264 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:59:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:10.266 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 07:59:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:11.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:11 np0005465988 nova_compute[236126]: 2025-10-02 11:59:11.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:11 np0005465988 nova_compute[236126]: 2025-10-02 11:59:11.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:59:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:11.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:59:11 np0005465988 nova_compute[236126]: 2025-10-02 11:59:11.963 2 DEBUG oslo_concurrency.lockutils [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Acquiring lock "5902a7e7-5ff2-4bb5-a497-e584b578908e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:11 np0005465988 nova_compute[236126]: 2025-10-02 11:59:11.964 2 DEBUG oslo_concurrency.lockutils [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "5902a7e7-5ff2-4bb5-a497-e584b578908e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:11 np0005465988 nova_compute[236126]: 2025-10-02 11:59:11.964 2 DEBUG oslo_concurrency.lockutils [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Acquiring lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:11 np0005465988 nova_compute[236126]: 2025-10-02 11:59:11.965 2 DEBUG oslo_concurrency.lockutils [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:11 np0005465988 nova_compute[236126]: 2025-10-02 11:59:11.965 2 DEBUG oslo_concurrency.lockutils [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lock "5902a7e7-5ff2-4bb5-a497-e584b578908e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:11 np0005465988 nova_compute[236126]: 2025-10-02 11:59:11.967 2 INFO nova.compute.manager [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Terminating instance#033[00m
Oct  2 07:59:11 np0005465988 nova_compute[236126]: 2025-10-02 11:59:11.969 2 DEBUG nova.compute.manager [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 07:59:12 np0005465988 kernel: tap8f48e31c-b0 (unregistering): left promiscuous mode
Oct  2 07:59:12 np0005465988 NetworkManager[45041]: <info>  [1759406352.0287] device (tap8f48e31c-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 07:59:12 np0005465988 ovn_controller[132601]: 2025-10-02T11:59:12Z|00032|binding|INFO|Releasing lport 8f48e31c-b017-4024-9753-0ae3fd1e22c4 from this chassis (sb_readonly=0)
Oct  2 07:59:12 np0005465988 ovn_controller[132601]: 2025-10-02T11:59:12Z|00033|binding|INFO|Setting lport 8f48e31c-b017-4024-9753-0ae3fd1e22c4 down in Southbound
Oct  2 07:59:12 np0005465988 ovn_controller[132601]: 2025-10-02T11:59:12Z|00034|binding|INFO|Removing iface tap8f48e31c-b0 ovn-installed in OVS
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:12.097 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:1d:4d 10.1.0.68 fdfe:381f:8400:1::2ab'], port_security=['fa:16:3e:bb:1d:4d 10.1.0.68 fdfe:381f:8400:1::2ab'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.68/26 fdfe:381f:8400:1::2ab/64', 'neutron:device_id': '5902a7e7-5ff2-4bb5-a497-e584b578908e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e5c857-28d2-421a-9519-d32a13037daa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8972026d0f3a4bf4b6debd9555f9225c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c1955bc9-f08c-4e28-af03-54d4a3949aee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0c8e46c-a173-4ff5-bd2b-7026f16b2de8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=8f48e31c-b017-4024-9753-0ae3fd1e22c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:59:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:12.098 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 8f48e31c-b017-4024-9753-0ae3fd1e22c4 in datapath 48e5c857-28d2-421a-9519-d32a13037daa unbound from our chassis#033[00m
Oct  2 07:59:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:12.100 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48e5c857-28d2-421a-9519-d32a13037daa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 07:59:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:12.101 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d476af83-3d5a-49ef-afab-c8c85be8628d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:59:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 11:59:12.101 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa namespace which is not needed anymore#033[00m
Oct  2 07:59:12 np0005465988 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Oct  2 07:59:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:59:12 np0005465988 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 13.405s CPU time.
Oct  2 07:59:12 np0005465988 systemd-machined[192594]: Machine qemu-1-instance-00000003 terminated.
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.209 2 INFO nova.virt.libvirt.driver [-] [instance: 5902a7e7-5ff2-4bb5-a497-e584b578908e] Instance destroyed successfully.#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.209 2 DEBUG nova.objects.instance [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Lazy-loading 'resources' on Instance uuid 5902a7e7-5ff2-4bb5-a497-e584b578908e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.239 2 DEBUG nova.virt.libvirt.vif [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T11:58:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-524116669-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-524116669-2',id=3,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-10-02T11:58:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8972026d0f3a4bf4b6debd9555f9225c',ramdisk_id='',reservation_id='r-5dqt1vqy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-379631237',owner_user_name='tempest-AutoAllocateNetworkTest-379631237-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T11:58:59Z,user_data=None,user_id='17903cd0333c407b96f0aede6dd3b16c',uuid=5902a7e7-5ff2-4bb5-a497-e584b578908e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "address": "fa:16:3e:bb:1d:4d", "network": {"id": "48e5c857-28d2-421a-9519-d32a13037daa", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::2ab", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8972026d0f3a4bf4b6debd9555f9225c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f48e31c-b0", "ovs_interfaceid": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.240 2 DEBUG nova.network.os_vif_util [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Converting VIF {"id": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "address": "fa:16:3e:bb:1d:4d", "network": {"id": "48e5c857-28d2-421a-9519-d32a13037daa", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::2ab", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8972026d0f3a4bf4b6debd9555f9225c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f48e31c-b0", "ovs_interfaceid": "8f48e31c-b017-4024-9753-0ae3fd1e22c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.241 2 DEBUG nova.network.os_vif_util [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:1d:4d,bridge_name='br-int',has_traffic_filtering=True,id=8f48e31c-b017-4024-9753-0ae3fd1e22c4,network=Network(48e5c857-28d2-421a-9519-d32a13037daa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f48e31c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.242 2 DEBUG os_vif [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:1d:4d,bridge_name='br-int',has_traffic_filtering=True,id=8f48e31c-b017-4024-9753-0ae3fd1e22c4,network=Network(48e5c857-28d2-421a-9519-d32a13037daa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f48e31c-b0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.245 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f48e31c-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:12 np0005465988 nova_compute[236126]: 2025-10-02 11:59:12.251 2 INFO os_vif [None req-088014da-075c-4e15-97f6-8b925ad079f1 17903cd0333c407b96f0aede6dd3b16c 8972026d0f3a4bf4b6debd9555f9225c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:1d:4d,bridge_name='br-int',has_traffic_filtering=True,id=8f48e31c-b017-4024-9753-0ae3fd1e22c4,network=Network(48e5c857-28d2-421a-9519-d32a13037daa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f48e31c-b0')#033[00m
Oct  2 07:59:12 np0005465988 neutron-haproxy-ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa[240011]: [NOTICE]   (240015) : haproxy version is 2.8.14-c23fe91
Oct  2 07:59:12 np0005465988 neutron-haproxy-ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa[240011]: [NOTICE]   (240015) : path to executable is /usr/sbin/haproxy
Oct  2 07:59:12 np0005465988 neutron-haproxy-ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa[240011]: [WARNING]  (240015) : Exiting Master process...
Oct  2 07:59:12 np0005465988 neutron-haproxy-ovnmeta-48e5c857-28d2-421a-9519-d32a13037daa[240011]: [ALERT]    (240015) : Current worker (240017) exited with code 143 (Terminated)
Oct  2 07:59:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:59:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:37.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:37 np0005465988 nova_compute[236126]: 2025-10-02 11:59:37.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:37 np0005465988 rsyslogd[1008]: imjournal: 556 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Oct  2 07:59:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:37.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:39.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:59:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:39.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:59:40 np0005465988 nova_compute[236126]: 2025-10-02 11:59:40.137 2 DEBUG nova.virt.libvirt.driver [None req-87ff865a-dd4c-4595-8a37-ef2cf43e8099 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] [instance: b4e4932c-8129-4ceb-95ef-3a612ef502f9] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 07:59:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:41.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:41 np0005465988 nova_compute[236126]: 2025-10-02 11:59:41.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:41.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:59:42 np0005465988 nova_compute[236126]: 2025-10-02 11:59:42.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:43.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:43 np0005465988 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Deactivated successfully.
Oct  2 07:59:43 np0005465988 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Consumed 13.843s CPU time.
Oct  2 07:59:43 np0005465988 systemd-machined[192594]: Machine qemu-2-instance-00000006 terminated.
Oct  2 07:59:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:43.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:44 np0005465988 nova_compute[236126]: 2025-10-02 11:59:44.157 2 INFO nova.virt.libvirt.driver [None req-87ff865a-dd4c-4595-8a37-ef2cf43e8099 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] [instance: b4e4932c-8129-4ceb-95ef-3a612ef502f9] Instance shutdown successfully after 14 seconds.#033[00m
Oct  2 07:59:44 np0005465988 nova_compute[236126]: 2025-10-02 11:59:44.164 2 INFO nova.virt.libvirt.driver [-] [instance: b4e4932c-8129-4ceb-95ef-3a612ef502f9] Instance destroyed successfully.#033[00m
Oct  2 07:59:44 np0005465988 nova_compute[236126]: 2025-10-02 11:59:44.168 2 DEBUG nova.virt.libvirt.driver [None req-87ff865a-dd4c-4595-8a37-ef2cf43e8099 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 07:59:44 np0005465988 nova_compute[236126]: 2025-10-02 11:59:44.168 2 DEBUG nova.virt.libvirt.driver [None req-87ff865a-dd4c-4595-8a37-ef2cf43e8099 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 07:59:44 np0005465988 nova_compute[236126]: 2025-10-02 11:59:44.265 2 DEBUG oslo_concurrency.lockutils [None req-87ff865a-dd4c-4595-8a37-ef2cf43e8099 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:59:44 np0005465988 nova_compute[236126]: 2025-10-02 11:59:44.265 2 DEBUG oslo_concurrency.lockutils [None req-87ff865a-dd4c-4595-8a37-ef2cf43e8099 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:59:44 np0005465988 nova_compute[236126]: 2025-10-02 11:59:44.279 2 INFO nova.compute.rpcapi [None req-87ff865a-dd4c-4595-8a37-ef2cf43e8099 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Oct  2 07:59:44 np0005465988 nova_compute[236126]: 2025-10-02 11:59:44.280 2 DEBUG oslo_concurrency.lockutils [None req-87ff865a-dd4c-4595-8a37-ef2cf43e8099 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:59:44 np0005465988 nova_compute[236126]: 2025-10-02 11:59:44.299 2 DEBUG oslo_concurrency.lockutils [None req-87ff865a-dd4c-4595-8a37-ef2cf43e8099 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Acquiring lock "b4e4932c-8129-4ceb-95ef-3a612ef502f9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:44 np0005465988 nova_compute[236126]: 2025-10-02 11:59:44.300 2 DEBUG oslo_concurrency.lockutils [None req-87ff865a-dd4c-4595-8a37-ef2cf43e8099 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Lock "b4e4932c-8129-4ceb-95ef-3a612ef502f9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:44 np0005465988 nova_compute[236126]: 2025-10-02 11:59:44.300 2 DEBUG oslo_concurrency.lockutils [None req-87ff865a-dd4c-4595-8a37-ef2cf43e8099 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Lock "b4e4932c-8129-4ceb-95ef-3a612ef502f9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:45.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:45 np0005465988 podman[240971]: 2025-10-02 11:59:45.539530313 +0000 UTC m=+0.064739355 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Oct  2 07:59:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:45.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Oct  2 07:59:46 np0005465988 nova_compute[236126]: 2025-10-02 11:59:46.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:59:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:59:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:47.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:59:47 np0005465988 nova_compute[236126]: 2025-10-02 11:59:47.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:59:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:47.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:59:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:49.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:49.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:50 np0005465988 nova_compute[236126]: 2025-10-02 11:59:50.718 2 DEBUG oslo_concurrency.lockutils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "b4e4932c-8129-4ceb-95ef-3a612ef502f9" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:50 np0005465988 nova_compute[236126]: 2025-10-02 11:59:50.719 2 DEBUG oslo_concurrency.lockutils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b4e4932c-8129-4ceb-95ef-3a612ef502f9" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:50 np0005465988 nova_compute[236126]: 2025-10-02 11:59:50.720 2 DEBUG nova.compute.manager [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b4e4932c-8129-4ceb-95ef-3a612ef502f9] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  2 07:59:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 07:59:50 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/897661849' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 07:59:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 07:59:50 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/897661849' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 07:59:50 np0005465988 nova_compute[236126]: 2025-10-02 11:59:50.939 2 DEBUG oslo_concurrency.lockutils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "refresh_cache-b4e4932c-8129-4ceb-95ef-3a612ef502f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:59:50 np0005465988 nova_compute[236126]: 2025-10-02 11:59:50.940 2 DEBUG oslo_concurrency.lockutils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquired lock "refresh_cache-b4e4932c-8129-4ceb-95ef-3a612ef502f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:59:50 np0005465988 nova_compute[236126]: 2025-10-02 11:59:50.940 2 DEBUG nova.network.neutron [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b4e4932c-8129-4ceb-95ef-3a612ef502f9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 07:59:50 np0005465988 nova_compute[236126]: 2025-10-02 11:59:50.940 2 DEBUG nova.objects.instance [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lazy-loading 'info_cache' on Instance uuid b4e4932c-8129-4ceb-95ef-3a612ef502f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 07:59:51 np0005465988 nova_compute[236126]: 2025-10-02 11:59:51.112 2 DEBUG nova.network.neutron [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b4e4932c-8129-4ceb-95ef-3a612ef502f9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 07:59:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:59:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:51.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:59:51 np0005465988 nova_compute[236126]: 2025-10-02 11:59:51.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:51 np0005465988 nova_compute[236126]: 2025-10-02 11:59:51.471 2 DEBUG nova.network.neutron [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b4e4932c-8129-4ceb-95ef-3a612ef502f9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 07:59:51 np0005465988 nova_compute[236126]: 2025-10-02 11:59:51.486 2 DEBUG oslo_concurrency.lockutils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Releasing lock "refresh_cache-b4e4932c-8129-4ceb-95ef-3a612ef502f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:59:51 np0005465988 nova_compute[236126]: 2025-10-02 11:59:51.486 2 DEBUG nova.objects.instance [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lazy-loading 'migration_context' on Instance uuid b4e4932c-8129-4ceb-95ef-3a612ef502f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 07:59:51 np0005465988 nova_compute[236126]: 2025-10-02 11:59:51.588 2 DEBUG nova.storage.rbd_utils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] removing snapshot(nova-resize) on rbd image(b4e4932c-8129-4ceb-95ef-3a612ef502f9_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 07:59:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:59:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:51.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:59:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Oct  2 07:59:52 np0005465988 nova_compute[236126]: 2025-10-02 11:59:52.013 2 DEBUG oslo_concurrency.lockutils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:52 np0005465988 nova_compute[236126]: 2025-10-02 11:59:52.013 2 DEBUG oslo_concurrency.lockutils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:52 np0005465988 nova_compute[236126]: 2025-10-02 11:59:52.096 2 DEBUG oslo_concurrency.processutils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:59:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:59:52 np0005465988 nova_compute[236126]: 2025-10-02 11:59:52.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 07:59:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/766376539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 07:59:52 np0005465988 nova_compute[236126]: 2025-10-02 11:59:52.530 2 DEBUG oslo_concurrency.processutils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:59:52 np0005465988 nova_compute[236126]: 2025-10-02 11:59:52.537 2 DEBUG nova.compute.provider_tree [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:59:52 np0005465988 nova_compute[236126]: 2025-10-02 11:59:52.563 2 DEBUG nova.scheduler.client.report [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:59:52 np0005465988 nova_compute[236126]: 2025-10-02 11:59:52.705 2 DEBUG oslo_concurrency.lockutils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:52 np0005465988 nova_compute[236126]: 2025-10-02 11:59:52.829 2 INFO nova.scheduler.client.report [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Deleted allocation for migration 72a7a986-629c-41a7-83d5-4be5e86579ab#033[00m
Oct  2 07:59:52 np0005465988 nova_compute[236126]: 2025-10-02 11:59:52.892 2 DEBUG oslo_concurrency.lockutils [None req-8f88111e-d622-4000-b02c-7b35388d0616 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b4e4932c-8129-4ceb-95ef-3a612ef502f9" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:59:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:53.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:59:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:53.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 07:59:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:55.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 07:59:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:55.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:56 np0005465988 nova_compute[236126]: 2025-10-02 11:59:56.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 07:59:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 07:59:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:57.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 07:59:57 np0005465988 nova_compute[236126]: 2025-10-02 11:59:57.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 07:59:57 np0005465988 ovn_controller[132601]: 2025-10-02T11:59:57Z|00035|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  2 07:59:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:57.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e140 e140: 3 total, 3 up, 3 in
Oct  2 07:59:58 np0005465988 nova_compute[236126]: 2025-10-02 11:59:58.701 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406383.6986737, b4e4932c-8129-4ceb-95ef-3a612ef502f9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 07:59:58 np0005465988 nova_compute[236126]: 2025-10-02 11:59:58.702 2 INFO nova.compute.manager [-] [instance: b4e4932c-8129-4ceb-95ef-3a612ef502f9] VM Stopped (Lifecycle Event)#033[00m
Oct  2 07:59:58 np0005465988 nova_compute[236126]: 2025-10-02 11:59:58.751 2 DEBUG nova.compute.manager [None req-e7350dd1-0df0-4b6b-aec5-e9ea827103e4 - - - - - -] [instance: b4e4932c-8129-4ceb-95ef-3a612ef502f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 07:59:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:11:59:59.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 07:59:59 np0005465988 podman[241107]: 2025-10-02 11:59:59.521107489 +0000 UTC m=+0.052275942 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:59:59 np0005465988 podman[241106]: 2025-10-02 11:59:59.563478422 +0000 UTC m=+0.095405347 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller)
Oct  2 07:59:59 np0005465988 podman[241108]: 2025-10-02 11:59:59.590116677 +0000 UTC m=+0.109651332 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:59:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 07:59:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 07:59:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:11:59:59.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:00 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 08:00:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:01.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:01 np0005465988 nova_compute[236126]: 2025-10-02 12:00:01.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:01.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:02 np0005465988 nova_compute[236126]: 2025-10-02 12:00:02.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:03.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:03.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:05.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:05.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:06 np0005465988 nova_compute[236126]: 2025-10-02 12:00:06.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:07.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:07 np0005465988 nova_compute[236126]: 2025-10-02 12:00:07.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:07.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:09.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:09.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:11.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:11 np0005465988 nova_compute[236126]: 2025-10-02 12:00:11.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:11.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:12 np0005465988 nova_compute[236126]: 2025-10-02 12:00:12.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:13.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:13.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:15.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:00:15.209 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:00:15 np0005465988 nova_compute[236126]: 2025-10-02 12:00:15.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:00:15.211 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:00:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:15.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:00:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:00:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:00:16 np0005465988 nova_compute[236126]: 2025-10-02 12:00:16.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:16 np0005465988 podman[241364]: 2025-10-02 12:00:16.531276098 +0000 UTC m=+0.072860131 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:00:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:17.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:17 np0005465988 nova_compute[236126]: 2025-10-02 12:00:17.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:17.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:19.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:19.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:21.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:21 np0005465988 nova_compute[236126]: 2025-10-02 12:00:21.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:21.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:22 np0005465988 nova_compute[236126]: 2025-10-02 12:00:22.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:23.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:23.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e141 e141: 3 total, 3 up, 3 in
Oct  2 08:00:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:25.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:00:25.213 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:25 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:00:25 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:00:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:25.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:26 np0005465988 nova_compute[236126]: 2025-10-02 12:00:26.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:27.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:00:27.327 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:00:27.327 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:00:27.327 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:27 np0005465988 nova_compute[236126]: 2025-10-02 12:00:27.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:27 np0005465988 nova_compute[236126]: 2025-10-02 12:00:27.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:27.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.517 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.518 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.518 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.518 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.518 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:00:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2811680153' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:00:28 np0005465988 nova_compute[236126]: 2025-10-02 12:00:28.963 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.178 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.179 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4969MB free_disk=20.851749420166016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.180 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.180 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:29.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.291 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.291 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.321 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:00:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/656861087' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:00:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:29.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.754 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.760 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.863 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.949 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:00:29 np0005465988 nova_compute[236126]: 2025-10-02 12:00:29.949 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:30 np0005465988 podman[241536]: 2025-10-02 12:00:30.531340732 +0000 UTC m=+0.062342285 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:00:30 np0005465988 podman[241537]: 2025-10-02 12:00:30.540597581 +0000 UTC m=+0.068129103 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Oct  2 08:00:30 np0005465988 podman[241535]: 2025-10-02 12:00:30.570513691 +0000 UTC m=+0.108357263 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:00:30 np0005465988 nova_compute[236126]: 2025-10-02 12:00:30.949 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:30 np0005465988 nova_compute[236126]: 2025-10-02 12:00:30.950 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:31.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:31 np0005465988 nova_compute[236126]: 2025-10-02 12:00:31.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:31.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e142 e142: 3 total, 3 up, 3 in
Oct  2 08:00:32 np0005465988 nova_compute[236126]: 2025-10-02 12:00:32.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:33.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:33.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:34 np0005465988 nova_compute[236126]: 2025-10-02 12:00:34.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:34 np0005465988 nova_compute[236126]: 2025-10-02 12:00:34.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:00:34 np0005465988 nova_compute[236126]: 2025-10-02 12:00:34.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:00:34 np0005465988 nova_compute[236126]: 2025-10-02 12:00:34.501 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:00:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:35.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:35.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:36 np0005465988 nova_compute[236126]: 2025-10-02 12:00:36.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:37.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:37 np0005465988 nova_compute[236126]: 2025-10-02 12:00:37.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:37.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e143 e143: 3 total, 3 up, 3 in
Oct  2 08:00:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:39.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:39.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:41.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:41 np0005465988 nova_compute[236126]: 2025-10-02 12:00:41.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:41.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:42 np0005465988 nova_compute[236126]: 2025-10-02 12:00:42.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:43.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:43.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:45.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:45.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:46 np0005465988 nova_compute[236126]: 2025-10-02 12:00:46.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:46 np0005465988 nova_compute[236126]: 2025-10-02 12:00:46.974 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:46 np0005465988 nova_compute[236126]: 2025-10-02 12:00:46.975 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:46 np0005465988 nova_compute[236126]: 2025-10-02 12:00:46.997 2 DEBUG nova.compute.manager [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:00:47 np0005465988 podman[241607]: 2025-10-02 12:00:47.103177526 +0000 UTC m=+0.086851858 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:00:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.150 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.150 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.157 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.157 2 INFO nova.compute.claims [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:00:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:47.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.296 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:00:47Z|00036|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  2 08:00:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:00:47 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2598809351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.740 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.749 2 DEBUG nova.compute.provider_tree [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:00:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:47.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.770 2 DEBUG nova.scheduler.client.report [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.799 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.801 2 DEBUG nova.compute.manager [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.895 2 DEBUG nova.compute.manager [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.896 2 DEBUG nova.network.neutron [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.938 2 INFO nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:00:47 np0005465988 nova_compute[236126]: 2025-10-02 12:00:47.983 2 DEBUG nova.compute.manager [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.137 2 DEBUG nova.compute.manager [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.139 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.139 2 INFO nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Creating image(s)#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.173 2 DEBUG nova.storage.rbd_utils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.206 2 DEBUG nova.storage.rbd_utils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.243 2 DEBUG nova.storage.rbd_utils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.247 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.321 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.322 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.323 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.323 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.345 2 DEBUG nova.storage.rbd_utils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.349 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.711 2 DEBUG nova.network.neutron [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:00:48 np0005465988 nova_compute[236126]: 2025-10-02 12:00:48.712 2 DEBUG nova.compute.manager [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:00:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:49.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:49.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.050 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.702s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.130 2 DEBUG nova.storage.rbd_utils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] resizing rbd image b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.572 2 DEBUG nova.objects.instance [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lazy-loading 'migration_context' on Instance uuid b8d4207f-7e3b-4a3c-ad76-60d87d695918 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.718 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.719 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Ensure instance console log exists: /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.719 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.720 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.720 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.723 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.729 2 WARNING nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.735 2 DEBUG nova.virt.libvirt.host [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.735 2 DEBUG nova.virt.libvirt.host [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.739 2 DEBUG nova.virt.libvirt.host [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.740 2 DEBUG nova.virt.libvirt.host [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.742 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.742 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T12:00:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f03fea4d-cf0e-438f-9bbd-227b9e45ee4f',id=25,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-752494272',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.743 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.744 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.744 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.744 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.745 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.745 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.746 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.746 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.746 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.747 2 DEBUG nova.virt.hardware [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:00:50 np0005465988 nova_compute[236126]: 2025-10-02 12:00:50.751 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:00:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:51.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:00:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:00:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1971463427' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:00:51 np0005465988 nova_compute[236126]: 2025-10-02 12:00:51.264 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:51 np0005465988 nova_compute[236126]: 2025-10-02 12:00:51.287 2 DEBUG nova.storage.rbd_utils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:00:51 np0005465988 nova_compute[236126]: 2025-10-02 12:00:51.290 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:51 np0005465988 nova_compute[236126]: 2025-10-02 12:00:51.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:00:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/127885509' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:00:51 np0005465988 nova_compute[236126]: 2025-10-02 12:00:51.717 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:51 np0005465988 nova_compute[236126]: 2025-10-02 12:00:51.720 2 DEBUG nova.objects.instance [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lazy-loading 'pci_devices' on Instance uuid b8d4207f-7e3b-4a3c-ad76-60d87d695918 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:51 np0005465988 nova_compute[236126]: 2025-10-02 12:00:51.743 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  <uuid>b8d4207f-7e3b-4a3c-ad76-60d87d695918</uuid>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  <name>instance-0000000a</name>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <nova:name>tempest-MigrationsAdminTest-server-1449740565</nova:name>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:00:50</nova:creationTime>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <nova:flavor name="tempest-test_resize_flavor_-752494272">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <nova:user uuid="1a06819bf8cc4ff7bccbbb2616ff2d21">tempest-MigrationsAdminTest-819597356-project-member</nova:user>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <nova:project uuid="f1ce36070fb047479c3a083f36733f63">tempest-MigrationsAdminTest-819597356</nova:project>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <entry name="serial">b8d4207f-7e3b-4a3c-ad76-60d87d695918</entry>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <entry name="uuid">b8d4207f-7e3b-4a3c-ad76-60d87d695918</entry>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk.config">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/console.log" append="off"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:00:51 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:00:51 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:00:51 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:00:51 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:00:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:51.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:51 np0005465988 nova_compute[236126]: 2025-10-02 12:00:51.829 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:00:51 np0005465988 nova_compute[236126]: 2025-10-02 12:00:51.829 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:00:51 np0005465988 nova_compute[236126]: 2025-10-02 12:00:51.830 2 INFO nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Using config drive#033[00m
Oct  2 08:00:51 np0005465988 nova_compute[236126]: 2025-10-02 12:00:51.859 2 DEBUG nova.storage.rbd_utils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:00:52 np0005465988 nova_compute[236126]: 2025-10-02 12:00:52.094 2 INFO nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Creating config drive at /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/disk.config#033[00m
Oct  2 08:00:52 np0005465988 nova_compute[236126]: 2025-10-02 12:00:52.099 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpovwglxtr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:52 np0005465988 nova_compute[236126]: 2025-10-02 12:00:52.222 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpovwglxtr" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:52 np0005465988 nova_compute[236126]: 2025-10-02 12:00:52.256 2 DEBUG nova.storage.rbd_utils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:00:52 np0005465988 nova_compute[236126]: 2025-10-02 12:00:52.263 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/disk.config b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:52 np0005465988 nova_compute[236126]: 2025-10-02 12:00:52.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:52 np0005465988 nova_compute[236126]: 2025-10-02 12:00:52.530 2 DEBUG oslo_concurrency.processutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/disk.config b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.267s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:52 np0005465988 nova_compute[236126]: 2025-10-02 12:00:52.531 2 INFO nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Deleting local config drive /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/disk.config because it was imported into RBD.#033[00m
Oct  2 08:00:52 np0005465988 systemd-machined[192594]: New machine qemu-3-instance-0000000a.
Oct  2 08:00:52 np0005465988 systemd[1]: Started Virtual Machine qemu-3-instance-0000000a.
Oct  2 08:00:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:00:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2867616679' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:00:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:53.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.722 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406453.721962, b8d4207f-7e3b-4a3c-ad76-60d87d695918 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.723 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.727 2 DEBUG nova.compute.manager [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.728 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.733 2 INFO nova.virt.libvirt.driver [-] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance spawned successfully.#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.733 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:00:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:53.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.770 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.777 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.781 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.781 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.782 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.782 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.783 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.783 2 DEBUG nova.virt.libvirt.driver [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.810 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.811 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406453.72541, b8d4207f-7e3b-4a3c-ad76-60d87d695918 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.811 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] VM Started (Lifecycle Event)#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.866 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.871 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.900 2 INFO nova.compute.manager [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Took 5.76 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.901 2 DEBUG nova.compute.manager [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:53 np0005465988 nova_compute[236126]: 2025-10-02 12:00:53.933 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:00:54 np0005465988 nova_compute[236126]: 2025-10-02 12:00:54.083 2 INFO nova.compute.manager [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Took 6.96 seconds to build instance.#033[00m
Oct  2 08:00:54 np0005465988 nova_compute[236126]: 2025-10-02 12:00:54.158 2 DEBUG oslo_concurrency.lockutils [None req-6c435372-9ec4-435c-8748-e1530d2fd737 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:00:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/141035279' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:00:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:00:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/141035279' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:00:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:55.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:55.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:56 np0005465988 nova_compute[236126]: 2025-10-02 12:00:56.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:00:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:00:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:57.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:00:57 np0005465988 nova_compute[236126]: 2025-10-02 12:00:57.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:00:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:57.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:00:58 np0005465988 nova_compute[236126]: 2025-10-02 12:00:58.767 2 DEBUG oslo_concurrency.lockutils [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:00:58 np0005465988 nova_compute[236126]: 2025-10-02 12:00:58.768 2 DEBUG oslo_concurrency.lockutils [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquired lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:00:58 np0005465988 nova_compute[236126]: 2025-10-02 12:00:58.769 2 DEBUG nova.network.neutron [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:00:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:00:59.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:59 np0005465988 nova_compute[236126]: 2025-10-02 12:00:59.258 2 DEBUG nova.network.neutron [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:00:59 np0005465988 nova_compute[236126]: 2025-10-02 12:00:59.613 2 DEBUG nova.network.neutron [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:00:59 np0005465988 nova_compute[236126]: 2025-10-02 12:00:59.729 2 DEBUG oslo_concurrency.lockutils [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Releasing lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:00:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:00:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:00:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:00:59.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:00:59 np0005465988 nova_compute[236126]: 2025-10-02 12:00:59.916 2 DEBUG nova.virt.libvirt.driver [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:00:59 np0005465988 nova_compute[236126]: 2025-10-02 12:00:59.916 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Creating file /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/5df0f16ff96e42a89ec74e543eddcba5.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  2 08:00:59 np0005465988 nova_compute[236126]: 2025-10-02 12:00:59.917 2 DEBUG oslo_concurrency.processutils [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/5df0f16ff96e42a89ec74e543eddcba5.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:00 np0005465988 nova_compute[236126]: 2025-10-02 12:01:00.361 2 DEBUG oslo_concurrency.processutils [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/5df0f16ff96e42a89ec74e543eddcba5.tmp" returned: 1 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:00 np0005465988 nova_compute[236126]: 2025-10-02 12:01:00.362 2 DEBUG oslo_concurrency.processutils [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/5df0f16ff96e42a89ec74e543eddcba5.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  2 08:01:00 np0005465988 nova_compute[236126]: 2025-10-02 12:01:00.362 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Creating directory /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  2 08:01:00 np0005465988 nova_compute[236126]: 2025-10-02 12:01:00.363 2 DEBUG oslo_concurrency.processutils [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:00 np0005465988 nova_compute[236126]: 2025-10-02 12:01:00.587 2 DEBUG oslo_concurrency.processutils [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918" returned: 0 in 0.224s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:00 np0005465988 nova_compute[236126]: 2025-10-02 12:01:00.592 2 DEBUG nova.virt.libvirt.driver [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:01:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:01.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:01 np0005465988 nova_compute[236126]: 2025-10-02 12:01:01.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:01 np0005465988 podman[242065]: 2025-10-02 12:01:01.543552311 +0000 UTC m=+0.070239983 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 08:01:01 np0005465988 podman[242063]: 2025-10-02 12:01:01.562273846 +0000 UTC m=+0.092262824 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:01:01 np0005465988 podman[242064]: 2025-10-02 12:01:01.573241155 +0000 UTC m=+0.103501491 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:01:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:01.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:02 np0005465988 nova_compute[236126]: 2025-10-02 12:01:02.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:03.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:03.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:05.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:05.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:06 np0005465988 nova_compute[236126]: 2025-10-02 12:01:06.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:07.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:07 np0005465988 nova_compute[236126]: 2025-10-02 12:01:07.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:07.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:09.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:09.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:10 np0005465988 nova_compute[236126]: 2025-10-02 12:01:10.644 2 DEBUG nova.virt.libvirt.driver [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:01:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:11.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:11 np0005465988 nova_compute[236126]: 2025-10-02 12:01:11.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:11.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:12 np0005465988 nova_compute[236126]: 2025-10-02 12:01:12.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:13.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:13 np0005465988 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Oct  2 08:01:13 np0005465988 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000a.scope: Consumed 13.148s CPU time.
Oct  2 08:01:13 np0005465988 systemd-machined[192594]: Machine qemu-3-instance-0000000a terminated.
Oct  2 08:01:13 np0005465988 nova_compute[236126]: 2025-10-02 12:01:13.671 2 INFO nova.virt.libvirt.driver [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:01:13 np0005465988 nova_compute[236126]: 2025-10-02 12:01:13.679 2 INFO nova.virt.libvirt.driver [-] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance destroyed successfully.#033[00m
Oct  2 08:01:13 np0005465988 nova_compute[236126]: 2025-10-02 12:01:13.683 2 DEBUG nova.virt.libvirt.driver [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:01:13 np0005465988 nova_compute[236126]: 2025-10-02 12:01:13.684 2 DEBUG nova.virt.libvirt.driver [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:01:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:13.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:15 np0005465988 nova_compute[236126]: 2025-10-02 12:01:15.055 2 DEBUG oslo_concurrency.lockutils [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:15 np0005465988 nova_compute[236126]: 2025-10-02 12:01:15.056 2 DEBUG oslo_concurrency.lockutils [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:15 np0005465988 nova_compute[236126]: 2025-10-02 12:01:15.057 2 DEBUG oslo_concurrency.lockutils [None req-8cc971a5-0d8f-469a-b725-6f34bcb5f275 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:15.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:15.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:16 np0005465988 nova_compute[236126]: 2025-10-02 12:01:16.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:17.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:17 np0005465988 nova_compute[236126]: 2025-10-02 12:01:17.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:17 np0005465988 podman[242192]: 2025-10-02 12:01:17.524545107 +0000 UTC m=+0.065328132 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:01:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:17.644 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:01:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:17.645 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:01:17 np0005465988 nova_compute[236126]: 2025-10-02 12:01:17.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:17.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:19.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:19.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e144 e144: 3 total, 3 up, 3 in
Oct  2 08:01:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:21.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:21 np0005465988 nova_compute[236126]: 2025-10-02 12:01:21.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:01:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:21.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:01:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:22 np0005465988 nova_compute[236126]: 2025-10-02 12:01:22.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:23.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:23.647 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:23.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:25.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:01:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:25.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:01:25 np0005465988 podman[242389]: 2025-10-02 12:01:25.835025164 +0000 UTC m=+0.111032382 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Oct  2 08:01:25 np0005465988 podman[242389]: 2025-10-02 12:01:25.947779985 +0000 UTC m=+0.223787183 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 08:01:26 np0005465988 nova_compute[236126]: 2025-10-02 12:01:26.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:26 np0005465988 podman[242523]: 2025-10-02 12:01:26.560620035 +0000 UTC m=+0.067531505 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:01:26 np0005465988 podman[242523]: 2025-10-02 12:01:26.570655267 +0000 UTC m=+0.077566767 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:01:26 np0005465988 nova_compute[236126]: 2025-10-02 12:01:26.660 2 INFO nova.compute.manager [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Swapping old allocation on dict_keys(['5abd2871-a992-42ab-bb6a-594a92f77d4d']) held by migration 9100aa3b-5271-4018-89db-db3228fa0fa2 for instance#033[00m
Oct  2 08:01:26 np0005465988 nova_compute[236126]: 2025-10-02 12:01:26.702 2 DEBUG nova.scheduler.client.report [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Overwriting current allocation {'allocations': {'a293a24c-b5ed-43d1-8783-f02da4f75ad4': {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}, 'generation': 18}}, 'project_id': 'f1ce36070fb047479c3a083f36733f63', 'user_id': '1a06819bf8cc4ff7bccbbb2616ff2d21', 'consumer_generation': 1} on consumer b8d4207f-7e3b-4a3c-ad76-60d87d695918 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Oct  2 08:01:26 np0005465988 podman[242589]: 2025-10-02 12:01:26.803355248 +0000 UTC m=+0.065331602 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, name=keepalived, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, release=1793, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, distribution-scope=public, version=2.2.4)
Oct  2 08:01:26 np0005465988 podman[242589]: 2025-10-02 12:01:26.823870265 +0000 UTC m=+0.085846609 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vendor=Red Hat, Inc., description=keepalived for Ceph, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, name=keepalived, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container)
Oct  2 08:01:26 np0005465988 nova_compute[236126]: 2025-10-02 12:01:26.884 2 DEBUG oslo_concurrency.lockutils [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:26 np0005465988 nova_compute[236126]: 2025-10-02 12:01:26.884 2 DEBUG oslo_concurrency.lockutils [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquired lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:26 np0005465988 nova_compute[236126]: 2025-10-02 12:01:26.884 2 DEBUG nova.network.neutron [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:01:27 np0005465988 nova_compute[236126]: 2025-10-02 12:01:27.032 2 DEBUG nova.network.neutron [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:01:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:01:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:01:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:27.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:27.328 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:27.329 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:27.329 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:27 np0005465988 nova_compute[236126]: 2025-10-02 12:01:27.338 2 DEBUG nova.network.neutron [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:27 np0005465988 nova_compute[236126]: 2025-10-02 12:01:27.386 2 DEBUG oslo_concurrency.lockutils [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Releasing lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:27 np0005465988 nova_compute[236126]: 2025-10-02 12:01:27.387 2 DEBUG nova.virt.libvirt.driver [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Oct  2 08:01:27 np0005465988 nova_compute[236126]: 2025-10-02 12:01:27.495 2 DEBUG nova.storage.rbd_utils [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rolling back rbd image(b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Oct  2 08:01:27 np0005465988 nova_compute[236126]: 2025-10-02 12:01:27.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:27.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:27 np0005465988 nova_compute[236126]: 2025-10-02 12:01:27.869 2 DEBUG nova.storage.rbd_utils [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] removing snapshot(nova-resize) on rbd image(b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:01:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:01:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:01:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.518 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.519 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.520 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.520 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.521 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.671 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406473.6697123, b8d4207f-7e3b-4a3c-ad76-60d87d695918 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.672 2 INFO nova.compute.manager [-] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.695 2 DEBUG nova.compute.manager [None req-fb49c837-b660-4247-b43c-c799762b63eb - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.699 2 DEBUG nova.compute.manager [None req-fb49c837-b660-4247-b43c-c799762b63eb - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.725 2 INFO nova.compute.manager [None req-fb49c837-b660-4247-b43c-c799762b63eb - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Oct  2 08:01:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:01:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4074648720' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:01:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e145 e145: 3 total, 3 up, 3 in
Oct  2 08:01:28 np0005465988 nova_compute[236126]: 2025-10-02 12:01:28.979 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.076 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.076 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.215 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.216 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4931MB free_disk=20.785140991210938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.216 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.216 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.254 2 DEBUG nova.virt.libvirt.driver [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.257 2 WARNING nova.virt.libvirt.driver [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.260 2 DEBUG nova.virt.libvirt.host [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.261 2 DEBUG nova.virt.libvirt.host [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.265 2 DEBUG nova.virt.libvirt.host [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.265 2 DEBUG nova.virt.libvirt.host [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.266 2 DEBUG nova.virt.libvirt.driver [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.266 2 DEBUG nova.virt.hardware [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T12:00:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f03fea4d-cf0e-438f-9bbd-227b9e45ee4f',id=25,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-752494272',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.266 2 DEBUG nova.virt.hardware [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.267 2 DEBUG nova.virt.hardware [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.267 2 DEBUG nova.virt.hardware [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.267 2 DEBUG nova.virt.hardware [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.267 2 DEBUG nova.virt.hardware [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.267 2 DEBUG nova.virt.hardware [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.268 2 DEBUG nova.virt.hardware [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.268 2 DEBUG nova.virt.hardware [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.268 2 DEBUG nova.virt.hardware [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.269 2 DEBUG nova.virt.hardware [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.270 2 DEBUG nova.objects.instance [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b8d4207f-7e3b-4a3c-ad76-60d87d695918 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:29.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.396 2 DEBUG oslo_concurrency.processutils [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.445 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance b8d4207f-7e3b-4a3c-ad76-60d87d695918 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.446 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.447 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.494 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:29.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:01:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4050351588' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.836 2 DEBUG oslo_concurrency.processutils [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.872 2 DEBUG oslo_concurrency.processutils [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:01:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/402158354' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.973 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:29 np0005465988 nova_compute[236126]: 2025-10-02 12:01:29.979 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:01:30 np0005465988 nova_compute[236126]: 2025-10-02 12:01:30.006 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:01:30 np0005465988 nova_compute[236126]: 2025-10-02 12:01:30.057 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:01:30 np0005465988 nova_compute[236126]: 2025-10-02 12:01:30.057 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.841s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:01:30 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2928742114' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:01:30 np0005465988 nova_compute[236126]: 2025-10-02 12:01:30.343 2 DEBUG oslo_concurrency.processutils [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:30 np0005465988 nova_compute[236126]: 2025-10-02 12:01:30.349 2 DEBUG nova.virt.libvirt.driver [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  <uuid>b8d4207f-7e3b-4a3c-ad76-60d87d695918</uuid>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  <name>instance-0000000a</name>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <nova:name>tempest-MigrationsAdminTest-server-1449740565</nova:name>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:01:29</nova:creationTime>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <nova:flavor name="tempest-test_resize_flavor_-752494272">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <nova:user uuid="1a06819bf8cc4ff7bccbbb2616ff2d21">tempest-MigrationsAdminTest-819597356-project-member</nova:user>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <nova:project uuid="f1ce36070fb047479c3a083f36733f63">tempest-MigrationsAdminTest-819597356</nova:project>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <entry name="serial">b8d4207f-7e3b-4a3c-ad76-60d87d695918</entry>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <entry name="uuid">b8d4207f-7e3b-4a3c-ad76-60d87d695918</entry>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/b8d4207f-7e3b-4a3c-ad76-60d87d695918_disk.config">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918/console.log" append="off"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <input type="keyboard" bus="usb"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:01:30 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:01:30 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:01:30 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:01:30 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:01:30 np0005465988 systemd-machined[192594]: New machine qemu-4-instance-0000000a.
Oct  2 08:01:30 np0005465988 systemd[1]: Started Virtual Machine qemu-4-instance-0000000a.
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.057 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.057 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.057 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.057 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:31.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.804 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406491.804494, b8d4207f-7e3b-4a3c-ad76-60d87d695918 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.805 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.808 2 DEBUG nova.compute.manager [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.811 2 INFO nova.virt.libvirt.driver [-] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance running successfully.#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.811 2 DEBUG nova.virt.libvirt.driver [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Oct  2 08:01:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:31.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.870 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.874 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.989 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.990 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406491.808415, b8d4207f-7e3b-4a3c-ad76-60d87d695918 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:31 np0005465988 nova_compute[236126]: 2025-10-02 12:01:31.990 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] VM Started (Lifecycle Event)#033[00m
Oct  2 08:01:32 np0005465988 nova_compute[236126]: 2025-10-02 12:01:32.081 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:32 np0005465988 nova_compute[236126]: 2025-10-02 12:01:32.085 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:32 np0005465988 nova_compute[236126]: 2025-10-02 12:01:32.114 2 INFO nova.compute.manager [None req-ba9a7a52-bb05-4fce-b0c3-4d25cc6f9b8d 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Updating instance to original state: 'active'#033[00m
Oct  2 08:01:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:32 np0005465988 nova_compute[236126]: 2025-10-02 12:01:32.145 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Oct  2 08:01:32 np0005465988 nova_compute[236126]: 2025-10-02 12:01:32.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:32 np0005465988 podman[243045]: 2025-10-02 12:01:32.53086502 +0000 UTC m=+0.058509204 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:01:32 np0005465988 podman[243046]: 2025-10-02 12:01:32.538405459 +0000 UTC m=+0.064640042 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:01:32 np0005465988 podman[243044]: 2025-10-02 12:01:32.563660694 +0000 UTC m=+0.090902456 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:01:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:33.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:33 np0005465988 nova_compute[236126]: 2025-10-02 12:01:33.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:33.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:34 np0005465988 nova_compute[236126]: 2025-10-02 12:01:34.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:34 np0005465988 nova_compute[236126]: 2025-10-02 12:01:34.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:01:34 np0005465988 nova_compute[236126]: 2025-10-02 12:01:34.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:01:34 np0005465988 nova_compute[236126]: 2025-10-02 12:01:34.842 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:34 np0005465988 nova_compute[236126]: 2025-10-02 12:01:34.843 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:34 np0005465988 nova_compute[236126]: 2025-10-02 12:01:34.844 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:01:34 np0005465988 nova_compute[236126]: 2025-10-02 12:01:34.844 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b8d4207f-7e3b-4a3c-ad76-60d87d695918 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:35 np0005465988 nova_compute[236126]: 2025-10-02 12:01:35.058 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:01:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:35.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:35 np0005465988 nova_compute[236126]: 2025-10-02 12:01:35.448 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:35 np0005465988 nova_compute[236126]: 2025-10-02 12:01:35.471 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:35 np0005465988 nova_compute[236126]: 2025-10-02 12:01:35.472 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:01:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:35.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:01:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:01:35 np0005465988 nova_compute[236126]: 2025-10-02 12:01:35.969 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Acquiring lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:35 np0005465988 nova_compute[236126]: 2025-10-02 12:01:35.970 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:35 np0005465988 nova_compute[236126]: 2025-10-02 12:01:35.990 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:35 np0005465988 nova_compute[236126]: 2025-10-02 12:01:35.991 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:35 np0005465988 nova_compute[236126]: 2025-10-02 12:01:35.992 2 DEBUG nova.compute.manager [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.024 2 DEBUG nova.compute.manager [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.071 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.072 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.083 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.083 2 INFO nova.compute.claims [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.092 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.239 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:01:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/715644299' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.685 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.693 2 DEBUG nova.compute.provider_tree [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.708 2 DEBUG nova.scheduler.client.report [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.730 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.731 2 DEBUG nova.compute.manager [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.735 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.744 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:01:36 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:01:36 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.756 2 INFO nova.compute.claims [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.798 2 DEBUG nova.compute.manager [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.799 2 DEBUG nova.network.neutron [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.827 2 INFO nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.851 2 DEBUG nova.compute.manager [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.933 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.994 2 DEBUG nova.compute.manager [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.996 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:01:36 np0005465988 nova_compute[236126]: 2025-10-02 12:01:36.996 2 INFO nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Creating image(s)#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.028 2 DEBUG nova.storage.rbd_utils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] rbd image e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.049 2 DEBUG nova.storage.rbd_utils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] rbd image e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.074 2 DEBUG nova.storage.rbd_utils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] rbd image e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.077 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.133 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.133 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.134 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.134 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.156 2 DEBUG nova.storage.rbd_utils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] rbd image e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.160 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:37.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:01:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/506775395' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.397 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.403 2 DEBUG nova.compute.provider_tree [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.421 2 DEBUG nova.scheduler.client.report [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.441 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.442 2 DEBUG nova.compute.manager [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.489 2 DEBUG nova.compute.manager [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.489 2 DEBUG nova.network.neutron [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.510 2 INFO nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.529 2 DEBUG nova.compute.manager [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.601 2 DEBUG nova.compute.manager [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.602 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.602 2 INFO nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Creating image(s)#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.638 2 DEBUG nova.storage.rbd_utils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.669 2 DEBUG nova.storage.rbd_utils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.698 2 DEBUG nova.storage.rbd_utils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.703 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.778 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.780 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.781 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.781 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:37.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.825 2 DEBUG nova.storage.rbd_utils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.831 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.899 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.739s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:37 np0005465988 nova_compute[236126]: 2025-10-02 12:01:37.983 2 DEBUG nova.storage.rbd_utils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] resizing rbd image e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:01:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e146 e146: 3 total, 3 up, 3 in
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.293 2 DEBUG nova.objects.instance [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lazy-loading 'migration_context' on Instance uuid e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.326 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.327 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Ensure instance console log exists: /var/lib/nova/instances/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.327 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.328 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.329 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.402 2 DEBUG nova.policy [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efb31eeadee34403b1ab7a584f3616f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3f2b3ac7d7504c9c96f0d4a67e0243c9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.514 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.684s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.589 2 DEBUG nova.storage.rbd_utils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] resizing rbd image b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.740 2 DEBUG nova.objects.instance [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lazy-loading 'migration_context' on Instance uuid b6a5f3ca-c662-41a0-ac02-78f9fba82bba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.755 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.756 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Ensure instance console log exists: /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.756 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.756 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:38 np0005465988 nova_compute[236126]: 2025-10-02 12:01:38.757 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.260 2 DEBUG nova.network.neutron [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.260 2 DEBUG nova.compute.manager [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.263 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.269 2 WARNING nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.275 2 DEBUG nova.virt.libvirt.host [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.276 2 DEBUG nova.virt.libvirt.host [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.282 2 DEBUG nova.virt.libvirt.host [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.283 2 DEBUG nova.virt.libvirt.host [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.285 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.285 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.285 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.286 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.286 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.286 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.286 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.287 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.287 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.287 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.287 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.287 2 DEBUG nova.virt.hardware [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.290 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:39.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:01:39 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3501241138' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.789 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:39.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.821 2 DEBUG nova.storage.rbd_utils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:39 np0005465988 nova_compute[236126]: 2025-10-02 12:01:39.827 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:01:40 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/897392049' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:01:40 np0005465988 nova_compute[236126]: 2025-10-02 12:01:40.290 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:40 np0005465988 nova_compute[236126]: 2025-10-02 12:01:40.292 2 DEBUG nova.objects.instance [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lazy-loading 'pci_devices' on Instance uuid b6a5f3ca-c662-41a0-ac02-78f9fba82bba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:40 np0005465988 nova_compute[236126]: 2025-10-02 12:01:40.309 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  <uuid>b6a5f3ca-c662-41a0-ac02-78f9fba82bba</uuid>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  <name>instance-0000000d</name>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <nova:name>tempest-MigrationsAdminTest-server-1708097480</nova:name>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:01:39</nova:creationTime>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <nova:user uuid="1a06819bf8cc4ff7bccbbb2616ff2d21">tempest-MigrationsAdminTest-819597356-project-member</nova:user>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <nova:project uuid="f1ce36070fb047479c3a083f36733f63">tempest-MigrationsAdminTest-819597356</nova:project>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <entry name="serial">b6a5f3ca-c662-41a0-ac02-78f9fba82bba</entry>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <entry name="uuid">b6a5f3ca-c662-41a0-ac02-78f9fba82bba</entry>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk.config">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/console.log" append="off"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:01:40 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:01:40 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:01:40 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:01:40 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:01:40 np0005465988 nova_compute[236126]: 2025-10-02 12:01:40.395 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:01:40 np0005465988 nova_compute[236126]: 2025-10-02 12:01:40.396 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:01:40 np0005465988 nova_compute[236126]: 2025-10-02 12:01:40.396 2 INFO nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Using config drive#033[00m
Oct  2 08:01:40 np0005465988 nova_compute[236126]: 2025-10-02 12:01:40.423 2 DEBUG nova.storage.rbd_utils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:41 np0005465988 nova_compute[236126]: 2025-10-02 12:01:41.048 2 INFO nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Creating config drive at /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/disk.config#033[00m
Oct  2 08:01:41 np0005465988 nova_compute[236126]: 2025-10-02 12:01:41.057 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq2c22eh1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:41 np0005465988 nova_compute[236126]: 2025-10-02 12:01:41.191 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq2c22eh1" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:41 np0005465988 nova_compute[236126]: 2025-10-02 12:01:41.238 2 DEBUG nova.storage.rbd_utils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rbd image b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:41 np0005465988 nova_compute[236126]: 2025-10-02 12:01:41.242 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/disk.config b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:41.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:41 np0005465988 nova_compute[236126]: 2025-10-02 12:01:41.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:41.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:41 np0005465988 nova_compute[236126]: 2025-10-02 12:01:41.994 2 DEBUG oslo_concurrency.processutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/disk.config b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.752s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:41 np0005465988 nova_compute[236126]: 2025-10-02 12:01:41.995 2 INFO nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Deleting local config drive /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/disk.config because it was imported into RBD.#033[00m
Oct  2 08:01:42 np0005465988 systemd-machined[192594]: New machine qemu-5-instance-0000000d.
Oct  2 08:01:42 np0005465988 systemd[1]: Started Virtual Machine qemu-5-instance-0000000d.
Oct  2 08:01:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:42 np0005465988 nova_compute[236126]: 2025-10-02 12:01:42.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:42 np0005465988 nova_compute[236126]: 2025-10-02 12:01:42.859 2 DEBUG nova.network.neutron [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Successfully updated port: c3f370c5-770e-48df-b015-b39eb427f259 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:01:42 np0005465988 nova_compute[236126]: 2025-10-02 12:01:42.889 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Acquiring lock "refresh_cache-e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:42 np0005465988 nova_compute[236126]: 2025-10-02 12:01:42.889 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Acquired lock "refresh_cache-e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:42 np0005465988 nova_compute[236126]: 2025-10-02 12:01:42.889 2 DEBUG nova.network.neutron [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:01:42 np0005465988 nova_compute[236126]: 2025-10-02 12:01:42.975 2 DEBUG nova.compute.manager [req-03d4cd17-f6fb-4a1a-b9a4-b2a595b5bdb4 req-70029bac-724b-44d1-9b14-f40c6ade73f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-changed-c3f370c5-770e-48df-b015-b39eb427f259 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:01:42 np0005465988 nova_compute[236126]: 2025-10-02 12:01:42.976 2 DEBUG nova.compute.manager [req-03d4cd17-f6fb-4a1a-b9a4-b2a595b5bdb4 req-70029bac-724b-44d1-9b14-f40c6ade73f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Refreshing instance network info cache due to event network-changed-c3f370c5-770e-48df-b015-b39eb427f259. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:01:42 np0005465988 nova_compute[236126]: 2025-10-02 12:01:42.976 2 DEBUG oslo_concurrency.lockutils [req-03d4cd17-f6fb-4a1a-b9a4-b2a595b5bdb4 req-70029bac-724b-44d1-9b14-f40c6ade73f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.045 2 DEBUG nova.network.neutron [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:01:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:43.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.621 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406503.621147, b6a5f3ca-c662-41a0-ac02-78f9fba82bba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.622 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.626 2 DEBUG nova.compute.manager [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.627 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.631 2 INFO nova.virt.libvirt.driver [-] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance spawned successfully.#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.632 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.647 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.651 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.665 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.665 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.666 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.667 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.668 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.669 2 DEBUG nova.virt.libvirt.driver [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.691 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.691 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406503.6212566, b6a5f3ca-c662-41a0-ac02-78f9fba82bba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.692 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] VM Started (Lifecycle Event)#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.753 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.756 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.769 2 INFO nova.compute.manager [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Took 6.17 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.769 2 DEBUG nova.compute.manager [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.781 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.793 2 DEBUG nova.network.neutron [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Updating instance_info_cache with network_info: [{"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:43.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.826 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Releasing lock "refresh_cache-e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.827 2 DEBUG nova.compute.manager [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Instance network_info: |[{"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.827 2 DEBUG oslo_concurrency.lockutils [req-03d4cd17-f6fb-4a1a-b9a4-b2a595b5bdb4 req-70029bac-724b-44d1-9b14-f40c6ade73f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.827 2 DEBUG nova.network.neutron [req-03d4cd17-f6fb-4a1a-b9a4-b2a595b5bdb4 req-70029bac-724b-44d1-9b14-f40c6ade73f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Refreshing network info cache for port c3f370c5-770e-48df-b015-b39eb427f259 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.829 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Start _get_guest_xml network_info=[{"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.833 2 WARNING nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.837 2 DEBUG nova.virt.libvirt.host [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.838 2 DEBUG nova.virt.libvirt.host [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.844 2 DEBUG nova.virt.libvirt.host [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.844 2 DEBUG nova.virt.libvirt.host [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.845 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.845 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.846 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.846 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.846 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.846 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.847 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.847 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.847 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.847 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.847 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.848 2 DEBUG nova.virt.hardware [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.850 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.867 2 INFO nova.compute.manager [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Took 7.79 seconds to build instance.#033[00m
Oct  2 08:01:43 np0005465988 nova_compute[236126]: 2025-10-02 12:01:43.900 2 DEBUG oslo_concurrency.lockutils [None req-7705865c-04c9-477d-9a97-53bab307d28e 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.909s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:01:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1391525199' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.270 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.302 2 DEBUG nova.storage.rbd_utils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] rbd image e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.307 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:01:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/509611605' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.816 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.819 2 DEBUG nova.virt.libvirt.vif [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:01:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1571369270',display_name='tempest-LiveMigrationTest-server-1571369270',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1571369270',id=12,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f2b3ac7d7504c9c96f0d4a67e0243c9',ramdisk_id='',reservation_id='r-ton1smkr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-1876533760',owner_user_name='tempest-LiveMigrationTest-1876533760-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:01:36Z,user_data=None,user_id='efb31eeadee34403b1ab7a584f3616f7',uuid=e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.819 2 DEBUG nova.network.os_vif_util [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Converting VIF {"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.821 2 DEBUG nova.network.os_vif_util [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:ff:42,bridge_name='br-int',has_traffic_filtering=True,id=c3f370c5-770e-48df-b015-b39eb427f259,network=Network(5bd66e63-9399-4ab1-bcda-a761f2c44b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc3f370c5-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.824 2 DEBUG nova.objects.instance [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lazy-loading 'pci_devices' on Instance uuid e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.848 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  <uuid>e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7</uuid>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  <name>instance-0000000c</name>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <nova:name>tempest-LiveMigrationTest-server-1571369270</nova:name>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:01:43</nova:creationTime>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <nova:user uuid="efb31eeadee34403b1ab7a584f3616f7">tempest-LiveMigrationTest-1876533760-project-member</nova:user>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <nova:project uuid="3f2b3ac7d7504c9c96f0d4a67e0243c9">tempest-LiveMigrationTest-1876533760</nova:project>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <nova:port uuid="c3f370c5-770e-48df-b015-b39eb427f259">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <entry name="serial">e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7</entry>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <entry name="uuid">e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7</entry>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk.config">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:02:ff:42"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <target dev="tapc3f370c5-77"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7/console.log" append="off"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:01:44 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:01:44 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:01:44 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:01:44 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.849 2 DEBUG nova.compute.manager [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Preparing to wait for external event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.849 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Acquiring lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.850 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.850 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.851 2 DEBUG nova.virt.libvirt.vif [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:01:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1571369270',display_name='tempest-LiveMigrationTest-server-1571369270',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1571369270',id=12,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f2b3ac7d7504c9c96f0d4a67e0243c9',ramdisk_id='',reservation_id='r-ton1smkr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-1876533760',owner_user_name='tempest-LiveMigrationTest-1876533760-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:01:36Z,user_data=None,user_id='efb31eeadee34403b1ab7a584f3616f7',uuid=e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.851 2 DEBUG nova.network.os_vif_util [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Converting VIF {"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.852 2 DEBUG nova.network.os_vif_util [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:ff:42,bridge_name='br-int',has_traffic_filtering=True,id=c3f370c5-770e-48df-b015-b39eb427f259,network=Network(5bd66e63-9399-4ab1-bcda-a761f2c44b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc3f370c5-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.852 2 DEBUG os_vif [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:ff:42,bridge_name='br-int',has_traffic_filtering=True,id=c3f370c5-770e-48df-b015-b39eb427f259,network=Network(5bd66e63-9399-4ab1-bcda-a761f2c44b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc3f370c5-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.853 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.854 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.859 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3f370c5-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.860 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc3f370c5-77, col_values=(('external_ids', {'iface-id': 'c3f370c5-770e-48df-b015-b39eb427f259', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:ff:42', 'vm-uuid': 'e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:44 np0005465988 NetworkManager[45041]: <info>  [1759406504.8632] manager: (tapc3f370c5-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.871 2 INFO os_vif [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:ff:42,bridge_name='br-int',has_traffic_filtering=True,id=c3f370c5-770e-48df-b015-b39eb427f259,network=Network(5bd66e63-9399-4ab1-bcda-a761f2c44b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc3f370c5-77')#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.925 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.925 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.925 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] No VIF found with MAC fa:16:3e:02:ff:42, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.926 2 INFO nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Using config drive#033[00m
Oct  2 08:01:44 np0005465988 nova_compute[236126]: 2025-10-02 12:01:44.949 2 DEBUG nova.storage.rbd_utils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] rbd image e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.265 2 DEBUG nova.network.neutron [req-03d4cd17-f6fb-4a1a-b9a4-b2a595b5bdb4 req-70029bac-724b-44d1-9b14-f40c6ade73f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Updated VIF entry in instance network info cache for port c3f370c5-770e-48df-b015-b39eb427f259. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.266 2 DEBUG nova.network.neutron [req-03d4cd17-f6fb-4a1a-b9a4-b2a595b5bdb4 req-70029bac-724b-44d1-9b14-f40c6ade73f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Updating instance_info_cache with network_info: [{"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.286 2 DEBUG oslo_concurrency.lockutils [req-03d4cd17-f6fb-4a1a-b9a4-b2a595b5bdb4 req-70029bac-724b-44d1-9b14-f40c6ade73f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:45.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.344 2 INFO nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Creating config drive at /var/lib/nova/instances/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7/disk.config#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.349 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0vyodill execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.473 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0vyodill" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.520 2 DEBUG nova.storage.rbd_utils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] rbd image e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.527 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7/disk.config e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.750 2 DEBUG oslo_concurrency.processutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7/disk.config e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.224s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.752 2 INFO nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Deleting local config drive /var/lib/nova/instances/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7/disk.config because it was imported into RBD.#033[00m
Oct  2 08:01:45 np0005465988 kernel: tapc3f370c5-77: entered promiscuous mode
Oct  2 08:01:45 np0005465988 NetworkManager[45041]: <info>  [1759406505.8210] manager: (tapc3f370c5-77): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Oct  2 08:01:45 np0005465988 systemd-udevd[243718]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:01:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:45.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:45 np0005465988 NetworkManager[45041]: <info>  [1759406505.8594] device (tapc3f370c5-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:01:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:45Z|00037|binding|INFO|Claiming lport c3f370c5-770e-48df-b015-b39eb427f259 for this chassis.
Oct  2 08:01:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:45Z|00038|binding|INFO|c3f370c5-770e-48df-b015-b39eb427f259: Claiming fa:16:3e:02:ff:42 10.100.0.6
Oct  2 08:01:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:45Z|00039|binding|INFO|Claiming lport 0fe5d3df-efa4-47f1-a32b-45858e68a8b9 for this chassis.
Oct  2 08:01:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:45Z|00040|binding|INFO|0fe5d3df-efa4-47f1-a32b-45858e68a8b9: Claiming fa:16:3e:88:98:01 19.80.0.134
Oct  2 08:01:45 np0005465988 NetworkManager[45041]: <info>  [1759406505.8607] device (tapc3f370c5-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.891 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:ff:42 10.100.0.6'], port_security=['fa:16:3e:02:ff:42 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-96076607', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bd66e63-9399-4ab1-bcda-a761f2c44b1d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-96076607', 'neutron:project_id': '3f2b3ac7d7504c9c96f0d4a67e0243c9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c39f970f-1ead-4030-a775-b7ca9942094a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58eebdcc-6c12-4ff3-b6bc-0fe1fb3af6b6, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c3f370c5-770e-48df-b015-b39eb427f259) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.895 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:98:01 19.80.0.134'], port_security=['fa:16:3e:88:98:01 19.80.0.134'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['c3f370c5-770e-48df-b015-b39eb427f259'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-2131619737', 'neutron:cidrs': '19.80.0.134/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89bcb068-8337-43f0-9d5d-f27225e9a30d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-2131619737', 'neutron:project_id': '3f2b3ac7d7504c9c96f0d4a67e0243c9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c39f970f-1ead-4030-a775-b7ca9942094a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=b81908e7-ad92-4b22-83eb-83c6667bf780, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0fe5d3df-efa4-47f1-a32b-45858e68a8b9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.897 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c3f370c5-770e-48df-b015-b39eb427f259 in datapath 5bd66e63-9399-4ab1-bcda-a761f2c44b1d bound to our chassis#033[00m
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.901 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5bd66e63-9399-4ab1-bcda-a761f2c44b1d#033[00m
Oct  2 08:01:45 np0005465988 systemd-machined[192594]: New machine qemu-6-instance-0000000c.
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.921 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c7611114-4134-4acd-af9a-7b53b212d6a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.922 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5bd66e63-91 in ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.924 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5bd66e63-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.924 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8dba1ab4-eb98-485e-ad79-e582d68ef780]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.926 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9f8f9654-c1db-4ab5-8c4c-609f9a318569]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.939 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[10ea50e6-65a1-4395-b5a4-17cfd5a6a48e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.964 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5085613a-640f-4f7f-af52-0a220d58f836]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:45 np0005465988 systemd[1]: Started Virtual Machine qemu-6-instance-0000000c.
Oct  2 08:01:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.993 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2c49b83b-cfce-470b-9959-a1e1981a3a40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:45Z|00041|binding|INFO|Setting lport c3f370c5-770e-48df-b015-b39eb427f259 ovn-installed in OVS
Oct  2 08:01:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:45Z|00042|binding|INFO|Setting lport c3f370c5-770e-48df-b015-b39eb427f259 up in Southbound
Oct  2 08:01:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:45Z|00043|binding|INFO|Setting lport 0fe5d3df-efa4-47f1-a32b-45858e68a8b9 up in Southbound
Oct  2 08:01:45 np0005465988 nova_compute[236126]: 2025-10-02 12:01:45.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:46 np0005465988 NetworkManager[45041]: <info>  [1759406506.0006] manager: (tap5bd66e63-90): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:45.999 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8192c288-fd15-45af-9657-4316a2a1d4d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.040 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[61f15ff4-bb45-4288-8326-25573174b43d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.043 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0f66c72f-7477-430e-a2f8-58b425cce649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 NetworkManager[45041]: <info>  [1759406506.0700] device (tap5bd66e63-90): carrier: link connected
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.075 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec6b86b-b452-4917-aead-52f92fc4dad8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.096 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[33c1bcc1-462a-4884-8319-e503d3314069]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bd66e63-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:10:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456640, 'reachable_time': 15164, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243888, 'error': None, 'target': 'ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.113 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0a54852f-7ee4-4f31-b0ff-e563fc2546fa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6f:100f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 456640, 'tstamp': 456640}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243889, 'error': None, 'target': 'ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.131 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ece54b9a-81d2-436b-b721-c0102a125935]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bd66e63-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:10:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456640, 'reachable_time': 15164, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243890, 'error': None, 'target': 'ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.163 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec4ac01-cbbe-476f-af69-a67fb2121501]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.226 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[655f4be1-5957-4211-841b-2bfcb8dab5f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.228 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bd66e63-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.228 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.228 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bd66e63-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:46 np0005465988 nova_compute[236126]: 2025-10-02 12:01:46.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:46 np0005465988 NetworkManager[45041]: <info>  [1759406506.2306] manager: (tap5bd66e63-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Oct  2 08:01:46 np0005465988 kernel: tap5bd66e63-90: entered promiscuous mode
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.236 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5bd66e63-90, col_values=(('external_ids', {'iface-id': '5a25c40a-77b7-400c-afc3-f6cb920420cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:46 np0005465988 nova_compute[236126]: 2025-10-02 12:01:46.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:46Z|00044|binding|INFO|Releasing lport 5a25c40a-77b7-400c-afc3-f6cb920420cb from this chassis (sb_readonly=0)
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.240 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5bd66e63-9399-4ab1-bcda-a761f2c44b1d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5bd66e63-9399-4ab1-bcda-a761f2c44b1d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.241 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3fd5a5-9163-402a-8d8e-94b21eb47475]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.242 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-5bd66e63-9399-4ab1-bcda-a761f2c44b1d
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/5bd66e63-9399-4ab1-bcda-a761f2c44b1d.pid.haproxy
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 5bd66e63-9399-4ab1-bcda-a761f2c44b1d
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.243 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d', 'env', 'PROCESS_TAG=haproxy-5bd66e63-9399-4ab1-bcda-a761f2c44b1d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5bd66e63-9399-4ab1-bcda-a761f2c44b1d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:01:46 np0005465988 nova_compute[236126]: 2025-10-02 12:01:46.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:46 np0005465988 nova_compute[236126]: 2025-10-02 12:01:46.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:46 np0005465988 nova_compute[236126]: 2025-10-02 12:01:46.551 2 DEBUG nova.compute.manager [req-8ff668e4-356a-4dea-9655-6146772db748 req-f70f16fa-5d5e-46bc-82a3-81e74cf45cf7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:01:46 np0005465988 nova_compute[236126]: 2025-10-02 12:01:46.551 2 DEBUG oslo_concurrency.lockutils [req-8ff668e4-356a-4dea-9655-6146772db748 req-f70f16fa-5d5e-46bc-82a3-81e74cf45cf7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:46 np0005465988 nova_compute[236126]: 2025-10-02 12:01:46.552 2 DEBUG oslo_concurrency.lockutils [req-8ff668e4-356a-4dea-9655-6146772db748 req-f70f16fa-5d5e-46bc-82a3-81e74cf45cf7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:46 np0005465988 nova_compute[236126]: 2025-10-02 12:01:46.552 2 DEBUG oslo_concurrency.lockutils [req-8ff668e4-356a-4dea-9655-6146772db748 req-f70f16fa-5d5e-46bc-82a3-81e74cf45cf7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:46 np0005465988 nova_compute[236126]: 2025-10-02 12:01:46.552 2 DEBUG nova.compute.manager [req-8ff668e4-356a-4dea-9655-6146772db748 req-f70f16fa-5d5e-46bc-82a3-81e74cf45cf7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Processing event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:01:46 np0005465988 podman[243965]: 2025-10-02 12:01:46.664416568 +0000 UTC m=+0.076648751 container create ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:01:46 np0005465988 systemd[1]: Started libpod-conmon-ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b.scope.
Oct  2 08:01:46 np0005465988 podman[243965]: 2025-10-02 12:01:46.630237214 +0000 UTC m=+0.042469417 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:01:46 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:01:46 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5334b83451a49136a8ca81a01e89cab5d997344b9092959d5f4b7c2d4173919f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:01:46 np0005465988 podman[243965]: 2025-10-02 12:01:46.784334117 +0000 UTC m=+0.196566320 container init ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 08:01:46 np0005465988 podman[243965]: 2025-10-02 12:01:46.792847275 +0000 UTC m=+0.205079448 container start ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:01:46 np0005465988 neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d[243980]: [NOTICE]   (243985) : New worker (243987) forked
Oct  2 08:01:46 np0005465988 neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d[243980]: [NOTICE]   (243985) : Loading success.
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.875 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 0fe5d3df-efa4-47f1-a32b-45858e68a8b9 in datapath 89bcb068-8337-43f0-9d5d-f27225e9a30d unbound from our chassis#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.878 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89bcb068-8337-43f0-9d5d-f27225e9a30d#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.890 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8c41aa-c088-4662-9d70-599e25cd6581]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.891 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89bcb068-81 in ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.892 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89bcb068-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.893 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[db0a72bc-2459-49a7-82df-9d778644e6ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.893 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b9db0238-d1b8-462c-aa6b-7aff3cbacb67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.904 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9c4d0f-4f55-467d-a904-ea57cbabda77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.926 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[febc9b21-cf8d-4baa-b99d-0516e706b369]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.964 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d01c480c-0728-4db6-b3cb-8e278e7c038c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:46.971 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d296ee57-efa8-49e6-af37-643ce7ab708f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:46 np0005465988 NetworkManager[45041]: <info>  [1759406506.9737] manager: (tap89bcb068-80): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Oct  2 08:01:46 np0005465988 systemd-udevd[243942]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.010 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[a34e7add-0b48-4925-a006-bf542b3de106]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.014 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2da8dd5a-064b-4db6-80db-4791de7341b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:47 np0005465988 NetworkManager[45041]: <info>  [1759406507.0377] device (tap89bcb068-80): carrier: link connected
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.045 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d342ea9c-0df1-4fe8-9179-5089d6c0b751]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.064 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[70451372-e069-43be-bc0f-90f57d26fd9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89bcb068-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:40:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456736, 'reachable_time': 18554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244023, 'error': None, 'target': 'ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.075 2 DEBUG nova.compute.manager [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.075 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406507.0738945, e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.076 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] VM Started (Lifecycle Event)#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.082 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.082 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c34da39c-3d2e-4669-a9d7-d6206b421189]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8c:409e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 456736, 'tstamp': 456736}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244024, 'error': None, 'target': 'ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.087 2 INFO nova.virt.libvirt.driver [-] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Instance spawned successfully.#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.087 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.102 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4947a645-2b94-4435-b9e0-5a9f9e40b1c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89bcb068-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:40:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456736, 'reachable_time': 18554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244025, 'error': None, 'target': 'ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.125 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.131 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.134 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.135 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.135 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.136 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.136 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.136 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ddf597-df1c-48c7-9064-ebaf38037b39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.137 2 DEBUG nova.virt.libvirt.driver [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.197 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.198 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406507.0741584, e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.198 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.216 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[44c6f6e3-edbc-4260-979f-336ca5695131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.218 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89bcb068-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.218 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.219 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89bcb068-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:47 np0005465988 NetworkManager[45041]: <info>  [1759406507.2230] manager: (tap89bcb068-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Oct  2 08:01:47 np0005465988 kernel: tap89bcb068-80: entered promiscuous mode
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.226 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89bcb068-80, col_values=(('external_ids', {'iface-id': '2d320c71-91bf-44ba-b753-c9807cdc7fdb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:47Z|00045|binding|INFO|Releasing lport 2d320c71-91bf-44ba-b753-c9807cdc7fdb from this chassis (sb_readonly=0)
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.258 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89bcb068-8337-43f0-9d5d-f27225e9a30d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89bcb068-8337-43f0-9d5d-f27225e9a30d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.259 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[585ac2a7-0586-43f0-a1ad-651209b41053]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.260 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-89bcb068-8337-43f0-9d5d-f27225e9a30d
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/89bcb068-8337-43f0-9d5d-f27225e9a30d.pid.haproxy
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 89bcb068-8337-43f0-9d5d-f27225e9a30d
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:01:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:47.261 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d', 'env', 'PROCESS_TAG=haproxy-89bcb068-8337-43f0-9d5d-f27225e9a30d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89bcb068-8337-43f0-9d5d-f27225e9a30d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.292 2 INFO nova.compute.manager [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Took 10.30 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.293 2 DEBUG nova.compute.manager [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:47.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.376 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.381 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406507.081003, e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.381 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.494 2 INFO nova.compute.manager [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Took 11.44 seconds to build instance.#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.511 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.513 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:47 np0005465988 nova_compute[236126]: 2025-10-02 12:01:47.666 2 DEBUG oslo_concurrency.lockutils [None req-7e75a7aa-1db6-417b-8d7c-5a87f586b9b2 efb31eeadee34403b1ab7a584f3616f7 3f2b3ac7d7504c9c96f0d4a67e0243c9 - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:47 np0005465988 podman[244055]: 2025-10-02 12:01:47.694945903 +0000 UTC m=+0.057985458 container create a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:01:47 np0005465988 systemd[1]: Started libpod-conmon-a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a.scope.
Oct  2 08:01:47 np0005465988 podman[244055]: 2025-10-02 12:01:47.664842707 +0000 UTC m=+0.027882272 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:01:47 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:01:47 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/caafc3904607897f16face52aa7170039a46d470677cac73f8f83b22989f095e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:01:47 np0005465988 podman[244055]: 2025-10-02 12:01:47.782173391 +0000 UTC m=+0.145212946 container init a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:01:47 np0005465988 podman[244068]: 2025-10-02 12:01:47.786977361 +0000 UTC m=+0.050961464 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:01:47 np0005465988 podman[244055]: 2025-10-02 12:01:47.789190976 +0000 UTC m=+0.152230521 container start a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:01:47 np0005465988 neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d[244076]: [NOTICE]   (244093) : New worker (244095) forked
Oct  2 08:01:47 np0005465988 neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d[244076]: [NOTICE]   (244093) : Loading success.
Oct  2 08:01:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:47.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.319 2 DEBUG oslo_concurrency.lockutils [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Acquiring lock "refresh_cache-b6a5f3ca-c662-41a0-ac02-78f9fba82bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.320 2 DEBUG oslo_concurrency.lockutils [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Acquired lock "refresh_cache-b6a5f3ca-c662-41a0-ac02-78f9fba82bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.321 2 DEBUG nova.network.neutron [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.534 2 DEBUG nova.network.neutron [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.698 2 DEBUG nova.compute.manager [req-57bc9af0-6b17-4699-8f66-64e90e0af507 req-089c6162-228a-49ec-8fae-03cd0fc2000b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.699 2 DEBUG oslo_concurrency.lockutils [req-57bc9af0-6b17-4699-8f66-64e90e0af507 req-089c6162-228a-49ec-8fae-03cd0fc2000b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.699 2 DEBUG oslo_concurrency.lockutils [req-57bc9af0-6b17-4699-8f66-64e90e0af507 req-089c6162-228a-49ec-8fae-03cd0fc2000b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.699 2 DEBUG oslo_concurrency.lockutils [req-57bc9af0-6b17-4699-8f66-64e90e0af507 req-089c6162-228a-49ec-8fae-03cd0fc2000b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.700 2 DEBUG nova.compute.manager [req-57bc9af0-6b17-4699-8f66-64e90e0af507 req-089c6162-228a-49ec-8fae-03cd0fc2000b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] No waiting events found dispatching network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.700 2 WARNING nova.compute.manager [req-57bc9af0-6b17-4699-8f66-64e90e0af507 req-089c6162-228a-49ec-8fae-03cd0fc2000b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received unexpected event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.797 2 DEBUG nova.network.neutron [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.813 2 DEBUG oslo_concurrency.lockutils [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Releasing lock "refresh_cache-b6a5f3ca-c662-41a0-ac02-78f9fba82bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.921 2 DEBUG nova.virt.libvirt.driver [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.922 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Creating file /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/b33f39798f854b7291720db43abe484a.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  2 08:01:48 np0005465988 nova_compute[236126]: 2025-10-02 12:01:48.922 2 DEBUG oslo_concurrency.processutils [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/b33f39798f854b7291720db43abe484a.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:49.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:49 np0005465988 nova_compute[236126]: 2025-10-02 12:01:49.417 2 DEBUG oslo_concurrency.processutils [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/b33f39798f854b7291720db43abe484a.tmp" returned: 1 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:49 np0005465988 nova_compute[236126]: 2025-10-02 12:01:49.417 2 DEBUG oslo_concurrency.processutils [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/b33f39798f854b7291720db43abe484a.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  2 08:01:49 np0005465988 nova_compute[236126]: 2025-10-02 12:01:49.418 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Creating directory /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  2 08:01:49 np0005465988 nova_compute[236126]: 2025-10-02 12:01:49.418 2 DEBUG oslo_concurrency.processutils [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:49 np0005465988 nova_compute[236126]: 2025-10-02 12:01:49.639 2 DEBUG oslo_concurrency.processutils [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba" returned: 0 in 0.221s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:49 np0005465988 nova_compute[236126]: 2025-10-02 12:01:49.652 2 DEBUG nova.virt.libvirt.driver [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:01:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:49.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:49 np0005465988 nova_compute[236126]: 2025-10-02 12:01:49.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:50 np0005465988 nova_compute[236126]: 2025-10-02 12:01:50.965 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Check if temp file /var/lib/nova/instances/tmp8l46yc1t exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Oct  2 08:01:50 np0005465988 nova_compute[236126]: 2025-10-02 12:01:50.967 2 DEBUG nova.compute.manager [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8l46yc1t',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Oct  2 08:01:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:51.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:51 np0005465988 nova_compute[236126]: 2025-10-02 12:01:51.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:51.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:53.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:53.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.744 2 INFO nova.compute.manager [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Took 3.04 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.745 2 DEBUG nova.compute.manager [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.794 2 DEBUG nova.compute.manager [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8l46yc1t',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(33813b6a-c36c-41a1-8dae-64be3d05308f),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.797 2 DEBUG nova.objects.instance [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Lazy-loading 'migration_context' on Instance uuid e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.798 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.799 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.800 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.820 2 DEBUG nova.compute.manager [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-vif-unplugged-c3f370c5-770e-48df-b015-b39eb427f259 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.820 2 DEBUG oslo_concurrency.lockutils [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.820 2 DEBUG oslo_concurrency.lockutils [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.820 2 DEBUG oslo_concurrency.lockutils [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.821 2 DEBUG nova.compute.manager [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] No waiting events found dispatching network-vif-unplugged-c3f370c5-770e-48df-b015-b39eb427f259 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.821 2 DEBUG nova.compute.manager [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-vif-unplugged-c3f370c5-770e-48df-b015-b39eb427f259 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.821 2 DEBUG nova.compute.manager [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.821 2 DEBUG oslo_concurrency.lockutils [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.821 2 DEBUG oslo_concurrency.lockutils [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.821 2 DEBUG oslo_concurrency.lockutils [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.822 2 DEBUG nova.compute.manager [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] No waiting events found dispatching network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.822 2 WARNING nova.compute.manager [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received unexpected event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.823 2 DEBUG nova.compute.manager [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-changed-c3f370c5-770e-48df-b015-b39eb427f259 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.823 2 DEBUG nova.compute.manager [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Refreshing instance network info cache due to event network-changed-c3f370c5-770e-48df-b015-b39eb427f259. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.824 2 DEBUG oslo_concurrency.lockutils [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.824 2 DEBUG oslo_concurrency.lockutils [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.824 2 DEBUG nova.network.neutron [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Refreshing network info cache for port c3f370c5-770e-48df-b015-b39eb427f259 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.854 2 DEBUG nova.virt.libvirt.vif [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:01:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1571369270',display_name='tempest-LiveMigrationTest-server-1571369270',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1571369270',id=12,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:01:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f2b3ac7d7504c9c96f0d4a67e0243c9',ramdisk_id='',reservation_id='r-ton1smkr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1876533760',owner_user_name='tempest-LiveMigrationTest-1876533760-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:01:47Z,user_data=None,user_id='efb31eeadee34403b1ab7a584f3616f7',uuid=e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.855 2 DEBUG nova.network.os_vif_util [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Converting VIF {"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.856 2 DEBUG nova.network.os_vif_util [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:ff:42,bridge_name='br-int',has_traffic_filtering=True,id=c3f370c5-770e-48df-b015-b39eb427f259,network=Network(5bd66e63-9399-4ab1-bcda-a761f2c44b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc3f370c5-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.857 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Updating guest XML with vif config: <interface type="ethernet">
Oct  2 08:01:54 np0005465988 nova_compute[236126]:  <mac address="fa:16:3e:02:ff:42"/>
Oct  2 08:01:54 np0005465988 nova_compute[236126]:  <model type="virtio"/>
Oct  2 08:01:54 np0005465988 nova_compute[236126]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:01:54 np0005465988 nova_compute[236126]:  <mtu size="1442"/>
Oct  2 08:01:54 np0005465988 nova_compute[236126]:  <target dev="tapc3f370c5-77"/>
Oct  2 08:01:54 np0005465988 nova_compute[236126]: </interface>
Oct  2 08:01:54 np0005465988 nova_compute[236126]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.857 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Oct  2 08:01:54 np0005465988 nova_compute[236126]: 2025-10-02 12:01:54.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:01:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1035003158' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:01:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:01:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1035003158' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:01:55 np0005465988 nova_compute[236126]: 2025-10-02 12:01:55.304 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:01:55 np0005465988 nova_compute[236126]: 2025-10-02 12:01:55.305 2 INFO nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Oct  2 08:01:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:55.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:55 np0005465988 nova_compute[236126]: 2025-10-02 12:01:55.396 2 INFO nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Oct  2 08:01:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:01:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:55.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:01:55 np0005465988 nova_compute[236126]: 2025-10-02 12:01:55.899 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:01:55 np0005465988 nova_compute[236126]: 2025-10-02 12:01:55.900 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:01:55 np0005465988 nova_compute[236126]: 2025-10-02 12:01:55.934 2 DEBUG nova.network.neutron [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Updated VIF entry in instance network info cache for port c3f370c5-770e-48df-b015-b39eb427f259. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:01:55 np0005465988 nova_compute[236126]: 2025-10-02 12:01:55.935 2 DEBUG nova.network.neutron [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Updating instance_info_cache with network_info: [{"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:56 np0005465988 nova_compute[236126]: 2025-10-02 12:01:56.021 2 DEBUG oslo_concurrency.lockutils [req-e980b5ff-00fb-4ac9-a072-e11608f30dc8 req-2b4c5b8e-f3c0-490a-a83f-7176b634a9a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:56 np0005465988 nova_compute[236126]: 2025-10-02 12:01:56.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:56 np0005465988 nova_compute[236126]: 2025-10-02 12:01:56.402 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:01:56 np0005465988 nova_compute[236126]: 2025-10-02 12:01:56.402 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:01:56 np0005465988 nova_compute[236126]: 2025-10-02 12:01:56.905 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:01:56 np0005465988 nova_compute[236126]: 2025-10-02 12:01:56.907 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:01:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:01:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:57.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:57 np0005465988 nova_compute[236126]: 2025-10-02 12:01:57.411 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:01:57 np0005465988 nova_compute[236126]: 2025-10-02 12:01:57.411 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:01:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:57.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:57 np0005465988 nova_compute[236126]: 2025-10-02 12:01:57.914 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Current 50 elapsed 3 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:01:57 np0005465988 nova_compute[236126]: 2025-10-02 12:01:57.915 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:01:58 np0005465988 nova_compute[236126]: 2025-10-02 12:01:58.419 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Current 50 elapsed 3 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:01:58 np0005465988 nova_compute[236126]: 2025-10-02 12:01:58.420 2 DEBUG nova.virt.libvirt.migration [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:01:58 np0005465988 nova_compute[236126]: 2025-10-02 12:01:58.790 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406518.7895098, e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:58 np0005465988 nova_compute[236126]: 2025-10-02 12:01:58.791 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:01:58 np0005465988 nova_compute[236126]: 2025-10-02 12:01:58.824 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:58 np0005465988 nova_compute[236126]: 2025-10-02 12:01:58.828 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:58 np0005465988 nova_compute[236126]: 2025-10-02 12:01:58.854 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Oct  2 08:01:58 np0005465988 kernel: tapc3f370c5-77 (unregistering): left promiscuous mode
Oct  2 08:01:58 np0005465988 NetworkManager[45041]: <info>  [1759406518.9695] device (tapc3f370c5-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:01:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:58Z|00046|binding|INFO|Releasing lport c3f370c5-770e-48df-b015-b39eb427f259 from this chassis (sb_readonly=0)
Oct  2 08:01:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:58Z|00047|binding|INFO|Setting lport c3f370c5-770e-48df-b015-b39eb427f259 down in Southbound
Oct  2 08:01:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:58Z|00048|binding|INFO|Releasing lport 0fe5d3df-efa4-47f1-a32b-45858e68a8b9 from this chassis (sb_readonly=0)
Oct  2 08:01:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:58Z|00049|binding|INFO|Setting lport 0fe5d3df-efa4-47f1-a32b-45858e68a8b9 down in Southbound
Oct  2 08:01:58 np0005465988 nova_compute[236126]: 2025-10-02 12:01:58.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:58Z|00050|binding|INFO|Removing iface tapc3f370c5-77 ovn-installed in OVS
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.008 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:ff:42 10.100.0.6'], port_security=['fa:16:3e:02:ff:42 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '6718a9ec-e13c-42f0-978a-6c44c48d0d54'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-96076607', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bd66e63-9399-4ab1-bcda-a761f2c44b1d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-96076607', 'neutron:project_id': '3f2b3ac7d7504c9c96f0d4a67e0243c9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c39f970f-1ead-4030-a775-b7ca9942094a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58eebdcc-6c12-4ff3-b6bc-0fe1fb3af6b6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c3f370c5-770e-48df-b015-b39eb427f259) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:01:59 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:59Z|00051|binding|INFO|Releasing lport 5a25c40a-77b7-400c-afc3-f6cb920420cb from this chassis (sb_readonly=0)
Oct  2 08:01:59 np0005465988 ovn_controller[132601]: 2025-10-02T12:01:59Z|00052|binding|INFO|Releasing lport 2d320c71-91bf-44ba-b753-c9807cdc7fdb from this chassis (sb_readonly=0)
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.010 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:98:01 19.80.0.134'], port_security=['fa:16:3e:88:98:01 19.80.0.134'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['c3f370c5-770e-48df-b015-b39eb427f259'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-2131619737', 'neutron:cidrs': '19.80.0.134/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89bcb068-8337-43f0-9d5d-f27225e9a30d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-2131619737', 'neutron:project_id': '3f2b3ac7d7504c9c96f0d4a67e0243c9', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'c39f970f-1ead-4030-a775-b7ca9942094a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=b81908e7-ad92-4b22-83eb-83c6667bf780, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0fe5d3df-efa4-47f1-a32b-45858e68a8b9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.011 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c3f370c5-770e-48df-b015-b39eb427f259 in datapath 5bd66e63-9399-4ab1-bcda-a761f2c44b1d unbound from our chassis#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.013 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5bd66e63-9399-4ab1-bcda-a761f2c44b1d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.014 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[149ffb85-67f3-4b79-a080-a93a49dd3710]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.015 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d namespace which is not needed anymore#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:59 np0005465988 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Oct  2 08:01:59 np0005465988 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Consumed 12.348s CPU time.
Oct  2 08:01:59 np0005465988 systemd-machined[192594]: Machine qemu-6-instance-0000000c terminated.
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:59 np0005465988 virtqemud[235689]: Unable to get XATTR trusted.libvirt.security.ref_selinux on e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk: No such file or directory
Oct  2 08:01:59 np0005465988 virtqemud[235689]: Unable to get XATTR trusted.libvirt.security.ref_dac on e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_disk: No such file or directory
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.147 2 DEBUG nova.virt.libvirt.guest [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.148 2 INFO nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Migration operation has completed#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.149 2 INFO nova.compute.manager [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] _post_live_migration() is started..#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.154 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.154 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.154 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Oct  2 08:01:59 np0005465988 neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d[243980]: [NOTICE]   (243985) : haproxy version is 2.8.14-c23fe91
Oct  2 08:01:59 np0005465988 neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d[243980]: [NOTICE]   (243985) : path to executable is /usr/sbin/haproxy
Oct  2 08:01:59 np0005465988 neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d[243980]: [WARNING]  (243985) : Exiting Master process...
Oct  2 08:01:59 np0005465988 neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d[243980]: [ALERT]    (243985) : Current worker (243987) exited with code 143 (Terminated)
Oct  2 08:01:59 np0005465988 neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d[243980]: [WARNING]  (243985) : All workers exited. Exiting... (0)
Oct  2 08:01:59 np0005465988 systemd[1]: libpod-ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b.scope: Deactivated successfully.
Oct  2 08:01:59 np0005465988 podman[244191]: 2025-10-02 12:01:59.190420355 +0000 UTC m=+0.051775109 container died ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:01:59 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b-userdata-shm.mount: Deactivated successfully.
Oct  2 08:01:59 np0005465988 systemd[1]: var-lib-containers-storage-overlay-5334b83451a49136a8ca81a01e89cab5d997344b9092959d5f4b7c2d4173919f-merged.mount: Deactivated successfully.
Oct  2 08:01:59 np0005465988 podman[244191]: 2025-10-02 12:01:59.231319637 +0000 UTC m=+0.092674381 container cleanup ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:01:59 np0005465988 systemd[1]: libpod-conmon-ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b.scope: Deactivated successfully.
Oct  2 08:01:59 np0005465988 podman[244228]: 2025-10-02 12:01:59.306701838 +0000 UTC m=+0.051113810 container remove ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.313 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9f1e3df9-bf76-40a4-8374-b733ef781bab]: (4, ('Thu Oct  2 12:01:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d (ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b)\nebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b\nThu Oct  2 12:01:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d (ebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b)\nebf59ab21e734e661f9efa86196d05ad1ddcb3ea59bd3769905f006c5523dc2b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.316 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6a47881e-50e6-4e8e-8957-de30ec23a031]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.318 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bd66e63-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:59 np0005465988 kernel: tap5bd66e63-90: left promiscuous mode
Oct  2 08:01:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:01:59.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.343 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[da814b38-4fa1-414c-bbb9-fb8a5b733306]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.373 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[79e4b842-250b-429f-817c-1b97212ceebd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.374 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fa793ac9-4f06-48f0-95a5-1f28992e3b9a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.392 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[10e2658e-aa79-486c-88a3-8c3f1a75d7f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456631, 'reachable_time': 39981, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244246, 'error': None, 'target': 'ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 systemd[1]: run-netns-ovnmeta\x2d5bd66e63\x2d9399\x2d4ab1\x2dbcda\x2da761f2c44b1d.mount: Deactivated successfully.
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.395 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5bd66e63-9399-4ab1-bcda-a761f2c44b1d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.395 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e52244-c82f-48dd-869e-d911635ddcf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.396 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 0fe5d3df-efa4-47f1-a32b-45858e68a8b9 in datapath 89bcb068-8337-43f0-9d5d-f27225e9a30d unbound from our chassis#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.398 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89bcb068-8337-43f0-9d5d-f27225e9a30d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.399 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1d18a153-1b87-40fb-9bf2-c21d2d220f9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.399 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d namespace which is not needed anymore#033[00m
Oct  2 08:01:59 np0005465988 neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d[244076]: [NOTICE]   (244093) : haproxy version is 2.8.14-c23fe91
Oct  2 08:01:59 np0005465988 neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d[244076]: [NOTICE]   (244093) : path to executable is /usr/sbin/haproxy
Oct  2 08:01:59 np0005465988 neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d[244076]: [WARNING]  (244093) : Exiting Master process...
Oct  2 08:01:59 np0005465988 neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d[244076]: [ALERT]    (244093) : Current worker (244095) exited with code 143 (Terminated)
Oct  2 08:01:59 np0005465988 neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d[244076]: [WARNING]  (244093) : All workers exited. Exiting... (0)
Oct  2 08:01:59 np0005465988 systemd[1]: libpod-a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a.scope: Deactivated successfully.
Oct  2 08:01:59 np0005465988 podman[244263]: 2025-10-02 12:01:59.532756895 +0000 UTC m=+0.042623923 container died a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:01:59 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a-userdata-shm.mount: Deactivated successfully.
Oct  2 08:01:59 np0005465988 systemd[1]: var-lib-containers-storage-overlay-caafc3904607897f16face52aa7170039a46d470677cac73f8f83b22989f095e-merged.mount: Deactivated successfully.
Oct  2 08:01:59 np0005465988 podman[244263]: 2025-10-02 12:01:59.583781921 +0000 UTC m=+0.093648979 container cleanup a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.590 2 DEBUG nova.compute.manager [req-89f4b426-c4e9-48fc-add8-33fa0e2f7097 req-e8a33c91-84b7-4807-95aa-5bdffcd70cc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-vif-unplugged-c3f370c5-770e-48df-b015-b39eb427f259 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.590 2 DEBUG oslo_concurrency.lockutils [req-89f4b426-c4e9-48fc-add8-33fa0e2f7097 req-e8a33c91-84b7-4807-95aa-5bdffcd70cc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.591 2 DEBUG oslo_concurrency.lockutils [req-89f4b426-c4e9-48fc-add8-33fa0e2f7097 req-e8a33c91-84b7-4807-95aa-5bdffcd70cc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.591 2 DEBUG oslo_concurrency.lockutils [req-89f4b426-c4e9-48fc-add8-33fa0e2f7097 req-e8a33c91-84b7-4807-95aa-5bdffcd70cc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.592 2 DEBUG nova.compute.manager [req-89f4b426-c4e9-48fc-add8-33fa0e2f7097 req-e8a33c91-84b7-4807-95aa-5bdffcd70cc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] No waiting events found dispatching network-vif-unplugged-c3f370c5-770e-48df-b015-b39eb427f259 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.592 2 DEBUG nova.compute.manager [req-89f4b426-c4e9-48fc-add8-33fa0e2f7097 req-e8a33c91-84b7-4807-95aa-5bdffcd70cc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-vif-unplugged-c3f370c5-770e-48df-b015-b39eb427f259 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:01:59 np0005465988 systemd[1]: libpod-conmon-a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a.scope: Deactivated successfully.
Oct  2 08:01:59 np0005465988 podman[244294]: 2025-10-02 12:01:59.680668443 +0000 UTC m=+0.058359179 container remove a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.687 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5a4fbe52-6849-442c-b4f1-8c44930bb0e2]: (4, ('Thu Oct  2 12:01:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d (a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a)\na6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a\nThu Oct  2 12:01:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d (a6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a)\na6b77617f44461e4fcd2587f199d5076c37aa550825fc7ed8717040e6b52a91a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.689 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ced52899-2ecb-4f31-b7dd-3c6522096414]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.690 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89bcb068-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:59 np0005465988 kernel: tap89bcb068-80: left promiscuous mode
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.707 2 DEBUG nova.virt.libvirt.driver [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.717 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4e760e2b-a6d2-4c5c-8a5f-9e77a6f63dc7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.748 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d801c5e7-cadb-4228-a91f-4ca908ffa5a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.749 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[74544150-743b-4fd5-8663-6abce89d642c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.774 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9057f41e-1c26-4d32-a38b-64e086d378ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456728, 'reachable_time': 38417, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244311, 'error': None, 'target': 'ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.777 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89bcb068-8337-43f0-9d5d-f27225e9a30d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:01:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:01:59.777 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[80382246-87b9-4fc7-b90d-c827a1422569]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:01:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:01:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:01:59.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:01:59 np0005465988 nova_compute[236126]: 2025-10-02 12:01:59.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:00 np0005465988 systemd[1]: run-netns-ovnmeta\x2d89bcb068\x2d8337\x2d43f0\x2d9d5d\x2df27225e9a30d.mount: Deactivated successfully.
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.241 2 DEBUG nova.network.neutron [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Activated binding for port c3f370c5-770e-48df-b015-b39eb427f259 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.241 2 DEBUG nova.compute.manager [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.243 2 DEBUG nova.virt.libvirt.vif [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:01:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1571369270',display_name='tempest-LiveMigrationTest-server-1571369270',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1571369270',id=12,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:01:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f2b3ac7d7504c9c96f0d4a67e0243c9',ramdisk_id='',reservation_id='r-ton1smkr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1876533760',owner_user_name='tempest-LiveMigrationTest-1876533760-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:01:50Z,user_data=None,user_id='efb31eeadee34403b1ab7a584f3616f7',uuid=e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.244 2 DEBUG nova.network.os_vif_util [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Converting VIF {"id": "c3f370c5-770e-48df-b015-b39eb427f259", "address": "fa:16:3e:02:ff:42", "network": {"id": "5bd66e63-9399-4ab1-bcda-a761f2c44b1d", "bridge": "br-int", "label": "tempest-LiveMigrationTest-205544999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f2b3ac7d7504c9c96f0d4a67e0243c9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3f370c5-77", "ovs_interfaceid": "c3f370c5-770e-48df-b015-b39eb427f259", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.245 2 DEBUG nova.network.os_vif_util [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:ff:42,bridge_name='br-int',has_traffic_filtering=True,id=c3f370c5-770e-48df-b015-b39eb427f259,network=Network(5bd66e63-9399-4ab1-bcda-a761f2c44b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc3f370c5-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.246 2 DEBUG os_vif [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:ff:42,bridge_name='br-int',has_traffic_filtering=True,id=c3f370c5-770e-48df-b015-b39eb427f259,network=Network(5bd66e63-9399-4ab1-bcda-a761f2c44b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc3f370c5-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.249 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3f370c5-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.257 2 INFO os_vif [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:ff:42,bridge_name='br-int',has_traffic_filtering=True,id=c3f370c5-770e-48df-b015-b39eb427f259,network=Network(5bd66e63-9399-4ab1-bcda-a761f2c44b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc3f370c5-77')#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.258 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.258 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.259 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.259 2 DEBUG nova.compute.manager [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.260 2 INFO nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Deleting instance files /var/lib/nova/instances/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_del#033[00m
Oct  2 08:02:00 np0005465988 nova_compute[236126]: 2025-10-02 12:02:00.261 2 INFO nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Deletion of /var/lib/nova/instances/e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7_del complete#033[00m
Oct  2 08:02:00 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Oct  2 08:02:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:01.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:01 np0005465988 nova_compute[236126]: 2025-10-02 12:02:01.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:01.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:02 np0005465988 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Oct  2 08:02:02 np0005465988 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000d.scope: Consumed 14.389s CPU time.
Oct  2 08:02:02 np0005465988 systemd-machined[192594]: Machine qemu-5-instance-0000000d terminated.
Oct  2 08:02:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.521 2 DEBUG nova.compute.manager [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.522 2 DEBUG oslo_concurrency.lockutils [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.522 2 DEBUG oslo_concurrency.lockutils [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.522 2 DEBUG oslo_concurrency.lockutils [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.523 2 DEBUG nova.compute.manager [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] No waiting events found dispatching network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.523 2 WARNING nova.compute.manager [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received unexpected event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.523 2 DEBUG nova.compute.manager [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.523 2 DEBUG oslo_concurrency.lockutils [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.523 2 DEBUG oslo_concurrency.lockutils [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.524 2 DEBUG oslo_concurrency.lockutils [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.524 2 DEBUG nova.compute.manager [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] No waiting events found dispatching network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.524 2 WARNING nova.compute.manager [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received unexpected event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.524 2 DEBUG nova.compute.manager [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.524 2 DEBUG oslo_concurrency.lockutils [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.524 2 DEBUG oslo_concurrency.lockutils [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.525 2 DEBUG oslo_concurrency.lockutils [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.525 2 DEBUG nova.compute.manager [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] No waiting events found dispatching network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.525 2 WARNING nova.compute.manager [req-9aeec47f-236d-49bf-8a1e-8bc827affe85 req-ebcefce1-fcef-43ac-9ee1-ac05c0663169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Received unexpected event network-vif-plugged-c3f370c5-770e-48df-b015-b39eb427f259 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.723 2 INFO nova.virt.libvirt.driver [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.730 2 INFO nova.virt.libvirt.driver [-] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance destroyed successfully.#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.734 2 DEBUG nova.virt.libvirt.driver [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.735 2 DEBUG nova.virt.libvirt.driver [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.844 2 DEBUG oslo_concurrency.lockutils [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Acquiring lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.845 2 DEBUG oslo_concurrency.lockutils [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:02 np0005465988 nova_compute[236126]: 2025-10-02 12:02:02.845 2 DEBUG oslo_concurrency.lockutils [None req-efbbd1d4-f09b-489a-b37f-706f6984cc69 4b9a52c4eb834c90b054f480d8de237c 453cdd07db3b4f158d23674cf6281d92 - - default default] Lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:03.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:03 np0005465988 podman[244319]: 2025-10-02 12:02:03.531070188 +0000 UTC m=+0.056779623 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:02:03 np0005465988 podman[244318]: 2025-10-02 12:02:03.541608223 +0000 UTC m=+0.067861374 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:02:03 np0005465988 podman[244317]: 2025-10-02 12:02:03.562111346 +0000 UTC m=+0.094597607 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:02:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:03.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e147 e147: 3 total, 3 up, 3 in
Oct  2 08:02:05 np0005465988 nova_compute[236126]: 2025-10-02 12:02:05.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:05.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:05.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:06 np0005465988 nova_compute[236126]: 2025-10-02 12:02:06.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:07.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:07.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:08 np0005465988 nova_compute[236126]: 2025-10-02 12:02:08.726 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Acquiring lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:08 np0005465988 nova_compute[236126]: 2025-10-02 12:02:08.726 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:08 np0005465988 nova_compute[236126]: 2025-10-02 12:02:08.727 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Lock "e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:08 np0005465988 nova_compute[236126]: 2025-10-02 12:02:08.750 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:08 np0005465988 nova_compute[236126]: 2025-10-02 12:02:08.750 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:08 np0005465988 nova_compute[236126]: 2025-10-02 12:02:08.751 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:08 np0005465988 nova_compute[236126]: 2025-10-02 12:02:08.751 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:02:08 np0005465988 nova_compute[236126]: 2025-10-02 12:02:08.751 2 DEBUG oslo_concurrency.processutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:02:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1466240660' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.174 2 DEBUG oslo_concurrency.processutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.272 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.273 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.278 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.279 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:02:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:09.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.450 2 WARNING nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.451 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4734MB free_disk=20.760841369628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.451 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.452 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.537 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Migration for instance b6a5f3ca-c662-41a0-ac02-78f9fba82bba refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.538 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Migration for instance e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.597 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.598 2 INFO nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Updating resource usage from migration b6b47536-f923-491d-aa49-8278d9a20c83#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.598 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Starting to track outgoing migration b6b47536-f923-491d-aa49-8278d9a20c83 with flavor cef129e5-cce4-4465-9674-03d3559e8a14 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.643 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Instance b8d4207f-7e3b-4a3c-ad76-60d87d695918 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.644 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Migration b6b47536-f923-491d-aa49-8278d9a20c83 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.645 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Migration 33813b6a-c36c-41a1-8dae-64be3d05308f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.645 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.645 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:02:09 np0005465988 nova_compute[236126]: 2025-10-02 12:02:09.722 2 DEBUG oslo_concurrency.processutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:02:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:09.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:02:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:02:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1107844131' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.179 2 DEBUG oslo_concurrency.processutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.193 2 DEBUG nova.compute.provider_tree [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.208 2 DEBUG nova.scheduler.client.report [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.246 2 DEBUG nova.compute.resource_tracker [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.247 2 DEBUG oslo_concurrency.lockutils [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.255 2 INFO nova.compute.manager [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.338 2 INFO nova.compute.manager [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Swapping old allocation on dict_keys(['5abd2871-a992-42ab-bb6a-594a92f77d4d']) held by migration b6b47536-f923-491d-aa49-8278d9a20c83 for instance#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.395 2 DEBUG nova.scheduler.client.report [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Overwriting current allocation {'allocations': {'abc4c1e4-e97d-4065-b850-e0065c8f9ab7': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 14}}, 'project_id': 'f1ce36070fb047479c3a083f36733f63', 'user_id': '1a06819bf8cc4ff7bccbbb2616ff2d21', 'consumer_generation': 1} on consumer b6a5f3ca-c662-41a0-ac02-78f9fba82bba move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.407 2 INFO nova.scheduler.client.report [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] Deleted allocation for migration 33813b6a-c36c-41a1-8dae-64be3d05308f#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.407 2 DEBUG nova.virt.libvirt.driver [None req-24f8d1fd-a587-4259-a5b4-1cfd1c3a09fd c087f94b944f499086fc90cd7a8c6ed2 a45f15d82e6e4e95a422d53281506176 - - default default] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.590 2 DEBUG oslo_concurrency.lockutils [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "refresh_cache-b6a5f3ca-c662-41a0-ac02-78f9fba82bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.590 2 DEBUG oslo_concurrency.lockutils [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquired lock "refresh_cache-b6a5f3ca-c662-41a0-ac02-78f9fba82bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:02:10 np0005465988 nova_compute[236126]: 2025-10-02 12:02:10.591 2 DEBUG nova.network.neutron [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:02:11 np0005465988 nova_compute[236126]: 2025-10-02 12:02:11.297 2 DEBUG nova.network.neutron [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:02:11 np0005465988 nova_compute[236126]: 2025-10-02 12:02:11.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:11.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:11 np0005465988 nova_compute[236126]: 2025-10-02 12:02:11.621 2 DEBUG nova.network.neutron [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:02:11 np0005465988 nova_compute[236126]: 2025-10-02 12:02:11.643 2 DEBUG oslo_concurrency.lockutils [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Releasing lock "refresh_cache-b6a5f3ca-c662-41a0-ac02-78f9fba82bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:02:11 np0005465988 nova_compute[236126]: 2025-10-02 12:02:11.644 2 DEBUG nova.virt.libvirt.driver [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Oct  2 08:02:11 np0005465988 nova_compute[236126]: 2025-10-02 12:02:11.728 2 DEBUG nova.storage.rbd_utils [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] rolling back rbd image(b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Oct  2 08:02:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:11.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:11 np0005465988 nova_compute[236126]: 2025-10-02 12:02:11.971 2 DEBUG nova.storage.rbd_utils [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] removing snapshot(nova-resize) on rbd image(b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:02:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e148 e148: 3 total, 3 up, 3 in
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.594 2 DEBUG nova.virt.libvirt.driver [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.598 2 WARNING nova.virt.libvirt.driver [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.602 2 DEBUG nova.virt.libvirt.host [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.603 2 DEBUG nova.virt.libvirt.host [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.606 2 DEBUG nova.virt.libvirt.host [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.607 2 DEBUG nova.virt.libvirt.host [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.608 2 DEBUG nova.virt.libvirt.driver [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.608 2 DEBUG nova.virt.hardware [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.609 2 DEBUG nova.virt.hardware [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.609 2 DEBUG nova.virt.hardware [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.609 2 DEBUG nova.virt.hardware [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.609 2 DEBUG nova.virt.hardware [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.609 2 DEBUG nova.virt.hardware [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.610 2 DEBUG nova.virt.hardware [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.610 2 DEBUG nova.virt.hardware [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.610 2 DEBUG nova.virt.hardware [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.610 2 DEBUG nova.virt.hardware [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.610 2 DEBUG nova.virt.hardware [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.611 2 DEBUG nova.objects.instance [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b6a5f3ca-c662-41a0-ac02-78f9fba82bba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:12 np0005465988 nova_compute[236126]: 2025-10-02 12:02:12.627 2 DEBUG oslo_concurrency.processutils [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:02:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3648740956' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:02:13 np0005465988 nova_compute[236126]: 2025-10-02 12:02:13.067 2 DEBUG oslo_concurrency.processutils [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:13 np0005465988 nova_compute[236126]: 2025-10-02 12:02:13.141 2 DEBUG oslo_concurrency.processutils [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:13.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:02:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3056966603' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:02:13 np0005465988 nova_compute[236126]: 2025-10-02 12:02:13.677 2 DEBUG oslo_concurrency.processutils [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:13 np0005465988 nova_compute[236126]: 2025-10-02 12:02:13.683 2 DEBUG nova.virt.libvirt.driver [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  <uuid>b6a5f3ca-c662-41a0-ac02-78f9fba82bba</uuid>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  <name>instance-0000000d</name>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <nova:name>tempest-MigrationsAdminTest-server-1708097480</nova:name>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:02:12</nova:creationTime>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <nova:user uuid="1a06819bf8cc4ff7bccbbb2616ff2d21">tempest-MigrationsAdminTest-819597356-project-member</nova:user>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <nova:project uuid="f1ce36070fb047479c3a083f36733f63">tempest-MigrationsAdminTest-819597356</nova:project>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <entry name="serial">b6a5f3ca-c662-41a0-ac02-78f9fba82bba</entry>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <entry name="uuid">b6a5f3ca-c662-41a0-ac02-78f9fba82bba</entry>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/b6a5f3ca-c662-41a0-ac02-78f9fba82bba_disk.config">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba/console.log" append="off"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <input type="keyboard" bus="usb"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:02:13 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:02:13 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:02:13 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:02:13 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:02:13 np0005465988 systemd-machined[192594]: New machine qemu-7-instance-0000000d.
Oct  2 08:02:13 np0005465988 systemd[1]: Started Virtual Machine qemu-7-instance-0000000d.
Oct  2 08:02:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:13.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.148 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406519.1466498, e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.149 2 INFO nova.compute.manager [-] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.178 2 DEBUG nova.compute.manager [None req-836d1ee8-0ad4-4bb0-ad73-664277cb5ace - - - - - -] [instance: e2d0d6b1-09f1-478b-8bd6-4159ee5c2bf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.898 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for b6a5f3ca-c662-41a0-ac02-78f9fba82bba due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.899 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406534.8981304, b6a5f3ca-c662-41a0-ac02-78f9fba82bba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.900 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.901 2 DEBUG nova.compute.manager [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.904 2 INFO nova.virt.libvirt.driver [-] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance running successfully.#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.904 2 DEBUG nova.virt.libvirt.driver [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.934 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.938 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.968 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.968 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406534.8993943, b6a5f3ca-c662-41a0-ac02-78f9fba82bba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:14 np0005465988 nova_compute[236126]: 2025-10-02 12:02:14.969 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] VM Started (Lifecycle Event)#033[00m
Oct  2 08:02:15 np0005465988 nova_compute[236126]: 2025-10-02 12:02:15.007 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:15 np0005465988 nova_compute[236126]: 2025-10-02 12:02:15.013 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:02:15 np0005465988 nova_compute[236126]: 2025-10-02 12:02:15.030 2 INFO nova.compute.manager [None req-a91bc5ff-6405-448e-86b6-50c6130b3dbf 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Updating instance to original state: 'active'#033[00m
Oct  2 08:02:15 np0005465988 nova_compute[236126]: 2025-10-02 12:02:15.046 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Oct  2 08:02:15 np0005465988 nova_compute[236126]: 2025-10-02 12:02:15.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:15.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:15.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:16 np0005465988 nova_compute[236126]: 2025-10-02 12:02:16.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:16 np0005465988 nova_compute[236126]: 2025-10-02 12:02:16.812 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:16 np0005465988 nova_compute[236126]: 2025-10-02 12:02:16.812 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:16 np0005465988 nova_compute[236126]: 2025-10-02 12:02:16.813 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:16 np0005465988 nova_compute[236126]: 2025-10-02 12:02:16.813 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:16 np0005465988 nova_compute[236126]: 2025-10-02 12:02:16.813 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:16 np0005465988 nova_compute[236126]: 2025-10-02 12:02:16.815 2 INFO nova.compute.manager [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Terminating instance#033[00m
Oct  2 08:02:16 np0005465988 nova_compute[236126]: 2025-10-02 12:02:16.816 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "refresh_cache-b6a5f3ca-c662-41a0-ac02-78f9fba82bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:02:16 np0005465988 nova_compute[236126]: 2025-10-02 12:02:16.816 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquired lock "refresh_cache-b6a5f3ca-c662-41a0-ac02-78f9fba82bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:02:16 np0005465988 nova_compute[236126]: 2025-10-02 12:02:16.817 2 DEBUG nova.network.neutron [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:02:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:17 np0005465988 nova_compute[236126]: 2025-10-02 12:02:17.303 2 DEBUG nova.network.neutron [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:02:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:17.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:17 np0005465988 nova_compute[236126]: 2025-10-02 12:02:17.692 2 DEBUG nova.network.neutron [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:02:17 np0005465988 nova_compute[236126]: 2025-10-02 12:02:17.711 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Releasing lock "refresh_cache-b6a5f3ca-c662-41a0-ac02-78f9fba82bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:02:17 np0005465988 nova_compute[236126]: 2025-10-02 12:02:17.711 2 DEBUG nova.compute.manager [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:02:17 np0005465988 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Oct  2 08:02:17 np0005465988 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000d.scope: Consumed 4.062s CPU time.
Oct  2 08:02:17 np0005465988 systemd-machined[192594]: Machine qemu-7-instance-0000000d terminated.
Oct  2 08:02:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:17.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:17 np0005465988 podman[244660]: 2025-10-02 12:02:17.919145989 +0000 UTC m=+0.088001906 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:02:17 np0005465988 nova_compute[236126]: 2025-10-02 12:02:17.931 2 INFO nova.virt.libvirt.driver [-] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance destroyed successfully.#033[00m
Oct  2 08:02:17 np0005465988 nova_compute[236126]: 2025-10-02 12:02:17.932 2 DEBUG nova.objects.instance [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lazy-loading 'resources' on Instance uuid b6a5f3ca-c662-41a0-ac02-78f9fba82bba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:18 np0005465988 nova_compute[236126]: 2025-10-02 12:02:18.631 2 INFO nova.virt.libvirt.driver [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Deleting instance files /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba_del#033[00m
Oct  2 08:02:18 np0005465988 nova_compute[236126]: 2025-10-02 12:02:18.632 2 INFO nova.virt.libvirt.driver [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Deletion of /var/lib/nova/instances/b6a5f3ca-c662-41a0-ac02-78f9fba82bba_del complete#033[00m
Oct  2 08:02:18 np0005465988 nova_compute[236126]: 2025-10-02 12:02:18.706 2 INFO nova.compute.manager [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Took 0.99 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:02:18 np0005465988 nova_compute[236126]: 2025-10-02 12:02:18.707 2 DEBUG oslo.service.loopingcall [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:02:18 np0005465988 nova_compute[236126]: 2025-10-02 12:02:18.707 2 DEBUG nova.compute.manager [-] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:02:18 np0005465988 nova_compute[236126]: 2025-10-02 12:02:18.708 2 DEBUG nova.network.neutron [-] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:02:18 np0005465988 nova_compute[236126]: 2025-10-02 12:02:18.873 2 DEBUG nova.network.neutron [-] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:02:18 np0005465988 nova_compute[236126]: 2025-10-02 12:02:18.898 2 DEBUG nova.network.neutron [-] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:02:18 np0005465988 nova_compute[236126]: 2025-10-02 12:02:18.960 2 INFO nova.compute.manager [-] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Took 0.25 seconds to deallocate network for instance.#033[00m
Oct  2 08:02:19 np0005465988 nova_compute[236126]: 2025-10-02 12:02:19.019 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:19 np0005465988 nova_compute[236126]: 2025-10-02 12:02:19.019 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:19 np0005465988 nova_compute[236126]: 2025-10-02 12:02:19.025 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:19 np0005465988 nova_compute[236126]: 2025-10-02 12:02:19.071 2 INFO nova.scheduler.client.report [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Deleted allocations for instance b6a5f3ca-c662-41a0-ac02-78f9fba82bba#033[00m
Oct  2 08:02:19 np0005465988 nova_compute[236126]: 2025-10-02 12:02:19.153 2 DEBUG oslo_concurrency.lockutils [None req-ce5016a9-3c56-40a9-ae51-ef2fb3962e38 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b6a5f3ca-c662-41a0-ac02-78f9fba82bba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:19.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:19.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.319 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.320 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.320 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.321 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.321 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.323 2 INFO nova.compute.manager [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Terminating instance#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.324 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.325 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquired lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.325 2 DEBUG nova.network.neutron [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.526 2 DEBUG nova.network.neutron [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.833 2 DEBUG nova.network.neutron [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.851 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Releasing lock "refresh_cache-b8d4207f-7e3b-4a3c-ad76-60d87d695918" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:02:20 np0005465988 nova_compute[236126]: 2025-10-02 12:02:20.851 2 DEBUG nova.compute.manager [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:02:20 np0005465988 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Oct  2 08:02:20 np0005465988 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Consumed 14.843s CPU time.
Oct  2 08:02:20 np0005465988 systemd-machined[192594]: Machine qemu-4-instance-0000000a terminated.
Oct  2 08:02:21 np0005465988 nova_compute[236126]: 2025-10-02 12:02:21.080 2 INFO nova.virt.libvirt.driver [-] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance destroyed successfully.#033[00m
Oct  2 08:02:21 np0005465988 nova_compute[236126]: 2025-10-02 12:02:21.080 2 DEBUG nova.objects.instance [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lazy-loading 'resources' on Instance uuid b8d4207f-7e3b-4a3c-ad76-60d87d695918 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:21 np0005465988 nova_compute[236126]: 2025-10-02 12:02:21.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:21.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:21 np0005465988 nova_compute[236126]: 2025-10-02 12:02:21.841 2 INFO nova.virt.libvirt.driver [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Deleting instance files /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918_del#033[00m
Oct  2 08:02:21 np0005465988 nova_compute[236126]: 2025-10-02 12:02:21.842 2 INFO nova.virt.libvirt.driver [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Deletion of /var/lib/nova/instances/b8d4207f-7e3b-4a3c-ad76-60d87d695918_del complete#033[00m
Oct  2 08:02:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:21.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:21 np0005465988 nova_compute[236126]: 2025-10-02 12:02:21.901 2 INFO nova.compute.manager [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Took 1.05 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:02:21 np0005465988 nova_compute[236126]: 2025-10-02 12:02:21.901 2 DEBUG oslo.service.loopingcall [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:02:21 np0005465988 nova_compute[236126]: 2025-10-02 12:02:21.902 2 DEBUG nova.compute.manager [-] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:02:21 np0005465988 nova_compute[236126]: 2025-10-02 12:02:21.903 2 DEBUG nova.network.neutron [-] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:02:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:22 np0005465988 nova_compute[236126]: 2025-10-02 12:02:22.220 2 DEBUG nova.network.neutron [-] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:02:22 np0005465988 nova_compute[236126]: 2025-10-02 12:02:22.260 2 DEBUG nova.network.neutron [-] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:02:22 np0005465988 nova_compute[236126]: 2025-10-02 12:02:22.299 2 INFO nova.compute.manager [-] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Took 0.40 seconds to deallocate network for instance.#033[00m
Oct  2 08:02:22 np0005465988 nova_compute[236126]: 2025-10-02 12:02:22.380 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:22 np0005465988 nova_compute[236126]: 2025-10-02 12:02:22.381 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:22 np0005465988 nova_compute[236126]: 2025-10-02 12:02:22.435 2 DEBUG oslo_concurrency.processutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:02:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2809225506' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:02:22 np0005465988 nova_compute[236126]: 2025-10-02 12:02:22.901 2 DEBUG oslo_concurrency.processutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:22 np0005465988 nova_compute[236126]: 2025-10-02 12:02:22.908 2 DEBUG nova.compute.provider_tree [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:02:22 np0005465988 nova_compute[236126]: 2025-10-02 12:02:22.936 2 DEBUG nova.scheduler.client.report [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:02:22 np0005465988 nova_compute[236126]: 2025-10-02 12:02:22.958 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:22 np0005465988 nova_compute[236126]: 2025-10-02 12:02:22.984 2 INFO nova.scheduler.client.report [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Deleted allocations for instance b8d4207f-7e3b-4a3c-ad76-60d87d695918#033[00m
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.031247) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406543031293, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2289, "num_deletes": 254, "total_data_size": 4948547, "memory_usage": 5026672, "flush_reason": "Manual Compaction"}
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406543045054, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3248815, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23220, "largest_seqno": 25504, "table_properties": {"data_size": 3239813, "index_size": 5496, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19953, "raw_average_key_size": 20, "raw_value_size": 3221253, "raw_average_value_size": 3334, "num_data_blocks": 242, "num_entries": 966, "num_filter_entries": 966, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759406368, "oldest_key_time": 1759406368, "file_creation_time": 1759406543, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 13902 microseconds, and 7572 cpu microseconds.
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.045102) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3248815 bytes OK
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.045167) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.046627) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.046658) EVENT_LOG_v1 {"time_micros": 1759406543046636, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.046674) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 4938438, prev total WAL file size 4938438, number of live WAL files 2.
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.048097) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3172KB)], [48(7639KB)]
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406543048169, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 11071642, "oldest_snapshot_seqno": -1}
Oct  2 08:02:23 np0005465988 nova_compute[236126]: 2025-10-02 12:02:23.069 2 DEBUG oslo_concurrency.lockutils [None req-318c254f-3bac-4489-8d1e-6605b06d30fc 1a06819bf8cc4ff7bccbbb2616ff2d21 f1ce36070fb047479c3a083f36733f63 - - default default] Lock "b8d4207f-7e3b-4a3c-ad76-60d87d695918" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 4879 keys, 9040510 bytes, temperature: kUnknown
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406543102718, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 9040510, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9006872, "index_size": 20344, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 123236, "raw_average_key_size": 25, "raw_value_size": 8917636, "raw_average_value_size": 1827, "num_data_blocks": 833, "num_entries": 4879, "num_filter_entries": 4879, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759406543, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.102995) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 9040510 bytes
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.104538) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 202.7 rd, 165.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 7.5 +0.0 blob) out(8.6 +0.0 blob), read-write-amplify(6.2) write-amplify(2.8) OK, records in: 5404, records dropped: 525 output_compression: NoCompression
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.104570) EVENT_LOG_v1 {"time_micros": 1759406543104557, "job": 28, "event": "compaction_finished", "compaction_time_micros": 54627, "compaction_time_cpu_micros": 36865, "output_level": 6, "num_output_files": 1, "total_output_size": 9040510, "num_input_records": 5404, "num_output_records": 4879, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406543105831, "job": 28, "event": "table_file_deletion", "file_number": 50}
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406543108666, "job": 28, "event": "table_file_deletion", "file_number": 48}
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.047918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.108749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.108756) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.108759) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.108762) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:02:23.108765) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:02:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 e149: 3 total, 3 up, 3 in
Oct  2 08:02:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:23.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:23.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:25 np0005465988 nova_compute[236126]: 2025-10-02 12:02:25.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:25.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:02:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/247468690' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:02:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:25.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:26 np0005465988 nova_compute[236126]: 2025-10-02 12:02:26.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:02:27.329 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:02:27.329 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:02:27.330 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:27.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:27.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:28 np0005465988 nova_compute[236126]: 2025-10-02 12:02:28.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:29.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:29 np0005465988 nova_compute[236126]: 2025-10-02 12:02:29.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:29 np0005465988 nova_compute[236126]: 2025-10-02 12:02:29.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:02:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:02:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:29.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.546 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.547 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.547 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.547 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.548 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:02:30.716 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:02:30.719 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:02:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:02:30 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2871346515' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:02:30 np0005465988 nova_compute[236126]: 2025-10-02 12:02:30.978 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.128 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.129 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4899MB free_disk=20.82571029663086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.130 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.130 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.275 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.276 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.297 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:02:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:31.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:02:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:02:31 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2568248138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.707 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.713 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.733 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.758 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.759 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:31.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.993 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Acquiring lock "e486df99-68ea-4bb4-b55c-d0a8b4f1bce7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:31 np0005465988 nova_compute[236126]: 2025-10-02 12:02:31.994 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "e486df99-68ea-4bb4-b55c-d0a8b4f1bce7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.048 2 DEBUG nova.compute.manager [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:02:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.173 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.173 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.181 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.182 2 INFO nova.compute.claims [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.370 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:02:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3660646601' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.821 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.828 2 DEBUG nova.compute.provider_tree [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.844 2 DEBUG nova.scheduler.client.report [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.896 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.722s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.896 2 DEBUG nova.compute.manager [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.930 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406537.9299884, b6a5f3ca-c662-41a0-ac02-78f9fba82bba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.931 2 INFO nova.compute.manager [-] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.960 2 DEBUG nova.compute.manager [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.961 2 DEBUG nova.network.neutron [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.965 2 DEBUG nova.compute.manager [None req-945d9b7a-b4ed-40b3-aec4-86c629e8f81b - - - - - -] [instance: b6a5f3ca-c662-41a0-ac02-78f9fba82bba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:32 np0005465988 nova_compute[236126]: 2025-10-02 12:02:32.980 2 INFO nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.000 2 DEBUG nova.compute.manager [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.125 2 DEBUG nova.compute.manager [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.126 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.127 2 INFO nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Creating image(s)#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.150 2 DEBUG nova.storage.rbd_utils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] rbd image e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.179 2 DEBUG nova.storage.rbd_utils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] rbd image e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.203 2 DEBUG nova.storage.rbd_utils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] rbd image e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.206 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.292 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.295 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.296 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.297 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.336 2 DEBUG nova.storage.rbd_utils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] rbd image e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.341 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:33.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.564 2 DEBUG nova.network.neutron [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.564 2 DEBUG nova.compute.manager [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.755 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.880 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.540s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:33.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:33 np0005465988 nova_compute[236126]: 2025-10-02 12:02:33.955 2 DEBUG nova.storage.rbd_utils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] resizing rbd image e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.205 2 DEBUG nova.objects.instance [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lazy-loading 'migration_context' on Instance uuid e486df99-68ea-4bb4-b55c-d0a8b4f1bce7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.219 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.220 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Ensure instance console log exists: /var/lib/nova/instances/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.220 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.221 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.221 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.224 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.231 2 WARNING nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.236 2 DEBUG nova.virt.libvirt.host [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.237 2 DEBUG nova.virt.libvirt.host [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.240 2 DEBUG nova.virt.libvirt.host [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.241 2 DEBUG nova.virt.libvirt.host [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.243 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.244 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.245 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.245 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.246 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.246 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.246 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.247 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.247 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.248 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.248 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.249 2 DEBUG nova.virt.hardware [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.253 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.506 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.506 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:02:34 np0005465988 podman[245064]: 2025-10-02 12:02:34.55718555 +0000 UTC m=+0.058206795 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:02:34 np0005465988 podman[245057]: 2025-10-02 12:02:34.580920445 +0000 UTC m=+0.102918346 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct  2 08:02:34 np0005465988 podman[245058]: 2025-10-02 12:02:34.604228409 +0000 UTC m=+0.104369308 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:02:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:02:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3481444994' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.744 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.780 2 DEBUG nova.storage.rbd_utils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] rbd image e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:02:34 np0005465988 nova_compute[236126]: 2025-10-02 12:02:34.784 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:02:35 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/906744250' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:02:35 np0005465988 nova_compute[236126]: 2025-10-02 12:02:35.246 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:35 np0005465988 nova_compute[236126]: 2025-10-02 12:02:35.248 2 DEBUG nova.objects.instance [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lazy-loading 'pci_devices' on Instance uuid e486df99-68ea-4bb4-b55c-d0a8b4f1bce7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:35 np0005465988 nova_compute[236126]: 2025-10-02 12:02:35.281 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  <uuid>e486df99-68ea-4bb4-b55c-d0a8b4f1bce7</uuid>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  <name>instance-00000011</name>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <nova:name>tempest-LiveMigrationNegativeTest-server-2109861523</nova:name>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:02:34</nova:creationTime>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <nova:user uuid="bea607da06554a67af45c6df851f7c86">tempest-LiveMigrationNegativeTest-766829789-project-member</nova:user>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <nova:project uuid="1b8418ff78264e3292f5cd5b736866f0">tempest-LiveMigrationNegativeTest-766829789</nova:project>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <entry name="serial">e486df99-68ea-4bb4-b55c-d0a8b4f1bce7</entry>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <entry name="uuid">e486df99-68ea-4bb4-b55c-d0a8b4f1bce7</entry>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk.config">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7/console.log" append="off"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:02:35 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:02:35 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:02:35 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:02:35 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:02:35 np0005465988 nova_compute[236126]: 2025-10-02 12:02:35.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:35.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:35 np0005465988 nova_compute[236126]: 2025-10-02 12:02:35.401 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:02:35 np0005465988 nova_compute[236126]: 2025-10-02 12:02:35.402 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:02:35 np0005465988 nova_compute[236126]: 2025-10-02 12:02:35.403 2 INFO nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Using config drive#033[00m
Oct  2 08:02:35 np0005465988 nova_compute[236126]: 2025-10-02 12:02:35.438 2 DEBUG nova.storage.rbd_utils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] rbd image e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:02:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:02:35.721 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:35.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:36 np0005465988 nova_compute[236126]: 2025-10-02 12:02:36.078 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406541.076589, b8d4207f-7e3b-4a3c-ad76-60d87d695918 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:36 np0005465988 nova_compute[236126]: 2025-10-02 12:02:36.079 2 INFO nova.compute.manager [-] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:02:36 np0005465988 nova_compute[236126]: 2025-10-02 12:02:36.109 2 DEBUG nova.compute.manager [None req-a4858bf9-20d9-4837-80d3-bf7e97285c7c - - - - - -] [instance: b8d4207f-7e3b-4a3c-ad76-60d87d695918] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:36 np0005465988 nova_compute[236126]: 2025-10-02 12:02:36.131 2 INFO nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Creating config drive at /var/lib/nova/instances/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7/disk.config#033[00m
Oct  2 08:02:36 np0005465988 nova_compute[236126]: 2025-10-02 12:02:36.136 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp54aoflhl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:36 np0005465988 nova_compute[236126]: 2025-10-02 12:02:36.269 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp54aoflhl" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:36 np0005465988 nova_compute[236126]: 2025-10-02 12:02:36.316 2 DEBUG nova.storage.rbd_utils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] rbd image e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:02:36 np0005465988 nova_compute[236126]: 2025-10-02 12:02:36.321 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7/disk.config e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:36 np0005465988 nova_compute[236126]: 2025-10-02 12:02:36.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:36 np0005465988 nova_compute[236126]: 2025-10-02 12:02:36.601 2 DEBUG oslo_concurrency.processutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7/disk.config e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.279s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:36 np0005465988 nova_compute[236126]: 2025-10-02 12:02:36.602 2 INFO nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Deleting local config drive /var/lib/nova/instances/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7/disk.config because it was imported into RBD.#033[00m
Oct  2 08:02:36 np0005465988 systemd-machined[192594]: New machine qemu-8-instance-00000011.
Oct  2 08:02:36 np0005465988 systemd[1]: Started Virtual Machine qemu-8-instance-00000011.
Oct  2 08:02:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:37.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.554 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406557.5536506, e486df99-68ea-4bb4-b55c-d0a8b4f1bce7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.556 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.560 2 DEBUG nova.compute.manager [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.560 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.564 2 INFO nova.virt.libvirt.driver [-] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Instance spawned successfully.#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.566 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.595 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.599 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.634 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.635 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.635 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.636 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.636 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.637 2 DEBUG nova.virt.libvirt.driver [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.652 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.652 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406557.5554318, e486df99-68ea-4bb4-b55c-d0a8b4f1bce7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.652 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] VM Started (Lifecycle Event)#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.824 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:37 np0005465988 nova_compute[236126]: 2025-10-02 12:02:37.830 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:02:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:37.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:38 np0005465988 nova_compute[236126]: 2025-10-02 12:02:38.174 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:02:38 np0005465988 nova_compute[236126]: 2025-10-02 12:02:38.185 2 INFO nova.compute.manager [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Took 5.06 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:02:38 np0005465988 nova_compute[236126]: 2025-10-02 12:02:38.186 2 DEBUG nova.compute.manager [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:38 np0005465988 nova_compute[236126]: 2025-10-02 12:02:38.258 2 INFO nova.compute.manager [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Took 6.10 seconds to build instance.#033[00m
Oct  2 08:02:38 np0005465988 nova_compute[236126]: 2025-10-02 12:02:38.285 2 DEBUG oslo_concurrency.lockutils [None req-0a62b369-8217-4182-92d4-19f974dbb3d5 bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "e486df99-68ea-4bb4-b55c-d0a8b4f1bce7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:02:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:02:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:02:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:02:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:02:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:02:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:02:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:39.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:39 np0005465988 nova_compute[236126]: 2025-10-02 12:02:39.852 2 DEBUG nova.objects.instance [None req-1f31f2dc-9bac-4815-825f-03ac5b522074 0fd4663173a44abeb138d3d6a20f8d17 1040113416964f64b0f1153ff23f45fd - - default default] Lazy-loading 'pci_devices' on Instance uuid e486df99-68ea-4bb4-b55c-d0a8b4f1bce7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:39 np0005465988 nova_compute[236126]: 2025-10-02 12:02:39.882 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406559.882557, e486df99-68ea-4bb4-b55c-d0a8b4f1bce7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:39 np0005465988 nova_compute[236126]: 2025-10-02 12:02:39.883 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:02:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:39.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:39 np0005465988 nova_compute[236126]: 2025-10-02 12:02:39.914 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:39 np0005465988 nova_compute[236126]: 2025-10-02 12:02:39.917 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:02:39 np0005465988 nova_compute[236126]: 2025-10-02 12:02:39.952 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Oct  2 08:02:40 np0005465988 nova_compute[236126]: 2025-10-02 12:02:40.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:40 np0005465988 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000011.scope: Deactivated successfully.
Oct  2 08:02:40 np0005465988 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000011.scope: Consumed 3.386s CPU time.
Oct  2 08:02:40 np0005465988 systemd-machined[192594]: Machine qemu-8-instance-00000011 terminated.
Oct  2 08:02:40 np0005465988 nova_compute[236126]: 2025-10-02 12:02:40.481 2 DEBUG nova.compute.manager [None req-1f31f2dc-9bac-4815-825f-03ac5b522074 0fd4663173a44abeb138d3d6a20f8d17 1040113416964f64b0f1153ff23f45fd - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:41 np0005465988 nova_compute[236126]: 2025-10-02 12:02:41.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:41.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:02:41 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1912580596' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:02:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:41.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:43 np0005465988 nova_compute[236126]: 2025-10-02 12:02:43.233 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Acquiring lock "e486df99-68ea-4bb4-b55c-d0a8b4f1bce7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:43 np0005465988 nova_compute[236126]: 2025-10-02 12:02:43.234 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "e486df99-68ea-4bb4-b55c-d0a8b4f1bce7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:43 np0005465988 nova_compute[236126]: 2025-10-02 12:02:43.234 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Acquiring lock "e486df99-68ea-4bb4-b55c-d0a8b4f1bce7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:43 np0005465988 nova_compute[236126]: 2025-10-02 12:02:43.235 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "e486df99-68ea-4bb4-b55c-d0a8b4f1bce7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:43 np0005465988 nova_compute[236126]: 2025-10-02 12:02:43.235 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "e486df99-68ea-4bb4-b55c-d0a8b4f1bce7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:43 np0005465988 nova_compute[236126]: 2025-10-02 12:02:43.236 2 INFO nova.compute.manager [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Terminating instance#033[00m
Oct  2 08:02:43 np0005465988 nova_compute[236126]: 2025-10-02 12:02:43.237 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Acquiring lock "refresh_cache-e486df99-68ea-4bb4-b55c-d0a8b4f1bce7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:02:43 np0005465988 nova_compute[236126]: 2025-10-02 12:02:43.237 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Acquired lock "refresh_cache-e486df99-68ea-4bb4-b55c-d0a8b4f1bce7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:02:43 np0005465988 nova_compute[236126]: 2025-10-02 12:02:43.237 2 DEBUG nova.network.neutron [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:02:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:43.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:43 np0005465988 nova_compute[236126]: 2025-10-02 12:02:43.736 2 DEBUG nova.network.neutron [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:02:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:43.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:44 np0005465988 nova_compute[236126]: 2025-10-02 12:02:44.326 2 DEBUG nova.network.neutron [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:02:44 np0005465988 nova_compute[236126]: 2025-10-02 12:02:44.462 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Releasing lock "refresh_cache-e486df99-68ea-4bb4-b55c-d0a8b4f1bce7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:02:44 np0005465988 nova_compute[236126]: 2025-10-02 12:02:44.463 2 DEBUG nova.compute.manager [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:02:44 np0005465988 nova_compute[236126]: 2025-10-02 12:02:44.473 2 INFO nova.virt.libvirt.driver [-] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Instance destroyed successfully.#033[00m
Oct  2 08:02:44 np0005465988 nova_compute[236126]: 2025-10-02 12:02:44.474 2 DEBUG nova.objects.instance [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lazy-loading 'resources' on Instance uuid e486df99-68ea-4bb4-b55c-d0a8b4f1bce7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:44 np0005465988 ovn_controller[132601]: 2025-10-02T12:02:44Z|00053|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  2 08:02:44 np0005465988 nova_compute[236126]: 2025-10-02 12:02:44.957 2 INFO nova.virt.libvirt.driver [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Deleting instance files /var/lib/nova/instances/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_del#033[00m
Oct  2 08:02:44 np0005465988 nova_compute[236126]: 2025-10-02 12:02:44.958 2 INFO nova.virt.libvirt.driver [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Deletion of /var/lib/nova/instances/e486df99-68ea-4bb4-b55c-d0a8b4f1bce7_del complete#033[00m
Oct  2 08:02:45 np0005465988 nova_compute[236126]: 2025-10-02 12:02:45.083 2 INFO nova.compute.manager [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Took 0.62 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:02:45 np0005465988 nova_compute[236126]: 2025-10-02 12:02:45.083 2 DEBUG oslo.service.loopingcall [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:02:45 np0005465988 nova_compute[236126]: 2025-10-02 12:02:45.084 2 DEBUG nova.compute.manager [-] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:02:45 np0005465988 nova_compute[236126]: 2025-10-02 12:02:45.084 2 DEBUG nova.network.neutron [-] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:02:45 np0005465988 nova_compute[236126]: 2025-10-02 12:02:45.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:45.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:45 np0005465988 nova_compute[236126]: 2025-10-02 12:02:45.457 2 DEBUG nova.network.neutron [-] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:02:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:02:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:02:45 np0005465988 nova_compute[236126]: 2025-10-02 12:02:45.504 2 DEBUG nova.network.neutron [-] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:02:45 np0005465988 nova_compute[236126]: 2025-10-02 12:02:45.538 2 INFO nova.compute.manager [-] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Took 0.45 seconds to deallocate network for instance.#033[00m
Oct  2 08:02:45 np0005465988 nova_compute[236126]: 2025-10-02 12:02:45.663 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:45 np0005465988 nova_compute[236126]: 2025-10-02 12:02:45.664 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:45 np0005465988 nova_compute[236126]: 2025-10-02 12:02:45.747 2 DEBUG oslo_concurrency.processutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:45.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:02:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3014154418' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:02:46 np0005465988 nova_compute[236126]: 2025-10-02 12:02:46.222 2 DEBUG oslo_concurrency.processutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:46 np0005465988 nova_compute[236126]: 2025-10-02 12:02:46.229 2 DEBUG nova.compute.provider_tree [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:02:46 np0005465988 nova_compute[236126]: 2025-10-02 12:02:46.288 2 DEBUG nova.scheduler.client.report [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:02:46 np0005465988 nova_compute[236126]: 2025-10-02 12:02:46.352 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:46 np0005465988 nova_compute[236126]: 2025-10-02 12:02:46.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:46 np0005465988 nova_compute[236126]: 2025-10-02 12:02:46.486 2 INFO nova.scheduler.client.report [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Deleted allocations for instance e486df99-68ea-4bb4-b55c-d0a8b4f1bce7#033[00m
Oct  2 08:02:46 np0005465988 nova_compute[236126]: 2025-10-02 12:02:46.773 2 DEBUG oslo_concurrency.lockutils [None req-fd0f676b-3d51-4204-adb6-af47d457f7cc bea607da06554a67af45c6df851f7c86 1b8418ff78264e3292f5cd5b736866f0 - - default default] Lock "e486df99-68ea-4bb4-b55c-d0a8b4f1bce7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:47.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:47.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:48 np0005465988 podman[245515]: 2025-10-02 12:02:48.573770108 +0000 UTC m=+0.099953462 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 08:02:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:49.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:49.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:50 np0005465988 nova_compute[236126]: 2025-10-02 12:02:50.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:51 np0005465988 nova_compute[236126]: 2025-10-02 12:02:51.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:51.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:02:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:51.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:02:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:53.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:53.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:55 np0005465988 nova_compute[236126]: 2025-10-02 12:02:55.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:55.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:55 np0005465988 nova_compute[236126]: 2025-10-02 12:02:55.483 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406560.4810975, e486df99-68ea-4bb4-b55c-d0a8b4f1bce7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:55 np0005465988 nova_compute[236126]: 2025-10-02 12:02:55.484 2 INFO nova.compute.manager [-] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:02:55 np0005465988 nova_compute[236126]: 2025-10-02 12:02:55.510 2 DEBUG nova.compute.manager [None req-d472f709-19d2-406d-afa5-b79b335f76e0 - - - - - -] [instance: e486df99-68ea-4bb4-b55c-d0a8b4f1bce7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:55.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:56 np0005465988 nova_compute[236126]: 2025-10-02 12:02:56.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:02:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:57.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:57.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:02:59.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:02:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:02:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:02:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:02:59.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:00 np0005465988 nova_compute[236126]: 2025-10-02 12:03:00.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:01 np0005465988 nova_compute[236126]: 2025-10-02 12:03:01.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:01.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:01.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:03.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:03.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:05 np0005465988 nova_compute[236126]: 2025-10-02 12:03:05.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:05.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:05 np0005465988 podman[245598]: 2025-10-02 12:03:05.574964804 +0000 UTC m=+0.096996566 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:03:05 np0005465988 podman[245596]: 2025-10-02 12:03:05.579225557 +0000 UTC m=+0.107816289 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:03:05 np0005465988 podman[245597]: 2025-10-02 12:03:05.581711469 +0000 UTC m=+0.104163783 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:03:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:05.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:06 np0005465988 nova_compute[236126]: 2025-10-02 12:03:06.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:07.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:07.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:08 np0005465988 nova_compute[236126]: 2025-10-02 12:03:08.688 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Acquiring lock "7b29078f-72c6-44c3-81ab-81a5aa09e9e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:08 np0005465988 nova_compute[236126]: 2025-10-02 12:03:08.689 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "7b29078f-72c6-44c3-81ab-81a5aa09e9e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:08 np0005465988 nova_compute[236126]: 2025-10-02 12:03:08.708 2 DEBUG nova.compute.manager [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:03:08 np0005465988 nova_compute[236126]: 2025-10-02 12:03:08.786 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:08 np0005465988 nova_compute[236126]: 2025-10-02 12:03:08.787 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:08 np0005465988 nova_compute[236126]: 2025-10-02 12:03:08.793 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:03:08 np0005465988 nova_compute[236126]: 2025-10-02 12:03:08.793 2 INFO nova.compute.claims [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:03:08 np0005465988 nova_compute[236126]: 2025-10-02 12:03:08.936 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:03:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1088330370' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.401 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.406 2 DEBUG nova.compute.provider_tree [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.430 2 DEBUG nova.scheduler.client.report [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:03:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:09.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.454 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.455 2 DEBUG nova.compute.manager [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.546 2 DEBUG nova.compute.manager [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.565 2 INFO nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.593 2 DEBUG nova.compute.manager [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.677 2 DEBUG nova.compute.manager [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.679 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.679 2 INFO nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Creating image(s)#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.700 2 DEBUG nova.storage.rbd_utils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] rbd image 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.726 2 DEBUG nova.storage.rbd_utils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] rbd image 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.755 2 DEBUG nova.storage.rbd_utils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] rbd image 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.760 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.819 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.820 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.821 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.821 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.849 2 DEBUG nova.storage.rbd_utils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] rbd image 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:03:09 np0005465988 nova_compute[236126]: 2025-10-02 12:03:09.855 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:09.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.144 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.288s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.223 2 DEBUG nova.storage.rbd_utils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] resizing rbd image 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.343 2 DEBUG nova.objects.instance [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lazy-loading 'migration_context' on Instance uuid 7b29078f-72c6-44c3-81ab-81a5aa09e9e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.357 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.358 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Ensure instance console log exists: /var/lib/nova/instances/7b29078f-72c6-44c3-81ab-81a5aa09e9e8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.358 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.359 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.359 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.360 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.364 2 WARNING nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.369 2 DEBUG nova.virt.libvirt.host [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.370 2 DEBUG nova.virt.libvirt.host [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.374 2 DEBUG nova.virt.libvirt.host [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.374 2 DEBUG nova.virt.libvirt.host [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.377 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.379 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.380 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.380 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.380 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.381 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.381 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.381 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.382 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.382 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.382 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.383 2 DEBUG nova.virt.hardware [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.385 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:03:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1729442588' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.860 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.893 2 DEBUG nova.storage.rbd_utils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] rbd image 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:03:10 np0005465988 nova_compute[236126]: 2025-10-02 12:03:10.898 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:03:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/872445967' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.390 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.394 2 DEBUG nova.objects.instance [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lazy-loading 'pci_devices' on Instance uuid 7b29078f-72c6-44c3-81ab-81a5aa09e9e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.411 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  <uuid>7b29078f-72c6-44c3-81ab-81a5aa09e9e8</uuid>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  <name>instance-00000013</name>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerDiagnosticsV248Test-server-358669194</nova:name>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:03:10</nova:creationTime>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <nova:user uuid="9088d723f63c46be96487ca378dd484a">tempest-ServerDiagnosticsV248Test-1329521098-project-member</nova:user>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <nova:project uuid="ffda556e614448dfbe0b67bebc19394b">tempest-ServerDiagnosticsV248Test-1329521098</nova:project>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <entry name="serial">7b29078f-72c6-44c3-81ab-81a5aa09e9e8</entry>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <entry name="uuid">7b29078f-72c6-44c3-81ab-81a5aa09e9e8</entry>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk.config">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/7b29078f-72c6-44c3-81ab-81a5aa09e9e8/console.log" append="off"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:03:11 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:03:11 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:03:11 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:03:11 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:11.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.463 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.463 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.463 2 INFO nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Using config drive#033[00m
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.489 2 DEBUG nova.storage.rbd_utils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] rbd image 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.730 2 INFO nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Creating config drive at /var/lib/nova/instances/7b29078f-72c6-44c3-81ab-81a5aa09e9e8/disk.config#033[00m
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.735 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7b29078f-72c6-44c3-81ab-81a5aa09e9e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp72rfmdc3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.860 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7b29078f-72c6-44c3-81ab-81a5aa09e9e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp72rfmdc3" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.899 2 DEBUG nova.storage.rbd_utils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] rbd image 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:03:11 np0005465988 nova_compute[236126]: 2025-10-02 12:03:11.907 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7b29078f-72c6-44c3-81ab-81a5aa09e9e8/disk.config 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:11.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:12 np0005465988 nova_compute[236126]: 2025-10-02 12:03:12.259 2 DEBUG oslo_concurrency.processutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7b29078f-72c6-44c3-81ab-81a5aa09e9e8/disk.config 7b29078f-72c6-44c3-81ab-81a5aa09e9e8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.352s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:12 np0005465988 nova_compute[236126]: 2025-10-02 12:03:12.261 2 INFO nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Deleting local config drive /var/lib/nova/instances/7b29078f-72c6-44c3-81ab-81a5aa09e9e8/disk.config because it was imported into RBD.#033[00m
Oct  2 08:03:12 np0005465988 systemd-machined[192594]: New machine qemu-9-instance-00000013.
Oct  2 08:03:12 np0005465988 systemd[1]: Started Virtual Machine qemu-9-instance-00000013.
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.335 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406593.3349857, 7b29078f-72c6-44c3-81ab-81a5aa09e9e8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.336 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.339 2 DEBUG nova.compute.manager [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.340 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.345 2 INFO nova.virt.libvirt.driver [-] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Instance spawned successfully.#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.346 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.374 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.384 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.390 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.390 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.392 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.393 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.394 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.395 2 DEBUG nova.virt.libvirt.driver [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.407 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.407 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406593.3366883, 7b29078f-72c6-44c3-81ab-81a5aa09e9e8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.408 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] VM Started (Lifecycle Event)#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.427 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.431 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:03:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:13.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.457 2 INFO nova.compute.manager [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Took 3.78 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.457 2 DEBUG nova.compute.manager [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.459 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.539 2 INFO nova.compute.manager [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Took 4.78 seconds to build instance.#033[00m
Oct  2 08:03:13 np0005465988 nova_compute[236126]: 2025-10-02 12:03:13.597 2 DEBUG oslo_concurrency.lockutils [None req-5f1e11c4-8551-41b7-9d57-9acba10480e1 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "7b29078f-72c6-44c3-81ab-81a5aa09e9e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.908s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:13.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:14 np0005465988 nova_compute[236126]: 2025-10-02 12:03:14.222 2 DEBUG nova.compute.manager [None req-0764fa9d-0675-4f77-8477-1bb9650c467e 133ac2ebfb6c41698ae9871583d7de77 922148cc0a4a4295814d17e4b0e41991 - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:14 np0005465988 nova_compute[236126]: 2025-10-02 12:03:14.227 2 INFO nova.compute.manager [None req-0764fa9d-0675-4f77-8477-1bb9650c467e 133ac2ebfb6c41698ae9871583d7de77 922148cc0a4a4295814d17e4b0e41991 - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Retrieving diagnostics#033[00m
Oct  2 08:03:15 np0005465988 nova_compute[236126]: 2025-10-02 12:03:15.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:15.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:15.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:16 np0005465988 nova_compute[236126]: 2025-10-02 12:03:16.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:17.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:17.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:19.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:19 np0005465988 podman[246079]: 2025-10-02 12:03:19.532785993 +0000 UTC m=+0.066853584 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:03:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:19.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:20 np0005465988 nova_compute[236126]: 2025-10-02 12:03:20.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:21 np0005465988 nova_compute[236126]: 2025-10-02 12:03:21.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:21.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:21.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:23.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:23.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:24 np0005465988 nova_compute[236126]: 2025-10-02 12:03:24.547 2 DEBUG nova.compute.manager [None req-cad4ae0d-525f-4893-aef1-3eb998cbfdc6 133ac2ebfb6c41698ae9871583d7de77 922148cc0a4a4295814d17e4b0e41991 - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:24 np0005465988 nova_compute[236126]: 2025-10-02 12:03:24.555 2 INFO nova.compute.manager [None req-cad4ae0d-525f-4893-aef1-3eb998cbfdc6 133ac2ebfb6c41698ae9871583d7de77 922148cc0a4a4295814d17e4b0e41991 - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Retrieving diagnostics#033[00m
Oct  2 08:03:25 np0005465988 nova_compute[236126]: 2025-10-02 12:03:25.149 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Acquiring lock "7b29078f-72c6-44c3-81ab-81a5aa09e9e8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:25 np0005465988 nova_compute[236126]: 2025-10-02 12:03:25.150 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "7b29078f-72c6-44c3-81ab-81a5aa09e9e8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:25 np0005465988 nova_compute[236126]: 2025-10-02 12:03:25.150 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Acquiring lock "7b29078f-72c6-44c3-81ab-81a5aa09e9e8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:25 np0005465988 nova_compute[236126]: 2025-10-02 12:03:25.151 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "7b29078f-72c6-44c3-81ab-81a5aa09e9e8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:25 np0005465988 nova_compute[236126]: 2025-10-02 12:03:25.151 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "7b29078f-72c6-44c3-81ab-81a5aa09e9e8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:25 np0005465988 nova_compute[236126]: 2025-10-02 12:03:25.152 2 INFO nova.compute.manager [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Terminating instance#033[00m
Oct  2 08:03:25 np0005465988 nova_compute[236126]: 2025-10-02 12:03:25.153 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Acquiring lock "refresh_cache-7b29078f-72c6-44c3-81ab-81a5aa09e9e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:03:25 np0005465988 nova_compute[236126]: 2025-10-02 12:03:25.154 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Acquired lock "refresh_cache-7b29078f-72c6-44c3-81ab-81a5aa09e9e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:03:25 np0005465988 nova_compute[236126]: 2025-10-02 12:03:25.154 2 DEBUG nova.network.neutron [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:03:25 np0005465988 nova_compute[236126]: 2025-10-02 12:03:25.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:25.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:25 np0005465988 nova_compute[236126]: 2025-10-02 12:03:25.480 2 DEBUG nova.network.neutron [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:25.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:26 np0005465988 nova_compute[236126]: 2025-10-02 12:03:26.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:26 np0005465988 nova_compute[236126]: 2025-10-02 12:03:26.504 2 DEBUG nova.network.neutron [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:26 np0005465988 nova_compute[236126]: 2025-10-02 12:03:26.525 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Releasing lock "refresh_cache-7b29078f-72c6-44c3-81ab-81a5aa09e9e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:03:26 np0005465988 nova_compute[236126]: 2025-10-02 12:03:26.526 2 DEBUG nova.compute.manager [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:03:26 np0005465988 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000013.scope: Deactivated successfully.
Oct  2 08:03:26 np0005465988 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000013.scope: Consumed 12.728s CPU time.
Oct  2 08:03:26 np0005465988 systemd-machined[192594]: Machine qemu-9-instance-00000013 terminated.
Oct  2 08:03:26 np0005465988 nova_compute[236126]: 2025-10-02 12:03:26.949 2 INFO nova.virt.libvirt.driver [-] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Instance destroyed successfully.#033[00m
Oct  2 08:03:26 np0005465988 nova_compute[236126]: 2025-10-02 12:03:26.949 2 DEBUG nova.objects.instance [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lazy-loading 'resources' on Instance uuid 7b29078f-72c6-44c3-81ab-81a5aa09e9e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:03:27.331 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:03:27.331 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:03:27.331 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:27.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:27.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:28 np0005465988 nova_compute[236126]: 2025-10-02 12:03:28.094 2 INFO nova.virt.libvirt.driver [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Deleting instance files /var/lib/nova/instances/7b29078f-72c6-44c3-81ab-81a5aa09e9e8_del#033[00m
Oct  2 08:03:28 np0005465988 nova_compute[236126]: 2025-10-02 12:03:28.095 2 INFO nova.virt.libvirt.driver [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Deletion of /var/lib/nova/instances/7b29078f-72c6-44c3-81ab-81a5aa09e9e8_del complete#033[00m
Oct  2 08:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5082 writes, 26K keys, 5082 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 5082 writes, 5082 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1592 writes, 7547 keys, 1592 commit groups, 1.0 writes per commit group, ingest: 15.47 MB, 0.03 MB/s#012Interval WAL: 1592 writes, 1592 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     68.5      0.46              0.09        14    0.033       0      0       0.0       0.0#012  L6      1/0    8.62 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.4    114.9     94.6      1.13              0.40        13    0.087     60K   6875       0.0       0.0#012 Sum      1/0    8.62 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.4     81.7     87.1      1.58              0.49        27    0.059     60K   6875       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   5.1    105.5    107.6      0.47              0.21        10    0.047     25K   2555       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    114.9     94.6      1.13              0.40        13    0.087     60K   6875       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     68.8      0.46              0.09        13    0.035       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.031, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.13 GB write, 0.08 MB/s write, 0.13 GB read, 0.07 MB/s read, 1.6 seconds#012Interval compaction: 0.05 GB write, 0.08 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 304.00 MB usage: 12.40 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000139 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(717,11.90 MB,3.91545%) FilterBlock(27,173.55 KB,0.0557498%) IndexBlock(27,331.31 KB,0.10643%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 08:03:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:29.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:29 np0005465988 nova_compute[236126]: 2025-10-02 12:03:29.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:29 np0005465988 nova_compute[236126]: 2025-10-02 12:03:29.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:29 np0005465988 nova_compute[236126]: 2025-10-02 12:03:29.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:03:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:29.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.121 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.186 2 INFO nova.compute.manager [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Took 3.66 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.187 2 DEBUG oslo.service.loopingcall [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.188 2 DEBUG nova.compute.manager [-] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.188 2 DEBUG nova.network.neutron [-] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.366 2 DEBUG nova.network.neutron [-] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.418 2 DEBUG nova.network.neutron [-] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.443 2 INFO nova.compute.manager [-] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Took 0.26 seconds to deallocate network for instance.#033[00m
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.573 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.574 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:30 np0005465988 nova_compute[236126]: 2025-10-02 12:03:30.821 2 DEBUG oslo_concurrency.processutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.122 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.123 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.123 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.124 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.149 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:03:31 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/250854786' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.226 2 DEBUG oslo_concurrency.processutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.237 2 DEBUG nova.compute.provider_tree [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.289 2 DEBUG nova.scheduler.client.report [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:31.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.630 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.635 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.636 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.636 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.637 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.725 2 INFO nova.scheduler.client.report [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Deleted allocations for instance 7b29078f-72c6-44c3-81ab-81a5aa09e9e8#033[00m
Oct  2 08:03:31 np0005465988 nova_compute[236126]: 2025-10-02 12:03:31.868 2 DEBUG oslo_concurrency.lockutils [None req-9917e8b4-04b3-41ba-90f2-9b8553c6aad4 9088d723f63c46be96487ca378dd484a ffda556e614448dfbe0b67bebc19394b - - default default] Lock "7b29078f-72c6-44c3-81ab-81a5aa09e9e8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:31.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:03:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1247553154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:03:32 np0005465988 nova_compute[236126]: 2025-10-02 12:03:32.049 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:32 np0005465988 nova_compute[236126]: 2025-10-02 12:03:32.241 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:03:32 np0005465988 nova_compute[236126]: 2025-10-02 12:03:32.242 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4883MB free_disk=20.946483612060547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:03:32 np0005465988 nova_compute[236126]: 2025-10-02 12:03:32.242 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:32 np0005465988 nova_compute[236126]: 2025-10-02 12:03:32.243 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:32 np0005465988 nova_compute[236126]: 2025-10-02 12:03:32.508 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:03:32 np0005465988 nova_compute[236126]: 2025-10-02 12:03:32.508 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:03:32 np0005465988 nova_compute[236126]: 2025-10-02 12:03:32.533 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:03:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3069889945' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:03:33 np0005465988 nova_compute[236126]: 2025-10-02 12:03:33.004 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:33 np0005465988 nova_compute[236126]: 2025-10-02 12:03:33.013 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:03:33 np0005465988 nova_compute[236126]: 2025-10-02 12:03:33.093 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:03:33 np0005465988 nova_compute[236126]: 2025-10-02 12:03:33.164 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:03:33 np0005465988 nova_compute[236126]: 2025-10-02 12:03:33.165 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:33.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:33 np0005465988 nova_compute[236126]: 2025-10-02 12:03:33.516 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:33 np0005465988 nova_compute[236126]: 2025-10-02 12:03:33.517 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:33 np0005465988 nova_compute[236126]: 2025-10-02 12:03:33.518 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:33.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:34 np0005465988 nova_compute[236126]: 2025-10-02 12:03:34.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:03:34.733 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:03:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:03:34.734 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:03:34 np0005465988 nova_compute[236126]: 2025-10-02 12:03:34.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:03:34.767 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:03:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:03:34.768 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:03:34 np0005465988 nova_compute[236126]: 2025-10-02 12:03:34.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:35 np0005465988 nova_compute[236126]: 2025-10-02 12:03:35.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:35 np0005465988 nova_compute[236126]: 2025-10-02 12:03:35.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:35 np0005465988 nova_compute[236126]: 2025-10-02 12:03:35.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:03:35 np0005465988 nova_compute[236126]: 2025-10-02 12:03:35.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:03:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:35.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:35 np0005465988 nova_compute[236126]: 2025-10-02 12:03:35.500 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:03:35 np0005465988 nova_compute[236126]: 2025-10-02 12:03:35.501 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:35.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:36 np0005465988 nova_compute[236126]: 2025-10-02 12:03:36.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:36 np0005465988 podman[246249]: 2025-10-02 12:03:36.528352492 +0000 UTC m=+0.058930456 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  2 08:03:36 np0005465988 podman[246248]: 2025-10-02 12:03:36.529528176 +0000 UTC m=+0.063327833 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:03:36 np0005465988 podman[246247]: 2025-10-02 12:03:36.554024384 +0000 UTC m=+0.090238871 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct  2 08:03:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:37.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:37 np0005465988 nova_compute[236126]: 2025-10-02 12:03:37.487 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:37 np0005465988 nova_compute[236126]: 2025-10-02 12:03:37.488 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:03:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:37.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:38 np0005465988 nova_compute[236126]: 2025-10-02 12:03:38.490 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:39.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:39.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:40 np0005465988 nova_compute[236126]: 2025-10-02 12:03:40.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:03:40.736 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:03:41 np0005465988 nova_compute[236126]: 2025-10-02 12:03:41.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:41.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:03:41.770 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:03:41 np0005465988 nova_compute[236126]: 2025-10-02 12:03:41.948 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406606.9468074, 7b29078f-72c6-44c3-81ab-81a5aa09e9e8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:41 np0005465988 nova_compute[236126]: 2025-10-02 12:03:41.948 2 INFO nova.compute.manager [-] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:03:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:41.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:43.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:43.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:45 np0005465988 nova_compute[236126]: 2025-10-02 12:03:45.093 2 DEBUG nova.compute.manager [None req-69e16786-1f70-494c-9c40-52bb549bed5a - - - - - -] [instance: 7b29078f-72c6-44c3-81ab-81a5aa09e9e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:45 np0005465988 nova_compute[236126]: 2025-10-02 12:03:45.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:45.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:45.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:46 np0005465988 nova_compute[236126]: 2025-10-02 12:03:46.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:03:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:03:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:03:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:03:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:03:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:47.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:47.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:49.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:49.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:50 np0005465988 nova_compute[236126]: 2025-10-02 12:03:50.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:50 np0005465988 podman[246570]: 2025-10-02 12:03:50.53728734 +0000 UTC m=+0.072024934 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 08:03:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:03:51 np0005465988 nova_compute[236126]: 2025-10-02 12:03:51.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:51.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:51.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:53.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:53.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:55 np0005465988 nova_compute[236126]: 2025-10-02 12:03:55.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:55.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:55.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:56 np0005465988 nova_compute[236126]: 2025-10-02 12:03:56.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:03:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:57.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:03:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:57.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:03:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:03:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:03:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:03:59.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:03:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:03:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:03:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:03:59.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:00 np0005465988 nova_compute[236126]: 2025-10-02 12:04:00.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:04:01 np0005465988 nova_compute[236126]: 2025-10-02 12:04:01.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:01.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:01.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:04:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:04:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:04:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:04:02 np0005465988 ceph-mgr[76715]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3158772141
Oct  2 08:04:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:03.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:03.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:05 np0005465988 nova_compute[236126]: 2025-10-02 12:04:05.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:05.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:04:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:05.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:06 np0005465988 nova_compute[236126]: 2025-10-02 12:04:06.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:04:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1801.0 total, 600.0 interval#012Cumulative writes: 12K writes, 51K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 12K writes, 3458 syncs, 3.49 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6548 writes, 27K keys, 6548 commit groups, 1.0 writes per commit group, ingest: 29.48 MB, 0.05 MB/s#012Interval WAL: 6548 writes, 2553 syncs, 2.56 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 08:04:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:07 np0005465988 podman[246651]: 2025-10-02 12:04:07.539221958 +0000 UTC m=+0.073101895 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:04:07 np0005465988 podman[246652]: 2025-10-02 12:04:07.56625183 +0000 UTC m=+0.090258301 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Oct  2 08:04:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:07.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:07 np0005465988 podman[246650]: 2025-10-02 12:04:07.587286178 +0000 UTC m=+0.123809091 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:04:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:07.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:04:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:09.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:04:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:10.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:10 np0005465988 nova_compute[236126]: 2025-10-02 12:04:10.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:10 np0005465988 nova_compute[236126]: 2025-10-02 12:04:10.857 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:11 np0005465988 nova_compute[236126]: 2025-10-02 12:04:11.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:11.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:12.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:12 np0005465988 ovn_controller[132601]: 2025-10-02T12:04:12Z|00054|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  2 08:04:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:04:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:13.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:04:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:14.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:15 np0005465988 nova_compute[236126]: 2025-10-02 12:04:15.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:15.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:16.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:16 np0005465988 nova_compute[236126]: 2025-10-02 12:04:16.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:17.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:04:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:18.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:04:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:04:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:04:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:19.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:04:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:20.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:04:20 np0005465988 nova_compute[236126]: 2025-10-02 12:04:20.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:21 np0005465988 nova_compute[236126]: 2025-10-02 12:04:21.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:21 np0005465988 podman[246820]: 2025-10-02 12:04:21.578299415 +0000 UTC m=+0.099941932 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:04:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:21.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:22.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:23.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:04:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:24.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:25 np0005465988 nova_compute[236126]: 2025-10-02 12:04:25.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:25.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:04:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:26.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:26 np0005465988 nova_compute[236126]: 2025-10-02 12:04:26.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:27.332 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:27.333 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:27.333 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:27.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:28.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.126 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "17426d60-57ac-41a5-9ae2-688821fe7f56" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.126 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.155 2 DEBUG nova.compute.manager [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.312 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.313 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.322 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.322 2 INFO nova.compute.claims [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.427 2 DEBUG nova.scheduler.client.report [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.450 2 DEBUG nova.scheduler.client.report [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.451 2 DEBUG nova.compute.provider_tree [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.466 2 DEBUG nova.scheduler.client.report [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.500 2 DEBUG nova.scheduler.client.report [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:04:28 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.568 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:04:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1311168690' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:28.999 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.005 2 DEBUG nova.compute.provider_tree [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.029 2 DEBUG nova.scheduler.client.report [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.058 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.059 2 DEBUG nova.compute.manager [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.128 2 DEBUG nova.compute.manager [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.128 2 DEBUG nova.network.neutron [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.149 2 INFO nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.178 2 DEBUG nova.compute.manager [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.264 2 DEBUG nova.compute.manager [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.265 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.266 2 INFO nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Creating image(s)#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.292 2 DEBUG nova.storage.rbd_utils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image 17426d60-57ac-41a5-9ae2-688821fe7f56_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.327 2 DEBUG nova.storage.rbd_utils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image 17426d60-57ac-41a5-9ae2-688821fe7f56_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.354 2 DEBUG nova.storage.rbd_utils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image 17426d60-57ac-41a5-9ae2-688821fe7f56_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.357 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.417 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.418 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.419 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.419 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.444 2 DEBUG nova.storage.rbd_utils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image 17426d60-57ac-41a5-9ae2-688821fe7f56_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.448 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 17426d60-57ac-41a5-9ae2-688821fe7f56_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.519 2 DEBUG nova.policy [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8850add40b254d198f270d9e64c777d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9afa78cc4dec419babdf61fd31f46e28', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:04:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:29.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.763 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 17426d60-57ac-41a5-9ae2-688821fe7f56_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.316s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.831 2 DEBUG nova.storage.rbd_utils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] resizing rbd image 17426d60-57ac-41a5-9ae2-688821fe7f56_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.970 2 DEBUG nova.objects.instance [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lazy-loading 'migration_context' on Instance uuid 17426d60-57ac-41a5-9ae2-688821fe7f56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.995 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.995 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Ensure instance console log exists: /var/lib/nova/instances/17426d60-57ac-41a5-9ae2-688821fe7f56/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.996 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.996 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:29 np0005465988 nova_compute[236126]: 2025-10-02 12:04:29.996 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:30.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:30 np0005465988 nova_compute[236126]: 2025-10-02 12:04:30.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:30 np0005465988 nova_compute[236126]: 2025-10-02 12:04:30.876 2 DEBUG nova.network.neutron [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Successfully created port: f98d3352-aeeb-4929-9920-2a306cb9558d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:04:31 np0005465988 nova_compute[236126]: 2025-10-02 12:04:31.523 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:31 np0005465988 nova_compute[236126]: 2025-10-02 12:04:31.523 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:31 np0005465988 nova_compute[236126]: 2025-10-02 12:04:31.524 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:04:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:31.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:31 np0005465988 nova_compute[236126]: 2025-10-02 12:04:31.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:31 np0005465988 nova_compute[236126]: 2025-10-02 12:04:31.863 2 DEBUG nova.network.neutron [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Successfully updated port: f98d3352-aeeb-4929-9920-2a306cb9558d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:04:31 np0005465988 nova_compute[236126]: 2025-10-02 12:04:31.888 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "refresh_cache-17426d60-57ac-41a5-9ae2-688821fe7f56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:31 np0005465988 nova_compute[236126]: 2025-10-02 12:04:31.888 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquired lock "refresh_cache-17426d60-57ac-41a5-9ae2-688821fe7f56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:31 np0005465988 nova_compute[236126]: 2025-10-02 12:04:31.888 2 DEBUG nova.network.neutron [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:04:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:32.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.143 2 DEBUG nova.compute.manager [req-95cbc89e-2e63-4f78-ad55-212d514b0855 req-a0a811e2-81c9-4940-8f4f-0eed682829f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Received event network-changed-f98d3352-aeeb-4929-9920-2a306cb9558d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.144 2 DEBUG nova.compute.manager [req-95cbc89e-2e63-4f78-ad55-212d514b0855 req-a0a811e2-81c9-4940-8f4f-0eed682829f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Refreshing instance network info cache due to event network-changed-f98d3352-aeeb-4929-9920-2a306cb9558d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.145 2 DEBUG oslo_concurrency.lockutils [req-95cbc89e-2e63-4f78-ad55-212d514b0855 req-a0a811e2-81c9-4940-8f4f-0eed682829f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-17426d60-57ac-41a5-9ae2-688821fe7f56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.462 2 DEBUG nova.network.neutron [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.533 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.534 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.534 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.534 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:04:32 np0005465988 nova_compute[236126]: 2025-10-02 12:04:32.534 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:04:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4190072840' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.010 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.180 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.181 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4873MB free_disk=20.96361541748047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.181 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.182 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.281 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 17426d60-57ac-41a5-9ae2-688821fe7f56 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.282 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.282 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.354 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:33.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:04:33 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3944645154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.754 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.761 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.778 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.805 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.805 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.878 2 DEBUG nova.network.neutron [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Updating instance_info_cache with network_info: [{"id": "f98d3352-aeeb-4929-9920-2a306cb9558d", "address": "fa:16:3e:69:cf:4c", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf98d3352-ae", "ovs_interfaceid": "f98d3352-aeeb-4929-9920-2a306cb9558d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.930 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Releasing lock "refresh_cache-17426d60-57ac-41a5-9ae2-688821fe7f56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.931 2 DEBUG nova.compute.manager [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Instance network_info: |[{"id": "f98d3352-aeeb-4929-9920-2a306cb9558d", "address": "fa:16:3e:69:cf:4c", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf98d3352-ae", "ovs_interfaceid": "f98d3352-aeeb-4929-9920-2a306cb9558d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.932 2 DEBUG oslo_concurrency.lockutils [req-95cbc89e-2e63-4f78-ad55-212d514b0855 req-a0a811e2-81c9-4940-8f4f-0eed682829f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-17426d60-57ac-41a5-9ae2-688821fe7f56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.932 2 DEBUG nova.network.neutron [req-95cbc89e-2e63-4f78-ad55-212d514b0855 req-a0a811e2-81c9-4940-8f4f-0eed682829f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Refreshing network info cache for port f98d3352-aeeb-4929-9920-2a306cb9558d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.937 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Start _get_guest_xml network_info=[{"id": "f98d3352-aeeb-4929-9920-2a306cb9558d", "address": "fa:16:3e:69:cf:4c", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf98d3352-ae", "ovs_interfaceid": "f98d3352-aeeb-4929-9920-2a306cb9558d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.943 2 WARNING nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.959 2 DEBUG nova.virt.libvirt.host [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.960 2 DEBUG nova.virt.libvirt.host [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.966 2 DEBUG nova.virt.libvirt.host [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.967 2 DEBUG nova.virt.libvirt.host [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.969 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.969 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.970 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.971 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.971 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.972 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.972 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.973 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.973 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.974 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.974 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.975 2 DEBUG nova.virt.hardware [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:04:33 np0005465988 nova_compute[236126]: 2025-10-02 12:04:33.980 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:04:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:34.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:04:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:04:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2967803754' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:04:34 np0005465988 nova_compute[236126]: 2025-10-02 12:04:34.489 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:34 np0005465988 nova_compute[236126]: 2025-10-02 12:04:34.518 2 DEBUG nova.storage.rbd_utils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image 17426d60-57ac-41a5-9ae2-688821fe7f56_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:34 np0005465988 nova_compute[236126]: 2025-10-02 12:04:34.521 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:34 np0005465988 nova_compute[236126]: 2025-10-02 12:04:34.807 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:04:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/260913443' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:04:34 np0005465988 nova_compute[236126]: 2025-10-02 12:04:34.943 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:34 np0005465988 nova_compute[236126]: 2025-10-02 12:04:34.945 2 DEBUG nova.virt.libvirt.vif [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:04:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-966324223',display_name='tempest-ServersAdminTestJSON-server-966324223',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-966324223',id=24,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9afa78cc4dec419babdf61fd31f46e28',ramdisk_id='',reservation_id='r-ps9o5e69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-518249049',owner_user_name='tempest-ServersAdminTestJSON-518249049-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:04:29Z,user_data=None,user_id='8850add40b254d198f270d9e64c777d5',uuid=17426d60-57ac-41a5-9ae2-688821fe7f56,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f98d3352-aeeb-4929-9920-2a306cb9558d", "address": "fa:16:3e:69:cf:4c", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf98d3352-ae", "ovs_interfaceid": "f98d3352-aeeb-4929-9920-2a306cb9558d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:04:34 np0005465988 nova_compute[236126]: 2025-10-02 12:04:34.946 2 DEBUG nova.network.os_vif_util [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converting VIF {"id": "f98d3352-aeeb-4929-9920-2a306cb9558d", "address": "fa:16:3e:69:cf:4c", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf98d3352-ae", "ovs_interfaceid": "f98d3352-aeeb-4929-9920-2a306cb9558d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:04:34 np0005465988 nova_compute[236126]: 2025-10-02 12:04:34.947 2 DEBUG nova.network.os_vif_util [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=f98d3352-aeeb-4929-9920-2a306cb9558d,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf98d3352-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:04:34 np0005465988 nova_compute[236126]: 2025-10-02 12:04:34.948 2 DEBUG nova.objects.instance [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lazy-loading 'pci_devices' on Instance uuid 17426d60-57ac-41a5-9ae2-688821fe7f56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.011 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  <uuid>17426d60-57ac-41a5-9ae2-688821fe7f56</uuid>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  <name>instance-00000018</name>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersAdminTestJSON-server-966324223</nova:name>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:04:33</nova:creationTime>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <nova:user uuid="8850add40b254d198f270d9e64c777d5">tempest-ServersAdminTestJSON-518249049-project-member</nova:user>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <nova:project uuid="9afa78cc4dec419babdf61fd31f46e28">tempest-ServersAdminTestJSON-518249049</nova:project>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <nova:port uuid="f98d3352-aeeb-4929-9920-2a306cb9558d">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <entry name="serial">17426d60-57ac-41a5-9ae2-688821fe7f56</entry>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <entry name="uuid">17426d60-57ac-41a5-9ae2-688821fe7f56</entry>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/17426d60-57ac-41a5-9ae2-688821fe7f56_disk">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/17426d60-57ac-41a5-9ae2-688821fe7f56_disk.config">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:69:cf:4c"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <target dev="tapf98d3352-ae"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/17426d60-57ac-41a5-9ae2-688821fe7f56/console.log" append="off"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:04:35 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:04:35 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:04:35 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:04:35 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.014 2 DEBUG nova.compute.manager [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Preparing to wait for external event network-vif-plugged-f98d3352-aeeb-4929-9920-2a306cb9558d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.014 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.015 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.015 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.016 2 DEBUG nova.virt.libvirt.vif [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:04:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-966324223',display_name='tempest-ServersAdminTestJSON-server-966324223',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-966324223',id=24,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9afa78cc4dec419babdf61fd31f46e28',ramdisk_id='',reservation_id='r-ps9o5e69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-518249049',owner_user_name='tempest-ServersAdminTestJSON-518249049-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:04:29Z,user_data=None,user_id='8850add40b254d198f270d9e64c777d5',uuid=17426d60-57ac-41a5-9ae2-688821fe7f56,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f98d3352-aeeb-4929-9920-2a306cb9558d", "address": "fa:16:3e:69:cf:4c", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf98d3352-ae", "ovs_interfaceid": "f98d3352-aeeb-4929-9920-2a306cb9558d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.017 2 DEBUG nova.network.os_vif_util [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converting VIF {"id": "f98d3352-aeeb-4929-9920-2a306cb9558d", "address": "fa:16:3e:69:cf:4c", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf98d3352-ae", "ovs_interfaceid": "f98d3352-aeeb-4929-9920-2a306cb9558d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.018 2 DEBUG nova.network.os_vif_util [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=f98d3352-aeeb-4929-9920-2a306cb9558d,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf98d3352-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.019 2 DEBUG os_vif [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=f98d3352-aeeb-4929-9920-2a306cb9558d,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf98d3352-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.021 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.022 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.027 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf98d3352-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.027 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf98d3352-ae, col_values=(('external_ids', {'iface-id': 'f98d3352-aeeb-4929-9920-2a306cb9558d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:69:cf:4c', 'vm-uuid': '17426d60-57ac-41a5-9ae2-688821fe7f56'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:35 np0005465988 NetworkManager[45041]: <info>  [1759406675.0313] manager: (tapf98d3352-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.040 2 INFO os_vif [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=f98d3352-aeeb-4929-9920-2a306cb9558d,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf98d3352-ae')#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.236 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.237 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.238 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] No VIF found with MAC fa:16:3e:69:cf:4c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.239 2 INFO nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Using config drive#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.278 2 DEBUG nova.storage.rbd_utils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image 17426d60-57ac-41a5-9ae2-688821fe7f56_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.496 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:04:35 np0005465988 nova_compute[236126]: 2025-10-02 12:04:35.497 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:04:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:35.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:36.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:36 np0005465988 nova_compute[236126]: 2025-10-02 12:04:36.493 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:36 np0005465988 nova_compute[236126]: 2025-10-02 12:04:36.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:37 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Oct  2 08:04:37 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Oct  2 08:04:37 np0005465988 nova_compute[236126]: 2025-10-02 12:04:37.436 2 INFO nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Creating config drive at /var/lib/nova/instances/17426d60-57ac-41a5-9ae2-688821fe7f56/disk.config#033[00m
Oct  2 08:04:37 np0005465988 nova_compute[236126]: 2025-10-02 12:04:37.441 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/17426d60-57ac-41a5-9ae2-688821fe7f56/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp06uusm_1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:37 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Oct  2 08:04:37 np0005465988 nova_compute[236126]: 2025-10-02 12:04:37.586 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/17426d60-57ac-41a5-9ae2-688821fe7f56/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp06uusm_1" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:04:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:37.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:04:37 np0005465988 nova_compute[236126]: 2025-10-02 12:04:37.625 2 DEBUG nova.storage.rbd_utils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image 17426d60-57ac-41a5-9ae2-688821fe7f56_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:37 np0005465988 nova_compute[236126]: 2025-10-02 12:04:37.631 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/17426d60-57ac-41a5-9ae2-688821fe7f56/disk.config 17426d60-57ac-41a5-9ae2-688821fe7f56_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:37 np0005465988 nova_compute[236126]: 2025-10-02 12:04:37.797 2 DEBUG oslo_concurrency.processutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/17426d60-57ac-41a5-9ae2-688821fe7f56/disk.config 17426d60-57ac-41a5-9ae2-688821fe7f56_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:37 np0005465988 nova_compute[236126]: 2025-10-02 12:04:37.798 2 INFO nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Deleting local config drive /var/lib/nova/instances/17426d60-57ac-41a5-9ae2-688821fe7f56/disk.config because it was imported into RBD.#033[00m
Oct  2 08:04:37 np0005465988 kernel: tapf98d3352-ae: entered promiscuous mode
Oct  2 08:04:37 np0005465988 NetworkManager[45041]: <info>  [1759406677.8729] manager: (tapf98d3352-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Oct  2 08:04:37 np0005465988 nova_compute[236126]: 2025-10-02 12:04:37.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:04:37Z|00055|binding|INFO|Claiming lport f98d3352-aeeb-4929-9920-2a306cb9558d for this chassis.
Oct  2 08:04:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:04:37Z|00056|binding|INFO|f98d3352-aeeb-4929-9920-2a306cb9558d: Claiming fa:16:3e:69:cf:4c 10.100.0.3
Oct  2 08:04:37 np0005465988 nova_compute[236126]: 2025-10-02 12:04:37.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.912 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:cf:4c 10.100.0.3'], port_security=['fa:16:3e:69:cf:4c 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '17426d60-57ac-41a5-9ae2-688821fe7f56', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80106802-d877-42c6-b2a9-50b050f6b08f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9afa78cc4dec419babdf61fd31f46e28', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8fbb5420-10f4-405b-bd01-713020f7e518', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03aa6f10-2374-4fa3-bc90-1fcb8815afb8, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f98d3352-aeeb-4929-9920-2a306cb9558d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.914 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f98d3352-aeeb-4929-9920-2a306cb9558d in datapath 80106802-d877-42c6-b2a9-50b050f6b08f bound to our chassis#033[00m
Oct  2 08:04:37 np0005465988 systemd-udevd[247292]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:04:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.915 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 80106802-d877-42c6-b2a9-50b050f6b08f#033[00m
Oct  2 08:04:37 np0005465988 systemd-machined[192594]: New machine qemu-10-instance-00000018.
Oct  2 08:04:37 np0005465988 NetworkManager[45041]: <info>  [1759406677.9279] device (tapf98d3352-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:04:37 np0005465988 NetworkManager[45041]: <info>  [1759406677.9291] device (tapf98d3352-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:04:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.929 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a632219a-2e4f-41b0-9772-3487807c9ab9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.930 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap80106802-d1 in ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:04:37 np0005465988 systemd[1]: Started Virtual Machine qemu-10-instance-00000018.
Oct  2 08:04:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.933 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap80106802-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:04:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.934 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dc144eef-ec9e-4ed3-9f54-5631184102fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.935 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a77c9e84-f544-4620-bdd7-971e5e285b28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.950 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[cc403a3f-8b1e-4d33-af74-ac79bd04fefb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.965 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[aec9daca-d8fc-48ae-a16b-26ab2d5f23db]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:04:37Z|00057|binding|INFO|Setting lport f98d3352-aeeb-4929-9920-2a306cb9558d ovn-installed in OVS
Oct  2 08:04:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:04:37Z|00058|binding|INFO|Setting lport f98d3352-aeeb-4929-9920-2a306cb9558d up in Southbound
Oct  2 08:04:37 np0005465988 nova_compute[236126]: 2025-10-02 12:04:37.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:37 np0005465988 podman[247253]: 2025-10-02 12:04:37.979499813 +0000 UTC m=+0.130543797 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 08:04:37 np0005465988 podman[247255]: 2025-10-02 12:04:37.988074141 +0000 UTC m=+0.125037277 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:04:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.994 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f75e474f-93b8-44dd-8bad-02836f579c00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:37.999 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dc78514b-5453-4e08-a63b-cccacda84adb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:38 np0005465988 NetworkManager[45041]: <info>  [1759406678.0002] manager: (tap80106802-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Oct  2 08:04:38 np0005465988 podman[247249]: 2025-10-02 12:04:38.00880199 +0000 UTC m=+0.156059564 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.029 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[43ec9696-2709-43e4-98e8-457ce7338a72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.031 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9c00a59b-0006-41ca-9b58-8248f7436f0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:38.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:38 np0005465988 NetworkManager[45041]: <info>  [1759406678.0508] device (tap80106802-d0): carrier: link connected
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.058 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[94a294d6-47f9-4e65-929d-ad4539e42935]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.073 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5204bc8e-c95e-4e8e-a468-afd4e3ca2be1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80106802-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:27:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473838, 'reachable_time': 20756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247357, 'error': None, 'target': 'ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.091 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a3802ce7-6003-428d-ae07-9a269eca08d3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:27b6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473838, 'tstamp': 473838}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247359, 'error': None, 'target': 'ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.103 2 DEBUG nova.network.neutron [req-95cbc89e-2e63-4f78-ad55-212d514b0855 req-a0a811e2-81c9-4940-8f4f-0eed682829f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Updated VIF entry in instance network info cache for port f98d3352-aeeb-4929-9920-2a306cb9558d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.104 2 DEBUG nova.network.neutron [req-95cbc89e-2e63-4f78-ad55-212d514b0855 req-a0a811e2-81c9-4940-8f4f-0eed682829f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Updating instance_info_cache with network_info: [{"id": "f98d3352-aeeb-4929-9920-2a306cb9558d", "address": "fa:16:3e:69:cf:4c", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf98d3352-ae", "ovs_interfaceid": "f98d3352-aeeb-4929-9920-2a306cb9558d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.108 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[eec02f30-6190-4129-ba00-8913f0d868e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80106802-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:27:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473838, 'reachable_time': 20756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247360, 'error': None, 'target': 'ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.126 2 DEBUG oslo_concurrency.lockutils [req-95cbc89e-2e63-4f78-ad55-212d514b0855 req-a0a811e2-81c9-4940-8f4f-0eed682829f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-17426d60-57ac-41a5-9ae2-688821fe7f56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.144 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[42d9b5ee-2225-4101-954d-8c54fe8f3cce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.208 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f963f2cb-d9a4-4da0-8ec1-1bb7896dbc32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.210 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80106802-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.210 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.211 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap80106802-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:38 np0005465988 NetworkManager[45041]: <info>  [1759406678.2141] manager: (tap80106802-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Oct  2 08:04:38 np0005465988 kernel: tap80106802-d0: entered promiscuous mode
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.219 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap80106802-d0, col_values=(('external_ids', {'iface-id': '3e3f512e-f85f-4c9c-b91d-072c570470c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:04:38Z|00059|binding|INFO|Releasing lport 3e3f512e-f85f-4c9c-b91d-072c570470c1 from this chassis (sb_readonly=0)
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.223 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/80106802-d877-42c6-b2a9-50b050f6b08f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/80106802-d877-42c6-b2a9-50b050f6b08f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.226 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[eafc9bfa-2620-4ce8-b1a1-283493f4be8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.227 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-80106802-d877-42c6-b2a9-50b050f6b08f
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/80106802-d877-42c6-b2a9-50b050f6b08f.pid.haproxy
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 80106802-d877-42c6-b2a9-50b050f6b08f
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:04:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:38.229 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f', 'env', 'PROCESS_TAG=haproxy-80106802-d877-42c6-b2a9-50b050f6b08f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/80106802-d877-42c6-b2a9-50b050f6b08f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.389 2 DEBUG nova.compute.manager [req-fca29f52-939a-4b06-af39-938a2b16b06c req-f07508c0-e6b7-4f34-83f4-dabac8a7d326 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Received event network-vif-plugged-f98d3352-aeeb-4929-9920-2a306cb9558d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.393 2 DEBUG oslo_concurrency.lockutils [req-fca29f52-939a-4b06-af39-938a2b16b06c req-f07508c0-e6b7-4f34-83f4-dabac8a7d326 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.395 2 DEBUG oslo_concurrency.lockutils [req-fca29f52-939a-4b06-af39-938a2b16b06c req-f07508c0-e6b7-4f34-83f4-dabac8a7d326 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.396 2 DEBUG oslo_concurrency.lockutils [req-fca29f52-939a-4b06-af39-938a2b16b06c req-f07508c0-e6b7-4f34-83f4-dabac8a7d326 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.397 2 DEBUG nova.compute.manager [req-fca29f52-939a-4b06-af39-938a2b16b06c req-f07508c0-e6b7-4f34-83f4-dabac8a7d326 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Processing event network-vif-plugged-f98d3352-aeeb-4929-9920-2a306cb9558d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:04:38 np0005465988 podman[247434]: 2025-10-02 12:04:38.626864045 +0000 UTC m=+0.057792692 container create ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:04:38 np0005465988 systemd[1]: Started libpod-conmon-ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50.scope.
Oct  2 08:04:38 np0005465988 podman[247434]: 2025-10-02 12:04:38.59454109 +0000 UTC m=+0.025469747 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:04:38 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:04:38 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec11a9301a1e6ea967e1856d0df2d55f5e225f94538f1ecee803b3b315a7b3dd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:04:38 np0005465988 podman[247434]: 2025-10-02 12:04:38.739149243 +0000 UTC m=+0.170077940 container init ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:04:38 np0005465988 podman[247434]: 2025-10-02 12:04:38.746072933 +0000 UTC m=+0.177001580 container start ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:04:38 np0005465988 neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f[247449]: [NOTICE]   (247453) : New worker (247455) forked
Oct  2 08:04:38 np0005465988 neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f[247449]: [NOTICE]   (247453) : Loading success.
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.868 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406678.867532, 17426d60-57ac-41a5-9ae2-688821fe7f56 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.869 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] VM Started (Lifecycle Event)#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.873 2 DEBUG nova.compute.manager [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.876 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.880 2 INFO nova.virt.libvirt.driver [-] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Instance spawned successfully.#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.880 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.924 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.927 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.934 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.934 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.935 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.935 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.935 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.936 2 DEBUG nova.virt.libvirt.driver [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.984 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.985 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406678.8686655, 17426d60-57ac-41a5-9ae2-688821fe7f56 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:38 np0005465988 nova_compute[236126]: 2025-10-02 12:04:38.985 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:04:39 np0005465988 nova_compute[236126]: 2025-10-02 12:04:39.046 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:39 np0005465988 nova_compute[236126]: 2025-10-02 12:04:39.052 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406678.8759904, 17426d60-57ac-41a5-9ae2-688821fe7f56 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:39 np0005465988 nova_compute[236126]: 2025-10-02 12:04:39.052 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:04:39 np0005465988 nova_compute[236126]: 2025-10-02 12:04:39.075 2 INFO nova.compute.manager [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Took 9.81 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:04:39 np0005465988 nova_compute[236126]: 2025-10-02 12:04:39.075 2 DEBUG nova.compute.manager [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:39 np0005465988 nova_compute[236126]: 2025-10-02 12:04:39.099 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:39 np0005465988 nova_compute[236126]: 2025-10-02 12:04:39.104 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:39 np0005465988 nova_compute[236126]: 2025-10-02 12:04:39.145 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:39 np0005465988 nova_compute[236126]: 2025-10-02 12:04:39.229 2 INFO nova.compute.manager [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Took 11.02 seconds to build instance.#033[00m
Oct  2 08:04:39 np0005465988 nova_compute[236126]: 2025-10-02 12:04:39.283 2 DEBUG oslo_concurrency.lockutils [None req-f46aca6c-32f6-4740-a914-18f70f7fe317 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:39.546 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:39.548 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:04:39 np0005465988 nova_compute[236126]: 2025-10-02 12:04:39.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 08:04:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:39.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 08:04:40 np0005465988 nova_compute[236126]: 2025-10-02 12:04:40.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:40.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:40 np0005465988 nova_compute[236126]: 2025-10-02 12:04:40.490 2 DEBUG nova.compute.manager [req-5de7ba45-fe38-4e15-b11f-22582207c423 req-353e335d-57f8-4e97-9325-822bdb16f5de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Received event network-vif-plugged-f98d3352-aeeb-4929-9920-2a306cb9558d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:40 np0005465988 nova_compute[236126]: 2025-10-02 12:04:40.491 2 DEBUG oslo_concurrency.lockutils [req-5de7ba45-fe38-4e15-b11f-22582207c423 req-353e335d-57f8-4e97-9325-822bdb16f5de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:40 np0005465988 nova_compute[236126]: 2025-10-02 12:04:40.491 2 DEBUG oslo_concurrency.lockutils [req-5de7ba45-fe38-4e15-b11f-22582207c423 req-353e335d-57f8-4e97-9325-822bdb16f5de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:40 np0005465988 nova_compute[236126]: 2025-10-02 12:04:40.491 2 DEBUG oslo_concurrency.lockutils [req-5de7ba45-fe38-4e15-b11f-22582207c423 req-353e335d-57f8-4e97-9325-822bdb16f5de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:40 np0005465988 nova_compute[236126]: 2025-10-02 12:04:40.492 2 DEBUG nova.compute.manager [req-5de7ba45-fe38-4e15-b11f-22582207c423 req-353e335d-57f8-4e97-9325-822bdb16f5de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] No waiting events found dispatching network-vif-plugged-f98d3352-aeeb-4929-9920-2a306cb9558d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:04:40 np0005465988 nova_compute[236126]: 2025-10-02 12:04:40.492 2 WARNING nova.compute.manager [req-5de7ba45-fe38-4e15-b11f-22582207c423 req-353e335d-57f8-4e97-9325-822bdb16f5de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Received unexpected event network-vif-plugged-f98d3352-aeeb-4929-9920-2a306cb9558d for instance with vm_state active and task_state None.#033[00m
Oct  2 08:04:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:04:40.550 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:41.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:41 np0005465988 nova_compute[236126]: 2025-10-02 12:04:41.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:04:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:42.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:43.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:44.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:45 np0005465988 nova_compute[236126]: 2025-10-02 12:04:45.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:45.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:46.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:46 np0005465988 nova_compute[236126]: 2025-10-02 12:04:46.311 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Acquiring lock "89fb72b4-0730-4453-8995-330bfa32454d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:46 np0005465988 nova_compute[236126]: 2025-10-02 12:04:46.312 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "89fb72b4-0730-4453-8995-330bfa32454d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:46 np0005465988 nova_compute[236126]: 2025-10-02 12:04:46.341 2 DEBUG nova.compute.manager [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:04:46 np0005465988 nova_compute[236126]: 2025-10-02 12:04:46.448 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:46 np0005465988 nova_compute[236126]: 2025-10-02 12:04:46.449 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:46 np0005465988 nova_compute[236126]: 2025-10-02 12:04:46.458 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:04:46 np0005465988 nova_compute[236126]: 2025-10-02 12:04:46.460 2 INFO nova.compute.claims [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:04:46 np0005465988 nova_compute[236126]: 2025-10-02 12:04:46.610 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:46 np0005465988 nova_compute[236126]: 2025-10-02 12:04:46.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:04:47 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/203995729' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.088 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.096 2 DEBUG nova.compute.provider_tree [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.114 2 DEBUG nova.scheduler.client.report [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.144 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.145 2 DEBUG nova.compute.manager [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:04:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.203 2 DEBUG nova.compute.manager [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.203 2 DEBUG nova.network.neutron [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.237 2 INFO nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.260 2 DEBUG nova.compute.manager [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.399 2 DEBUG nova.compute.manager [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.401 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.401 2 INFO nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Creating image(s)#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.422 2 DEBUG nova.storage.rbd_utils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] rbd image 89fb72b4-0730-4453-8995-330bfa32454d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.453 2 DEBUG nova.storage.rbd_utils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] rbd image 89fb72b4-0730-4453-8995-330bfa32454d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.486 2 DEBUG nova.storage.rbd_utils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] rbd image 89fb72b4-0730-4453-8995-330bfa32454d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.490 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.577 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.578 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.578 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.579 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.600 2 DEBUG nova.storage.rbd_utils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] rbd image 89fb72b4-0730-4453-8995-330bfa32454d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.604 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 89fb72b4-0730-4453-8995-330bfa32454d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:47.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.733 2 DEBUG nova.network.neutron [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.734 2 DEBUG nova.compute.manager [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:04:47 np0005465988 nova_compute[236126]: 2025-10-02 12:04:47.978 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 89fb72b4-0730-4453-8995-330bfa32454d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.374s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:48.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.064 2 DEBUG nova.storage.rbd_utils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] resizing rbd image 89fb72b4-0730-4453-8995-330bfa32454d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.175 2 DEBUG nova.objects.instance [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lazy-loading 'migration_context' on Instance uuid 89fb72b4-0730-4453-8995-330bfa32454d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.192 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.193 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Ensure instance console log exists: /var/lib/nova/instances/89fb72b4-0730-4453-8995-330bfa32454d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.193 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.194 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.194 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.196 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.199 2 WARNING nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.204 2 DEBUG nova.virt.libvirt.host [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.204 2 DEBUG nova.virt.libvirt.host [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.209 2 DEBUG nova.virt.libvirt.host [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.209 2 DEBUG nova.virt.libvirt.host [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.210 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.210 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.211 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.211 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.211 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.211 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.211 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.212 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.212 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.212 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.212 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.212 2 DEBUG nova.virt.hardware [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.215 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:04:48 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3393721122' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.628 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.658 2 DEBUG nova.storage.rbd_utils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] rbd image 89fb72b4-0730-4453-8995-330bfa32454d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:48 np0005465988 nova_compute[236126]: 2025-10-02 12:04:48.663 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:04:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1900020686' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:04:49 np0005465988 nova_compute[236126]: 2025-10-02 12:04:49.171 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:49 np0005465988 nova_compute[236126]: 2025-10-02 12:04:49.173 2 DEBUG nova.objects.instance [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lazy-loading 'pci_devices' on Instance uuid 89fb72b4-0730-4453-8995-330bfa32454d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:49 np0005465988 nova_compute[236126]: 2025-10-02 12:04:49.207 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  <uuid>89fb72b4-0730-4453-8995-330bfa32454d</uuid>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  <name>instance-0000001c</name>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersAdminNegativeTestJSON-server-1626545353</nova:name>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:04:48</nova:creationTime>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <nova:user uuid="c5ab011ce9f04adbb19dab5fa5ed1714">tempest-ServersAdminNegativeTestJSON-1328318995-project-member</nova:user>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <nova:project uuid="6b8ddbfa33c348beb1c883371b5c6909">tempest-ServersAdminNegativeTestJSON-1328318995</nova:project>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <entry name="serial">89fb72b4-0730-4453-8995-330bfa32454d</entry>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <entry name="uuid">89fb72b4-0730-4453-8995-330bfa32454d</entry>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/89fb72b4-0730-4453-8995-330bfa32454d_disk">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/89fb72b4-0730-4453-8995-330bfa32454d_disk.config">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/89fb72b4-0730-4453-8995-330bfa32454d/console.log" append="off"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:04:49 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:04:49 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:04:49 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:04:49 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:04:49 np0005465988 nova_compute[236126]: 2025-10-02 12:04:49.361 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:49 np0005465988 nova_compute[236126]: 2025-10-02 12:04:49.361 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:49 np0005465988 nova_compute[236126]: 2025-10-02 12:04:49.362 2 INFO nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Using config drive#033[00m
Oct  2 08:04:49 np0005465988 nova_compute[236126]: 2025-10-02 12:04:49.400 2 DEBUG nova.storage.rbd_utils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] rbd image 89fb72b4-0730-4453-8995-330bfa32454d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:49.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:50 np0005465988 nova_compute[236126]: 2025-10-02 12:04:50.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:50.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:50 np0005465988 nova_compute[236126]: 2025-10-02 12:04:50.439 2 INFO nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Creating config drive at /var/lib/nova/instances/89fb72b4-0730-4453-8995-330bfa32454d/disk.config#033[00m
Oct  2 08:04:50 np0005465988 nova_compute[236126]: 2025-10-02 12:04:50.449 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/89fb72b4-0730-4453-8995-330bfa32454d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxvbuab5l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:50 np0005465988 nova_compute[236126]: 2025-10-02 12:04:50.599 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/89fb72b4-0730-4453-8995-330bfa32454d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxvbuab5l" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:50 np0005465988 nova_compute[236126]: 2025-10-02 12:04:50.636 2 DEBUG nova.storage.rbd_utils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] rbd image 89fb72b4-0730-4453-8995-330bfa32454d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:04:50 np0005465988 nova_compute[236126]: 2025-10-02 12:04:50.640 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/89fb72b4-0730-4453-8995-330bfa32454d/disk.config 89fb72b4-0730-4453-8995-330bfa32454d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:50 np0005465988 nova_compute[236126]: 2025-10-02 12:04:50.916 2 DEBUG oslo_concurrency.processutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/89fb72b4-0730-4453-8995-330bfa32454d/disk.config 89fb72b4-0730-4453-8995-330bfa32454d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.275s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:50 np0005465988 nova_compute[236126]: 2025-10-02 12:04:50.917 2 INFO nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Deleting local config drive /var/lib/nova/instances/89fb72b4-0730-4453-8995-330bfa32454d/disk.config because it was imported into RBD.#033[00m
Oct  2 08:04:51 np0005465988 systemd-machined[192594]: New machine qemu-11-instance-0000001c.
Oct  2 08:04:51 np0005465988 systemd[1]: Started Virtual Machine qemu-11-instance-0000001c.
Oct  2 08:04:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:04:51Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:69:cf:4c 10.100.0.3
Oct  2 08:04:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:04:51Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:69:cf:4c 10.100.0.3
Oct  2 08:04:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:04:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:51.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:51 np0005465988 nova_compute[236126]: 2025-10-02 12:04:51.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:51 np0005465988 nova_compute[236126]: 2025-10-02 12:04:51.873 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406691.8732903, 89fb72b4-0730-4453-8995-330bfa32454d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:51 np0005465988 nova_compute[236126]: 2025-10-02 12:04:51.874 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:04:51 np0005465988 nova_compute[236126]: 2025-10-02 12:04:51.876 2 DEBUG nova.compute.manager [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:04:51 np0005465988 nova_compute[236126]: 2025-10-02 12:04:51.877 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:04:51 np0005465988 nova_compute[236126]: 2025-10-02 12:04:51.880 2 INFO nova.virt.libvirt.driver [-] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Instance spawned successfully.#033[00m
Oct  2 08:04:51 np0005465988 nova_compute[236126]: 2025-10-02 12:04:51.880 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:04:51 np0005465988 nova_compute[236126]: 2025-10-02 12:04:51.982 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:51 np0005465988 nova_compute[236126]: 2025-10-02 12:04:51.985 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.037 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.038 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.038 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.039 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.039 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.039 2 DEBUG nova.virt.libvirt.driver [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:52.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.084 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.084 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406691.8754537, 89fb72b4-0730-4453-8995-330bfa32454d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.084 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] VM Started (Lifecycle Event)#033[00m
Oct  2 08:04:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.178 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.181 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.313 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.424 2 INFO nova.compute.manager [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Took 5.02 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.425 2 DEBUG nova.compute.manager [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:52 np0005465988 podman[247888]: 2025-10-02 12:04:52.522155976 +0000 UTC m=+0.058338428 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.585 2 INFO nova.compute.manager [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Took 6.17 seconds to build instance.#033[00m
Oct  2 08:04:52 np0005465988 nova_compute[236126]: 2025-10-02 12:04:52.707 2 DEBUG oslo_concurrency.lockutils [None req-47b80631-d2ba-4d43-b69c-5bb8fbafede3 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "89fb72b4-0730-4453-8995-330bfa32454d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.396s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:53 np0005465988 nova_compute[236126]: 2025-10-02 12:04:53.464 2 DEBUG nova.objects.instance [None req-c88b758e-1b1a-44cc-adef-ff7eb423172e 5816edde6cf64ef191e2c3856d279084 90da982012114f5c8ba8b23f4714095b - - default default] Lazy-loading 'pci_devices' on Instance uuid 89fb72b4-0730-4453-8995-330bfa32454d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:53 np0005465988 nova_compute[236126]: 2025-10-02 12:04:53.499 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406693.4986691, 89fb72b4-0730-4453-8995-330bfa32454d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:53 np0005465988 nova_compute[236126]: 2025-10-02 12:04:53.499 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:04:53 np0005465988 nova_compute[236126]: 2025-10-02 12:04:53.526 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:53 np0005465988 nova_compute[236126]: 2025-10-02 12:04:53.530 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:53 np0005465988 nova_compute[236126]: 2025-10-02 12:04:53.598 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Oct  2 08:04:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:53.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:53 np0005465988 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Oct  2 08:04:53 np0005465988 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001c.scope: Consumed 2.547s CPU time.
Oct  2 08:04:53 np0005465988 systemd-machined[192594]: Machine qemu-11-instance-0000001c terminated.
Oct  2 08:04:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:54.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:54 np0005465988 nova_compute[236126]: 2025-10-02 12:04:54.059 2 DEBUG nova.compute.manager [None req-c88b758e-1b1a-44cc-adef-ff7eb423172e 5816edde6cf64ef191e2c3856d279084 90da982012114f5c8ba8b23f4714095b - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:55 np0005465988 nova_compute[236126]: 2025-10-02 12:04:55.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:55.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:56.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:04:56 np0005465988 nova_compute[236126]: 2025-10-02 12:04:56.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:04:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:04:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:57.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:04:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:04:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:04:58.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:04:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:04:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:04:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:04:59.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:00 np0005465988 nova_compute[236126]: 2025-10-02 12:05:00.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:00.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e150 e150: 3 total, 3 up, 3 in
Oct  2 08:05:00 np0005465988 nova_compute[236126]: 2025-10-02 12:05:00.854 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Acquiring lock "89fb72b4-0730-4453-8995-330bfa32454d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:00 np0005465988 nova_compute[236126]: 2025-10-02 12:05:00.855 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "89fb72b4-0730-4453-8995-330bfa32454d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:00 np0005465988 nova_compute[236126]: 2025-10-02 12:05:00.855 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Acquiring lock "89fb72b4-0730-4453-8995-330bfa32454d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:00 np0005465988 nova_compute[236126]: 2025-10-02 12:05:00.855 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "89fb72b4-0730-4453-8995-330bfa32454d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:00 np0005465988 nova_compute[236126]: 2025-10-02 12:05:00.855 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "89fb72b4-0730-4453-8995-330bfa32454d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:00 np0005465988 nova_compute[236126]: 2025-10-02 12:05:00.856 2 INFO nova.compute.manager [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Terminating instance#033[00m
Oct  2 08:05:00 np0005465988 nova_compute[236126]: 2025-10-02 12:05:00.857 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Acquiring lock "refresh_cache-89fb72b4-0730-4453-8995-330bfa32454d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:05:00 np0005465988 nova_compute[236126]: 2025-10-02 12:05:00.858 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Acquired lock "refresh_cache-89fb72b4-0730-4453-8995-330bfa32454d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:05:00 np0005465988 nova_compute[236126]: 2025-10-02 12:05:00.858 2 DEBUG nova.network.neutron [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:05:01 np0005465988 nova_compute[236126]: 2025-10-02 12:05:01.452 2 DEBUG nova.network.neutron [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:05:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:01.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:01 np0005465988 nova_compute[236126]: 2025-10-02 12:05:01.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:01 np0005465988 nova_compute[236126]: 2025-10-02 12:05:01.820 2 DEBUG nova.network.neutron [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e151 e151: 3 total, 3 up, 3 in
Oct  2 08:05:01 np0005465988 nova_compute[236126]: 2025-10-02 12:05:01.897 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Releasing lock "refresh_cache-89fb72b4-0730-4453-8995-330bfa32454d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:05:01 np0005465988 nova_compute[236126]: 2025-10-02 12:05:01.897 2 DEBUG nova.compute.manager [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:05:01 np0005465988 nova_compute[236126]: 2025-10-02 12:05:01.906 2 INFO nova.virt.libvirt.driver [-] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Instance destroyed successfully.#033[00m
Oct  2 08:05:01 np0005465988 nova_compute[236126]: 2025-10-02 12:05:01.906 2 DEBUG nova.objects.instance [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lazy-loading 'resources' on Instance uuid 89fb72b4-0730-4453-8995-330bfa32454d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:05:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:02.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e152 e152: 3 total, 3 up, 3 in
Oct  2 08:05:03 np0005465988 nova_compute[236126]: 2025-10-02 12:05:03.163 2 INFO nova.virt.libvirt.driver [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Deleting instance files /var/lib/nova/instances/89fb72b4-0730-4453-8995-330bfa32454d_del#033[00m
Oct  2 08:05:03 np0005465988 nova_compute[236126]: 2025-10-02 12:05:03.164 2 INFO nova.virt.libvirt.driver [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Deletion of /var/lib/nova/instances/89fb72b4-0730-4453-8995-330bfa32454d_del complete#033[00m
Oct  2 08:05:03 np0005465988 nova_compute[236126]: 2025-10-02 12:05:03.369 2 INFO nova.compute.manager [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Took 1.47 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:05:03 np0005465988 nova_compute[236126]: 2025-10-02 12:05:03.370 2 DEBUG oslo.service.loopingcall [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:05:03 np0005465988 nova_compute[236126]: 2025-10-02 12:05:03.371 2 DEBUG nova.compute.manager [-] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:05:03 np0005465988 nova_compute[236126]: 2025-10-02 12:05:03.371 2 DEBUG nova.network.neutron [-] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:05:03 np0005465988 nova_compute[236126]: 2025-10-02 12:05:03.616 2 DEBUG nova.network.neutron [-] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:05:03 np0005465988 nova_compute[236126]: 2025-10-02 12:05:03.657 2 DEBUG nova.network.neutron [-] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:03.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:03 np0005465988 nova_compute[236126]: 2025-10-02 12:05:03.778 2 INFO nova.compute.manager [-] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Took 0.41 seconds to deallocate network for instance.#033[00m
Oct  2 08:05:03 np0005465988 nova_compute[236126]: 2025-10-02 12:05:03.934 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:03 np0005465988 nova_compute[236126]: 2025-10-02 12:05:03.934 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:04.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:04 np0005465988 nova_compute[236126]: 2025-10-02 12:05:04.094 2 DEBUG oslo_concurrency.processutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:05:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/307807258' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:05:04 np0005465988 nova_compute[236126]: 2025-10-02 12:05:04.555 2 DEBUG oslo_concurrency.processutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:04 np0005465988 nova_compute[236126]: 2025-10-02 12:05:04.563 2 DEBUG nova.compute.provider_tree [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:05:04 np0005465988 nova_compute[236126]: 2025-10-02 12:05:04.695 2 DEBUG nova.scheduler.client.report [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:05:05 np0005465988 nova_compute[236126]: 2025-10-02 12:05:05.007 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:05 np0005465988 nova_compute[236126]: 2025-10-02 12:05:05.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:05 np0005465988 nova_compute[236126]: 2025-10-02 12:05:05.188 2 INFO nova.scheduler.client.report [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Deleted allocations for instance 89fb72b4-0730-4453-8995-330bfa32454d#033[00m
Oct  2 08:05:05 np0005465988 nova_compute[236126]: 2025-10-02 12:05:05.612 2 DEBUG oslo_concurrency.lockutils [None req-c03f48ad-fad5-4acf-b94d-15137c1c4005 c5ab011ce9f04adbb19dab5fa5ed1714 6b8ddbfa33c348beb1c883371b5c6909 - - default default] Lock "89fb72b4-0730-4453-8995-330bfa32454d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:05.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:06.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:06 np0005465988 nova_compute[236126]: 2025-10-02 12:05:06.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e153 e153: 3 total, 3 up, 3 in
Oct  2 08:05:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:07.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:08.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:08 np0005465988 nova_compute[236126]: 2025-10-02 12:05:08.360 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:08 np0005465988 nova_compute[236126]: 2025-10-02 12:05:08.360 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:08 np0005465988 nova_compute[236126]: 2025-10-02 12:05:08.410 2 DEBUG nova.compute.manager [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:05:08 np0005465988 podman[247960]: 2025-10-02 12:05:08.556217251 +0000 UTC m=+0.071918181 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:05:08 np0005465988 nova_compute[236126]: 2025-10-02 12:05:08.561 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:08 np0005465988 nova_compute[236126]: 2025-10-02 12:05:08.562 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:08 np0005465988 nova_compute[236126]: 2025-10-02 12:05:08.572 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:05:08 np0005465988 nova_compute[236126]: 2025-10-02 12:05:08.572 2 INFO nova.compute.claims [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:05:08 np0005465988 podman[247959]: 2025-10-02 12:05:08.585635442 +0000 UTC m=+0.103979058 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:05:08 np0005465988 podman[247958]: 2025-10-02 12:05:08.611057937 +0000 UTC m=+0.137179348 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Oct  2 08:05:08 np0005465988 nova_compute[236126]: 2025-10-02 12:05:08.752 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.060 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406694.0592678, 89fb72b4-0730-4453-8995-330bfa32454d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.061 2 INFO nova.compute.manager [-] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.090 2 DEBUG nova.compute.manager [None req-e5bba833-2289-4e6d-aca5-850116b376ab - - - - - -] [instance: 89fb72b4-0730-4453-8995-330bfa32454d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:05:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1575453669' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.239 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.245 2 DEBUG nova.compute.provider_tree [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.350 2 DEBUG nova.scheduler.client.report [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.449 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.887s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.450 2 DEBUG nova.compute.manager [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.541 2 DEBUG nova.compute.manager [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.542 2 DEBUG nova.network.neutron [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.593 2 INFO nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.631 2 DEBUG nova.compute.manager [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:05:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:09.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.795 2 DEBUG nova.compute.manager [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.797 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.797 2 INFO nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Creating image(s)#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.827 2 DEBUG nova.storage.rbd_utils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.859 2 DEBUG nova.storage.rbd_utils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.889 2 DEBUG nova.storage.rbd_utils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.894 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.922 2 DEBUG nova.policy [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8850add40b254d198f270d9e64c777d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9afa78cc4dec419babdf61fd31f46e28', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.964 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.965 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.965 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.966 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.991 2 DEBUG nova.storage.rbd_utils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:05:09 np0005465988 nova_compute[236126]: 2025-10-02 12:05:09.994 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:10 np0005465988 nova_compute[236126]: 2025-10-02 12:05:10.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:05:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:10.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:05:10 np0005465988 nova_compute[236126]: 2025-10-02 12:05:10.626 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.631s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:10 np0005465988 nova_compute[236126]: 2025-10-02 12:05:10.711 2 DEBUG nova.storage.rbd_utils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] resizing rbd image e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:05:11 np0005465988 nova_compute[236126]: 2025-10-02 12:05:11.208 2 DEBUG nova.objects.instance [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lazy-loading 'migration_context' on Instance uuid e25deec6-82e8-43a2-b508-c3b3fa2d4f4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:05:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:05:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:11.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:05:11 np0005465988 nova_compute[236126]: 2025-10-02 12:05:11.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:12.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:13 np0005465988 nova_compute[236126]: 2025-10-02 12:05:13.207 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:05:13 np0005465988 nova_compute[236126]: 2025-10-02 12:05:13.208 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Ensure instance console log exists: /var/lib/nova/instances/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:05:13 np0005465988 nova_compute[236126]: 2025-10-02 12:05:13.209 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:13 np0005465988 nova_compute[236126]: 2025-10-02 12:05:13.209 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:13 np0005465988 nova_compute[236126]: 2025-10-02 12:05:13.209 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:13.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:13 np0005465988 nova_compute[236126]: 2025-10-02 12:05:13.857 2 DEBUG nova.network.neutron [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Successfully created port: 669a4eb1-5ddd-4f87-9d86-105e09015429 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:05:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:14.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:15 np0005465988 nova_compute[236126]: 2025-10-02 12:05:15.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:15 np0005465988 nova_compute[236126]: 2025-10-02 12:05:15.145 2 DEBUG nova.network.neutron [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Successfully updated port: 669a4eb1-5ddd-4f87-9d86-105e09015429 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:05:15 np0005465988 nova_compute[236126]: 2025-10-02 12:05:15.167 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "refresh_cache-e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:05:15 np0005465988 nova_compute[236126]: 2025-10-02 12:05:15.168 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquired lock "refresh_cache-e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:05:15 np0005465988 nova_compute[236126]: 2025-10-02 12:05:15.168 2 DEBUG nova.network.neutron [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:05:15 np0005465988 nova_compute[236126]: 2025-10-02 12:05:15.444 2 DEBUG nova.compute.manager [req-6cd36866-f452-41cc-ae9f-027a865a8750 req-591acecb-5a5e-4aa7-b157-54d3da523155 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Received event network-changed-669a4eb1-5ddd-4f87-9d86-105e09015429 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:15 np0005465988 nova_compute[236126]: 2025-10-02 12:05:15.444 2 DEBUG nova.compute.manager [req-6cd36866-f452-41cc-ae9f-027a865a8750 req-591acecb-5a5e-4aa7-b157-54d3da523155 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Refreshing instance network info cache due to event network-changed-669a4eb1-5ddd-4f87-9d86-105e09015429. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:05:15 np0005465988 nova_compute[236126]: 2025-10-02 12:05:15.444 2 DEBUG oslo_concurrency.lockutils [req-6cd36866-f452-41cc-ae9f-027a865a8750 req-591acecb-5a5e-4aa7-b157-54d3da523155 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:05:15 np0005465988 nova_compute[236126]: 2025-10-02 12:05:15.611 2 DEBUG nova.network.neutron [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:05:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:15.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:16.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:16 np0005465988 nova_compute[236126]: 2025-10-02 12:05:16.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:17.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:18.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:05:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:05:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 08:05:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct  2 08:05:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:05:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:19.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:05:19 np0005465988 nova_compute[236126]: 2025-10-02 12:05:19.974 2 DEBUG nova.network.neutron [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Updating instance_info_cache with network_info: [{"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.020 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Releasing lock "refresh_cache-e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.021 2 DEBUG nova.compute.manager [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Instance network_info: |[{"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.021 2 DEBUG oslo_concurrency.lockutils [req-6cd36866-f452-41cc-ae9f-027a865a8750 req-591acecb-5a5e-4aa7-b157-54d3da523155 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.021 2 DEBUG nova.network.neutron [req-6cd36866-f452-41cc-ae9f-027a865a8750 req-591acecb-5a5e-4aa7-b157-54d3da523155 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Refreshing network info cache for port 669a4eb1-5ddd-4f87-9d86-105e09015429 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.023 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Start _get_guest_xml network_info=[{"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.028 2 WARNING nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.032 2 DEBUG nova.virt.libvirt.host [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.033 2 DEBUG nova.virt.libvirt.host [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.035 2 DEBUG nova.virt.libvirt.host [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.036 2 DEBUG nova.virt.libvirt.host [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.037 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.037 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.038 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.038 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.038 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.038 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.038 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.039 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.039 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.039 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.039 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.040 2 DEBUG nova.virt.hardware [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.042 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:20.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:20 np0005465988 podman[248666]: 2025-10-02 12:05:20.123576076 +0000 UTC m=+0.040124972 container create edb66682d096a8dc782cc08e21900833747131051cd5dd3fb8ab1f452f9d8f9b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_cannon, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:20 np0005465988 systemd[1]: Started libpod-conmon-edb66682d096a8dc782cc08e21900833747131051cd5dd3fb8ab1f452f9d8f9b.scope.
Oct  2 08:05:20 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:05:20 np0005465988 podman[248666]: 2025-10-02 12:05:20.200119559 +0000 UTC m=+0.116668465 container init edb66682d096a8dc782cc08e21900833747131051cd5dd3fb8ab1f452f9d8f9b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_cannon, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct  2 08:05:20 np0005465988 podman[248666]: 2025-10-02 12:05:20.105304447 +0000 UTC m=+0.021853353 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 08:05:20 np0005465988 podman[248666]: 2025-10-02 12:05:20.209295115 +0000 UTC m=+0.125844031 container start edb66682d096a8dc782cc08e21900833747131051cd5dd3fb8ab1f452f9d8f9b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_cannon, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Oct  2 08:05:20 np0005465988 podman[248666]: 2025-10-02 12:05:20.214869526 +0000 UTC m=+0.131418462 container attach edb66682d096a8dc782cc08e21900833747131051cd5dd3fb8ab1f452f9d8f9b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_cannon, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Oct  2 08:05:20 np0005465988 loving_cannon[248683]: 167 167
Oct  2 08:05:20 np0005465988 systemd[1]: libpod-edb66682d096a8dc782cc08e21900833747131051cd5dd3fb8ab1f452f9d8f9b.scope: Deactivated successfully.
Oct  2 08:05:20 np0005465988 conmon[248683]: conmon edb66682d096a8dc782c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-edb66682d096a8dc782cc08e21900833747131051cd5dd3fb8ab1f452f9d8f9b.scope/container/memory.events
Oct  2 08:05:20 np0005465988 podman[248666]: 2025-10-02 12:05:20.218093579 +0000 UTC m=+0.134642485 container died edb66682d096a8dc782cc08e21900833747131051cd5dd3fb8ab1f452f9d8f9b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_cannon, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Oct  2 08:05:20 np0005465988 systemd[1]: var-lib-containers-storage-overlay-087e16d9ba93eaf9954e1273357c2aae113756c4a1ec94d7664bb7d97e3313e9-merged.mount: Deactivated successfully.
Oct  2 08:05:20 np0005465988 podman[248666]: 2025-10-02 12:05:20.278170966 +0000 UTC m=+0.194719852 container remove edb66682d096a8dc782cc08e21900833747131051cd5dd3fb8ab1f452f9d8f9b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_cannon, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Oct  2 08:05:20 np0005465988 systemd[1]: libpod-conmon-edb66682d096a8dc782cc08e21900833747131051cd5dd3fb8ab1f452f9d8f9b.scope: Deactivated successfully.
Oct  2 08:05:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:05:20 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1224773953' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.493 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.525 2 DEBUG nova.storage.rbd_utils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.529 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:20 np0005465988 podman[248724]: 2025-10-02 12:05:20.531089091 +0000 UTC m=+0.082132266 container create 415c3400f03439157da4804c7f67afc9cac1ef42da2512a4b98661e6a1bcf448 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_vaughan, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Oct  2 08:05:20 np0005465988 systemd[1]: Started libpod-conmon-415c3400f03439157da4804c7f67afc9cac1ef42da2512a4b98661e6a1bcf448.scope.
Oct  2 08:05:20 np0005465988 podman[248724]: 2025-10-02 12:05:20.48575016 +0000 UTC m=+0.036793395 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 08:05:20 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:05:20 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d979b999f9c8f041ab6c22158ff5edd85155e9725c4e921f68746ce010e2566/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct  2 08:05:20 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d979b999f9c8f041ab6c22158ff5edd85155e9725c4e921f68746ce010e2566/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 08:05:20 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d979b999f9c8f041ab6c22158ff5edd85155e9725c4e921f68746ce010e2566/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 08:05:20 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d979b999f9c8f041ab6c22158ff5edd85155e9725c4e921f68746ce010e2566/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 08:05:20 np0005465988 podman[248724]: 2025-10-02 12:05:20.627841069 +0000 UTC m=+0.178884264 container init 415c3400f03439157da4804c7f67afc9cac1ef42da2512a4b98661e6a1bcf448 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_vaughan, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Oct  2 08:05:20 np0005465988 podman[248724]: 2025-10-02 12:05:20.634000757 +0000 UTC m=+0.185043932 container start 415c3400f03439157da4804c7f67afc9cac1ef42da2512a4b98661e6a1bcf448 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_vaughan, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct  2 08:05:20 np0005465988 podman[248724]: 2025-10-02 12:05:20.651833053 +0000 UTC m=+0.202876248 container attach 415c3400f03439157da4804c7f67afc9cac1ef42da2512a4b98661e6a1bcf448 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_vaughan, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct  2 08:05:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:05:20 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3699658884' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.949 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.952 2 DEBUG nova.virt.libvirt.vif [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:05:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-237972914',display_name='tempest-ServersAdminTestJSON-server-237972914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-237972914',id=29,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9afa78cc4dec419babdf61fd31f46e28',ramdisk_id='',reservation_id='r-0l53rwnu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-518249049',owner_user_name='tempest-ServersAdminTestJSON-518249049-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:05:09Z,user_data=None,user_id='8850add40b254d198f270d9e64c777d5',uuid=e25deec6-82e8-43a2-b508-c3b3fa2d4f4e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.952 2 DEBUG nova.network.os_vif_util [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converting VIF {"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.954 2 DEBUG nova.network.os_vif_util [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:fb:8e,bridge_name='br-int',has_traffic_filtering=True,id=669a4eb1-5ddd-4f87-9d86-105e09015429,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669a4eb1-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.955 2 DEBUG nova.objects.instance [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lazy-loading 'pci_devices' on Instance uuid e25deec6-82e8-43a2-b508-c3b3fa2d4f4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.982 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  <uuid>e25deec6-82e8-43a2-b508-c3b3fa2d4f4e</uuid>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  <name>instance-0000001d</name>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersAdminTestJSON-server-237972914</nova:name>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:05:20</nova:creationTime>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <nova:user uuid="8850add40b254d198f270d9e64c777d5">tempest-ServersAdminTestJSON-518249049-project-member</nova:user>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <nova:project uuid="9afa78cc4dec419babdf61fd31f46e28">tempest-ServersAdminTestJSON-518249049</nova:project>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <nova:port uuid="669a4eb1-5ddd-4f87-9d86-105e09015429">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <entry name="serial">e25deec6-82e8-43a2-b508-c3b3fa2d4f4e</entry>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <entry name="uuid">e25deec6-82e8-43a2-b508-c3b3fa2d4f4e</entry>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk.config">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:bc:fb:8e"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <target dev="tap669a4eb1-5d"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e/console.log" append="off"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:05:20 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:05:20 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:05:20 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:05:20 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.982 2 DEBUG nova.compute.manager [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Preparing to wait for external event network-vif-plugged-669a4eb1-5ddd-4f87-9d86-105e09015429 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.983 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.983 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.984 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.985 2 DEBUG nova.virt.libvirt.vif [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:05:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-237972914',display_name='tempest-ServersAdminTestJSON-server-237972914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-237972914',id=29,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9afa78cc4dec419babdf61fd31f46e28',ramdisk_id='',reservation_id='r-0l53rwnu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-518249049',owner_user_name='tempest-ServersAdminTestJSON-518249049-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:05:09Z,user_data=None,user_id='8850add40b254d198f270d9e64c777d5',uuid=e25deec6-82e8-43a2-b508-c3b3fa2d4f4e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.985 2 DEBUG nova.network.os_vif_util [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converting VIF {"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.985 2 DEBUG nova.network.os_vif_util [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:fb:8e,bridge_name='br-int',has_traffic_filtering=True,id=669a4eb1-5ddd-4f87-9d86-105e09015429,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669a4eb1-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.986 2 DEBUG os_vif [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:fb:8e,bridge_name='br-int',has_traffic_filtering=True,id=669a4eb1-5ddd-4f87-9d86-105e09015429,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669a4eb1-5d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.987 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.987 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.990 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap669a4eb1-5d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.991 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap669a4eb1-5d, col_values=(('external_ids', {'iface-id': '669a4eb1-5ddd-4f87-9d86-105e09015429', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bc:fb:8e', 'vm-uuid': 'e25deec6-82e8-43a2-b508-c3b3fa2d4f4e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:20 np0005465988 NetworkManager[45041]: <info>  [1759406720.9935] manager: (tap669a4eb1-5d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:20 np0005465988 nova_compute[236126]: 2025-10-02 12:05:20.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:05:21 np0005465988 nova_compute[236126]: 2025-10-02 12:05:21.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:21 np0005465988 nova_compute[236126]: 2025-10-02 12:05:21.003 2 INFO os_vif [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:fb:8e,bridge_name='br-int',has_traffic_filtering=True,id=669a4eb1-5ddd-4f87-9d86-105e09015429,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669a4eb1-5d')#033[00m
Oct  2 08:05:21 np0005465988 nova_compute[236126]: 2025-10-02 12:05:21.136 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:05:21 np0005465988 nova_compute[236126]: 2025-10-02 12:05:21.137 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:05:21 np0005465988 nova_compute[236126]: 2025-10-02 12:05:21.137 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] No VIF found with MAC fa:16:3e:bc:fb:8e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:05:21 np0005465988 nova_compute[236126]: 2025-10-02 12:05:21.137 2 INFO nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Using config drive#033[00m
Oct  2 08:05:21 np0005465988 nova_compute[236126]: 2025-10-02 12:05:21.167 2 DEBUG nova.storage.rbd_utils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:05:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:21.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:21 np0005465988 nova_compute[236126]: 2025-10-02 12:05:21.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]: [
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:    {
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:        "available": false,
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:        "ceph_device": false,
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:        "device_id": "QEMU_DVD-ROM_QM00001",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:        "lsm_data": {},
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:        "lvs": [],
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:        "path": "/dev/sr0",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:        "rejected_reasons": [
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "Insufficient space (<5GB)",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "Has a FileSystem"
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:        ],
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:        "sys_api": {
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "actuators": null,
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "device_nodes": "sr0",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "devname": "sr0",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "human_readable_size": "482.00 KB",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "id_bus": "ata",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "model": "QEMU DVD-ROM",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "nr_requests": "2",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "parent": "/dev/sr0",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "partitions": {},
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "path": "/dev/sr0",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "removable": "1",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "rev": "2.5+",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "ro": "0",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "rotational": "0",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "sas_address": "",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "sas_device_handle": "",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "scheduler_mode": "mq-deadline",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "sectors": 0,
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "sectorsize": "2048",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "size": 493568.0,
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "support_discard": "2048",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "type": "disk",
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:            "vendor": "QEMU"
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:        }
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]:    }
Oct  2 08:05:22 np0005465988 objective_vaughan[248761]: ]
Oct  2 08:05:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:22.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:22 np0005465988 systemd[1]: libpod-415c3400f03439157da4804c7f67afc9cac1ef42da2512a4b98661e6a1bcf448.scope: Deactivated successfully.
Oct  2 08:05:22 np0005465988 podman[248724]: 2025-10-02 12:05:22.109891312 +0000 UTC m=+1.660934487 container died 415c3400f03439157da4804c7f67afc9cac1ef42da2512a4b98661e6a1bcf448 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_vaughan, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:05:22 np0005465988 systemd[1]: libpod-415c3400f03439157da4804c7f67afc9cac1ef42da2512a4b98661e6a1bcf448.scope: Consumed 1.365s CPU time.
Oct  2 08:05:22 np0005465988 systemd[1]: var-lib-containers-storage-overlay-4d979b999f9c8f041ab6c22158ff5edd85155e9725c4e921f68746ce010e2566-merged.mount: Deactivated successfully.
Oct  2 08:05:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:22 np0005465988 podman[248724]: 2025-10-02 12:05:22.207782863 +0000 UTC m=+1.758826048 container remove 415c3400f03439157da4804c7f67afc9cac1ef42da2512a4b98661e6a1bcf448 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_vaughan, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Oct  2 08:05:22 np0005465988 systemd[1]: libpod-conmon-415c3400f03439157da4804c7f67afc9cac1ef42da2512a4b98661e6a1bcf448.scope: Deactivated successfully.
Oct  2 08:05:22 np0005465988 nova_compute[236126]: 2025-10-02 12:05:22.479 2 INFO nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Creating config drive at /var/lib/nova/instances/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e/disk.config#033[00m
Oct  2 08:05:22 np0005465988 nova_compute[236126]: 2025-10-02 12:05:22.489 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_cgllc3_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:22 np0005465988 nova_compute[236126]: 2025-10-02 12:05:22.630 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_cgllc3_" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:22 np0005465988 nova_compute[236126]: 2025-10-02 12:05:22.671 2 DEBUG nova.storage.rbd_utils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] rbd image e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:05:22 np0005465988 nova_compute[236126]: 2025-10-02 12:05:22.675 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e/disk.config e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:22.980 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:05:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:22.981 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:05:22 np0005465988 nova_compute[236126]: 2025-10-02 12:05:22.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:05:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:05:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 08:05:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:05:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:05:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:05:23 np0005465988 nova_compute[236126]: 2025-10-02 12:05:23.262 2 DEBUG oslo_concurrency.processutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e/disk.config e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.588s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:23 np0005465988 nova_compute[236126]: 2025-10-02 12:05:23.263 2 INFO nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Deleting local config drive /var/lib/nova/instances/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e/disk.config because it was imported into RBD.#033[00m
Oct  2 08:05:23 np0005465988 kernel: tap669a4eb1-5d: entered promiscuous mode
Oct  2 08:05:23 np0005465988 NetworkManager[45041]: <info>  [1759406723.3212] manager: (tap669a4eb1-5d): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Oct  2 08:05:23 np0005465988 ovn_controller[132601]: 2025-10-02T12:05:23Z|00060|binding|INFO|Claiming lport 669a4eb1-5ddd-4f87-9d86-105e09015429 for this chassis.
Oct  2 08:05:23 np0005465988 ovn_controller[132601]: 2025-10-02T12:05:23Z|00061|binding|INFO|669a4eb1-5ddd-4f87-9d86-105e09015429: Claiming fa:16:3e:bc:fb:8e 10.100.0.14
Oct  2 08:05:23 np0005465988 nova_compute[236126]: 2025-10-02 12:05:23.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.350 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:fb:8e 10.100.0.14'], port_security=['fa:16:3e:bc:fb:8e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e25deec6-82e8-43a2-b508-c3b3fa2d4f4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80106802-d877-42c6-b2a9-50b050f6b08f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9afa78cc4dec419babdf61fd31f46e28', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8fbb5420-10f4-405b-bd01-713020f7e518', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03aa6f10-2374-4fa3-bc90-1fcb8815afb8, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=669a4eb1-5ddd-4f87-9d86-105e09015429) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.353 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 669a4eb1-5ddd-4f87-9d86-105e09015429 in datapath 80106802-d877-42c6-b2a9-50b050f6b08f bound to our chassis#033[00m
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.356 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 80106802-d877-42c6-b2a9-50b050f6b08f#033[00m
Oct  2 08:05:23 np0005465988 ovn_controller[132601]: 2025-10-02T12:05:23Z|00062|binding|INFO|Setting lport 669a4eb1-5ddd-4f87-9d86-105e09015429 ovn-installed in OVS
Oct  2 08:05:23 np0005465988 ovn_controller[132601]: 2025-10-02T12:05:23Z|00063|binding|INFO|Setting lport 669a4eb1-5ddd-4f87-9d86-105e09015429 up in Southbound
Oct  2 08:05:23 np0005465988 systemd-machined[192594]: New machine qemu-12-instance-0000001d.
Oct  2 08:05:23 np0005465988 nova_compute[236126]: 2025-10-02 12:05:23.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:23 np0005465988 systemd[1]: Started Virtual Machine qemu-12-instance-0000001d.
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.375 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[400ccda2-46de-498a-9504-979300b513ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005465988 systemd-udevd[250143]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.404 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[49cb2fd0-62f1-4674-886b-284ec4b496be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005465988 NetworkManager[45041]: <info>  [1759406723.4078] device (tap669a4eb1-5d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:05:23 np0005465988 NetworkManager[45041]: <info>  [1759406723.4091] device (tap669a4eb1-5d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.408 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c21d96-df90-47b1-a27e-a94bb6baa1ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005465988 podman[250128]: 2025-10-02 12:05:23.436696432 +0000 UTC m=+0.083714582 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.449 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[525f6520-8a90-42c1-a14e-2b6a7c7c3646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.470 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7e42b6e2-b26e-4dd3-a7ea-b26f8ec946e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80106802-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:27:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473838, 'reachable_time': 20756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250164, 'error': None, 'target': 'ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.487 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[be5b20b2-86e5-46bf-a5c0-f045ebf00ca1]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap80106802-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473850, 'tstamp': 473850}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250165, 'error': None, 'target': 'ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap80106802-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473853, 'tstamp': 473853}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250165, 'error': None, 'target': 'ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.490 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80106802-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:23 np0005465988 nova_compute[236126]: 2025-10-02 12:05:23.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:23 np0005465988 nova_compute[236126]: 2025-10-02 12:05:23.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.494 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap80106802-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.495 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.495 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap80106802-d0, col_values=(('external_ids', {'iface-id': '3e3f512e-f85f-4c9c-b91d-072c570470c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.496 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:05:23 np0005465988 nova_compute[236126]: 2025-10-02 12:05:23.543 2 DEBUG nova.network.neutron [req-6cd36866-f452-41cc-ae9f-027a865a8750 req-591acecb-5a5e-4aa7-b157-54d3da523155 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Updated VIF entry in instance network info cache for port 669a4eb1-5ddd-4f87-9d86-105e09015429. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:05:23 np0005465988 nova_compute[236126]: 2025-10-02 12:05:23.544 2 DEBUG nova.network.neutron [req-6cd36866-f452-41cc-ae9f-027a865a8750 req-591acecb-5a5e-4aa7-b157-54d3da523155 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Updating instance_info_cache with network_info: [{"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:23 np0005465988 nova_compute[236126]: 2025-10-02 12:05:23.603 2 DEBUG oslo_concurrency.lockutils [req-6cd36866-f452-41cc-ae9f-027a865a8750 req-591acecb-5a5e-4aa7-b157-54d3da523155 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:05:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:23.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:23.984 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:24 np0005465988 nova_compute[236126]: 2025-10-02 12:05:24.070 2 DEBUG nova.compute.manager [req-1e04db1e-582d-40fa-9f2a-ebbf94f32cd1 req-8c451923-2a0f-4423-8863-dea420843aa2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Received event network-vif-plugged-669a4eb1-5ddd-4f87-9d86-105e09015429 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:24 np0005465988 nova_compute[236126]: 2025-10-02 12:05:24.071 2 DEBUG oslo_concurrency.lockutils [req-1e04db1e-582d-40fa-9f2a-ebbf94f32cd1 req-8c451923-2a0f-4423-8863-dea420843aa2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:24 np0005465988 nova_compute[236126]: 2025-10-02 12:05:24.071 2 DEBUG oslo_concurrency.lockutils [req-1e04db1e-582d-40fa-9f2a-ebbf94f32cd1 req-8c451923-2a0f-4423-8863-dea420843aa2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:24 np0005465988 nova_compute[236126]: 2025-10-02 12:05:24.072 2 DEBUG oslo_concurrency.lockutils [req-1e04db1e-582d-40fa-9f2a-ebbf94f32cd1 req-8c451923-2a0f-4423-8863-dea420843aa2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:24 np0005465988 nova_compute[236126]: 2025-10-02 12:05:24.073 2 DEBUG nova.compute.manager [req-1e04db1e-582d-40fa-9f2a-ebbf94f32cd1 req-8c451923-2a0f-4423-8863-dea420843aa2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Processing event network-vif-plugged-669a4eb1-5ddd-4f87-9d86-105e09015429 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:05:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:24.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.148 2 DEBUG nova.compute.manager [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.149 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406725.14739, e25deec6-82e8-43a2-b508-c3b3fa2d4f4e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.150 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] VM Started (Lifecycle Event)#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.153 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.156 2 INFO nova.virt.libvirt.driver [-] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Instance spawned successfully.#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.157 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.181 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.189 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.195 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.195 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.196 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.197 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.198 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.199 2 DEBUG nova.virt.libvirt.driver [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.236 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.237 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406725.1485512, e25deec6-82e8-43a2-b508-c3b3fa2d4f4e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.238 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.264 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.268 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406725.1522582, e25deec6-82e8-43a2-b508-c3b3fa2d4f4e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.268 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.308 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.311 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.314 2 INFO nova.compute.manager [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Took 15.52 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.314 2 DEBUG nova.compute.manager [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.383 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.443 2 INFO nova.compute.manager [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Took 16.92 seconds to build instance.#033[00m
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.473 2 DEBUG oslo_concurrency.lockutils [None req-b087f92b-516a-42ed-b8fc-aeda77733921 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:25.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:25 np0005465988 nova_compute[236126]: 2025-10-02 12:05:25.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:26.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:26 np0005465988 nova_compute[236126]: 2025-10-02 12:05:26.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:27.333 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:27.334 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:05:27.334 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:27.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:27 np0005465988 nova_compute[236126]: 2025-10-02 12:05:27.707 2 DEBUG nova.compute.manager [req-404c2807-3942-44c7-bbca-46f24f6ebb2c req-b81ed32e-c949-478e-ba7c-e68e2d1ce8f8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Received event network-vif-plugged-669a4eb1-5ddd-4f87-9d86-105e09015429 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:27 np0005465988 nova_compute[236126]: 2025-10-02 12:05:27.707 2 DEBUG oslo_concurrency.lockutils [req-404c2807-3942-44c7-bbca-46f24f6ebb2c req-b81ed32e-c949-478e-ba7c-e68e2d1ce8f8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:27 np0005465988 nova_compute[236126]: 2025-10-02 12:05:27.707 2 DEBUG oslo_concurrency.lockutils [req-404c2807-3942-44c7-bbca-46f24f6ebb2c req-b81ed32e-c949-478e-ba7c-e68e2d1ce8f8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:27 np0005465988 nova_compute[236126]: 2025-10-02 12:05:27.707 2 DEBUG oslo_concurrency.lockutils [req-404c2807-3942-44c7-bbca-46f24f6ebb2c req-b81ed32e-c949-478e-ba7c-e68e2d1ce8f8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:27 np0005465988 nova_compute[236126]: 2025-10-02 12:05:27.708 2 DEBUG nova.compute.manager [req-404c2807-3942-44c7-bbca-46f24f6ebb2c req-b81ed32e-c949-478e-ba7c-e68e2d1ce8f8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] No waiting events found dispatching network-vif-plugged-669a4eb1-5ddd-4f87-9d86-105e09015429 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:05:27 np0005465988 nova_compute[236126]: 2025-10-02 12:05:27.708 2 WARNING nova.compute.manager [req-404c2807-3942-44c7-bbca-46f24f6ebb2c req-b81ed32e-c949-478e-ba7c-e68e2d1ce8f8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Received unexpected event network-vif-plugged-669a4eb1-5ddd-4f87-9d86-105e09015429 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:05:28 np0005465988 nova_compute[236126]: 2025-10-02 12:05:28.088 2 DEBUG oslo_concurrency.lockutils [None req-abad7447-9e12-452e-a2f4-7e13cb965c7c 6d62db4a2ce1452594a6f86f4302e10a 9da12fce614f48e180f0bab1515f9dc7 - - default default] Acquiring lock "refresh_cache-e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:05:28 np0005465988 nova_compute[236126]: 2025-10-02 12:05:28.089 2 DEBUG oslo_concurrency.lockutils [None req-abad7447-9e12-452e-a2f4-7e13cb965c7c 6d62db4a2ce1452594a6f86f4302e10a 9da12fce614f48e180f0bab1515f9dc7 - - default default] Acquired lock "refresh_cache-e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:05:28 np0005465988 nova_compute[236126]: 2025-10-02 12:05:28.089 2 DEBUG nova.network.neutron [None req-abad7447-9e12-452e-a2f4-7e13cb965c7c 6d62db4a2ce1452594a6f86f4302e10a 9da12fce614f48e180f0bab1515f9dc7 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:05:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:28.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:29.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:30 np0005465988 nova_compute[236126]: 2025-10-02 12:05:30.038 2 DEBUG nova.network.neutron [None req-abad7447-9e12-452e-a2f4-7e13cb965c7c 6d62db4a2ce1452594a6f86f4302e10a 9da12fce614f48e180f0bab1515f9dc7 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Updating instance_info_cache with network_info: [{"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:30 np0005465988 nova_compute[236126]: 2025-10-02 12:05:30.055 2 DEBUG oslo_concurrency.lockutils [None req-abad7447-9e12-452e-a2f4-7e13cb965c7c 6d62db4a2ce1452594a6f86f4302e10a 9da12fce614f48e180f0bab1515f9dc7 - - default default] Releasing lock "refresh_cache-e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:05:30 np0005465988 nova_compute[236126]: 2025-10-02 12:05:30.055 2 DEBUG nova.compute.manager [None req-abad7447-9e12-452e-a2f4-7e13cb965c7c 6d62db4a2ce1452594a6f86f4302e10a 9da12fce614f48e180f0bab1515f9dc7 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Oct  2 08:05:30 np0005465988 nova_compute[236126]: 2025-10-02 12:05:30.056 2 DEBUG nova.compute.manager [None req-abad7447-9e12-452e-a2f4-7e13cb965c7c 6d62db4a2ce1452594a6f86f4302e10a 9da12fce614f48e180f0bab1515f9dc7 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] network_info to inject: |[{"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Oct  2 08:05:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:30.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:30 np0005465988 nova_compute[236126]: 2025-10-02 12:05:30.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:31 np0005465988 nova_compute[236126]: 2025-10-02 12:05:31.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:31 np0005465988 nova_compute[236126]: 2025-10-02 12:05:31.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:31.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:31 np0005465988 nova_compute[236126]: 2025-10-02 12:05:31.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:05:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:32.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.426915) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406732427009, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2415, "num_deletes": 258, "total_data_size": 5616106, "memory_usage": 5708576, "flush_reason": "Manual Compaction"}
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406732446446, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3607201, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25509, "largest_seqno": 27919, "table_properties": {"data_size": 3597676, "index_size": 5891, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20655, "raw_average_key_size": 20, "raw_value_size": 3577976, "raw_average_value_size": 3497, "num_data_blocks": 260, "num_entries": 1023, "num_filter_entries": 1023, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759406543, "oldest_key_time": 1759406543, "file_creation_time": 1759406732, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 19700 microseconds, and 12871 cpu microseconds.
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.446620) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3607201 bytes OK
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.446699) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.448655) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.448679) EVENT_LOG_v1 {"time_micros": 1759406732448672, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.448702) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 5605387, prev total WAL file size 5605387, number of live WAL files 2.
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.451453) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353031' seq:72057594037927935, type:22 .. '6C6F676D00373533' seq:0, type:0; will stop at (end)
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3522KB)], [51(8828KB)]
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406732451520, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 12647711, "oldest_snapshot_seqno": -1}
Oct  2 08:05:32 np0005465988 nova_compute[236126]: 2025-10-02 12:05:32.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5365 keys, 12528517 bytes, temperature: kUnknown
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406732505522, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 12528517, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12488195, "index_size": 25812, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13445, "raw_key_size": 134712, "raw_average_key_size": 25, "raw_value_size": 12387222, "raw_average_value_size": 2308, "num_data_blocks": 1069, "num_entries": 5365, "num_filter_entries": 5365, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759406732, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.505747) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 12528517 bytes
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.506752) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 233.9 rd, 231.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 8.6 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(7.0) write-amplify(3.5) OK, records in: 5902, records dropped: 537 output_compression: NoCompression
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.506767) EVENT_LOG_v1 {"time_micros": 1759406732506760, "job": 30, "event": "compaction_finished", "compaction_time_micros": 54079, "compaction_time_cpu_micros": 26971, "output_level": 6, "num_output_files": 1, "total_output_size": 12528517, "num_input_records": 5902, "num_output_records": 5365, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406732507401, "job": 30, "event": "table_file_deletion", "file_number": 53}
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406732508917, "job": 30, "event": "table_file_deletion", "file_number": 51}
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.451266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.509003) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.509009) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.509012) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.509016) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:32 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:32.509019) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:32 np0005465988 nova_compute[236126]: 2025-10-02 12:05:32.663 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:32 np0005465988 nova_compute[236126]: 2025-10-02 12:05:32.664 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:32 np0005465988 nova_compute[236126]: 2025-10-02 12:05:32.664 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:32 np0005465988 nova_compute[236126]: 2025-10-02 12:05:32.664 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:05:32 np0005465988 nova_compute[236126]: 2025-10-02 12:05:32.665 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:05:33 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/374108653' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.136 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.233 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.234 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.237 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.237 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.402 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.404 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4437MB free_disk=20.78548812866211GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.404 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.404 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.518 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 17426d60-57ac-41a5-9ae2-688821fe7f56 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.518 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance e25deec6-82e8-43a2-b508-c3b3fa2d4f4e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.519 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.519 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:05:33 np0005465988 nova_compute[236126]: 2025-10-02 12:05:33.608 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:33.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:05:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2636885787' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:05:34 np0005465988 nova_compute[236126]: 2025-10-02 12:05:34.072 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:34 np0005465988 nova_compute[236126]: 2025-10-02 12:05:34.078 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:05:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:34.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:34 np0005465988 nova_compute[236126]: 2025-10-02 12:05:34.272 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:05:34 np0005465988 nova_compute[236126]: 2025-10-02 12:05:34.964 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:05:34 np0005465988 nova_compute[236126]: 2025-10-02 12:05:34.965 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.233865) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406735234313, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 291, "num_deletes": 251, "total_data_size": 101757, "memory_usage": 108520, "flush_reason": "Manual Compaction"}
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e154 e154: 3 total, 3 up, 3 in
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406735236535, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 66542, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27924, "largest_seqno": 28210, "table_properties": {"data_size": 64637, "index_size": 133, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5160, "raw_average_key_size": 18, "raw_value_size": 60742, "raw_average_value_size": 219, "num_data_blocks": 6, "num_entries": 277, "num_filter_entries": 277, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759406733, "oldest_key_time": 1759406733, "file_creation_time": 1759406735, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 2816 microseconds, and 763 cpu microseconds.
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.236692) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 66542 bytes OK
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.236768) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.237874) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.237886) EVENT_LOG_v1 {"time_micros": 1759406735237883, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.237899) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 99565, prev total WAL file size 99606, number of live WAL files 2.
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.238731) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(64KB)], [54(11MB)]
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406735238820, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 12595059, "oldest_snapshot_seqno": -1}
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5130 keys, 10659408 bytes, temperature: kUnknown
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406735291706, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 10659408, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10622284, "index_size": 23190, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12869, "raw_key_size": 130571, "raw_average_key_size": 25, "raw_value_size": 10526908, "raw_average_value_size": 2052, "num_data_blocks": 951, "num_entries": 5130, "num_filter_entries": 5130, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759406735, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.292064) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 10659408 bytes
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.298342) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 237.6 rd, 201.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 11.9 +0.0 blob) out(10.2 +0.0 blob), read-write-amplify(349.5) write-amplify(160.2) OK, records in: 5642, records dropped: 512 output_compression: NoCompression
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.298404) EVENT_LOG_v1 {"time_micros": 1759406735298384, "job": 32, "event": "compaction_finished", "compaction_time_micros": 53012, "compaction_time_cpu_micros": 25356, "output_level": 6, "num_output_files": 1, "total_output_size": 10659408, "num_input_records": 5642, "num_output_records": 5130, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406735298635, "job": 32, "event": "table_file_deletion", "file_number": 56}
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406735300804, "job": 32, "event": "table_file_deletion", "file_number": 54}
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.238619) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.300880) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.300886) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.300888) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.300890) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:05:35.300891) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:05:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:35.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:35 np0005465988 nova_compute[236126]: 2025-10-02 12:05:35.965 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:35 np0005465988 nova_compute[236126]: 2025-10-02 12:05:35.966 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:35 np0005465988 nova_compute[236126]: 2025-10-02 12:05:35.966 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:35 np0005465988 nova_compute[236126]: 2025-10-02 12:05:35.966 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:35 np0005465988 nova_compute[236126]: 2025-10-02 12:05:35.966 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:05:35 np0005465988 nova_compute[236126]: 2025-10-02 12:05:35.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:36.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:36 np0005465988 nova_compute[236126]: 2025-10-02 12:05:36.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:36 np0005465988 nova_compute[236126]: 2025-10-02 12:05:36.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:05:36 np0005465988 nova_compute[236126]: 2025-10-02 12:05:36.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:05:36 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Oct  2 08:05:36 np0005465988 nova_compute[236126]: 2025-10-02 12:05:36.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:37.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:05:38Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bc:fb:8e 10.100.0.14
Oct  2 08:05:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:05:38Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bc:fb:8e 10.100.0.14
Oct  2 08:05:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:38.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:38 np0005465988 nova_compute[236126]: 2025-10-02 12:05:38.726 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-17426d60-57ac-41a5-9ae2-688821fe7f56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:05:38 np0005465988 nova_compute[236126]: 2025-10-02 12:05:38.728 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-17426d60-57ac-41a5-9ae2-688821fe7f56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:05:38 np0005465988 nova_compute[236126]: 2025-10-02 12:05:38.728 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:05:38 np0005465988 nova_compute[236126]: 2025-10-02 12:05:38.728 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 17426d60-57ac-41a5-9ae2-688821fe7f56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:05:39 np0005465988 podman[250365]: 2025-10-02 12:05:39.569992728 +0000 UTC m=+0.076105822 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:05:39 np0005465988 podman[250364]: 2025-10-02 12:05:39.613413684 +0000 UTC m=+0.127576891 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:05:39 np0005465988 podman[250363]: 2025-10-02 12:05:39.625664578 +0000 UTC m=+0.145264112 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:05:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:39.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:40.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:40 np0005465988 nova_compute[236126]: 2025-10-02 12:05:40.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:41.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:41 np0005465988 nova_compute[236126]: 2025-10-02 12:05:41.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:41 np0005465988 nova_compute[236126]: 2025-10-02 12:05:41.821 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Updating instance_info_cache with network_info: [{"id": "f98d3352-aeeb-4929-9920-2a306cb9558d", "address": "fa:16:3e:69:cf:4c", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf98d3352-ae", "ovs_interfaceid": "f98d3352-aeeb-4929-9920-2a306cb9558d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:41 np0005465988 nova_compute[236126]: 2025-10-02 12:05:41.846 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-17426d60-57ac-41a5-9ae2-688821fe7f56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:05:41 np0005465988 nova_compute[236126]: 2025-10-02 12:05:41.847 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:05:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:42.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e155 e155: 3 total, 3 up, 3 in
Oct  2 08:05:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:43.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:43 np0005465988 nova_compute[236126]: 2025-10-02 12:05:43.841 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:43 np0005465988 nova_compute[236126]: 2025-10-02 12:05:43.841 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:44.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:45.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:46 np0005465988 nova_compute[236126]: 2025-10-02 12:05:46.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:46.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:46 np0005465988 nova_compute[236126]: 2025-10-02 12:05:46.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:47.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:48.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:49.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:50.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:51 np0005465988 nova_compute[236126]: 2025-10-02 12:05:51.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e156 e156: 3 total, 3 up, 3 in
Oct  2 08:05:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:51.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:51 np0005465988 nova_compute[236126]: 2025-10-02 12:05:51.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:52.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e157 e157: 3 total, 3 up, 3 in
Oct  2 08:05:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:05:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:53.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:05:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:54.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e158 e158: 3 total, 3 up, 3 in
Oct  2 08:05:54 np0005465988 podman[250483]: 2025-10-02 12:05:54.529519568 +0000 UTC m=+0.062849928 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:05:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:55.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:56 np0005465988 nova_compute[236126]: 2025-10-02 12:05:56.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:56.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:56 np0005465988 nova_compute[236126]: 2025-10-02 12:05:56.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:05:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:57.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:05:58.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:05:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:05:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:05:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:05:59.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:00.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:01 np0005465988 nova_compute[236126]: 2025-10-02 12:06:01.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:01.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:01 np0005465988 nova_compute[236126]: 2025-10-02 12:06:01.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:02.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e159 e159: 3 total, 3 up, 3 in
Oct  2 08:06:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:03.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:04.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:05.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:06 np0005465988 nova_compute[236126]: 2025-10-02 12:06:06.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:06.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:06 np0005465988 nova_compute[236126]: 2025-10-02 12:06:06.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:07.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:08.106 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:08 np0005465988 nova_compute[236126]: 2025-10-02 12:06:08.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:08.108 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:06:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:08.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:09.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:10.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:10 np0005465988 podman[250512]: 2025-10-02 12:06:10.555484628 +0000 UTC m=+0.077979310 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:06:10 np0005465988 podman[250511]: 2025-10-02 12:06:10.567188099 +0000 UTC m=+0.099731564 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:06:10 np0005465988 podman[250510]: 2025-10-02 12:06:10.56791982 +0000 UTC m=+0.104402480 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:06:11 np0005465988 nova_compute[236126]: 2025-10-02 12:06:11.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:06:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:11.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:06:11 np0005465988 nova_compute[236126]: 2025-10-02 12:06:11.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:12.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:13.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:14.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e160 e160: 3 total, 3 up, 3 in
Oct  2 08:06:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:15.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:16 np0005465988 nova_compute[236126]: 2025-10-02 12:06:16.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:16.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:16 np0005465988 nova_compute[236126]: 2025-10-02 12:06:16.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:17.111 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:17 np0005465988 ceph-osd[79039]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Oct  2 08:06:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:17.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:18.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:19 np0005465988 ceph-osd[79039]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Oct  2 08:06:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:19.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:20.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:21 np0005465988 nova_compute[236126]: 2025-10-02 12:06:21.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:21.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:21 np0005465988 nova_compute[236126]: 2025-10-02 12:06:21.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:22.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e161 e161: 3 total, 3 up, 3 in
Oct  2 08:06:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:23.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:24.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:25 np0005465988 podman[250632]: 2025-10-02 12:06:25.533609568 +0000 UTC m=+0.071332188 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  2 08:06:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:25.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:26 np0005465988 nova_compute[236126]: 2025-10-02 12:06:26.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:26.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:26 np0005465988 nova_compute[236126]: 2025-10-02 12:06:26.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:27.334 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:27.335 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:27.335 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:27.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:28.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:29 np0005465988 nova_compute[236126]: 2025-10-02 12:06:29.565 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:29 np0005465988 nova_compute[236126]: 2025-10-02 12:06:29.566 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:29 np0005465988 nova_compute[236126]: 2025-10-02 12:06:29.596 2 DEBUG nova.compute.manager [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:06:29 np0005465988 nova_compute[236126]: 2025-10-02 12:06:29.688 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:29 np0005465988 nova_compute[236126]: 2025-10-02 12:06:29.689 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:29 np0005465988 nova_compute[236126]: 2025-10-02 12:06:29.699 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:06:29 np0005465988 nova_compute[236126]: 2025-10-02 12:06:29.700 2 INFO nova.compute.claims [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:06:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:29.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:29 np0005465988 nova_compute[236126]: 2025-10-02 12:06:29.854 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:30.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.329 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.339 2 DEBUG nova.compute.provider_tree [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.393 2 DEBUG nova.scheduler.client.report [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.434 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.435 2 DEBUG nova.compute.manager [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.511 2 DEBUG nova.compute.manager [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.512 2 DEBUG nova.network.neutron [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.543 2 INFO nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.572 2 DEBUG nova.compute.manager [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.674 2 DEBUG nova.compute.manager [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.675 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.676 2 INFO nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Creating image(s)#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.705 2 DEBUG nova.storage.rbd_utils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] rbd image 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.739 2 DEBUG nova.storage.rbd_utils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] rbd image 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.767 2 DEBUG nova.storage.rbd_utils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] rbd image 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.770 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.839 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.840 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.841 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.841 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.866 2 DEBUG nova.storage.rbd_utils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] rbd image 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:06:30 np0005465988 nova_compute[236126]: 2025-10-02 12:06:30.871 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.494 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.623s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.598 2 DEBUG nova.storage.rbd_utils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] resizing rbd image 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.646 2 DEBUG nova.policy [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ec17c54e24584f11a5348b68d6e7ca85', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7359a7dad3b849bfbf075b88f2a261b4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.750 2 DEBUG nova.objects.instance [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lazy-loading 'migration_context' on Instance uuid 8d7306df-bd40-48a7-99a7-36da8b9a67f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.765 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.766 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Ensure instance console log exists: /var/lib/nova/instances/8d7306df-bd40-48a7-99a7-36da8b9a67f3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.766 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.766 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.767 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:31.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.813 2 DEBUG oslo_concurrency.lockutils [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.813 2 DEBUG oslo_concurrency.lockutils [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.814 2 DEBUG oslo_concurrency.lockutils [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.814 2 DEBUG oslo_concurrency.lockutils [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.814 2 DEBUG oslo_concurrency.lockutils [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.816 2 INFO nova.compute.manager [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Terminating instance#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.817 2 DEBUG nova.compute.manager [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:06:31 np0005465988 nova_compute[236126]: 2025-10-02 12:06:31.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:32 np0005465988 kernel: tap669a4eb1-5d (unregistering): left promiscuous mode
Oct  2 08:06:32 np0005465988 NetworkManager[45041]: <info>  [1759406792.1194] device (tap669a4eb1-5d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:06:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:32Z|00064|binding|INFO|Releasing lport 669a4eb1-5ddd-4f87-9d86-105e09015429 from this chassis (sb_readonly=0)
Oct  2 08:06:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:32Z|00065|binding|INFO|Setting lport 669a4eb1-5ddd-4f87-9d86-105e09015429 down in Southbound
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:32Z|00066|binding|INFO|Removing iface tap669a4eb1-5d ovn-installed in OVS
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.148 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:fb:8e 10.100.0.14'], port_security=['fa:16:3e:bc:fb:8e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e25deec6-82e8-43a2-b508-c3b3fa2d4f4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80106802-d877-42c6-b2a9-50b050f6b08f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9afa78cc4dec419babdf61fd31f46e28', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8fbb5420-10f4-405b-bd01-713020f7e518', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03aa6f10-2374-4fa3-bc90-1fcb8815afb8, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=669a4eb1-5ddd-4f87-9d86-105e09015429) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.150 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 669a4eb1-5ddd-4f87-9d86-105e09015429 in datapath 80106802-d877-42c6-b2a9-50b050f6b08f unbound from our chassis#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.151 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 80106802-d877-42c6-b2a9-50b050f6b08f#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:32.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.178 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[209266aa-5255-4c11-a275-c09713dc4d58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:32 np0005465988 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Oct  2 08:06:32 np0005465988 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001d.scope: Consumed 16.449s CPU time.
Oct  2 08:06:32 np0005465988 systemd-machined[192594]: Machine qemu-12-instance-0000001d terminated.
Oct  2 08:06:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.222 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[899fd232-0deb-4abe-aebd-3f0ef6300d6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.229 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[875499bb-ac4b-4ed5-9a49-53d775dd7cbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.260 2 INFO nova.virt.libvirt.driver [-] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Instance destroyed successfully.#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.261 2 DEBUG nova.objects.instance [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lazy-loading 'resources' on Instance uuid e25deec6-82e8-43a2-b508-c3b3fa2d4f4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.270 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[da855ee0-3e49-4aca-a572-4fc83b060316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.288 2 DEBUG nova.virt.libvirt.vif [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:05:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-237972914',display_name='tempest-ServersAdminTestJSON-server-237972914',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-237972914',id=29,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:05:25Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9afa78cc4dec419babdf61fd31f46e28',ramdisk_id='',reservation_id='r-0l53rwnu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-518249049',owner_user_name='tempest-ServersAdminTestJSON-518249049-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:05:25Z,user_data=None,user_id='8850add40b254d198f270d9e64c777d5',uuid=e25deec6-82e8-43a2-b508-c3b3fa2d4f4e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.288 2 DEBUG nova.network.os_vif_util [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converting VIF {"id": "669a4eb1-5ddd-4f87-9d86-105e09015429", "address": "fa:16:3e:bc:fb:8e", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669a4eb1-5d", "ovs_interfaceid": "669a4eb1-5ddd-4f87-9d86-105e09015429", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.290 2 DEBUG nova.network.os_vif_util [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bc:fb:8e,bridge_name='br-int',has_traffic_filtering=True,id=669a4eb1-5ddd-4f87-9d86-105e09015429,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669a4eb1-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.290 2 DEBUG os_vif [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bc:fb:8e,bridge_name='br-int',has_traffic_filtering=True,id=669a4eb1-5ddd-4f87-9d86-105e09015429,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669a4eb1-5d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.294 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap669a4eb1-5d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.294 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7e5c8616-18f1-47c4-9611-1cbaf146ea4a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80106802-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:27:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473838, 'reachable_time': 20756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250915, 'error': None, 'target': 'ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.304 2 INFO os_vif [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bc:fb:8e,bridge_name='br-int',has_traffic_filtering=True,id=669a4eb1-5ddd-4f87-9d86-105e09015429,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669a4eb1-5d')#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.320 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[24acfa19-d02f-44e5-ac66-e94b9b407ec8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap80106802-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473850, 'tstamp': 473850}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250916, 'error': None, 'target': 'ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap80106802-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473853, 'tstamp': 473853}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250916, 'error': None, 'target': 'ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.322 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80106802-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.325 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap80106802-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.326 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.326 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap80106802-d0, col_values=(('external_ids', {'iface-id': '3e3f512e-f85f-4c9c-b91d-072c570470c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:32.327 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.512 2 DEBUG nova.compute.manager [req-6e691cb4-19c6-4033-a602-ba5b94002bfe req-49cd83c6-3941-4bb8-b7e5-649e0c2b28b1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Received event network-vif-unplugged-669a4eb1-5ddd-4f87-9d86-105e09015429 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.512 2 DEBUG oslo_concurrency.lockutils [req-6e691cb4-19c6-4033-a602-ba5b94002bfe req-49cd83c6-3941-4bb8-b7e5-649e0c2b28b1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.513 2 DEBUG oslo_concurrency.lockutils [req-6e691cb4-19c6-4033-a602-ba5b94002bfe req-49cd83c6-3941-4bb8-b7e5-649e0c2b28b1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.513 2 DEBUG oslo_concurrency.lockutils [req-6e691cb4-19c6-4033-a602-ba5b94002bfe req-49cd83c6-3941-4bb8-b7e5-649e0c2b28b1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.513 2 DEBUG nova.compute.manager [req-6e691cb4-19c6-4033-a602-ba5b94002bfe req-49cd83c6-3941-4bb8-b7e5-649e0c2b28b1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] No waiting events found dispatching network-vif-unplugged-669a4eb1-5ddd-4f87-9d86-105e09015429 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.514 2 DEBUG nova.compute.manager [req-6e691cb4-19c6-4033-a602-ba5b94002bfe req-49cd83c6-3941-4bb8-b7e5-649e0c2b28b1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Received event network-vif-unplugged-669a4eb1-5ddd-4f87-9d86-105e09015429 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:06:32 np0005465988 nova_compute[236126]: 2025-10-02 12:06:32.775 2 DEBUG nova.network.neutron [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Successfully created port: a8ce8a67-762d-41a0-8c12-778f66e87f3c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:06:33 np0005465988 nova_compute[236126]: 2025-10-02 12:06:33.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:33 np0005465988 nova_compute[236126]: 2025-10-02 12:06:33.790 2 INFO nova.virt.libvirt.driver [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Deleting instance files /var/lib/nova/instances/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_del#033[00m
Oct  2 08:06:33 np0005465988 nova_compute[236126]: 2025-10-02 12:06:33.791 2 INFO nova.virt.libvirt.driver [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Deletion of /var/lib/nova/instances/e25deec6-82e8-43a2-b508-c3b3fa2d4f4e_del complete#033[00m
Oct  2 08:06:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:33.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:33 np0005465988 nova_compute[236126]: 2025-10-02 12:06:33.847 2 INFO nova.compute.manager [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Took 2.03 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:06:33 np0005465988 nova_compute[236126]: 2025-10-02 12:06:33.847 2 DEBUG oslo.service.loopingcall [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:06:33 np0005465988 nova_compute[236126]: 2025-10-02 12:06:33.848 2 DEBUG nova.compute.manager [-] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:06:33 np0005465988 nova_compute[236126]: 2025-10-02 12:06:33.848 2 DEBUG nova.network.neutron [-] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:06:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:34.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.234 2 DEBUG nova.network.neutron [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Successfully updated port: a8ce8a67-762d-41a0-8c12-778f66e87f3c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.253 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.254 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquired lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.254 2 DEBUG nova.network.neutron [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.482 2 DEBUG nova.network.neutron [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.497 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.497 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.498 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.499 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.499 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:06:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:06:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.646 2 DEBUG nova.compute.manager [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Received event network-vif-plugged-669a4eb1-5ddd-4f87-9d86-105e09015429 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.647 2 DEBUG oslo_concurrency.lockutils [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.647 2 DEBUG oslo_concurrency.lockutils [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.648 2 DEBUG oslo_concurrency.lockutils [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.648 2 DEBUG nova.compute.manager [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] No waiting events found dispatching network-vif-plugged-669a4eb1-5ddd-4f87-9d86-105e09015429 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.648 2 WARNING nova.compute.manager [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Received unexpected event network-vif-plugged-669a4eb1-5ddd-4f87-9d86-105e09015429 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.648 2 DEBUG nova.compute.manager [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Received event network-changed-a8ce8a67-762d-41a0-8c12-778f66e87f3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.649 2 DEBUG nova.compute.manager [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Refreshing instance network info cache due to event network-changed-a8ce8a67-762d-41a0-8c12-778f66e87f3c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.649 2 DEBUG oslo_concurrency.lockutils [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:06:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3093107111' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:06:34 np0005465988 nova_compute[236126]: 2025-10-02 12:06:34.955 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.004 2 DEBUG nova.network.neutron [-] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.021 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.021 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.029 2 INFO nova.compute.manager [-] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Took 1.18 seconds to deallocate network for instance.#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.084 2 DEBUG oslo_concurrency.lockutils [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.084 2 DEBUG oslo_concurrency.lockutils [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.105 2 DEBUG nova.compute.manager [req-344f8fcd-019e-4345-bf77-36ae411e2405 req-8f103d83-6bc7-4e90-9281-ff5d4d204b78 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Received event network-vif-deleted-669a4eb1-5ddd-4f87-9d86-105e09015429 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.169 2 DEBUG oslo_concurrency.processutils [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.231 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.232 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4627MB free_disk=20.783302307128906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.232 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:06:35 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/511653184' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.623 2 DEBUG oslo_concurrency.processutils [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.629 2 DEBUG nova.compute.provider_tree [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.643 2 DEBUG nova.network.neutron [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updating instance_info_cache with network_info: [{"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:35.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.815 2 DEBUG nova.scheduler.client.report [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.841 2 DEBUG oslo_concurrency.lockutils [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.843 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.844 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Releasing lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.844 2 DEBUG nova.compute.manager [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Instance network_info: |[{"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.845 2 DEBUG oslo_concurrency.lockutils [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.846 2 DEBUG nova.network.neutron [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Refreshing network info cache for port a8ce8a67-762d-41a0-8c12-778f66e87f3c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.849 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Start _get_guest_xml network_info=[{"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.855 2 WARNING nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.865 2 DEBUG nova.virt.libvirt.host [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.866 2 DEBUG nova.virt.libvirt.host [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.870 2 DEBUG nova.virt.libvirt.host [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.871 2 DEBUG nova.virt.libvirt.host [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.872 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.873 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.874 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.874 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.875 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.875 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.875 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.875 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.875 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.876 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.876 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.876 2 DEBUG nova.virt.hardware [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.879 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.901 2 INFO nova.scheduler.client.report [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Deleted allocations for instance e25deec6-82e8-43a2-b508-c3b3fa2d4f4e#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.981 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 17426d60-57ac-41a5-9ae2-688821fe7f56 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.982 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 8d7306df-bd40-48a7-99a7-36da8b9a67f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.982 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:06:35 np0005465988 nova_compute[236126]: 2025-10-02 12:06:35.983 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.034 2 DEBUG oslo_concurrency.lockutils [None req-85ce3760-2bdb-4e37-8af8-dc3c57835831 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "e25deec6-82e8-43a2-b508-c3b3fa2d4f4e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.056 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:36.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:06:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1736012696' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.333 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.378 2 DEBUG nova.storage.rbd_utils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] rbd image 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.382 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:06:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4273803553' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.523 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.529 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.561 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.603 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.604 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:06:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2033195360' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.790 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.791 2 DEBUG nova.virt.libvirt.vif [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-1739227330',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-1739227330',id=32,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWLHYAzCSRCkStdbU+GdVhWIXiwiTci8xggQ9ThyRlprkD/MENcP1zXCe9JELWxtblFvNPabWQ+ZgjaGJX29tNuXgS46PKPgWmCmmQjfV3eqKUfK1wEy2Lz1kDGxf6LzA==',key_name='tempest-keypair-377984943',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7359a7dad3b849bfbf075b88f2a261b4',ramdisk_id='',reservation_id='r-pu3d1045',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-1815230933',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-1815230933-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ec17c54e24584f11a5348b68d6e7ca85',uuid=8d7306df-bd40-48a7-99a7-36da8b9a67f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.792 2 DEBUG nova.network.os_vif_util [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Converting VIF {"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.793 2 DEBUG nova.network.os_vif_util [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:37:f7,bridge_name='br-int',has_traffic_filtering=True,id=a8ce8a67-762d-41a0-8c12-778f66e87f3c,network=Network(0392b00d-9a0f-4fdc-878a-61235e8b04c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ce8a67-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.796 2 DEBUG nova.objects.instance [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8d7306df-bd40-48a7-99a7-36da8b9a67f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.814 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  <uuid>8d7306df-bd40-48a7-99a7-36da8b9a67f3</uuid>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  <name>instance-00000020</name>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <nova:name>tempest-UpdateMultiattachVolumeNegativeTest-server-1739227330</nova:name>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:06:35</nova:creationTime>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <nova:user uuid="ec17c54e24584f11a5348b68d6e7ca85">tempest-UpdateMultiattachVolumeNegativeTest-1815230933-project-member</nova:user>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <nova:project uuid="7359a7dad3b849bfbf075b88f2a261b4">tempest-UpdateMultiattachVolumeNegativeTest-1815230933</nova:project>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <nova:port uuid="a8ce8a67-762d-41a0-8c12-778f66e87f3c">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <entry name="serial">8d7306df-bd40-48a7-99a7-36da8b9a67f3</entry>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <entry name="uuid">8d7306df-bd40-48a7-99a7-36da8b9a67f3</entry>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk.config">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:dd:37:f7"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <target dev="tapa8ce8a67-76"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/8d7306df-bd40-48a7-99a7-36da8b9a67f3/console.log" append="off"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:06:36 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:06:36 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:06:36 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:06:36 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.815 2 DEBUG nova.compute.manager [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Preparing to wait for external event network-vif-plugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.815 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.815 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.815 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.816 2 DEBUG nova.virt.libvirt.vif [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-1739227330',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-1739227330',id=32,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWLHYAzCSRCkStdbU+GdVhWIXiwiTci8xggQ9ThyRlprkD/MENcP1zXCe9JELWxtblFvNPabWQ+ZgjaGJX29tNuXgS46PKPgWmCmmQjfV3eqKUfK1wEy2Lz1kDGxf6LzA==',key_name='tempest-keypair-377984943',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7359a7dad3b849bfbf075b88f2a261b4',ramdisk_id='',reservation_id='r-pu3d1045',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-1815230933',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-1815230933-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ec17c54e24584f11a5348b68d6e7ca85',uuid=8d7306df-bd40-48a7-99a7-36da8b9a67f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.816 2 DEBUG nova.network.os_vif_util [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Converting VIF {"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.817 2 DEBUG nova.network.os_vif_util [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:37:f7,bridge_name='br-int',has_traffic_filtering=True,id=a8ce8a67-762d-41a0-8c12-778f66e87f3c,network=Network(0392b00d-9a0f-4fdc-878a-61235e8b04c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ce8a67-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.817 2 DEBUG os_vif [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:37:f7,bridge_name='br-int',has_traffic_filtering=True,id=a8ce8a67-762d-41a0-8c12-778f66e87f3c,network=Network(0392b00d-9a0f-4fdc-878a-61235e8b04c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ce8a67-76') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.818 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.819 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.822 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa8ce8a67-76, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.822 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa8ce8a67-76, col_values=(('external_ids', {'iface-id': 'a8ce8a67-762d-41a0-8c12-778f66e87f3c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:37:f7', 'vm-uuid': '8d7306df-bd40-48a7-99a7-36da8b9a67f3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:36 np0005465988 NetworkManager[45041]: <info>  [1759406796.8250] manager: (tapa8ce8a67-76): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.834 2 INFO os_vif [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:37:f7,bridge_name='br-int',has_traffic_filtering=True,id=a8ce8a67-762d-41a0-8c12-778f66e87f3c,network=Network(0392b00d-9a0f-4fdc-878a-61235e8b04c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ce8a67-76')#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.897 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.898 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.898 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] No VIF found with MAC fa:16:3e:dd:37:f7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.898 2 INFO nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Using config drive#033[00m
Oct  2 08:06:36 np0005465988 nova_compute[236126]: 2025-10-02 12:06:36.921 2 DEBUG nova.storage.rbd_utils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] rbd image 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:06:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.244 2 INFO nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Creating config drive at /var/lib/nova/instances/8d7306df-bd40-48a7-99a7-36da8b9a67f3/disk.config#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.249 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8d7306df-bd40-48a7-99a7-36da8b9a67f3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2f86_ew9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.378 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8d7306df-bd40-48a7-99a7-36da8b9a67f3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2f86_ew9" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.410 2 DEBUG nova.storage.rbd_utils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] rbd image 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.414 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8d7306df-bd40-48a7-99a7-36da8b9a67f3/disk.config 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.526 2 DEBUG nova.network.neutron [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updated VIF entry in instance network info cache for port a8ce8a67-762d-41a0-8c12-778f66e87f3c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.527 2 DEBUG nova.network.neutron [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updating instance_info_cache with network_info: [{"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.540 2 DEBUG oslo_concurrency.lockutils [req-54b8ee11-32d6-44e9-883e-4014a5188245 req-04be913f-3ea3-42b8-8e8b-7293567aab81 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.604 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.605 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.635 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.635 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.636 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.636 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.636 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.703 2 DEBUG oslo_concurrency.processutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8d7306df-bd40-48a7-99a7-36da8b9a67f3/disk.config 8d7306df-bd40-48a7-99a7-36da8b9a67f3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.290s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.704 2 INFO nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Deleting local config drive /var/lib/nova/instances/8d7306df-bd40-48a7-99a7-36da8b9a67f3/disk.config because it was imported into RBD.#033[00m
Oct  2 08:06:37 np0005465988 kernel: tapa8ce8a67-76: entered promiscuous mode
Oct  2 08:06:37 np0005465988 NetworkManager[45041]: <info>  [1759406797.7731] manager: (tapa8ce8a67-76): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Oct  2 08:06:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:37Z|00067|binding|INFO|Claiming lport a8ce8a67-762d-41a0-8c12-778f66e87f3c for this chassis.
Oct  2 08:06:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:37Z|00068|binding|INFO|a8ce8a67-762d-41a0-8c12-778f66e87f3c: Claiming fa:16:3e:dd:37:f7 10.100.0.4
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.790 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:37:f7 10.100.0.4'], port_security=['fa:16:3e:dd:37:f7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8d7306df-bd40-48a7-99a7-36da8b9a67f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0392b00d-9a0f-4fdc-878a-61235e8b04c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7359a7dad3b849bfbf075b88f2a261b4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '89efda0a-e365-4ab4-b56f-2cbf8e88c8e8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=00463c6f-e0da-4800-9774-7f10cd7297fc, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=a8ce8a67-762d-41a0-8c12-778f66e87f3c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.791 142124 INFO neutron.agent.ovn.metadata.agent [-] Port a8ce8a67-762d-41a0-8c12-778f66e87f3c in datapath 0392b00d-9a0f-4fdc-878a-61235e8b04c7 bound to our chassis#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.794 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0392b00d-9a0f-4fdc-878a-61235e8b04c7#033[00m
Oct  2 08:06:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:37.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.807 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[07b6b232-8a71-4b48-abaf-33643fa044c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.808 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0392b00d-91 in ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.811 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0392b00d-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.812 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e8786c-6faa-41a6-bc2f-5feccb142a65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.813 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a7824799-25d7-4da5-b6d0-084f9d27a337]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.829 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa6980f-aa98-460b-964e-64a9477e7399]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 systemd-machined[192594]: New machine qemu-13-instance-00000020.
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.846 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d2cffd66-b6aa-4a9f-8251-d28b1f52bb91]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:37Z|00069|binding|INFO|Setting lport a8ce8a67-762d-41a0-8c12-778f66e87f3c ovn-installed in OVS
Oct  2 08:06:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:37Z|00070|binding|INFO|Setting lport a8ce8a67-762d-41a0-8c12-778f66e87f3c up in Southbound
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:37 np0005465988 nova_compute[236126]: 2025-10-02 12:06:37.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:37 np0005465988 systemd[1]: Started Virtual Machine qemu-13-instance-00000020.
Oct  2 08:06:37 np0005465988 systemd-udevd[251276]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.884 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[3e39d5cc-c177-4655-9f59-b5f3ae2bff81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.890 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e9130ae7-20ec-453a-86bf-8ebea798c778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 NetworkManager[45041]: <info>  [1759406797.8922] manager: (tap0392b00d-90): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Oct  2 08:06:37 np0005465988 systemd-udevd[251278]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:06:37 np0005465988 NetworkManager[45041]: <info>  [1759406797.8935] device (tapa8ce8a67-76): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:06:37 np0005465988 NetworkManager[45041]: <info>  [1759406797.8945] device (tapa8ce8a67-76): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.920 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b30977f0-7649-464d-8fff-e8a45ca89e0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.922 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[dbf39517-e014-4794-9fa9-3a804361b242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 NetworkManager[45041]: <info>  [1759406797.9415] device (tap0392b00d-90): carrier: link connected
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.943 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[95b10ee2-bab8-4551-8479-d65277c0e44e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.957 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5a7ebfce-50ff-45b5-b836-47ed58fb0e1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0392b00d-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:2c:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485827, 'reachable_time': 34905, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251304, 'error': None, 'target': 'ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.973 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5a3a7836-f0b3-4acf-8c05-48d9fb186249]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe08:2c11'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485827, 'tstamp': 485827}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251305, 'error': None, 'target': 'ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:37.988 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b495d7-0c93-4bd4-ba28-7869bd9160e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0392b00d-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:2c:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485827, 'reachable_time': 34905, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251306, 'error': None, 'target': 'ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:38.021 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3fe59bfa-1e90-4566-b7d0-594bbeff52e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:38.080 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[67144394-5090-4ff5-9898-53be08510193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:38.083 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0392b00d-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:38.083 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:38.083 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0392b00d-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:38 np0005465988 nova_compute[236126]: 2025-10-02 12:06:38.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:38 np0005465988 kernel: tap0392b00d-90: entered promiscuous mode
Oct  2 08:06:38 np0005465988 nova_compute[236126]: 2025-10-02 12:06:38.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:38 np0005465988 NetworkManager[45041]: <info>  [1759406798.0897] manager: (tap0392b00d-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:38.095 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0392b00d-90, col_values=(('external_ids', {'iface-id': 'a266984f-a69e-4d11-8c6e-e21eb33eff29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:38 np0005465988 nova_compute[236126]: 2025-10-02 12:06:38.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:38Z|00071|binding|INFO|Releasing lport a266984f-a69e-4d11-8c6e-e21eb33eff29 from this chassis (sb_readonly=0)
Oct  2 08:06:38 np0005465988 nova_compute[236126]: 2025-10-02 12:06:38.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:38 np0005465988 nova_compute[236126]: 2025-10-02 12:06:38.109 2 DEBUG nova.compute.manager [req-e987f7f9-b3ca-4f9f-8c3d-de9a67643200 req-2bada02b-fb97-4a55-b823-8c117b50b71e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Received event network-vif-plugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:38 np0005465988 nova_compute[236126]: 2025-10-02 12:06:38.110 2 DEBUG oslo_concurrency.lockutils [req-e987f7f9-b3ca-4f9f-8c3d-de9a67643200 req-2bada02b-fb97-4a55-b823-8c117b50b71e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:38 np0005465988 nova_compute[236126]: 2025-10-02 12:06:38.110 2 DEBUG oslo_concurrency.lockutils [req-e987f7f9-b3ca-4f9f-8c3d-de9a67643200 req-2bada02b-fb97-4a55-b823-8c117b50b71e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:38 np0005465988 nova_compute[236126]: 2025-10-02 12:06:38.110 2 DEBUG oslo_concurrency.lockutils [req-e987f7f9-b3ca-4f9f-8c3d-de9a67643200 req-2bada02b-fb97-4a55-b823-8c117b50b71e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:38 np0005465988 nova_compute[236126]: 2025-10-02 12:06:38.110 2 DEBUG nova.compute.manager [req-e987f7f9-b3ca-4f9f-8c3d-de9a67643200 req-2bada02b-fb97-4a55-b823-8c117b50b71e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Processing event network-vif-plugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:06:38 np0005465988 nova_compute[236126]: 2025-10-02 12:06:38.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:38.114 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0392b00d-9a0f-4fdc-878a-61235e8b04c7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0392b00d-9a0f-4fdc-878a-61235e8b04c7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:38.115 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cb7f487d-4438-4a8f-a470-279ca39a854a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:38.115 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-0392b00d-9a0f-4fdc-878a-61235e8b04c7
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/0392b00d-9a0f-4fdc-878a-61235e8b04c7.pid.haproxy
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 0392b00d-9a0f-4fdc-878a-61235e8b04c7
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:06:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:38.116 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7', 'env', 'PROCESS_TAG=haproxy-0392b00d-9a0f-4fdc-878a-61235e8b04c7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0392b00d-9a0f-4fdc-878a-61235e8b04c7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:06:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:38.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:38 np0005465988 podman[251338]: 2025-10-02 12:06:38.558034526 +0000 UTC m=+0.059240945 container create 08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:06:38 np0005465988 systemd[1]: Started libpod-conmon-08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af.scope.
Oct  2 08:06:38 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:06:38 np0005465988 podman[251338]: 2025-10-02 12:06:38.526474578 +0000 UTC m=+0.027681017 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:06:38 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ec215ae210229590b3e630485aa9907e988503a121a123c0585aeec602ac157/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:06:38 np0005465988 podman[251338]: 2025-10-02 12:06:38.648297884 +0000 UTC m=+0.149504393 container init 08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:06:38 np0005465988 podman[251338]: 2025-10-02 12:06:38.653912227 +0000 UTC m=+0.155118676 container start 08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  2 08:06:38 np0005465988 neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7[251389]: [NOTICE]   (251398) : New worker (251401) forked
Oct  2 08:06:38 np0005465988 neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7[251389]: [NOTICE]   (251398) : Loading success.
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.121 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406799.1203413, 8d7306df-bd40-48a7-99a7-36da8b9a67f3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.122 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] VM Started (Lifecycle Event)#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.124 2 DEBUG nova.compute.manager [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.129 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.136 2 INFO nova.virt.libvirt.driver [-] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Instance spawned successfully.#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.137 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.142 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.147 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.161 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.162 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.162 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.163 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.164 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.164 2 DEBUG nova.virt.libvirt.driver [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.200 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.200 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406799.1219075, 8d7306df-bd40-48a7-99a7-36da8b9a67f3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.201 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.234 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.239 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406799.127772, 8d7306df-bd40-48a7-99a7-36da8b9a67f3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.239 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.280 2 INFO nova.compute.manager [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Took 8.61 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.280 2 DEBUG nova.compute.manager [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.290 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.294 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.357 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.375 2 INFO nova.compute.manager [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Took 9.72 seconds to build instance.#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.393 2 DEBUG oslo_concurrency.lockutils [None req-5837fa5c-9193-4a05-88a1-330d629af01a ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:39 np0005465988 nova_compute[236126]: 2025-10-02 12:06:39.500 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:39.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:40.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:40 np0005465988 nova_compute[236126]: 2025-10-02 12:06:40.188 2 DEBUG nova.compute.manager [req-c6ee6730-95a3-46a7-ad4a-3a37724ce142 req-5e926c8f-9ca3-4a70-836e-c9998ecba789 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Received event network-vif-plugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:40 np0005465988 nova_compute[236126]: 2025-10-02 12:06:40.188 2 DEBUG oslo_concurrency.lockutils [req-c6ee6730-95a3-46a7-ad4a-3a37724ce142 req-5e926c8f-9ca3-4a70-836e-c9998ecba789 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:40 np0005465988 nova_compute[236126]: 2025-10-02 12:06:40.189 2 DEBUG oslo_concurrency.lockutils [req-c6ee6730-95a3-46a7-ad4a-3a37724ce142 req-5e926c8f-9ca3-4a70-836e-c9998ecba789 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:40 np0005465988 nova_compute[236126]: 2025-10-02 12:06:40.189 2 DEBUG oslo_concurrency.lockutils [req-c6ee6730-95a3-46a7-ad4a-3a37724ce142 req-5e926c8f-9ca3-4a70-836e-c9998ecba789 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:40 np0005465988 nova_compute[236126]: 2025-10-02 12:06:40.190 2 DEBUG nova.compute.manager [req-c6ee6730-95a3-46a7-ad4a-3a37724ce142 req-5e926c8f-9ca3-4a70-836e-c9998ecba789 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] No waiting events found dispatching network-vif-plugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:40 np0005465988 nova_compute[236126]: 2025-10-02 12:06:40.190 2 WARNING nova.compute.manager [req-c6ee6730-95a3-46a7-ad4a-3a37724ce142 req-5e926c8f-9ca3-4a70-836e-c9998ecba789 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Received unexpected event network-vif-plugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:06:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:06:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:06:41 np0005465988 podman[251438]: 2025-10-02 12:06:41.113764678 +0000 UTC m=+0.060519033 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 08:06:41 np0005465988 podman[251437]: 2025-10-02 12:06:41.122219154 +0000 UTC m=+0.071922885 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=iscsid, container_name=iscsid)
Oct  2 08:06:41 np0005465988 podman[251436]: 2025-10-02 12:06:41.148319823 +0000 UTC m=+0.101170995 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005465988 NetworkManager[45041]: <info>  [1759406801.2318] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/45)
Oct  2 08:06:41 np0005465988 NetworkManager[45041]: <info>  [1759406801.2324] device (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 08:06:41 np0005465988 NetworkManager[45041]: <info>  [1759406801.2332] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/46)
Oct  2 08:06:41 np0005465988 NetworkManager[45041]: <info>  [1759406801.2334] device (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 08:06:41 np0005465988 NetworkManager[45041]: <info>  [1759406801.2342] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Oct  2 08:06:41 np0005465988 NetworkManager[45041]: <info>  [1759406801.2347] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Oct  2 08:06:41 np0005465988 NetworkManager[45041]: <info>  [1759406801.2351] device (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  2 08:06:41 np0005465988 NetworkManager[45041]: <info>  [1759406801.2354] device (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:41Z|00072|binding|INFO|Releasing lport 3e3f512e-f85f-4c9c-b91d-072c570470c1 from this chassis (sb_readonly=0)
Oct  2 08:06:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:41Z|00073|binding|INFO|Releasing lport a266984f-a69e-4d11-8c6e-e21eb33eff29 from this chassis (sb_readonly=0)
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.374 2 DEBUG oslo_concurrency.lockutils [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "17426d60-57ac-41a5-9ae2-688821fe7f56" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.374 2 DEBUG oslo_concurrency.lockutils [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.375 2 DEBUG oslo_concurrency.lockutils [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.375 2 DEBUG oslo_concurrency.lockutils [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.375 2 DEBUG oslo_concurrency.lockutils [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.376 2 INFO nova.compute.manager [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Terminating instance#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.377 2 DEBUG nova.compute.manager [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005465988 kernel: tapf98d3352-ae (unregistering): left promiscuous mode
Oct  2 08:06:41 np0005465988 NetworkManager[45041]: <info>  [1759406801.4276] device (tapf98d3352-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:41Z|00074|binding|INFO|Releasing lport f98d3352-aeeb-4929-9920-2a306cb9558d from this chassis (sb_readonly=0)
Oct  2 08:06:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:41Z|00075|binding|INFO|Setting lport f98d3352-aeeb-4929-9920-2a306cb9558d down in Southbound
Oct  2 08:06:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:41Z|00076|binding|INFO|Removing iface tapf98d3352-ae ovn-installed in OVS
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.449 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:cf:4c 10.100.0.3'], port_security=['fa:16:3e:69:cf:4c 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '17426d60-57ac-41a5-9ae2-688821fe7f56', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80106802-d877-42c6-b2a9-50b050f6b08f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9afa78cc4dec419babdf61fd31f46e28', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8fbb5420-10f4-405b-bd01-713020f7e518', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03aa6f10-2374-4fa3-bc90-1fcb8815afb8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f98d3352-aeeb-4929-9920-2a306cb9558d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.451 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f98d3352-aeeb-4929-9920-2a306cb9558d in datapath 80106802-d877-42c6-b2a9-50b050f6b08f unbound from our chassis#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.454 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 80106802-d877-42c6-b2a9-50b050f6b08f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.455 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[aed4d118-104e-4a39-aa9c-b07140cf7255]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.456 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f namespace which is not needed anymore#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005465988 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000018.scope: Deactivated successfully.
Oct  2 08:06:41 np0005465988 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000018.scope: Consumed 19.122s CPU time.
Oct  2 08:06:41 np0005465988 systemd-machined[192594]: Machine qemu-10-instance-00000018 terminated.
Oct  2 08:06:41 np0005465988 neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f[247449]: [NOTICE]   (247453) : haproxy version is 2.8.14-c23fe91
Oct  2 08:06:41 np0005465988 neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f[247449]: [NOTICE]   (247453) : path to executable is /usr/sbin/haproxy
Oct  2 08:06:41 np0005465988 neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f[247449]: [WARNING]  (247453) : Exiting Master process...
Oct  2 08:06:41 np0005465988 neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f[247449]: [ALERT]    (247453) : Current worker (247455) exited with code 143 (Terminated)
Oct  2 08:06:41 np0005465988 neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f[247449]: [WARNING]  (247453) : All workers exited. Exiting... (0)
Oct  2 08:06:41 np0005465988 systemd[1]: libpod-ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50.scope: Deactivated successfully.
Oct  2 08:06:41 np0005465988 podman[251550]: 2025-10-02 12:06:41.595516889 +0000 UTC m=+0.045278439 container died ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.614 2 INFO nova.virt.libvirt.driver [-] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Instance destroyed successfully.#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.614 2 DEBUG nova.objects.instance [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lazy-loading 'resources' on Instance uuid 17426d60-57ac-41a5-9ae2-688821fe7f56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:41 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50-userdata-shm.mount: Deactivated successfully.
Oct  2 08:06:41 np0005465988 systemd[1]: var-lib-containers-storage-overlay-ec11a9301a1e6ea967e1856d0df2d55f5e225f94538f1ecee803b3b315a7b3dd-merged.mount: Deactivated successfully.
Oct  2 08:06:41 np0005465988 podman[251550]: 2025-10-02 12:06:41.640161799 +0000 UTC m=+0.089923329 container cleanup ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:06:41 np0005465988 systemd[1]: libpod-conmon-ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50.scope: Deactivated successfully.
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.671 2 DEBUG nova.virt.libvirt.vif [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:04:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-966324223',display_name='tempest-ServersAdminTestJSON-server-966324223',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-966324223',id=24,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:04:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9afa78cc4dec419babdf61fd31f46e28',ramdisk_id='',reservation_id='r-ps9o5e69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-518249049',owner_user_name='tempest-ServersAdminTestJSON-518249049-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:04:39Z,user_data=None,user_id='8850add40b254d198f270d9e64c777d5',uuid=17426d60-57ac-41a5-9ae2-688821fe7f56,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f98d3352-aeeb-4929-9920-2a306cb9558d", "address": "fa:16:3e:69:cf:4c", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf98d3352-ae", "ovs_interfaceid": "f98d3352-aeeb-4929-9920-2a306cb9558d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.671 2 DEBUG nova.network.os_vif_util [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converting VIF {"id": "f98d3352-aeeb-4929-9920-2a306cb9558d", "address": "fa:16:3e:69:cf:4c", "network": {"id": "80106802-d877-42c6-b2a9-50b050f6b08f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-79358917-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9afa78cc4dec419babdf61fd31f46e28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf98d3352-ae", "ovs_interfaceid": "f98d3352-aeeb-4929-9920-2a306cb9558d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.672 2 DEBUG nova.network.os_vif_util [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:69:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=f98d3352-aeeb-4929-9920-2a306cb9558d,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf98d3352-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.672 2 DEBUG os_vif [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=f98d3352-aeeb-4929-9920-2a306cb9558d,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf98d3352-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.675 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf98d3352-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.680 2 INFO os_vif [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=f98d3352-aeeb-4929-9920-2a306cb9558d,network=Network(80106802-d877-42c6-b2a9-50b050f6b08f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf98d3352-ae')#033[00m
Oct  2 08:06:41 np0005465988 podman[251591]: 2025-10-02 12:06:41.722786554 +0000 UTC m=+0.057730242 container remove ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.728 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[02e11f06-82f4-4587-82b9-e3057ea81246]: (4, ('Thu Oct  2 12:06:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f (ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50)\nce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50\nThu Oct  2 12:06:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f (ce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50)\nce507b3622be571e98e0ab5f292c35eac26b4de958d8f083a4f04b076136ff50\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.729 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[99d2d916-c109-46e3-9c3e-5475ee16f0e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.730 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80106802-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005465988 kernel: tap80106802-d0: left promiscuous mode
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.755 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e78eb24a-7ad8-4df1-90c1-1fa6682defbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.781 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c37933-5992-43c6-bc79-492521512469]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.782 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[99abcfb7-9e45-4f69-b74b-630fbe4aad8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.797 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2851b70d-1805-4e93-be77-544cc0507008]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473832, 'reachable_time': 35566, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251624, 'error': None, 'target': 'ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:41 np0005465988 systemd[1]: run-netns-ovnmeta\x2d80106802\x2dd877\x2d42c6\x2db2a9\x2d50b050f6b08f.mount: Deactivated successfully.
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.802 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-80106802-d877-42c6-b2a9-50b050f6b08f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:06:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:41.802 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[c41097af-3785-491c-a732-7a0dc885b43e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:41.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:41 np0005465988 nova_compute[236126]: 2025-10-02 12:06:41.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:42.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:42 np0005465988 nova_compute[236126]: 2025-10-02 12:06:42.764 2 DEBUG nova.compute.manager [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Received event network-vif-unplugged-f98d3352-aeeb-4929-9920-2a306cb9558d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:42 np0005465988 nova_compute[236126]: 2025-10-02 12:06:42.764 2 DEBUG oslo_concurrency.lockutils [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:42 np0005465988 nova_compute[236126]: 2025-10-02 12:06:42.764 2 DEBUG oslo_concurrency.lockutils [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:42 np0005465988 nova_compute[236126]: 2025-10-02 12:06:42.765 2 DEBUG oslo_concurrency.lockutils [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:42 np0005465988 nova_compute[236126]: 2025-10-02 12:06:42.765 2 DEBUG nova.compute.manager [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] No waiting events found dispatching network-vif-unplugged-f98d3352-aeeb-4929-9920-2a306cb9558d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:42 np0005465988 nova_compute[236126]: 2025-10-02 12:06:42.765 2 DEBUG nova.compute.manager [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Received event network-vif-unplugged-f98d3352-aeeb-4929-9920-2a306cb9558d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:06:42 np0005465988 nova_compute[236126]: 2025-10-02 12:06:42.765 2 DEBUG nova.compute.manager [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Received event network-changed-a8ce8a67-762d-41a0-8c12-778f66e87f3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:42 np0005465988 nova_compute[236126]: 2025-10-02 12:06:42.766 2 DEBUG nova.compute.manager [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Refreshing instance network info cache due to event network-changed-a8ce8a67-762d-41a0-8c12-778f66e87f3c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:06:42 np0005465988 nova_compute[236126]: 2025-10-02 12:06:42.766 2 DEBUG oslo_concurrency.lockutils [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:42 np0005465988 nova_compute[236126]: 2025-10-02 12:06:42.766 2 DEBUG oslo_concurrency.lockutils [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:42 np0005465988 nova_compute[236126]: 2025-10-02 12:06:42.766 2 DEBUG nova.network.neutron [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Refreshing network info cache for port a8ce8a67-762d-41a0-8c12-778f66e87f3c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:06:43 np0005465988 nova_compute[236126]: 2025-10-02 12:06:43.400 2 INFO nova.virt.libvirt.driver [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Deleting instance files /var/lib/nova/instances/17426d60-57ac-41a5-9ae2-688821fe7f56_del#033[00m
Oct  2 08:06:43 np0005465988 nova_compute[236126]: 2025-10-02 12:06:43.400 2 INFO nova.virt.libvirt.driver [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Deletion of /var/lib/nova/instances/17426d60-57ac-41a5-9ae2-688821fe7f56_del complete#033[00m
Oct  2 08:06:43 np0005465988 nova_compute[236126]: 2025-10-02 12:06:43.525 2 INFO nova.compute.manager [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Took 2.15 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:06:43 np0005465988 nova_compute[236126]: 2025-10-02 12:06:43.526 2 DEBUG oslo.service.loopingcall [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:06:43 np0005465988 nova_compute[236126]: 2025-10-02 12:06:43.526 2 DEBUG nova.compute.manager [-] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:06:43 np0005465988 nova_compute[236126]: 2025-10-02 12:06:43.527 2 DEBUG nova.network.neutron [-] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:06:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:43.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:44.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.200 2 DEBUG nova.network.neutron [-] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.282 2 DEBUG nova.compute.manager [req-3f8dc6b4-0b1f-411f-8403-42c4da07301d req-cb0c874e-0e3b-4942-a794-c57662f4d784 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Received event network-vif-deleted-f98d3352-aeeb-4929-9920-2a306cb9558d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.283 2 INFO nova.compute.manager [req-3f8dc6b4-0b1f-411f-8403-42c4da07301d req-cb0c874e-0e3b-4942-a794-c57662f4d784 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Neutron deleted interface f98d3352-aeeb-4929-9920-2a306cb9558d; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.283 2 DEBUG nova.network.neutron [req-3f8dc6b4-0b1f-411f-8403-42c4da07301d req-cb0c874e-0e3b-4942-a794-c57662f4d784 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.285 2 INFO nova.compute.manager [-] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Took 0.76 seconds to deallocate network for instance.#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.320 2 DEBUG nova.compute.manager [req-3f8dc6b4-0b1f-411f-8403-42c4da07301d req-cb0c874e-0e3b-4942-a794-c57662f4d784 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Detach interface failed, port_id=f98d3352-aeeb-4929-9920-2a306cb9558d, reason: Instance 17426d60-57ac-41a5-9ae2-688821fe7f56 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.361 2 DEBUG oslo_concurrency.lockutils [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.362 2 DEBUG oslo_concurrency.lockutils [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.439 2 DEBUG oslo_concurrency.processutils [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.574 2 DEBUG nova.network.neutron [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updated VIF entry in instance network info cache for port a8ce8a67-762d-41a0-8c12-778f66e87f3c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.588 2 DEBUG nova.network.neutron [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updating instance_info_cache with network_info: [{"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.712 2 DEBUG oslo_concurrency.lockutils [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.713 2 DEBUG nova.compute.manager [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Received event network-vif-plugged-f98d3352-aeeb-4929-9920-2a306cb9558d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.713 2 DEBUG oslo_concurrency.lockutils [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.714 2 DEBUG oslo_concurrency.lockutils [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.714 2 DEBUG oslo_concurrency.lockutils [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.714 2 DEBUG nova.compute.manager [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] No waiting events found dispatching network-vif-plugged-f98d3352-aeeb-4929-9920-2a306cb9558d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.715 2 WARNING nova.compute.manager [req-91aca6d6-86dd-4e06-959d-9bc0e2bf51bf req-04c18ba8-984b-4fbc-b572-aae4771cac09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Received unexpected event network-vif-plugged-f98d3352-aeeb-4929-9920-2a306cb9558d for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:06:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:06:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2415899086' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.906 2 DEBUG oslo_concurrency.processutils [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.912 2 DEBUG nova.compute.provider_tree [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:44 np0005465988 nova_compute[236126]: 2025-10-02 12:06:44.974 2 DEBUG nova.scheduler.client.report [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:45 np0005465988 nova_compute[236126]: 2025-10-02 12:06:45.245 2 DEBUG oslo_concurrency.lockutils [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.883s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:45 np0005465988 nova_compute[236126]: 2025-10-02 12:06:45.379 2 INFO nova.scheduler.client.report [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Deleted allocations for instance 17426d60-57ac-41a5-9ae2-688821fe7f56#033[00m
Oct  2 08:06:45 np0005465988 nova_compute[236126]: 2025-10-02 12:06:45.552 2 DEBUG oslo_concurrency.lockutils [None req-334953c1-0fd5-46d8-91bb-005d3e5b2ae4 8850add40b254d198f270d9e64c777d5 9afa78cc4dec419babdf61fd31f46e28 - - default default] Lock "17426d60-57ac-41a5-9ae2-688821fe7f56" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:06:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:45.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:06:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:46.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:46 np0005465988 nova_compute[236126]: 2025-10-02 12:06:46.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:46 np0005465988 nova_compute[236126]: 2025-10-02 12:06:46.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:47 np0005465988 nova_compute[236126]: 2025-10-02 12:06:47.258 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406792.2573912, e25deec6-82e8-43a2-b508-c3b3fa2d4f4e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:47 np0005465988 nova_compute[236126]: 2025-10-02 12:06:47.259 2 INFO nova.compute.manager [-] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:06:47 np0005465988 nova_compute[236126]: 2025-10-02 12:06:47.311 2 DEBUG nova.compute.manager [None req-13d2b1a1-b1a9-4828-a3ff-cf8ae97a2563 - - - - - -] [instance: e25deec6-82e8-43a2-b508-c3b3fa2d4f4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:47.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:48.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:48.989 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:48 np0005465988 nova_compute[236126]: 2025-10-02 12:06:48.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:48.989 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:06:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:49.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:50.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:51 np0005465988 nova_compute[236126]: 2025-10-02 12:06:51.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:51.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:51 np0005465988 nova_compute[236126]: 2025-10-02 12:06:51.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:52.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:53 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:53Z|00077|binding|INFO|Releasing lport a266984f-a69e-4d11-8c6e-e21eb33eff29 from this chassis (sb_readonly=0)
Oct  2 08:06:53 np0005465988 nova_compute[236126]: 2025-10-02 12:06:53.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:53.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:06:53.992 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:54.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:54Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dd:37:f7 10.100.0.4
Oct  2 08:06:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:54Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:37:f7 10.100.0.4
Oct  2 08:06:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:55.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:56.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:56 np0005465988 podman[251706]: 2025-10-02 12:06:56.531803527 +0000 UTC m=+0.059818753 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent)
Oct  2 08:06:56 np0005465988 nova_compute[236126]: 2025-10-02 12:06:56.607 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406801.605941, 17426d60-57ac-41a5-9ae2-688821fe7f56 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:56 np0005465988 nova_compute[236126]: 2025-10-02 12:06:56.607 2 INFO nova.compute.manager [-] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:06:56 np0005465988 nova_compute[236126]: 2025-10-02 12:06:56.632 2 DEBUG nova.compute.manager [None req-2cd669c3-9a20-4e0a-b269-aa4a141c2f53 - - - - - -] [instance: 17426d60-57ac-41a5-9ae2-688821fe7f56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:56 np0005465988 nova_compute[236126]: 2025-10-02 12:06:56.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:56 np0005465988 nova_compute[236126]: 2025-10-02 12:06:56.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:06:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:06:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:57.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:06:57 np0005465988 ovn_controller[132601]: 2025-10-02T12:06:57Z|00078|binding|INFO|Releasing lport a266984f-a69e-4d11-8c6e-e21eb33eff29 from this chassis (sb_readonly=0)
Oct  2 08:06:57 np0005465988 nova_compute[236126]: 2025-10-02 12:06:57.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:06:58.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:06:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:06:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:06:59.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:06:59 np0005465988 nova_compute[236126]: 2025-10-02 12:06:59.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:00.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:01 np0005465988 nova_compute[236126]: 2025-10-02 12:07:01.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:01.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:01 np0005465988 nova_compute[236126]: 2025-10-02 12:07:01.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:02.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:03.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:04 np0005465988 nova_compute[236126]: 2025-10-02 12:07:04.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:04.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:05.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:06.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:06 np0005465988 nova_compute[236126]: 2025-10-02 12:07:06.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:06 np0005465988 nova_compute[236126]: 2025-10-02 12:07:06.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:07.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:08.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:09 np0005465988 nova_compute[236126]: 2025-10-02 12:07:09.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:09.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:10.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:11 np0005465988 podman[251736]: 2025-10-02 12:07:11.554714505 +0000 UTC m=+0.072458361 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:07:11 np0005465988 podman[251735]: 2025-10-02 12:07:11.555884819 +0000 UTC m=+0.079743953 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:07:11 np0005465988 podman[251734]: 2025-10-02 12:07:11.594908805 +0000 UTC m=+0.117986616 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, container_name=ovn_controller)
Oct  2 08:07:11 np0005465988 nova_compute[236126]: 2025-10-02 12:07:11.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:11.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:11 np0005465988 nova_compute[236126]: 2025-10-02 12:07:11.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:12.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:13 np0005465988 nova_compute[236126]: 2025-10-02 12:07:13.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:13.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:14.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:15 np0005465988 nova_compute[236126]: 2025-10-02 12:07:15.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:15.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:16.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:16 np0005465988 nova_compute[236126]: 2025-10-02 12:07:16.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 e162: 3 total, 3 up, 3 in
Oct  2 08:07:16 np0005465988 nova_compute[236126]: 2025-10-02 12:07:16.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:17.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:18 np0005465988 nova_compute[236126]: 2025-10-02 12:07:18.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:18.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:19.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:20.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:21 np0005465988 nova_compute[236126]: 2025-10-02 12:07:21.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:21.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:21 np0005465988 nova_compute[236126]: 2025-10-02 12:07:21.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:22.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:23.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.079 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.079 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.119 2 DEBUG nova.compute.manager [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.202 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.203 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.212 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.212 2 INFO nova.compute.claims [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:07:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:24.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.394 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:07:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2637674219' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.841 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.848 2 DEBUG nova.compute.provider_tree [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.871 2 DEBUG nova.scheduler.client.report [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.902 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.904 2 DEBUG nova.compute.manager [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.971 2 DEBUG nova.compute.manager [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.971 2 DEBUG nova.network.neutron [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:07:24 np0005465988 nova_compute[236126]: 2025-10-02 12:07:24.997 2 INFO nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:07:25 np0005465988 nova_compute[236126]: 2025-10-02 12:07:25.028 2 DEBUG nova.compute.manager [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:07:25 np0005465988 nova_compute[236126]: 2025-10-02 12:07:25.140 2 DEBUG nova.compute.manager [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:07:25 np0005465988 nova_compute[236126]: 2025-10-02 12:07:25.142 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:07:25 np0005465988 nova_compute[236126]: 2025-10-02 12:07:25.143 2 INFO nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Creating image(s)#033[00m
Oct  2 08:07:25 np0005465988 nova_compute[236126]: 2025-10-02 12:07:25.172 2 DEBUG nova.storage.rbd_utils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] rbd image 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:25 np0005465988 nova_compute[236126]: 2025-10-02 12:07:25.197 2 DEBUG nova.storage.rbd_utils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] rbd image 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:25 np0005465988 nova_compute[236126]: 2025-10-02 12:07:25.223 2 DEBUG nova.storage.rbd_utils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] rbd image 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:25 np0005465988 nova_compute[236126]: 2025-10-02 12:07:25.227 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "0b6beffbf0661cbda2e4327409b903fc2160c26f" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:25 np0005465988 nova_compute[236126]: 2025-10-02 12:07:25.228 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "0b6beffbf0661cbda2e4327409b903fc2160c26f" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:25 np0005465988 nova_compute[236126]: 2025-10-02 12:07:25.262 2 DEBUG nova.policy [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4666fdebee9947109da966b5c870b34e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '861ae6a71574411fbcdab09902e6bcc4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:07:25 np0005465988 nova_compute[236126]: 2025-10-02 12:07:25.660 2 DEBUG nova.virt.libvirt.imagebackend [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Image locations are: [{'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/9ba2a4cf-08bb-442a-b063-4fb551df3759/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/9ba2a4cf-08bb-442a-b063-4fb551df3759/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Oct  2 08:07:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:25.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:26.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:26.355 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:26.356 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:07:26 np0005465988 nova_compute[236126]: 2025-10-02 12:07:26.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:26 np0005465988 nova_compute[236126]: 2025-10-02 12:07:26.488 2 DEBUG nova.network.neutron [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Successfully created port: 54af1da4-2337-4e36-8e6e-2c36ccf43309 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:07:26 np0005465988 nova_compute[236126]: 2025-10-02 12:07:26.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:26 np0005465988 nova_compute[236126]: 2025-10-02 12:07:26.828 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:26 np0005465988 nova_compute[236126]: 2025-10-02 12:07:26.895 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f.part --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:26 np0005465988 nova_compute[236126]: 2025-10-02 12:07:26.896 2 DEBUG nova.virt.images [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] 9ba2a4cf-08bb-442a-b063-4fb551df3759 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:07:26 np0005465988 nova_compute[236126]: 2025-10-02 12:07:26.898 2 DEBUG nova.privsep.utils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:07:26 np0005465988 nova_compute[236126]: 2025-10-02 12:07:26.898 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f.part /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:26 np0005465988 nova_compute[236126]: 2025-10-02 12:07:26.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.068 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f.part /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f.converted" returned: 0 in 0.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.072 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.137 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f.converted --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.138 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "0b6beffbf0661cbda2e4327409b903fc2160c26f" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.910s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.164 2 DEBUG nova.storage.rbd_utils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] rbd image 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.169 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:27.335 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:27.336 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:27.336 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:27 np0005465988 podman[251983]: 2025-10-02 12:07:27.528339365 +0000 UTC m=+0.066743244 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent)
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.599 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.674 2 DEBUG nova.storage.rbd_utils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] resizing rbd image 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.783 2 DEBUG nova.network.neutron [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Successfully updated port: 54af1da4-2337-4e36-8e6e-2c36ccf43309 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.798 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "refresh_cache-1442e48b-6b8f-4c96-b9b7-909071c8ebf2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.799 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquired lock "refresh_cache-1442e48b-6b8f-4c96-b9b7-909071c8ebf2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.799 2 DEBUG nova.network.neutron [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:07:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:27.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.935 2 DEBUG nova.objects.instance [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lazy-loading 'migration_context' on Instance uuid 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.953 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.954 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Ensure instance console log exists: /var/lib/nova/instances/1442e48b-6b8f-4c96-b9b7-909071c8ebf2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.954 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.954 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.954 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.961 2 DEBUG nova.compute.manager [req-a29f4a36-8169-4d6e-a178-e9b12790506a req-525d258b-5c28-41f2-9042-73caf50d19ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Received event network-changed-54af1da4-2337-4e36-8e6e-2c36ccf43309 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.961 2 DEBUG nova.compute.manager [req-a29f4a36-8169-4d6e-a178-e9b12790506a req-525d258b-5c28-41f2-9042-73caf50d19ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Refreshing instance network info cache due to event network-changed-54af1da4-2337-4e36-8e6e-2c36ccf43309. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:07:27 np0005465988 nova_compute[236126]: 2025-10-02 12:07:27.961 2 DEBUG oslo_concurrency.lockutils [req-a29f4a36-8169-4d6e-a178-e9b12790506a req-525d258b-5c28-41f2-9042-73caf50d19ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1442e48b-6b8f-4c96-b9b7-909071c8ebf2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:07:28 np0005465988 nova_compute[236126]: 2025-10-02 12:07:28.043 2 DEBUG nova.network.neutron [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:07:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:28.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.086 2 DEBUG nova.network.neutron [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Updating instance_info_cache with network_info: [{"id": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "address": "fa:16:3e:e2:7c:9f", "network": {"id": "92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1460458376-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "861ae6a71574411fbcdab09902e6bcc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54af1da4-23", "ovs_interfaceid": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.134 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Releasing lock "refresh_cache-1442e48b-6b8f-4c96-b9b7-909071c8ebf2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.135 2 DEBUG nova.compute.manager [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Instance network_info: |[{"id": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "address": "fa:16:3e:e2:7c:9f", "network": {"id": "92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1460458376-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "861ae6a71574411fbcdab09902e6bcc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54af1da4-23", "ovs_interfaceid": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.135 2 DEBUG oslo_concurrency.lockutils [req-a29f4a36-8169-4d6e-a178-e9b12790506a req-525d258b-5c28-41f2-9042-73caf50d19ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1442e48b-6b8f-4c96-b9b7-909071c8ebf2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.136 2 DEBUG nova.network.neutron [req-a29f4a36-8169-4d6e-a178-e9b12790506a req-525d258b-5c28-41f2-9042-73caf50d19ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Refreshing network info cache for port 54af1da4-2337-4e36-8e6e-2c36ccf43309 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.139 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Start _get_guest_xml network_info=[{"id": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "address": "fa:16:3e:e2:7c:9f", "network": {"id": "92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1460458376-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "861ae6a71574411fbcdab09902e6bcc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54af1da4-23", "ovs_interfaceid": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T12:07:14Z,direct_url=<?>,disk_format='qcow2',id=9ba2a4cf-08bb-442a-b063-4fb551df3759,min_disk=0,min_ram=0,name='',owner='d650efcc52d9448da536370064e1794d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T12:07:16Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'scsi', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/sda', 'size': 0, 'image_id': '9ba2a4cf-08bb-442a-b063-4fb551df3759'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.146 2 WARNING nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.151 2 DEBUG nova.virt.libvirt.host [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.152 2 DEBUG nova.virt.libvirt.host [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.159 2 DEBUG nova.virt.libvirt.host [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.159 2 DEBUG nova.virt.libvirt.host [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.160 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.161 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T12:07:14Z,direct_url=<?>,disk_format='qcow2',id=9ba2a4cf-08bb-442a-b063-4fb551df3759,min_disk=0,min_ram=0,name='',owner='d650efcc52d9448da536370064e1794d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T12:07:16Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.161 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.162 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.162 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.162 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.162 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.162 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.163 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.163 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.163 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.163 2 DEBUG nova.virt.hardware [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.167 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:29.359 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:07:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/800830383' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.637 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.678 2 DEBUG nova.storage.rbd_utils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] rbd image 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:29 np0005465988 nova_compute[236126]: 2025-10-02 12:07:29.684 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:29.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:07:30 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3062239647' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.112 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.114 2 DEBUG nova.virt.libvirt.vif [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-365234862',display_name='tempest-AttachSCSIVolumeTestJSON-server-365234862',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachscsivolumetestjson-server-365234862',id=36,image_ref='9ba2a4cf-08bb-442a-b063-4fb551df3759',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJOJlCS90Za5LbWSHMPgP+Ek8S8WBhE0f2SR8Hk2ruxpy1jfgWDX1N1DF8IoYVNVmIZKbxHjppgg2BMM4LAC0bu2V87e4yQfdKSbWwxyQEnyLjV7j1Lk4l2C7rrj3wvBCw==',key_name='tempest-keypair-519727741',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='861ae6a71574411fbcdab09902e6bcc4',ramdisk_id='',reservation_id='r-v8x0qy0m',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9ba2a4cf-08bb-442a-b063-4fb551df3759',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='q35',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-280436056',owner_user_name='tempest-AttachSCSIVolumeTestJSON-280436056-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:07:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4666fdebee9947109da966b5c870b34e',uuid=1442e48b-6b8f-4c96-b9b7-909071c8ebf2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "address": "fa:16:3e:e2:7c:9f", "network": {"id": "92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1460458376-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "861ae6a71574411fbcdab09902e6bcc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54af1da4-23", "ovs_interfaceid": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.115 2 DEBUG nova.network.os_vif_util [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Converting VIF {"id": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "address": "fa:16:3e:e2:7c:9f", "network": {"id": "92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1460458376-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "861ae6a71574411fbcdab09902e6bcc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54af1da4-23", "ovs_interfaceid": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.116 2 DEBUG nova.network.os_vif_util [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:7c:9f,bridge_name='br-int',has_traffic_filtering=True,id=54af1da4-2337-4e36-8e6e-2c36ccf43309,network=Network(92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54af1da4-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.117 2 DEBUG nova.objects.instance [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.138 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  <uuid>1442e48b-6b8f-4c96-b9b7-909071c8ebf2</uuid>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  <name>instance-00000024</name>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <nova:name>tempest-AttachSCSIVolumeTestJSON-server-365234862</nova:name>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:07:29</nova:creationTime>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <nova:user uuid="4666fdebee9947109da966b5c870b34e">tempest-AttachSCSIVolumeTestJSON-280436056-project-member</nova:user>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <nova:project uuid="861ae6a71574411fbcdab09902e6bcc4">tempest-AttachSCSIVolumeTestJSON-280436056</nova:project>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="9ba2a4cf-08bb-442a-b063-4fb551df3759"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <nova:port uuid="54af1da4-2337-4e36-8e6e-2c36ccf43309">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <entry name="serial">1442e48b-6b8f-4c96-b9b7-909071c8ebf2</entry>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <entry name="uuid">1442e48b-6b8f-4c96-b9b7-909071c8ebf2</entry>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <target dev="sda" bus="scsi"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <address type="drive" controller="0" unit="0"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk.config">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <target dev="sdb" bus="scsi"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <address type="drive" controller="0" unit="1"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="scsi" index="0" model="virtio-scsi"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:e2:7c:9f"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <target dev="tap54af1da4-23"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/1442e48b-6b8f-4c96-b9b7-909071c8ebf2/console.log" append="off"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:07:30 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:07:30 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:07:30 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:07:30 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.140 2 DEBUG nova.compute.manager [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Preparing to wait for external event network-vif-plugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.141 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.142 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.142 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.143 2 DEBUG nova.virt.libvirt.vif [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-365234862',display_name='tempest-AttachSCSIVolumeTestJSON-server-365234862',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachscsivolumetestjson-server-365234862',id=36,image_ref='9ba2a4cf-08bb-442a-b063-4fb551df3759',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJOJlCS90Za5LbWSHMPgP+Ek8S8WBhE0f2SR8Hk2ruxpy1jfgWDX1N1DF8IoYVNVmIZKbxHjppgg2BMM4LAC0bu2V87e4yQfdKSbWwxyQEnyLjV7j1Lk4l2C7rrj3wvBCw==',key_name='tempest-keypair-519727741',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='861ae6a71574411fbcdab09902e6bcc4',ramdisk_id='',reservation_id='r-v8x0qy0m',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9ba2a4cf-08bb-442a-b063-4fb551df3759',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='q35',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-280436056',owner_user_name='tempest-AttachSCSIVolumeTestJSON-280436056-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:07:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4666fdebee9947109da966b5c870b34e',uuid=1442e48b-6b8f-4c96-b9b7-909071c8ebf2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "address": "fa:16:3e:e2:7c:9f", "network": {"id": "92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1460458376-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "861ae6a71574411fbcdab09902e6bcc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54af1da4-23", "ovs_interfaceid": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.144 2 DEBUG nova.network.os_vif_util [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Converting VIF {"id": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "address": "fa:16:3e:e2:7c:9f", "network": {"id": "92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1460458376-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "861ae6a71574411fbcdab09902e6bcc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54af1da4-23", "ovs_interfaceid": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.145 2 DEBUG nova.network.os_vif_util [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:7c:9f,bridge_name='br-int',has_traffic_filtering=True,id=54af1da4-2337-4e36-8e6e-2c36ccf43309,network=Network(92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54af1da4-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.145 2 DEBUG os_vif [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:7c:9f,bridge_name='br-int',has_traffic_filtering=True,id=54af1da4-2337-4e36-8e6e-2c36ccf43309,network=Network(92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54af1da4-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.147 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.147 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.152 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54af1da4-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.153 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap54af1da4-23, col_values=(('external_ids', {'iface-id': '54af1da4-2337-4e36-8e6e-2c36ccf43309', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:7c:9f', 'vm-uuid': '1442e48b-6b8f-4c96-b9b7-909071c8ebf2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:30 np0005465988 NetworkManager[45041]: <info>  [1759406850.1560] manager: (tap54af1da4-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.166 2 INFO os_vif [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:7c:9f,bridge_name='br-int',has_traffic_filtering=True,id=54af1da4-2337-4e36-8e6e-2c36ccf43309,network=Network(92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54af1da4-23')#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.228 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.230 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.230 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] No VIF found with MAC fa:16:3e:e2:7c:9f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.231 2 INFO nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Using config drive#033[00m
Oct  2 08:07:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:30.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.257 2 DEBUG nova.storage.rbd_utils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] rbd image 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.642 2 DEBUG nova.network.neutron [req-a29f4a36-8169-4d6e-a178-e9b12790506a req-525d258b-5c28-41f2-9042-73caf50d19ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Updated VIF entry in instance network info cache for port 54af1da4-2337-4e36-8e6e-2c36ccf43309. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.643 2 DEBUG nova.network.neutron [req-a29f4a36-8169-4d6e-a178-e9b12790506a req-525d258b-5c28-41f2-9042-73caf50d19ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Updating instance_info_cache with network_info: [{"id": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "address": "fa:16:3e:e2:7c:9f", "network": {"id": "92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1460458376-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "861ae6a71574411fbcdab09902e6bcc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54af1da4-23", "ovs_interfaceid": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.666 2 DEBUG oslo_concurrency.lockutils [req-a29f4a36-8169-4d6e-a178-e9b12790506a req-525d258b-5c28-41f2-9042-73caf50d19ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1442e48b-6b8f-4c96-b9b7-909071c8ebf2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.682 2 INFO nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Creating config drive at /var/lib/nova/instances/1442e48b-6b8f-4c96-b9b7-909071c8ebf2/disk.config#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.688 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1442e48b-6b8f-4c96-b9b7-909071c8ebf2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppgjqjfsy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.819 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1442e48b-6b8f-4c96-b9b7-909071c8ebf2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppgjqjfsy" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.851 2 DEBUG nova.storage.rbd_utils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] rbd image 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:30 np0005465988 nova_compute[236126]: 2025-10-02 12:07:30.856 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1442e48b-6b8f-4c96-b9b7-909071c8ebf2/disk.config 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.022 2 DEBUG oslo_concurrency.processutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1442e48b-6b8f-4c96-b9b7-909071c8ebf2/disk.config 1442e48b-6b8f-4c96-b9b7-909071c8ebf2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.167s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.024 2 INFO nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Deleting local config drive /var/lib/nova/instances/1442e48b-6b8f-4c96-b9b7-909071c8ebf2/disk.config because it was imported into RBD.#033[00m
Oct  2 08:07:31 np0005465988 kernel: tap54af1da4-23: entered promiscuous mode
Oct  2 08:07:31 np0005465988 NetworkManager[45041]: <info>  [1759406851.0785] manager: (tap54af1da4-23): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Oct  2 08:07:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:07:31Z|00079|binding|INFO|Claiming lport 54af1da4-2337-4e36-8e6e-2c36ccf43309 for this chassis.
Oct  2 08:07:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:07:31Z|00080|binding|INFO|54af1da4-2337-4e36-8e6e-2c36ccf43309: Claiming fa:16:3e:e2:7c:9f 10.100.0.12
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.088 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:7c:9f 10.100.0.12'], port_security=['fa:16:3e:e2:7c:9f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1442e48b-6b8f-4c96-b9b7-909071c8ebf2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '861ae6a71574411fbcdab09902e6bcc4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '08fdd263-3375-490b-b014-c8665c6b0045', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=17f08ce4-ac84-49af-9b71-8a31a5b87a27, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=54af1da4-2337-4e36-8e6e-2c36ccf43309) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.090 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 54af1da4-2337-4e36-8e6e-2c36ccf43309 in datapath 92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3 bound to our chassis#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.094 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3#033[00m
Oct  2 08:07:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:07:31Z|00081|binding|INFO|Setting lport 54af1da4-2337-4e36-8e6e-2c36ccf43309 ovn-installed in OVS
Oct  2 08:07:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:07:31Z|00082|binding|INFO|Setting lport 54af1da4-2337-4e36-8e6e-2c36ccf43309 up in Southbound
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.114 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[87c23f0a-dc36-4c7b-b25c-efe238c581b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.115 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap92e5c6c5-01 in ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.118 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap92e5c6c5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.118 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1cf5f086-6606-4971-9be7-ac11960fe3e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.120 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5edfecfd-32ba-42a7-89af-5598dd88da15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 systemd-machined[192594]: New machine qemu-14-instance-00000024.
Oct  2 08:07:31 np0005465988 systemd[1]: Started Virtual Machine qemu-14-instance-00000024.
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.138 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[2371d62d-c193-42f0-9636-b7445a4a00ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.163 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8599f9c5-dcd4-43e9-93ed-afe2bd2755a5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 systemd-udevd[252214]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:07:31 np0005465988 NetworkManager[45041]: <info>  [1759406851.1956] device (tap54af1da4-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:07:31 np0005465988 NetworkManager[45041]: <info>  [1759406851.1983] device (tap54af1da4-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.199 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ef34011f-345a-4179-bf91-122bbe8b2160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.206 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[04331f9d-fe8f-4341-a53f-c01a53d015d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 NetworkManager[45041]: <info>  [1759406851.2096] manager: (tap92e5c6c5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.246 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2dcd9f7f-46c6-4666-b5ab-c65079b60114]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.250 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f99c3237-8197-4fc0-afd2-c60bf82a6338]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 NetworkManager[45041]: <info>  [1759406851.2749] device (tap92e5c6c5-00): carrier: link connected
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.282 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[efd4bf57-f498-4159-b9cb-8bf9b606370c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.301 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[87e07a20-c720-477b-a204-c90241cba7c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92e5c6c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:20:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491160, 'reachable_time': 22217, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252244, 'error': None, 'target': 'ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.316 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5699feda-b53b-4f8b-a608-f66a7d877e1a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:2045'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 491160, 'tstamp': 491160}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252245, 'error': None, 'target': 'ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.332 2 DEBUG nova.compute.manager [req-ca1885c7-7704-4ada-8425-5162e3996e6d req-054af9d1-4901-48a2-af1b-eb418750575a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Received event network-changed-a8ce8a67-762d-41a0-8c12-778f66e87f3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.332 2 DEBUG nova.compute.manager [req-ca1885c7-7704-4ada-8425-5162e3996e6d req-054af9d1-4901-48a2-af1b-eb418750575a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Refreshing instance network info cache due to event network-changed-a8ce8a67-762d-41a0-8c12-778f66e87f3c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.333 2 DEBUG oslo_concurrency.lockutils [req-ca1885c7-7704-4ada-8425-5162e3996e6d req-054af9d1-4901-48a2-af1b-eb418750575a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.333 2 DEBUG oslo_concurrency.lockutils [req-ca1885c7-7704-4ada-8425-5162e3996e6d req-054af9d1-4901-48a2-af1b-eb418750575a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.333 2 DEBUG nova.network.neutron [req-ca1885c7-7704-4ada-8425-5162e3996e6d req-054af9d1-4901-48a2-af1b-eb418750575a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Refreshing network info cache for port a8ce8a67-762d-41a0-8c12-778f66e87f3c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.336 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c916ea93-7398-4b0f-afd9-b40fd0ec40c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92e5c6c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:20:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491160, 'reachable_time': 22217, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252246, 'error': None, 'target': 'ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.370 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3a8cb728-05fa-4346-b4d5-c17257a5493f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.424 2 DEBUG nova.compute.manager [req-c851bff5-f026-451f-92d7-64ba5151e820 req-d81cf4fd-3d85-4e87-b68e-8329d16e0ec0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Received event network-vif-plugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.425 2 DEBUG oslo_concurrency.lockutils [req-c851bff5-f026-451f-92d7-64ba5151e820 req-d81cf4fd-3d85-4e87-b68e-8329d16e0ec0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.426 2 DEBUG oslo_concurrency.lockutils [req-c851bff5-f026-451f-92d7-64ba5151e820 req-d81cf4fd-3d85-4e87-b68e-8329d16e0ec0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.427 2 DEBUG oslo_concurrency.lockutils [req-c851bff5-f026-451f-92d7-64ba5151e820 req-d81cf4fd-3d85-4e87-b68e-8329d16e0ec0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.428 2 DEBUG nova.compute.manager [req-c851bff5-f026-451f-92d7-64ba5151e820 req-d81cf4fd-3d85-4e87-b68e-8329d16e0ec0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Processing event network-vif-plugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.446 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1b5e6377-1f2f-46e4-8eac-e0744606275c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.448 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92e5c6c5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.448 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.449 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92e5c6c5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005465988 kernel: tap92e5c6c5-00: entered promiscuous mode
Oct  2 08:07:31 np0005465988 NetworkManager[45041]: <info>  [1759406851.4520] manager: (tap92e5c6c5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.455 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap92e5c6c5-00, col_values=(('external_ids', {'iface-id': '79718911-96f0-42c0-89ad-889cbbd1ab74'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:07:31Z|00083|binding|INFO|Releasing lport 79718911-96f0-42c0-89ad-889cbbd1ab74 from this chassis (sb_readonly=0)
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.458 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.459 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bd7d9512-37ff-47a8-b94f-250cd347ad90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.460 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3.pid.haproxy
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:07:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:07:31.461 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3', 'env', 'PROCESS_TAG=haproxy-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005465988 podman[252320]: 2025-10-02 12:07:31.867997411 +0000 UTC m=+0.052102148 container create d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:07:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:31.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:31 np0005465988 nova_compute[236126]: 2025-10-02 12:07:31.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005465988 systemd[1]: Started libpod-conmon-d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56.scope.
Oct  2 08:07:31 np0005465988 podman[252320]: 2025-10-02 12:07:31.839242484 +0000 UTC m=+0.023347251 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:07:31 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:07:31 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf10260e13cc43c175ab574a220f02c7fdc5be7c1fee181daaff5e97699b622/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:07:31 np0005465988 podman[252320]: 2025-10-02 12:07:31.966950851 +0000 UTC m=+0.151055608 container init d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:07:31 np0005465988 podman[252320]: 2025-10-02 12:07:31.973720258 +0000 UTC m=+0.157824995 container start d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:07:31 np0005465988 neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3[252335]: [NOTICE]   (252339) : New worker (252341) forked
Oct  2 08:07:31 np0005465988 neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3[252335]: [NOTICE]   (252339) : Loading success.
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.123 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406852.1227603, 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.124 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] VM Started (Lifecycle Event)#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.127 2 DEBUG nova.compute.manager [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.132 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.137 2 INFO nova.virt.libvirt.driver [-] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Instance spawned successfully.#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.138 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.144 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.144 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.145 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.146 2 DEBUG nova.virt.libvirt.driver [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.152 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.157 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.192 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.192 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406852.1247413, 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.193 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:07:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:32.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.263 2 INFO nova.compute.manager [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Took 7.12 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.264 2 DEBUG nova.compute.manager [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.307 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.311 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406852.1307547, 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.312 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.353 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.360 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.389 2 INFO nova.compute.manager [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Took 8.21 seconds to build instance.#033[00m
Oct  2 08:07:32 np0005465988 nova_compute[236126]: 2025-10-02 12:07:32.415 2 DEBUG oslo_concurrency.lockutils [None req-a2b6b8f6-8bbf-47cc-8195-c4247717dfd9 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:33 np0005465988 nova_compute[236126]: 2025-10-02 12:07:33.013 2 DEBUG nova.network.neutron [req-ca1885c7-7704-4ada-8425-5162e3996e6d req-054af9d1-4901-48a2-af1b-eb418750575a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updated VIF entry in instance network info cache for port a8ce8a67-762d-41a0-8c12-778f66e87f3c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:07:33 np0005465988 nova_compute[236126]: 2025-10-02 12:07:33.013 2 DEBUG nova.network.neutron [req-ca1885c7-7704-4ada-8425-5162e3996e6d req-054af9d1-4901-48a2-af1b-eb418750575a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updating instance_info_cache with network_info: [{"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:33 np0005465988 nova_compute[236126]: 2025-10-02 12:07:33.036 2 DEBUG oslo_concurrency.lockutils [req-ca1885c7-7704-4ada-8425-5162e3996e6d req-054af9d1-4901-48a2-af1b-eb418750575a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:07:33 np0005465988 nova_compute[236126]: 2025-10-02 12:07:33.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:33 np0005465988 nova_compute[236126]: 2025-10-02 12:07:33.569 2 DEBUG nova.compute.manager [req-67761d72-6d91-4621-a89e-dae3bc1855f3 req-743b7f84-63a3-40c3-a47c-ca505a629192 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Received event network-vif-plugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:33 np0005465988 nova_compute[236126]: 2025-10-02 12:07:33.570 2 DEBUG oslo_concurrency.lockutils [req-67761d72-6d91-4621-a89e-dae3bc1855f3 req-743b7f84-63a3-40c3-a47c-ca505a629192 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:33 np0005465988 nova_compute[236126]: 2025-10-02 12:07:33.571 2 DEBUG oslo_concurrency.lockutils [req-67761d72-6d91-4621-a89e-dae3bc1855f3 req-743b7f84-63a3-40c3-a47c-ca505a629192 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:33 np0005465988 nova_compute[236126]: 2025-10-02 12:07:33.572 2 DEBUG oslo_concurrency.lockutils [req-67761d72-6d91-4621-a89e-dae3bc1855f3 req-743b7f84-63a3-40c3-a47c-ca505a629192 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:33 np0005465988 nova_compute[236126]: 2025-10-02 12:07:33.572 2 DEBUG nova.compute.manager [req-67761d72-6d91-4621-a89e-dae3bc1855f3 req-743b7f84-63a3-40c3-a47c-ca505a629192 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] No waiting events found dispatching network-vif-plugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:33 np0005465988 nova_compute[236126]: 2025-10-02 12:07:33.573 2 WARNING nova.compute.manager [req-67761d72-6d91-4621-a89e-dae3bc1855f3 req-743b7f84-63a3-40c3-a47c-ca505a629192 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Received unexpected event network-vif-plugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:07:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:33.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:34.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:34 np0005465988 nova_compute[236126]: 2025-10-02 12:07:34.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:34 np0005465988 nova_compute[236126]: 2025-10-02 12:07:34.623 2 DEBUG nova.compute.manager [req-e1adcf63-3dfe-4d8d-bccd-4cfc5cf8a75b req-d4277c4d-527b-46b9-b789-fab8152d132a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Received event network-changed-54af1da4-2337-4e36-8e6e-2c36ccf43309 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:34 np0005465988 nova_compute[236126]: 2025-10-02 12:07:34.623 2 DEBUG nova.compute.manager [req-e1adcf63-3dfe-4d8d-bccd-4cfc5cf8a75b req-d4277c4d-527b-46b9-b789-fab8152d132a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Refreshing instance network info cache due to event network-changed-54af1da4-2337-4e36-8e6e-2c36ccf43309. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:07:34 np0005465988 nova_compute[236126]: 2025-10-02 12:07:34.624 2 DEBUG oslo_concurrency.lockutils [req-e1adcf63-3dfe-4d8d-bccd-4cfc5cf8a75b req-d4277c4d-527b-46b9-b789-fab8152d132a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1442e48b-6b8f-4c96-b9b7-909071c8ebf2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:07:34 np0005465988 nova_compute[236126]: 2025-10-02 12:07:34.624 2 DEBUG oslo_concurrency.lockutils [req-e1adcf63-3dfe-4d8d-bccd-4cfc5cf8a75b req-d4277c4d-527b-46b9-b789-fab8152d132a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1442e48b-6b8f-4c96-b9b7-909071c8ebf2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:07:34 np0005465988 nova_compute[236126]: 2025-10-02 12:07:34.624 2 DEBUG nova.network.neutron [req-e1adcf63-3dfe-4d8d-bccd-4cfc5cf8a75b req-d4277c4d-527b-46b9-b789-fab8152d132a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Refreshing network info cache for port 54af1da4-2337-4e36-8e6e-2c36ccf43309 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.494 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.494 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.494 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.495 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.495 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.825 2 DEBUG nova.network.neutron [req-e1adcf63-3dfe-4d8d-bccd-4cfc5cf8a75b req-d4277c4d-527b-46b9-b789-fab8152d132a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Updated VIF entry in instance network info cache for port 54af1da4-2337-4e36-8e6e-2c36ccf43309. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.826 2 DEBUG nova.network.neutron [req-e1adcf63-3dfe-4d8d-bccd-4cfc5cf8a75b req-d4277c4d-527b-46b9-b789-fab8152d132a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Updating instance_info_cache with network_info: [{"id": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "address": "fa:16:3e:e2:7c:9f", "network": {"id": "92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1460458376-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "861ae6a71574411fbcdab09902e6bcc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54af1da4-23", "ovs_interfaceid": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.850 2 DEBUG oslo_concurrency.lockutils [req-e1adcf63-3dfe-4d8d-bccd-4cfc5cf8a75b req-d4277c4d-527b-46b9-b789-fab8152d132a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1442e48b-6b8f-4c96-b9b7-909071c8ebf2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:07:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:07:35 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1234167239' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:07:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:35.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.899 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.992 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000020 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.993 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000020 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.995 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000024 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:07:35 np0005465988 nova_compute[236126]: 2025-10-02 12:07:35.995 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000024 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.144 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.145 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4499MB free_disk=20.863128662109375GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.145 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.145 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.210 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 8d7306df-bd40-48a7-99a7-36da8b9a67f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.210 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.210 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.210 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:07:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.249 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:36.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:07:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/639241301' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.663 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.668 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.687 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.720 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.720 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:36 np0005465988 nova_compute[236126]: 2025-10-02 12:07:36.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:37.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:38.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:38 np0005465988 nova_compute[236126]: 2025-10-02 12:07:38.720 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:39 np0005465988 nova_compute[236126]: 2025-10-02 12:07:39.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:39 np0005465988 nova_compute[236126]: 2025-10-02 12:07:39.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:07:39 np0005465988 nova_compute[236126]: 2025-10-02 12:07:39.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:07:39 np0005465988 nova_compute[236126]: 2025-10-02 12:07:39.683 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:07:39 np0005465988 nova_compute[236126]: 2025-10-02 12:07:39.683 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:07:39 np0005465988 nova_compute[236126]: 2025-10-02 12:07:39.684 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:07:39 np0005465988 nova_compute[236126]: 2025-10-02 12:07:39.684 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8d7306df-bd40-48a7-99a7-36da8b9a67f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:07:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:39.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:07:40 np0005465988 nova_compute[236126]: 2025-10-02 12:07:40.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:07:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:40.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.648 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updating instance_info_cache with network_info: [{"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.664 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-8d7306df-bd40-48a7-99a7-36da8b9a67f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.665 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.785 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "6536d247-f08a-47a5-8be9-cfbf5481312c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.785 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "6536d247-f08a-47a5-8be9-cfbf5481312c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.802 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.834 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "1fd5329d-bab2-4f79-85be-dc67dc7c8df8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.835 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "1fd5329d-bab2-4f79-85be-dc67dc7c8df8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.879 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:07:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:41.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.903 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.904 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.921 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.922 2 INFO nova.compute.claims [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:07:41 np0005465988 nova_compute[236126]: 2025-10-02 12:07:41.983 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.078 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:42.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:07:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:07:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:07:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:07:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/583747885' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:07:42 np0005465988 podman[252604]: 2025-10-02 12:07:42.55013534 +0000 UTC m=+0.072015358 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.554 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:42 np0005465988 podman[252603]: 2025-10-02 12:07:42.556282039 +0000 UTC m=+0.078085104 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.564 2 DEBUG nova.compute.provider_tree [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.586 2 DEBUG nova.scheduler.client.report [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:42 np0005465988 podman[252602]: 2025-10-02 12:07:42.589079583 +0000 UTC m=+0.108808618 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.609 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.611 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.617 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.618 2 INFO nova.compute.claims [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.641 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "1ca861ae-d9c8-4ed8-b10f-82853008d06c" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.641 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "1ca861ae-d9c8-4ed8-b10f-82853008d06c" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.677 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "1ca861ae-d9c8-4ed8-b10f-82853008d06c" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.678 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.727 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.728 2 DEBUG nova.network.neutron [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.820 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.853 2 INFO nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:07:42 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.893 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:42.999 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.001 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.001 2 INFO nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Creating image(s)#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.030 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 6536d247-f08a-47a5-8be9-cfbf5481312c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.057 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 6536d247-f08a-47a5-8be9-cfbf5481312c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.083 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 6536d247-f08a-47a5-8be9-cfbf5481312c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.089 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.152 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.153 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.154 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.155 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.186 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 6536d247-f08a-47a5-8be9-cfbf5481312c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.190 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 6536d247-f08a-47a5-8be9-cfbf5481312c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:07:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2783669186' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.290 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.296 2 DEBUG nova.compute.provider_tree [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.313 2 DEBUG nova.scheduler.client.report [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.334 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.350 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "1ca861ae-d9c8-4ed8-b10f-82853008d06c" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.351 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "1ca861ae-d9c8-4ed8-b10f-82853008d06c" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.366 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "1ca861ae-d9c8-4ed8-b10f-82853008d06c" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.015s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.367 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.407 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.408 2 DEBUG nova.network.neutron [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.434 2 INFO nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.451 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.560 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 6536d247-f08a-47a5-8be9-cfbf5481312c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.370s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.597 2 DEBUG nova.network.neutron [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.597 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.602 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.603 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.604 2 INFO nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Creating image(s)#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.642 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.677 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.707 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.711 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.746 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.789 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] resizing rbd image 6536d247-f08a-47a5-8be9-cfbf5481312c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.821 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.110s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.822 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.823 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.823 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.850 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.855 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.879 2 DEBUG nova.network.neutron [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.881 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:07:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:43.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.938 2 DEBUG nova.objects.instance [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lazy-loading 'migration_context' on Instance uuid 6536d247-f08a-47a5-8be9-cfbf5481312c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.953 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.954 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Ensure instance console log exists: /var/lib/nova/instances/6536d247-f08a-47a5-8be9-cfbf5481312c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.954 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.954 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.955 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.957 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.964 2 WARNING nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.971 2 DEBUG nova.virt.libvirt.host [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.971 2 DEBUG nova.virt.libvirt.host [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.976 2 DEBUG nova.virt.libvirt.host [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.977 2 DEBUG nova.virt.libvirt.host [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.979 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.979 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.979 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.980 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.980 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.980 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.981 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.981 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.981 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.982 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.982 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.982 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:07:43 np0005465988 nova_compute[236126]: 2025-10-02 12:07:43.986 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.181 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.326s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.256 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] resizing rbd image 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:07:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:44.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.396 2 DEBUG nova.objects.instance [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lazy-loading 'migration_context' on Instance uuid 1fd5329d-bab2-4f79-85be-dc67dc7c8df8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.411 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.412 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Ensure instance console log exists: /var/lib/nova/instances/1fd5329d-bab2-4f79-85be-dc67dc7c8df8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.413 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.414 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.414 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.418 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.424 2 WARNING nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.430 2 DEBUG nova.virt.libvirt.host [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.430 2 DEBUG nova.virt.libvirt.host [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.435 2 DEBUG nova.virt.libvirt.host [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.436 2 DEBUG nova.virt.libvirt.host [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.437 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.438 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.438 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.439 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.439 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.440 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.440 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.441 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.441 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.442 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.443 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.444 2 DEBUG nova.virt.hardware [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.449 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:07:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3743711792' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.489 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.523 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 6536d247-f08a-47a5-8be9-cfbf5481312c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.529 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:07:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2199587781' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.930 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.977 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:44 np0005465988 nova_compute[236126]: 2025-10-02 12:07:44.983 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:07:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2012125399' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.008 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.012 2 DEBUG nova.objects.instance [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6536d247-f08a-47a5-8be9-cfbf5481312c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.031 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <uuid>6536d247-f08a-47a5-8be9-cfbf5481312c</uuid>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <name>instance-00000026</name>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersOnMultiNodesTest-server-1682143048-1</nova:name>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:07:43</nova:creationTime>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:user uuid="199c0d9541a04c4db07e50bfba9fddb1">tempest-ServersOnMultiNodesTest-10966744-project-member</nova:user>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:project uuid="62aa9c47ee2841139cd7066168f59650">tempest-ServersOnMultiNodesTest-10966744</nova:project>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="serial">6536d247-f08a-47a5-8be9-cfbf5481312c</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="uuid">6536d247-f08a-47a5-8be9-cfbf5481312c</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/6536d247-f08a-47a5-8be9-cfbf5481312c_disk">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/6536d247-f08a-47a5-8be9-cfbf5481312c_disk.config">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/6536d247-f08a-47a5-8be9-cfbf5481312c/console.log" append="off"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:07:45 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:07:45 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.097 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.098 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.098 2 INFO nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Using config drive#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.134 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 6536d247-f08a-47a5-8be9-cfbf5481312c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.303 2 INFO nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Creating config drive at /var/lib/nova/instances/6536d247-f08a-47a5-8be9-cfbf5481312c/disk.config#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.308 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6536d247-f08a-47a5-8be9-cfbf5481312c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj55b09e9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.433 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6536d247-f08a-47a5-8be9-cfbf5481312c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj55b09e9" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:07:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/701690202' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.475 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 6536d247-f08a-47a5-8be9-cfbf5481312c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.480 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6536d247-f08a-47a5-8be9-cfbf5481312c/disk.config 6536d247-f08a-47a5-8be9-cfbf5481312c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.505 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.509 2 DEBUG nova.objects.instance [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fd5329d-bab2-4f79-85be-dc67dc7c8df8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.526 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <uuid>1fd5329d-bab2-4f79-85be-dc67dc7c8df8</uuid>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <name>instance-00000027</name>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersOnMultiNodesTest-server-1682143048-2</nova:name>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:07:44</nova:creationTime>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:user uuid="199c0d9541a04c4db07e50bfba9fddb1">tempest-ServersOnMultiNodesTest-10966744-project-member</nova:user>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <nova:project uuid="62aa9c47ee2841139cd7066168f59650">tempest-ServersOnMultiNodesTest-10966744</nova:project>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="serial">1fd5329d-bab2-4f79-85be-dc67dc7c8df8</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="uuid">1fd5329d-bab2-4f79-85be-dc67dc7c8df8</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk.config">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/1fd5329d-bab2-4f79-85be-dc67dc7c8df8/console.log" append="off"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:07:45 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:07:45 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:07:45 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:07:45 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.584 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.584 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.585 2 INFO nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Using config drive#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.616 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:07:45Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e2:7c:9f 10.100.0.12
Oct  2 08:07:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:07:45Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e2:7c:9f 10.100.0.12
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.735 2 INFO nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Creating config drive at /var/lib/nova/instances/1fd5329d-bab2-4f79-85be-dc67dc7c8df8/disk.config#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.740 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1fd5329d-bab2-4f79-85be-dc67dc7c8df8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3sprjf1r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.881 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1fd5329d-bab2-4f79-85be-dc67dc7c8df8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3sprjf1r" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:07:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:45.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.931 2 DEBUG nova.storage.rbd_utils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] rbd image 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:07:45 np0005465988 nova_compute[236126]: 2025-10-02 12:07:45.937 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1fd5329d-bab2-4f79-85be-dc67dc7c8df8/disk.config 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:46.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:46 np0005465988 nova_compute[236126]: 2025-10-02 12:07:46.391 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1fd5329d-bab2-4f79-85be-dc67dc7c8df8/disk.config 1fd5329d-bab2-4f79-85be-dc67dc7c8df8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:46 np0005465988 nova_compute[236126]: 2025-10-02 12:07:46.392 2 INFO nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Deleting local config drive /var/lib/nova/instances/1fd5329d-bab2-4f79-85be-dc67dc7c8df8/disk.config because it was imported into RBD.#033[00m
Oct  2 08:07:46 np0005465988 nova_compute[236126]: 2025-10-02 12:07:46.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:46 np0005465988 systemd-machined[192594]: New machine qemu-15-instance-00000027.
Oct  2 08:07:46 np0005465988 systemd[1]: Started Virtual Machine qemu-15-instance-00000027.
Oct  2 08:07:46 np0005465988 nova_compute[236126]: 2025-10-02 12:07:46.528 2 DEBUG oslo_concurrency.processutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6536d247-f08a-47a5-8be9-cfbf5481312c/disk.config 6536d247-f08a-47a5-8be9-cfbf5481312c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:46 np0005465988 nova_compute[236126]: 2025-10-02 12:07:46.529 2 INFO nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Deleting local config drive /var/lib/nova/instances/6536d247-f08a-47a5-8be9-cfbf5481312c/disk.config because it was imported into RBD.#033[00m
Oct  2 08:07:46 np0005465988 systemd-machined[192594]: New machine qemu-16-instance-00000026.
Oct  2 08:07:46 np0005465988 systemd[1]: Started Virtual Machine qemu-16-instance-00000026.
Oct  2 08:07:46 np0005465988 nova_compute[236126]: 2025-10-02 12:07:46.796 2 DEBUG oslo_concurrency.lockutils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:46 np0005465988 nova_compute[236126]: 2025-10-02 12:07:46.799 2 DEBUG oslo_concurrency.lockutils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:46 np0005465988 nova_compute[236126]: 2025-10-02 12:07:46.820 2 DEBUG nova.objects.instance [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lazy-loading 'flavor' on Instance uuid 8d7306df-bd40-48a7-99a7-36da8b9a67f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:46 np0005465988 nova_compute[236126]: 2025-10-02 12:07:46.868 2 DEBUG oslo_concurrency.lockutils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:46 np0005465988 nova_compute[236126]: 2025-10-02 12:07:46.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.072 2 DEBUG oslo_concurrency.lockutils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.072 2 DEBUG oslo_concurrency.lockutils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.072 2 INFO nova.compute.manager [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Attaching volume 6ee55f1c-32cc-4a1e-b7d7-c50f2e3e5291 to /dev/vdb#033[00m
Oct  2 08:07:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.262 2 DEBUG os_brick.utils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.265 2 INFO oslo.privsep.daemon [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmpl8trvyid/privsep.sock']#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.392 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406867.3918154, 6536d247-f08a-47a5-8be9-cfbf5481312c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.393 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.398 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.399 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.404 2 INFO nova.virt.libvirt.driver [-] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Instance spawned successfully.#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.404 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.419 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.426 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.430 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.430 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.431 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.431 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.432 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.432 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.456 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.457 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406867.3921444, 6536d247-f08a-47a5-8be9-cfbf5481312c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.457 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.488 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.492 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.500 2 INFO nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Took 4.50 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.501 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.512 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.565 2 INFO nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Took 5.69 seconds to build instance.#033[00m
Oct  2 08:07:47 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.584 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "6536d247-f08a-47a5-8be9-cfbf5481312c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:47.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.092 2 INFO oslo.privsep.daemon [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.964 5711 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.968 5711 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.970 5711 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:47.970 5711 INFO oslo.privsep.daemon [-] privsep daemon running as pid 5711#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.095 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f2efe2-3150-4388-ada2-d53c54bb8185]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.198 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.214 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.214 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[4fed3ae0-e274-4f40-ac4a-e5a2b69bbfb0]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.215 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.224 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.224 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[11c6beef-77a8-45b9-b4cd-d2172fb9212d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.226 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.238 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.239 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[eda0e38d-19b1-44a5-ae67-b45500e3bb39]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.242 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406868.242507, 1fd5329d-bab2-4f79-85be-dc67dc7c8df8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.243 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.243 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[7a2e37e0-17b1-4698-9575-ec543c4f806c]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.245 2 DEBUG oslo_concurrency.processutils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:48.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.267 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.268 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.271 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.272 2 DEBUG oslo_concurrency.processutils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CMD "nvme version" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.275 2 DEBUG os_brick.initiator.connectors.lightos [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.275 2 DEBUG os_brick.initiator.connectors.lightos [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.276 2 DEBUG os_brick.initiator.connectors.lightos [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.276 2 DEBUG os_brick.utils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] <== get_connector_properties: return (1012ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.276 2 DEBUG nova.virt.block_device [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updating existing volume attachment record: 2cfa8394-65cd-46dd-8912-0c86f0aac0a9 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.281 2 INFO nova.virt.libvirt.driver [-] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Instance spawned successfully.#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.282 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.283 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.313 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.313 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406868.2455065, 1fd5329d-bab2-4f79-85be-dc67dc7c8df8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.313 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] VM Started (Lifecycle Event)#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.318 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.319 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.319 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.320 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.321 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.321 2 DEBUG nova.virt.libvirt.driver [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.344 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.348 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.402 2 INFO nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Took 4.80 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.402 2 DEBUG nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.486 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.631 2 INFO nova.compute.manager [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Took 6.67 seconds to build instance.#033[00m
Oct  2 08:07:48 np0005465988 nova_compute[236126]: 2025-10-02 12:07:48.703 2 DEBUG oslo_concurrency.lockutils [None req-86764400-4ccf-4864-b8b1-51b0b4725b89 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "1fd5329d-bab2-4f79-85be-dc67dc7c8df8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.868s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:49 np0005465988 nova_compute[236126]: 2025-10-02 12:07:49.774 2 DEBUG oslo_concurrency.lockutils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:49 np0005465988 nova_compute[236126]: 2025-10-02 12:07:49.774 2 DEBUG oslo_concurrency.lockutils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:49 np0005465988 nova_compute[236126]: 2025-10-02 12:07:49.776 2 DEBUG oslo_concurrency.lockutils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:49 np0005465988 nova_compute[236126]: 2025-10-02 12:07:49.782 2 DEBUG nova.objects.instance [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lazy-loading 'flavor' on Instance uuid 8d7306df-bd40-48a7-99a7-36da8b9a67f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:49 np0005465988 nova_compute[236126]: 2025-10-02 12:07:49.809 2 DEBUG nova.virt.libvirt.driver [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Attempting to attach volume 6ee55f1c-32cc-4a1e-b7d7-c50f2e3e5291 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Oct  2 08:07:49 np0005465988 nova_compute[236126]: 2025-10-02 12:07:49.812 2 DEBUG nova.virt.libvirt.guest [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] attach device xml: <disk type="network" device="disk">
Oct  2 08:07:49 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:07:49 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-6ee55f1c-32cc-4a1e-b7d7-c50f2e3e5291">
Oct  2 08:07:49 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:07:49 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:07:49 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:07:49 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:07:49 np0005465988 nova_compute[236126]:  <auth username="openstack">
Oct  2 08:07:49 np0005465988 nova_compute[236126]:    <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:07:49 np0005465988 nova_compute[236126]:  </auth>
Oct  2 08:07:49 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:07:49 np0005465988 nova_compute[236126]:  <serial>6ee55f1c-32cc-4a1e-b7d7-c50f2e3e5291</serial>
Oct  2 08:07:49 np0005465988 nova_compute[236126]:  <shareable/>
Oct  2 08:07:49 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:07:49 np0005465988 nova_compute[236126]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:07:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:49.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:49 np0005465988 nova_compute[236126]: 2025-10-02 12:07:49.953 2 DEBUG nova.virt.libvirt.driver [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:49 np0005465988 nova_compute[236126]: 2025-10-02 12:07:49.953 2 DEBUG nova.virt.libvirt.driver [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:49 np0005465988 nova_compute[236126]: 2025-10-02 12:07:49.953 2 DEBUG nova.virt.libvirt.driver [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:49 np0005465988 nova_compute[236126]: 2025-10-02 12:07:49.954 2 DEBUG nova.virt.libvirt.driver [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] No VIF found with MAC fa:16:3e:dd:37:f7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:07:50 np0005465988 nova_compute[236126]: 2025-10-02 12:07:50.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:50.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:50 np0005465988 nova_compute[236126]: 2025-10-02 12:07:50.353 2 DEBUG oslo_concurrency.lockutils [None req-d220f710-59eb-46a0-a8e3-01fc2768c6a9 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:07:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:07:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:07:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:51.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:07:51 np0005465988 nova_compute[236126]: 2025-10-02 12:07:51.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:52.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:53.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:54.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:07:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4138778764' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:07:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:07:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4138778764' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:07:55 np0005465988 nova_compute[236126]: 2025-10-02 12:07:55.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:55.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:56.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:56 np0005465988 nova_compute[236126]: 2025-10-02 12:07:56.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:07:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:57.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:07:58.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.543 2 DEBUG oslo_concurrency.lockutils [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.543 2 DEBUG oslo_concurrency.lockutils [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:58 np0005465988 podman[253514]: 2025-10-02 12:07:58.557509152 +0000 UTC m=+0.081236005 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.564 2 INFO nova.compute.manager [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Detaching volume 6ee55f1c-32cc-4a1e-b7d7-c50f2e3e5291#033[00m
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.699 2 INFO nova.virt.block_device [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Attempting to driver detach volume 6ee55f1c-32cc-4a1e-b7d7-c50f2e3e5291 from mountpoint /dev/vdb#033[00m
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.710 2 DEBUG nova.virt.libvirt.driver [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Attempting to detach device vdb from instance 8d7306df-bd40-48a7-99a7-36da8b9a67f3 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.711 2 DEBUG nova.virt.libvirt.guest [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-6ee55f1c-32cc-4a1e-b7d7-c50f2e3e5291">
Oct  2 08:07:58 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <serial>6ee55f1c-32cc-4a1e-b7d7-c50f2e3e5291</serial>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <shareable/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:07:58 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.719 2 INFO nova.virt.libvirt.driver [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Successfully detached device vdb from instance 8d7306df-bd40-48a7-99a7-36da8b9a67f3 from the persistent domain config.#033[00m
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.720 2 DEBUG nova.virt.libvirt.driver [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 8d7306df-bd40-48a7-99a7-36da8b9a67f3 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.721 2 DEBUG nova.virt.libvirt.guest [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-6ee55f1c-32cc-4a1e-b7d7-c50f2e3e5291">
Oct  2 08:07:58 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <serial>6ee55f1c-32cc-4a1e-b7d7-c50f2e3e5291</serial>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <shareable/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Oct  2 08:07:58 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:07:58 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.871 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Received event <DeviceRemovedEvent: 1759406878.8713992, 8d7306df-bd40-48a7-99a7-36da8b9a67f3 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.874 2 DEBUG nova.virt.libvirt.driver [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 8d7306df-bd40-48a7-99a7-36da8b9a67f3 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:07:58 np0005465988 nova_compute[236126]: 2025-10-02 12:07:58.877 2 INFO nova.virt.libvirt.driver [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Successfully detached device vdb from instance 8d7306df-bd40-48a7-99a7-36da8b9a67f3 from the live domain config.#033[00m
Oct  2 08:07:59 np0005465988 ovn_controller[132601]: 2025-10-02T12:07:59Z|00084|binding|INFO|Releasing lport 79718911-96f0-42c0-89ad-889cbbd1ab74 from this chassis (sb_readonly=0)
Oct  2 08:07:59 np0005465988 ovn_controller[132601]: 2025-10-02T12:07:59Z|00085|binding|INFO|Releasing lport a266984f-a69e-4d11-8c6e-e21eb33eff29 from this chassis (sb_readonly=0)
Oct  2 08:07:59 np0005465988 nova_compute[236126]: 2025-10-02 12:07:59.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:59 np0005465988 nova_compute[236126]: 2025-10-02 12:07:59.294 2 DEBUG nova.objects.instance [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lazy-loading 'flavor' on Instance uuid 8d7306df-bd40-48a7-99a7-36da8b9a67f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:59 np0005465988 nova_compute[236126]: 2025-10-02 12:07:59.633 2 DEBUG oslo_concurrency.lockutils [None req-d13dab26-6a4d-4d8c-a9fa-d84656a84afc ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:07:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:07:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:07:59.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:00 np0005465988 nova_compute[236126]: 2025-10-02 12:08:00.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:00.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:01 np0005465988 nova_compute[236126]: 2025-10-02 12:08:01.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:08:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:01.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:08:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:02.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:08:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:03.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:08:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:04.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.259 2 DEBUG oslo_concurrency.lockutils [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.261 2 DEBUG oslo_concurrency.lockutils [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.261 2 DEBUG oslo_concurrency.lockutils [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.262 2 DEBUG oslo_concurrency.lockutils [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.262 2 DEBUG oslo_concurrency.lockutils [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.264 2 INFO nova.compute.manager [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Terminating instance#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.266 2 DEBUG nova.compute.manager [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 kernel: tapa8ce8a67-76 (unregistering): left promiscuous mode
Oct  2 08:08:05 np0005465988 NetworkManager[45041]: <info>  [1759406885.3947] device (tapa8ce8a67-76): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:08:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:08:05Z|00086|binding|INFO|Releasing lport a8ce8a67-762d-41a0-8c12-778f66e87f3c from this chassis (sb_readonly=0)
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:08:05Z|00087|binding|INFO|Setting lport a8ce8a67-762d-41a0-8c12-778f66e87f3c down in Southbound
Oct  2 08:08:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:08:05Z|00088|binding|INFO|Removing iface tapa8ce8a67-76 ovn-installed in OVS
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.417 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:37:f7 10.100.0.4'], port_security=['fa:16:3e:dd:37:f7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8d7306df-bd40-48a7-99a7-36da8b9a67f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0392b00d-9a0f-4fdc-878a-61235e8b04c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7359a7dad3b849bfbf075b88f2a261b4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '89efda0a-e365-4ab4-b56f-2cbf8e88c8e8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=00463c6f-e0da-4800-9774-7f10cd7297fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=a8ce8a67-762d-41a0-8c12-778f66e87f3c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.419 142124 INFO neutron.agent.ovn.metadata.agent [-] Port a8ce8a67-762d-41a0-8c12-778f66e87f3c in datapath 0392b00d-9a0f-4fdc-878a-61235e8b04c7 unbound from our chassis#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.421 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0392b00d-9a0f-4fdc-878a-61235e8b04c7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.423 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1508e74d-69ad-48da-aa80-972c3fc936db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.423 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7 namespace which is not needed anymore#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000020.scope: Deactivated successfully.
Oct  2 08:08:05 np0005465988 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000020.scope: Consumed 17.237s CPU time.
Oct  2 08:08:05 np0005465988 systemd-machined[192594]: Machine qemu-13-instance-00000020 terminated.
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.519 2 INFO nova.virt.libvirt.driver [-] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Instance destroyed successfully.#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.520 2 DEBUG nova.objects.instance [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lazy-loading 'resources' on Instance uuid 8d7306df-bd40-48a7-99a7-36da8b9a67f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.544 2 DEBUG nova.virt.libvirt.vif [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-1739227330',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-1739227330',id=32,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWLHYAzCSRCkStdbU+GdVhWIXiwiTci8xggQ9ThyRlprkD/MENcP1zXCe9JELWxtblFvNPabWQ+ZgjaGJX29tNuXgS46PKPgWmCmmQjfV3eqKUfK1wEy2Lz1kDGxf6LzA==',key_name='tempest-keypair-377984943',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:06:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7359a7dad3b849bfbf075b88f2a261b4',ramdisk_id='',reservation_id='r-pu3d1045',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-1815230933',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-1815230933-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:06:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ec17c54e24584f11a5348b68d6e7ca85',uuid=8d7306df-bd40-48a7-99a7-36da8b9a67f3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.544 2 DEBUG nova.network.os_vif_util [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Converting VIF {"id": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "address": "fa:16:3e:dd:37:f7", "network": {"id": "0392b00d-9a0f-4fdc-878a-61235e8b04c7", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-321386985-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7359a7dad3b849bfbf075b88f2a261b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ce8a67-76", "ovs_interfaceid": "a8ce8a67-762d-41a0-8c12-778f66e87f3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.545 2 DEBUG nova.network.os_vif_util [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:37:f7,bridge_name='br-int',has_traffic_filtering=True,id=a8ce8a67-762d-41a0-8c12-778f66e87f3c,network=Network(0392b00d-9a0f-4fdc-878a-61235e8b04c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ce8a67-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.545 2 DEBUG os_vif [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:37:f7,bridge_name='br-int',has_traffic_filtering=True,id=a8ce8a67-762d-41a0-8c12-778f66e87f3c,network=Network(0392b00d-9a0f-4fdc-878a-61235e8b04c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ce8a67-76') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8ce8a67-76, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.554 2 INFO os_vif [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:37:f7,bridge_name='br-int',has_traffic_filtering=True,id=a8ce8a67-762d-41a0-8c12-778f66e87f3c,network=Network(0392b00d-9a0f-4fdc-878a-61235e8b04c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ce8a67-76')#033[00m
Oct  2 08:08:05 np0005465988 neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7[251389]: [NOTICE]   (251398) : haproxy version is 2.8.14-c23fe91
Oct  2 08:08:05 np0005465988 neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7[251389]: [NOTICE]   (251398) : path to executable is /usr/sbin/haproxy
Oct  2 08:08:05 np0005465988 neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7[251389]: [WARNING]  (251398) : Exiting Master process...
Oct  2 08:08:05 np0005465988 neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7[251389]: [ALERT]    (251398) : Current worker (251401) exited with code 143 (Terminated)
Oct  2 08:08:05 np0005465988 neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7[251389]: [WARNING]  (251398) : All workers exited. Exiting... (0)
Oct  2 08:08:05 np0005465988 systemd[1]: libpod-08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af.scope: Deactivated successfully.
Oct  2 08:08:05 np0005465988 podman[253571]: 2025-10-02 12:08:05.605280607 +0000 UTC m=+0.057996890 container died 08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:08:05 np0005465988 systemd[1]: var-lib-containers-storage-overlay-2ec215ae210229590b3e630485aa9907e988503a121a123c0585aeec602ac157-merged.mount: Deactivated successfully.
Oct  2 08:08:05 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af-userdata-shm.mount: Deactivated successfully.
Oct  2 08:08:05 np0005465988 podman[253571]: 2025-10-02 12:08:05.68268611 +0000 UTC m=+0.135402403 container cleanup 08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.692 2 DEBUG nova.compute.manager [req-b2c806ab-1ebe-4815-a727-3da9c8f359f7 req-11b58943-3d00-4588-ba7a-e81ab0a3391b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Received event network-vif-unplugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.692 2 DEBUG oslo_concurrency.lockutils [req-b2c806ab-1ebe-4815-a727-3da9c8f359f7 req-11b58943-3d00-4588-ba7a-e81ab0a3391b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.693 2 DEBUG oslo_concurrency.lockutils [req-b2c806ab-1ebe-4815-a727-3da9c8f359f7 req-11b58943-3d00-4588-ba7a-e81ab0a3391b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.693 2 DEBUG oslo_concurrency.lockutils [req-b2c806ab-1ebe-4815-a727-3da9c8f359f7 req-11b58943-3d00-4588-ba7a-e81ab0a3391b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.693 2 DEBUG nova.compute.manager [req-b2c806ab-1ebe-4815-a727-3da9c8f359f7 req-11b58943-3d00-4588-ba7a-e81ab0a3391b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] No waiting events found dispatching network-vif-unplugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.693 2 DEBUG nova.compute.manager [req-b2c806ab-1ebe-4815-a727-3da9c8f359f7 req-11b58943-3d00-4588-ba7a-e81ab0a3391b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Received event network-vif-unplugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:08:05 np0005465988 systemd[1]: libpod-conmon-08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af.scope: Deactivated successfully.
Oct  2 08:08:05 np0005465988 podman[253620]: 2025-10-02 12:08:05.757000743 +0000 UTC m=+0.054833007 container remove 08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.763 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8a6bdf50-d1ed-4683-b0b6-03eb274d6c3f]: (4, ('Thu Oct  2 12:08:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7 (08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af)\n08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af\nThu Oct  2 12:08:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7 (08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af)\n08c658b116138c8e14d05d34d6aa659580f39964363161283a7bc150da6c70af\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.765 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[989c3451-c0a2-4aa7-b2b8-d9ee038ac0c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.767 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0392b00d-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 kernel: tap0392b00d-90: left promiscuous mode
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.777 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5f068b0d-ffda-42cd-84d3-6abd1c82a3e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:05 np0005465988 nova_compute[236126]: 2025-10-02 12:08:05.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.806 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[07903a69-586b-4f08-8075-12f85785d703]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.808 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1c0c10c6-4d2c-4a8a-9d1a-47fe7b004b1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.833 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d92243cd-f9c6-46eb-9a7a-adad2878abb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485820, 'reachable_time': 25210, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253635, 'error': None, 'target': 'ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:05 np0005465988 systemd[1]: run-netns-ovnmeta\x2d0392b00d\x2d9a0f\x2d4fdc\x2d878a\x2d61235e8b04c7.mount: Deactivated successfully.
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.836 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0392b00d-9a0f-4fdc-878a-61235e8b04c7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:08:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:05.836 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[53188d39-45d6-43cd-ad89-670ec76ed109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:05.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:06.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.620 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "6536d247-f08a-47a5-8be9-cfbf5481312c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.621 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "6536d247-f08a-47a5-8be9-cfbf5481312c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.622 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "6536d247-f08a-47a5-8be9-cfbf5481312c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.622 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "6536d247-f08a-47a5-8be9-cfbf5481312c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.622 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "6536d247-f08a-47a5-8be9-cfbf5481312c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.624 2 INFO nova.compute.manager [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Terminating instance#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.626 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "refresh_cache-6536d247-f08a-47a5-8be9-cfbf5481312c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.626 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquired lock "refresh_cache-6536d247-f08a-47a5-8be9-cfbf5481312c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.626 2 DEBUG nova.network.neutron [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.879 2 DEBUG nova.network.neutron [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.881 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "1fd5329d-bab2-4f79-85be-dc67dc7c8df8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.881 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "1fd5329d-bab2-4f79-85be-dc67dc7c8df8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.881 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "1fd5329d-bab2-4f79-85be-dc67dc7c8df8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.881 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "1fd5329d-bab2-4f79-85be-dc67dc7c8df8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.882 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "1fd5329d-bab2-4f79-85be-dc67dc7c8df8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.882 2 INFO nova.compute.manager [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Terminating instance#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.883 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "refresh_cache-1fd5329d-bab2-4f79-85be-dc67dc7c8df8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.883 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquired lock "refresh_cache-1fd5329d-bab2-4f79-85be-dc67dc7c8df8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.883 2 DEBUG nova.network.neutron [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:08:06 np0005465988 nova_compute[236126]: 2025-10-02 12:08:06.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.126 2 DEBUG nova.network.neutron [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.186 2 DEBUG nova.network.neutron [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.205 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Releasing lock "refresh_cache-6536d247-f08a-47a5-8be9-cfbf5481312c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.206 2 DEBUG nova.compute.manager [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:08:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:07 np0005465988 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000026.scope: Deactivated successfully.
Oct  2 08:08:07 np0005465988 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000026.scope: Consumed 12.798s CPU time.
Oct  2 08:08:07 np0005465988 systemd-machined[192594]: Machine qemu-16-instance-00000026 terminated.
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.560 2 DEBUG nova.network.neutron [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.579 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Releasing lock "refresh_cache-1fd5329d-bab2-4f79-85be-dc67dc7c8df8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.581 2 DEBUG nova.compute.manager [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.636 2 INFO nova.virt.libvirt.driver [-] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Instance destroyed successfully.#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.638 2 DEBUG nova.objects.instance [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lazy-loading 'resources' on Instance uuid 6536d247-f08a-47a5-8be9-cfbf5481312c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.773 2 DEBUG nova.compute.manager [req-8e880341-48b1-44a0-91f8-d0a5e29da92d req-a8e78e2b-36f1-4b12-868d-152f6971b4b5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Received event network-vif-plugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.774 2 DEBUG oslo_concurrency.lockutils [req-8e880341-48b1-44a0-91f8-d0a5e29da92d req-a8e78e2b-36f1-4b12-868d-152f6971b4b5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.774 2 DEBUG oslo_concurrency.lockutils [req-8e880341-48b1-44a0-91f8-d0a5e29da92d req-a8e78e2b-36f1-4b12-868d-152f6971b4b5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.775 2 DEBUG oslo_concurrency.lockutils [req-8e880341-48b1-44a0-91f8-d0a5e29da92d req-a8e78e2b-36f1-4b12-868d-152f6971b4b5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.775 2 DEBUG nova.compute.manager [req-8e880341-48b1-44a0-91f8-d0a5e29da92d req-a8e78e2b-36f1-4b12-868d-152f6971b4b5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] No waiting events found dispatching network-vif-plugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.775 2 WARNING nova.compute.manager [req-8e880341-48b1-44a0-91f8-d0a5e29da92d req-a8e78e2b-36f1-4b12-868d-152f6971b4b5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Received unexpected event network-vif-plugged-a8ce8a67-762d-41a0-8c12-778f66e87f3c for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:08:07 np0005465988 nova_compute[236126]: 2025-10-02 12:08:07.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:07.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:08 np0005465988 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000027.scope: Deactivated successfully.
Oct  2 08:08:08 np0005465988 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000027.scope: Consumed 13.568s CPU time.
Oct  2 08:08:08 np0005465988 systemd-machined[192594]: Machine qemu-15-instance-00000027 terminated.
Oct  2 08:08:08 np0005465988 nova_compute[236126]: 2025-10-02 12:08:08.214 2 INFO nova.virt.libvirt.driver [-] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Instance destroyed successfully.#033[00m
Oct  2 08:08:08 np0005465988 nova_compute[236126]: 2025-10-02 12:08:08.215 2 DEBUG nova.objects.instance [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lazy-loading 'resources' on Instance uuid 1fd5329d-bab2-4f79-85be-dc67dc7c8df8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:08.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:08 np0005465988 nova_compute[236126]: 2025-10-02 12:08:08.628 2 DEBUG oslo_concurrency.lockutils [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:08 np0005465988 nova_compute[236126]: 2025-10-02 12:08:08.629 2 DEBUG oslo_concurrency.lockutils [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:08 np0005465988 nova_compute[236126]: 2025-10-02 12:08:08.651 2 DEBUG nova.objects.instance [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lazy-loading 'flavor' on Instance uuid 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:08 np0005465988 nova_compute[236126]: 2025-10-02 12:08:08.702 2 DEBUG oslo_concurrency.lockutils [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:08 np0005465988 nova_compute[236126]: 2025-10-02 12:08:08.993 2 DEBUG oslo_concurrency.lockutils [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:08 np0005465988 nova_compute[236126]: 2025-10-02 12:08:08.994 2 DEBUG oslo_concurrency.lockutils [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:08 np0005465988 nova_compute[236126]: 2025-10-02 12:08:08.995 2 INFO nova.compute.manager [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Attaching volume 0dd62c4f-a912-4779-b0b4-363755e9a330 to /dev/sdc#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.143 2 DEBUG os_brick.utils [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.145 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.160 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.160 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[023cb9a8-fb77-4a3b-b967-eeb9b30edbd7]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.162 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.172 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.173 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[41cbab91-8535-45cf-9e8a-aeca437ffc68]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.175 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.186 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.186 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[c192c939-0a3f-46c1-b0f8-399a6a06082e]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.189 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[8457dc9f-1b1f-4825-804f-215f3161f8a9]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.189 2 DEBUG oslo_concurrency.processutils [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.220 2 DEBUG oslo_concurrency.processutils [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CMD "nvme version" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.223 2 DEBUG os_brick.initiator.connectors.lightos [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.224 2 DEBUG os_brick.initiator.connectors.lightos [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.226 2 DEBUG os_brick.initiator.connectors.lightos [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.227 2 DEBUG os_brick.utils [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] <== get_connector_properties: return (82ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.228 2 DEBUG nova.virt.block_device [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Updating existing volume attachment record: c96b81a8-7637-49cf-8068-2356c09aa2d5 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:08:09 np0005465988 nova_compute[236126]: 2025-10-02 12:08:09.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:09.814 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:08:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:09.815 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:08:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:09.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.102 2 DEBUG nova.objects.instance [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lazy-loading 'flavor' on Instance uuid 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.125 2 INFO nova.virt.libvirt.driver [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Deleting instance files /var/lib/nova/instances/8d7306df-bd40-48a7-99a7-36da8b9a67f3_del#033[00m
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.127 2 INFO nova.virt.libvirt.driver [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Deletion of /var/lib/nova/instances/8d7306df-bd40-48a7-99a7-36da8b9a67f3_del complete#033[00m
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.243 2 DEBUG nova.virt.libvirt.guest [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] attach device xml: <disk type="network" device="disk">
Oct  2 08:08:10 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:08:10 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-0dd62c4f-a912-4779-b0b4-363755e9a330">
Oct  2 08:08:10 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:08:10 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:08:10 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:08:10 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:08:10 np0005465988 nova_compute[236126]:  <auth username="openstack">
Oct  2 08:08:10 np0005465988 nova_compute[236126]:    <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:08:10 np0005465988 nova_compute[236126]:  </auth>
Oct  2 08:08:10 np0005465988 nova_compute[236126]:  <target dev="sdc" bus="scsi"/>
Oct  2 08:08:10 np0005465988 nova_compute[236126]:  <serial>0dd62c4f-a912-4779-b0b4-363755e9a330</serial>
Oct  2 08:08:10 np0005465988 nova_compute[236126]:  <address type="drive" controller="0" unit="2"/>
Oct  2 08:08:10 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:08:10 np0005465988 nova_compute[236126]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:08:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:10.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.386 2 INFO nova.compute.manager [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Took 5.12 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.387 2 DEBUG oslo.service.loopingcall [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.387 2 DEBUG nova.compute.manager [-] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.388 2 DEBUG nova.network.neutron [-] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.533 2 DEBUG nova.virt.libvirt.driver [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.534 2 DEBUG nova.virt.libvirt.driver [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.534 2 DEBUG nova.virt.libvirt.driver [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] No BDM found with device name sdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.535 2 DEBUG nova.virt.libvirt.driver [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] No VIF found with MAC fa:16:3e:e2:7c:9f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:08:10 np0005465988 nova_compute[236126]: 2025-10-02 12:08:10.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:11 np0005465988 nova_compute[236126]: 2025-10-02 12:08:11.357 2 INFO nova.virt.libvirt.driver [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Deleting instance files /var/lib/nova/instances/6536d247-f08a-47a5-8be9-cfbf5481312c_del#033[00m
Oct  2 08:08:11 np0005465988 nova_compute[236126]: 2025-10-02 12:08:11.359 2 INFO nova.virt.libvirt.driver [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Deletion of /var/lib/nova/instances/6536d247-f08a-47a5-8be9-cfbf5481312c_del complete#033[00m
Oct  2 08:08:11 np0005465988 nova_compute[236126]: 2025-10-02 12:08:11.638 2 INFO nova.compute.manager [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Took 4.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:08:11 np0005465988 nova_compute[236126]: 2025-10-02 12:08:11.639 2 DEBUG oslo.service.loopingcall [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:08:11 np0005465988 nova_compute[236126]: 2025-10-02 12:08:11.640 2 DEBUG nova.compute.manager [-] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:08:11 np0005465988 nova_compute[236126]: 2025-10-02 12:08:11.640 2 DEBUG nova.network.neutron [-] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:08:11 np0005465988 nova_compute[236126]: 2025-10-02 12:08:11.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:11.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:12.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.504 2 DEBUG nova.network.neutron [-] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.596 2 DEBUG nova.network.neutron [-] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.607 2 DEBUG oslo_concurrency.lockutils [None req-cdb998a6-b1ec-4b96-b640-d2a53053c65f 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.665 2 INFO nova.compute.manager [-] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Took 1.02 seconds to deallocate network for instance.#033[00m
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.718 2 INFO nova.virt.libvirt.driver [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Deleting instance files /var/lib/nova/instances/1fd5329d-bab2-4f79-85be-dc67dc7c8df8_del#033[00m
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.719 2 INFO nova.virt.libvirt.driver [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Deletion of /var/lib/nova/instances/1fd5329d-bab2-4f79-85be-dc67dc7c8df8_del complete#033[00m
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.817 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.818 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.909 2 INFO nova.compute.manager [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Took 5.33 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.909 2 DEBUG oslo.service.loopingcall [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.909 2 DEBUG nova.compute.manager [-] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.909 2 DEBUG nova.network.neutron [-] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:08:12 np0005465988 podman[253737]: 2025-10-02 12:08:12.926447748 +0000 UTC m=+0.087346843 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct  2 08:08:12 np0005465988 podman[253736]: 2025-10-02 12:08:12.940464056 +0000 UTC m=+0.099427905 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:08:12 np0005465988 nova_compute[236126]: 2025-10-02 12:08:12.962 2 DEBUG oslo_concurrency.processutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:12 np0005465988 podman[253734]: 2025-10-02 12:08:12.967448092 +0000 UTC m=+0.127103201 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:08:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:08:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3746483311' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:08:13 np0005465988 nova_compute[236126]: 2025-10-02 12:08:13.418 2 DEBUG oslo_concurrency.processutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:13 np0005465988 nova_compute[236126]: 2025-10-02 12:08:13.426 2 DEBUG nova.compute.provider_tree [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:13 np0005465988 nova_compute[236126]: 2025-10-02 12:08:13.475 2 DEBUG nova.scheduler.client.report [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:13 np0005465988 nova_compute[236126]: 2025-10-02 12:08:13.603 2 DEBUG nova.network.neutron [-] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:08:13 np0005465988 nova_compute[236126]: 2025-10-02 12:08:13.670 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.852s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:13 np0005465988 nova_compute[236126]: 2025-10-02 12:08:13.684 2 DEBUG nova.network.neutron [-] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:13 np0005465988 nova_compute[236126]: 2025-10-02 12:08:13.738 2 INFO nova.scheduler.client.report [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Deleted allocations for instance 6536d247-f08a-47a5-8be9-cfbf5481312c#033[00m
Oct  2 08:08:13 np0005465988 nova_compute[236126]: 2025-10-02 12:08:13.748 2 INFO nova.compute.manager [-] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Took 0.84 seconds to deallocate network for instance.#033[00m
Oct  2 08:08:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:13.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.019 2 DEBUG oslo_concurrency.lockutils [None req-3b39a54e-8c5f-4b69-9dc8-28f34577158c 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "6536d247-f08a-47a5-8be9-cfbf5481312c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.044 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.045 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.126 2 DEBUG oslo_concurrency.processutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:14.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.329 2 DEBUG nova.compute.manager [req-72896cd9-4830-44cb-b8d1-5e361e5bbcb7 req-87d75aa4-6aff-4123-80fc-e7848d8d4e2f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Received event network-vif-deleted-a8ce8a67-762d-41a0-8c12-778f66e87f3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.330 2 INFO nova.compute.manager [req-72896cd9-4830-44cb-b8d1-5e361e5bbcb7 req-87d75aa4-6aff-4123-80fc-e7848d8d4e2f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Neutron deleted interface a8ce8a67-762d-41a0-8c12-778f66e87f3c; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.331 2 DEBUG nova.network.neutron [req-72896cd9-4830-44cb-b8d1-5e361e5bbcb7 req-87d75aa4-6aff-4123-80fc-e7848d8d4e2f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.335 2 DEBUG nova.network.neutron [-] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.380 2 INFO nova.compute.manager [-] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Took 3.99 seconds to deallocate network for instance.#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.393 2 DEBUG nova.compute.manager [req-72896cd9-4830-44cb-b8d1-5e361e5bbcb7 req-87d75aa4-6aff-4123-80fc-e7848d8d4e2f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Detach interface failed, port_id=a8ce8a67-762d-41a0-8c12-778f66e87f3c, reason: Instance 8d7306df-bd40-48a7-99a7-36da8b9a67f3 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:08:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/240215465' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.620 2 DEBUG oslo_concurrency.processutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.626 2 DEBUG nova.compute.provider_tree [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.657 2 DEBUG nova.scheduler.client.report [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.666 2 DEBUG oslo_concurrency.lockutils [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.667 2 DEBUG oslo_concurrency.lockutils [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.668 2 DEBUG oslo_concurrency.lockutils [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.684 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.689 2 DEBUG oslo_concurrency.lockutils [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.690 2 INFO nova.compute.manager [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Detaching volume 0dd62c4f-a912-4779-b0b4-363755e9a330#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.730 2 INFO nova.scheduler.client.report [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Deleted allocations for instance 1fd5329d-bab2-4f79-85be-dc67dc7c8df8#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.778 2 DEBUG oslo_concurrency.processutils [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.802 2 DEBUG oslo_concurrency.lockutils [None req-9aeb438c-d1f2-4070-b593-887916c181d8 199c0d9541a04c4db07e50bfba9fddb1 62aa9c47ee2841139cd7066168f59650 - - default default] Lock "1fd5329d-bab2-4f79-85be-dc67dc7c8df8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.921s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:14.816 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.878 2 INFO nova.virt.block_device [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Attempting to driver detach volume 0dd62c4f-a912-4779-b0b4-363755e9a330 from mountpoint /dev/sdc#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.889 2 DEBUG nova.virt.libvirt.driver [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Attempting to detach device sdc from instance 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.889 2 DEBUG nova.virt.libvirt.guest [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-0dd62c4f-a912-4779-b0b4-363755e9a330">
Oct  2 08:08:14 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  <target dev="sdc" bus="scsi"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  <serial>0dd62c4f-a912-4779-b0b4-363755e9a330</serial>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  <address type="drive" controller="0" bus="0" target="0" unit="2"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:08:14 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.900 2 INFO nova.virt.libvirt.driver [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Successfully detached device sdc from instance 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 from the persistent domain config.#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.900 2 DEBUG nova.virt.libvirt.driver [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] (1/8): Attempting to detach device sdc with device alias scsi0-0-0-2 from instance 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.901 2 DEBUG nova.virt.libvirt.guest [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-0dd62c4f-a912-4779-b0b4-363755e9a330">
Oct  2 08:08:14 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  <target dev="sdc" bus="scsi"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  <serial>0dd62c4f-a912-4779-b0b4-363755e9a330</serial>
Oct  2 08:08:14 np0005465988 nova_compute[236126]:  <address type="drive" controller="0" bus="0" target="0" unit="2"/>
Oct  2 08:08:14 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:08:14 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.983 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Received event <DeviceRemovedEvent: 1759406894.9823136, 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 => scsi0-0-0-2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.984 2 DEBUG nova.virt.libvirt.driver [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Start waiting for the detach event from libvirt for device sdc with device alias scsi0-0-0-2 for instance 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:08:14 np0005465988 nova_compute[236126]: 2025-10-02 12:08:14.987 2 INFO nova.virt.libvirt.driver [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Successfully detached device sdc from instance 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 from the live domain config.#033[00m
Oct  2 08:08:15 np0005465988 nova_compute[236126]: 2025-10-02 12:08:15.137 2 DEBUG nova.objects.instance [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lazy-loading 'flavor' on Instance uuid 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:15 np0005465988 nova_compute[236126]: 2025-10-02 12:08:15.198 2 DEBUG oslo_concurrency.lockutils [None req-d26a7112-90d9-4f41-b3ee-3a919ad46ffe 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.530s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:08:15 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/984186566' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:08:15 np0005465988 nova_compute[236126]: 2025-10-02 12:08:15.260 2 DEBUG oslo_concurrency.processutils [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:15 np0005465988 nova_compute[236126]: 2025-10-02 12:08:15.265 2 DEBUG nova.compute.provider_tree [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:15 np0005465988 nova_compute[236126]: 2025-10-02 12:08:15.286 2 DEBUG nova.scheduler.client.report [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:15 np0005465988 nova_compute[236126]: 2025-10-02 12:08:15.308 2 DEBUG oslo_concurrency.lockutils [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:15 np0005465988 nova_compute[236126]: 2025-10-02 12:08:15.348 2 INFO nova.scheduler.client.report [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Deleted allocations for instance 8d7306df-bd40-48a7-99a7-36da8b9a67f3#033[00m
Oct  2 08:08:15 np0005465988 nova_compute[236126]: 2025-10-02 12:08:15.444 2 DEBUG oslo_concurrency.lockutils [None req-5bff85cd-84e3-488d-a0f0-7c2e45e7d9f7 ec17c54e24584f11a5348b68d6e7ca85 7359a7dad3b849bfbf075b88f2a261b4 - - default default] Lock "8d7306df-bd40-48a7-99a7-36da8b9a67f3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:15 np0005465988 nova_compute[236126]: 2025-10-02 12:08:15.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:15.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:16.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:16 np0005465988 nova_compute[236126]: 2025-10-02 12:08:16.438 2 DEBUG oslo_concurrency.lockutils [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:16 np0005465988 nova_compute[236126]: 2025-10-02 12:08:16.439 2 DEBUG oslo_concurrency.lockutils [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:16 np0005465988 nova_compute[236126]: 2025-10-02 12:08:16.439 2 DEBUG oslo_concurrency.lockutils [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:16 np0005465988 nova_compute[236126]: 2025-10-02 12:08:16.440 2 DEBUG oslo_concurrency.lockutils [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:16 np0005465988 nova_compute[236126]: 2025-10-02 12:08:16.440 2 DEBUG oslo_concurrency.lockutils [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:16 np0005465988 nova_compute[236126]: 2025-10-02 12:08:16.442 2 INFO nova.compute.manager [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Terminating instance#033[00m
Oct  2 08:08:16 np0005465988 nova_compute[236126]: 2025-10-02 12:08:16.443 2 DEBUG nova.compute.manager [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:08:16 np0005465988 nova_compute[236126]: 2025-10-02 12:08:16.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:17 np0005465988 kernel: tap54af1da4-23 (unregistering): left promiscuous mode
Oct  2 08:08:17 np0005465988 NetworkManager[45041]: <info>  [1759406897.0889] device (tap54af1da4-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:17 np0005465988 ovn_controller[132601]: 2025-10-02T12:08:17Z|00089|binding|INFO|Releasing lport 54af1da4-2337-4e36-8e6e-2c36ccf43309 from this chassis (sb_readonly=0)
Oct  2 08:08:17 np0005465988 ovn_controller[132601]: 2025-10-02T12:08:17Z|00090|binding|INFO|Setting lport 54af1da4-2337-4e36-8e6e-2c36ccf43309 down in Southbound
Oct  2 08:08:17 np0005465988 ovn_controller[132601]: 2025-10-02T12:08:17Z|00091|binding|INFO|Removing iface tap54af1da4-23 ovn-installed in OVS
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.107 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:7c:9f 10.100.0.12'], port_security=['fa:16:3e:e2:7c:9f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1442e48b-6b8f-4c96-b9b7-909071c8ebf2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '861ae6a71574411fbcdab09902e6bcc4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '08fdd263-3375-490b-b014-c8665c6b0045', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=17f08ce4-ac84-49af-9b71-8a31a5b87a27, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=54af1da4-2337-4e36-8e6e-2c36ccf43309) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.109 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 54af1da4-2337-4e36-8e6e-2c36ccf43309 in datapath 92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3 unbound from our chassis#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.111 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.112 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9412e4-1075-4c23-9a84-3b0b1f25cdb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.112 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3 namespace which is not needed anymore#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:17 np0005465988 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000024.scope: Deactivated successfully.
Oct  2 08:08:17 np0005465988 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000024.scope: Consumed 14.884s CPU time.
Oct  2 08:08:17 np0005465988 systemd-machined[192594]: Machine qemu-14-instance-00000024 terminated.
Oct  2 08:08:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:17 np0005465988 neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3[252335]: [NOTICE]   (252339) : haproxy version is 2.8.14-c23fe91
Oct  2 08:08:17 np0005465988 neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3[252335]: [NOTICE]   (252339) : path to executable is /usr/sbin/haproxy
Oct  2 08:08:17 np0005465988 neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3[252335]: [WARNING]  (252339) : Exiting Master process...
Oct  2 08:08:17 np0005465988 neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3[252335]: [ALERT]    (252339) : Current worker (252341) exited with code 143 (Terminated)
Oct  2 08:08:17 np0005465988 neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3[252335]: [WARNING]  (252339) : All workers exited. Exiting... (0)
Oct  2 08:08:17 np0005465988 systemd[1]: libpod-d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56.scope: Deactivated successfully.
Oct  2 08:08:17 np0005465988 podman[253920]: 2025-10-02 12:08:17.260807231 +0000 UTC m=+0.043859808 container died d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.282 2 INFO nova.virt.libvirt.driver [-] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Instance destroyed successfully.#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.283 2 DEBUG nova.objects.instance [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lazy-loading 'resources' on Instance uuid 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:17 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56-userdata-shm.mount: Deactivated successfully.
Oct  2 08:08:17 np0005465988 systemd[1]: var-lib-containers-storage-overlay-baf10260e13cc43c175ab574a220f02c7fdc5be7c1fee181daaff5e97699b622-merged.mount: Deactivated successfully.
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.299 2 DEBUG nova.virt.libvirt.vif [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-365234862',display_name='tempest-AttachSCSIVolumeTestJSON-server-365234862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachscsivolumetestjson-server-365234862',id=36,image_ref='9ba2a4cf-08bb-442a-b063-4fb551df3759',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJOJlCS90Za5LbWSHMPgP+Ek8S8WBhE0f2SR8Hk2ruxpy1jfgWDX1N1DF8IoYVNVmIZKbxHjppgg2BMM4LAC0bu2V87e4yQfdKSbWwxyQEnyLjV7j1Lk4l2C7rrj3wvBCw==',key_name='tempest-keypair-519727741',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:07:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='861ae6a71574411fbcdab09902e6bcc4',ramdisk_id='',reservation_id='r-v8x0qy0m',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9ba2a4cf-08bb-442a-b063-4fb551df3759',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-280436056',owner_user_name='tempest-AttachSCSIVolumeTestJSON-280436056-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:07:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4666fdebee9947109da966b5c870b34e',uuid=1442e48b-6b8f-4c96-b9b7-909071c8ebf2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "address": "fa:16:3e:e2:7c:9f", "network": {"id": "92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1460458376-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "861ae6a71574411fbcdab09902e6bcc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54af1da4-23", "ovs_interfaceid": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.300 2 DEBUG nova.network.os_vif_util [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Converting VIF {"id": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "address": "fa:16:3e:e2:7c:9f", "network": {"id": "92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1460458376-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "861ae6a71574411fbcdab09902e6bcc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54af1da4-23", "ovs_interfaceid": "54af1da4-2337-4e36-8e6e-2c36ccf43309", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.301 2 DEBUG nova.network.os_vif_util [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e2:7c:9f,bridge_name='br-int',has_traffic_filtering=True,id=54af1da4-2337-4e36-8e6e-2c36ccf43309,network=Network(92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54af1da4-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:08:17 np0005465988 podman[253920]: 2025-10-02 12:08:17.301553417 +0000 UTC m=+0.084605994 container cleanup d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.301 2 DEBUG os_vif [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:7c:9f,bridge_name='br-int',has_traffic_filtering=True,id=54af1da4-2337-4e36-8e6e-2c36ccf43309,network=Network(92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54af1da4-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.303 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54af1da4-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.309 2 INFO os_vif [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:7c:9f,bridge_name='br-int',has_traffic_filtering=True,id=54af1da4-2337-4e36-8e6e-2c36ccf43309,network=Network(92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54af1da4-23')#033[00m
Oct  2 08:08:17 np0005465988 systemd[1]: libpod-conmon-d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56.scope: Deactivated successfully.
Oct  2 08:08:17 np0005465988 podman[253959]: 2025-10-02 12:08:17.368573408 +0000 UTC m=+0.041435367 container remove d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.374 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[24ae5498-4ac9-4387-b8fd-3d1e0fef6358]: (4, ('Thu Oct  2 12:08:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3 (d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56)\nd1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56\nThu Oct  2 12:08:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3 (d1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56)\nd1ebd3da1ea5b8872ff5a44cb1b085725a06e2e9488b28c468404444b48bbd56\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.376 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[81f28003-e802-4779-b83b-8940ed27fd63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.377 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92e5c6c5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:17 np0005465988 kernel: tap92e5c6c5-00: left promiscuous mode
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.396 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b50ac515-8f35-4ffb-954e-a15194bd4160]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.420 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b7337765-4262-4c5c-b62a-c01bee8c4e5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.421 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e36cbaf9-6ea3-4498-987e-147b9123ff7b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.437 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[506564a7-17f8-4037-8f99-1e1561c6bb4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491152, 'reachable_time': 32792, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253989, 'error': None, 'target': 'ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.442 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-92e5c6c5-08b8-4c1a-bfbf-8b321e6952a3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:08:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:17.442 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[693408b0-67cb-4ef7-915b-77d78f86f29c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:17 np0005465988 systemd[1]: run-netns-ovnmeta\x2d92e5c6c5\x2d08b8\x2d4c1a\x2dbfbf\x2d8b321e6952a3.mount: Deactivated successfully.
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.824 2 DEBUG nova.compute.manager [req-0def7702-ebcc-4254-8c1f-a47e03fc572d req-1033f4a8-6d02-4b1f-a597-8d54b020905a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Received event network-vif-unplugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.824 2 DEBUG oslo_concurrency.lockutils [req-0def7702-ebcc-4254-8c1f-a47e03fc572d req-1033f4a8-6d02-4b1f-a597-8d54b020905a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.825 2 DEBUG oslo_concurrency.lockutils [req-0def7702-ebcc-4254-8c1f-a47e03fc572d req-1033f4a8-6d02-4b1f-a597-8d54b020905a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.825 2 DEBUG oslo_concurrency.lockutils [req-0def7702-ebcc-4254-8c1f-a47e03fc572d req-1033f4a8-6d02-4b1f-a597-8d54b020905a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.825 2 DEBUG nova.compute.manager [req-0def7702-ebcc-4254-8c1f-a47e03fc572d req-1033f4a8-6d02-4b1f-a597-8d54b020905a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] No waiting events found dispatching network-vif-unplugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:08:17 np0005465988 nova_compute[236126]: 2025-10-02 12:08:17.825 2 DEBUG nova.compute.manager [req-0def7702-ebcc-4254-8c1f-a47e03fc572d req-1033f4a8-6d02-4b1f-a597-8d54b020905a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Received event network-vif-unplugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:08:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:17.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:18.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:19 np0005465988 nova_compute[236126]: 2025-10-02 12:08:19.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:19 np0005465988 nova_compute[236126]: 2025-10-02 12:08:19.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:19.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:19 np0005465988 nova_compute[236126]: 2025-10-02 12:08:19.964 2 DEBUG nova.compute.manager [req-f007348a-6f6a-43e6-af7d-63681985315b req-b277b25e-f543-40e1-8b52-0bf2817c3b49 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Received event network-vif-plugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:19 np0005465988 nova_compute[236126]: 2025-10-02 12:08:19.965 2 DEBUG oslo_concurrency.lockutils [req-f007348a-6f6a-43e6-af7d-63681985315b req-b277b25e-f543-40e1-8b52-0bf2817c3b49 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:19 np0005465988 nova_compute[236126]: 2025-10-02 12:08:19.965 2 DEBUG oslo_concurrency.lockutils [req-f007348a-6f6a-43e6-af7d-63681985315b req-b277b25e-f543-40e1-8b52-0bf2817c3b49 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:19 np0005465988 nova_compute[236126]: 2025-10-02 12:08:19.965 2 DEBUG oslo_concurrency.lockutils [req-f007348a-6f6a-43e6-af7d-63681985315b req-b277b25e-f543-40e1-8b52-0bf2817c3b49 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:19 np0005465988 nova_compute[236126]: 2025-10-02 12:08:19.965 2 DEBUG nova.compute.manager [req-f007348a-6f6a-43e6-af7d-63681985315b req-b277b25e-f543-40e1-8b52-0bf2817c3b49 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] No waiting events found dispatching network-vif-plugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:08:19 np0005465988 nova_compute[236126]: 2025-10-02 12:08:19.966 2 WARNING nova.compute.manager [req-f007348a-6f6a-43e6-af7d-63681985315b req-b277b25e-f543-40e1-8b52-0bf2817c3b49 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Received unexpected event network-vif-plugged-54af1da4-2337-4e36-8e6e-2c36ccf43309 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:08:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:20.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:20 np0005465988 nova_compute[236126]: 2025-10-02 12:08:20.518 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406885.5169332, 8d7306df-bd40-48a7-99a7-36da8b9a67f3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:08:20 np0005465988 nova_compute[236126]: 2025-10-02 12:08:20.519 2 INFO nova.compute.manager [-] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:08:20 np0005465988 nova_compute[236126]: 2025-10-02 12:08:20.537 2 DEBUG nova.compute.manager [None req-f331354d-0518-49a9-aaee-2699194c4bf0 - - - - - -] [instance: 8d7306df-bd40-48a7-99a7-36da8b9a67f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:20 np0005465988 nova_compute[236126]: 2025-10-02 12:08:20.792 2 INFO nova.virt.libvirt.driver [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Deleting instance files /var/lib/nova/instances/1442e48b-6b8f-4c96-b9b7-909071c8ebf2_del#033[00m
Oct  2 08:08:20 np0005465988 nova_compute[236126]: 2025-10-02 12:08:20.793 2 INFO nova.virt.libvirt.driver [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Deletion of /var/lib/nova/instances/1442e48b-6b8f-4c96-b9b7-909071c8ebf2_del complete#033[00m
Oct  2 08:08:20 np0005465988 nova_compute[236126]: 2025-10-02 12:08:20.867 2 INFO nova.compute.manager [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Took 4.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:08:20 np0005465988 nova_compute[236126]: 2025-10-02 12:08:20.868 2 DEBUG oslo.service.loopingcall [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:08:20 np0005465988 nova_compute[236126]: 2025-10-02 12:08:20.869 2 DEBUG nova.compute.manager [-] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:08:20 np0005465988 nova_compute[236126]: 2025-10-02 12:08:20.869 2 DEBUG nova.network.neutron [-] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:08:21 np0005465988 nova_compute[236126]: 2025-10-02 12:08:21.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:21.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.063 2 DEBUG nova.network.neutron [-] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.085 2 INFO nova.compute.manager [-] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Took 1.22 seconds to deallocate network for instance.#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.152 2 DEBUG oslo_concurrency.lockutils [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.153 2 DEBUG oslo_concurrency.lockutils [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.202 2 DEBUG oslo_concurrency.processutils [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.274 2 DEBUG nova.compute.manager [req-957139e0-455f-4d68-b264-60f878b271bb req-eab7f67d-1dc7-4e01-957d-1bb404c193e6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Received event network-vif-deleted-54af1da4-2337-4e36-8e6e-2c36ccf43309 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:22.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.633 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406887.6317523, 6536d247-f08a-47a5-8be9-cfbf5481312c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.634 2 INFO nova.compute.manager [-] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:08:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:08:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2303983804' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.657 2 DEBUG nova.compute.manager [None req-b2820f51-97df-4ba6-8234-19b65f3554f6 - - - - - -] [instance: 6536d247-f08a-47a5-8be9-cfbf5481312c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.671 2 DEBUG oslo_concurrency.processutils [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.676 2 DEBUG nova.compute.provider_tree [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.690 2 DEBUG nova.scheduler.client.report [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.709 2 DEBUG oslo_concurrency.lockutils [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.739 2 INFO nova.scheduler.client.report [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Deleted allocations for instance 1442e48b-6b8f-4c96-b9b7-909071c8ebf2#033[00m
Oct  2 08:08:22 np0005465988 nova_compute[236126]: 2025-10-02 12:08:22.801 2 DEBUG oslo_concurrency.lockutils [None req-7fb0abf9-8d9b-484a-826d-ab8e749189d4 4666fdebee9947109da966b5c870b34e 861ae6a71574411fbcdab09902e6bcc4 - - default default] Lock "1442e48b-6b8f-4c96-b9b7-909071c8ebf2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:23 np0005465988 nova_compute[236126]: 2025-10-02 12:08:23.211 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406888.209718, 1fd5329d-bab2-4f79-85be-dc67dc7c8df8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:08:23 np0005465988 nova_compute[236126]: 2025-10-02 12:08:23.212 2 INFO nova.compute.manager [-] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:08:23 np0005465988 nova_compute[236126]: 2025-10-02 12:08:23.234 2 DEBUG nova.compute.manager [None req-c43d81bc-52ed-4419-b188-b4c012d90a1c - - - - - -] [instance: 1fd5329d-bab2-4f79-85be-dc67dc7c8df8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:23.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:24.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e163 e163: 3 total, 3 up, 3 in
Oct  2 08:08:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:25.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:26.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:26 np0005465988 nova_compute[236126]: 2025-10-02 12:08:26.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:27 np0005465988 nova_compute[236126]: 2025-10-02 12:08:27.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:27.336 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:27.336 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:27.336 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:27.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:28.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:29 np0005465988 podman[254023]: 2025-10-02 12:08:29.568300279 +0000 UTC m=+0.094226094 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:08:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:29.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:30.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:30 np0005465988 nova_compute[236126]: 2025-10-02 12:08:30.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:30 np0005465988 nova_compute[236126]: 2025-10-02 12:08:30.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:08:30 np0005465988 nova_compute[236126]: 2025-10-02 12:08:30.514 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:08:31 np0005465988 nova_compute[236126]: 2025-10-02 12:08:31.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:31.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:32 np0005465988 nova_compute[236126]: 2025-10-02 12:08:32.281 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406897.2809033, 1442e48b-6b8f-4c96-b9b7-909071c8ebf2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:08:32 np0005465988 nova_compute[236126]: 2025-10-02 12:08:32.282 2 INFO nova.compute.manager [-] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:08:32 np0005465988 nova_compute[236126]: 2025-10-02 12:08:32.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:32.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:32 np0005465988 nova_compute[236126]: 2025-10-02 12:08:32.314 2 DEBUG nova.compute.manager [None req-274e0c7d-54a6-489d-96b0-59b115f4c99d - - - - - -] [instance: 1442e48b-6b8f-4c96-b9b7-909071c8ebf2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e164 e164: 3 total, 3 up, 3 in
Oct  2 08:08:33 np0005465988 nova_compute[236126]: 2025-10-02 12:08:33.514 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:33 np0005465988 nova_compute[236126]: 2025-10-02 12:08:33.515 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:33.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:34.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:34 np0005465988 nova_compute[236126]: 2025-10-02 12:08:34.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:35 np0005465988 nova_compute[236126]: 2025-10-02 12:08:35.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:35 np0005465988 nova_compute[236126]: 2025-10-02 12:08:35.519 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:35 np0005465988 nova_compute[236126]: 2025-10-02 12:08:35.520 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:35 np0005465988 nova_compute[236126]: 2025-10-02 12:08:35.520 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:35 np0005465988 nova_compute[236126]: 2025-10-02 12:08:35.520 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:08:35 np0005465988 nova_compute[236126]: 2025-10-02 12:08:35.520 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:35.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:08:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3701618906' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:08:36 np0005465988 nova_compute[236126]: 2025-10-02 12:08:36.079 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.559s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:36 np0005465988 nova_compute[236126]: 2025-10-02 12:08:36.253 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:08:36 np0005465988 nova_compute[236126]: 2025-10-02 12:08:36.254 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4791MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:08:36 np0005465988 nova_compute[236126]: 2025-10-02 12:08:36.254 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:36 np0005465988 nova_compute[236126]: 2025-10-02 12:08:36.254 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:36.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:36 np0005465988 nova_compute[236126]: 2025-10-02 12:08:36.692 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:08:36 np0005465988 nova_compute[236126]: 2025-10-02 12:08:36.692 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:08:36 np0005465988 nova_compute[236126]: 2025-10-02 12:08:36.818 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:36 np0005465988 nova_compute[236126]: 2025-10-02 12:08:36.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:08:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/920380949' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:08:37 np0005465988 nova_compute[236126]: 2025-10-02 12:08:37.280 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:37 np0005465988 nova_compute[236126]: 2025-10-02 12:08:37.286 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:37 np0005465988 nova_compute[236126]: 2025-10-02 12:08:37.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:37 np0005465988 nova_compute[236126]: 2025-10-02 12:08:37.337 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:37 np0005465988 nova_compute[236126]: 2025-10-02 12:08:37.518 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:08:37 np0005465988 nova_compute[236126]: 2025-10-02 12:08:37.519 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:37.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:38.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:39 np0005465988 nova_compute[236126]: 2025-10-02 12:08:39.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:39 np0005465988 nova_compute[236126]: 2025-10-02 12:08:39.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:39 np0005465988 nova_compute[236126]: 2025-10-02 12:08:39.476 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:39 np0005465988 nova_compute[236126]: 2025-10-02 12:08:39.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:08:39 np0005465988 nova_compute[236126]: 2025-10-02 12:08:39.477 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:39 np0005465988 nova_compute[236126]: 2025-10-02 12:08:39.477 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:08:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:39.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:40.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:41 np0005465988 nova_compute[236126]: 2025-10-02 12:08:41.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:41 np0005465988 nova_compute[236126]: 2025-10-02 12:08:41.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:41 np0005465988 nova_compute[236126]: 2025-10-02 12:08:41.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:08:41 np0005465988 nova_compute[236126]: 2025-10-02 12:08:41.497 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:08:41 np0005465988 nova_compute[236126]: 2025-10-02 12:08:41.498 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:41 np0005465988 nova_compute[236126]: 2025-10-02 12:08:41.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:41.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:42 np0005465988 nova_compute[236126]: 2025-10-02 12:08:42.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:42.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:43 np0005465988 podman[254146]: 2025-10-02 12:08:43.552757581 +0000 UTC m=+0.078512606 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:08:43 np0005465988 podman[254145]: 2025-10-02 12:08:43.580731875 +0000 UTC m=+0.108041415 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 08:08:43 np0005465988 podman[254144]: 2025-10-02 12:08:43.581634172 +0000 UTC m=+0.111123196 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:08:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:43.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:44.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e165 e165: 3 total, 3 up, 3 in
Oct  2 08:08:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:45.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:46.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:46 np0005465988 nova_compute[236126]: 2025-10-02 12:08:46.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:47 np0005465988 nova_compute[236126]: 2025-10-02 12:08:47.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e166 e166: 3 total, 3 up, 3 in
Oct  2 08:08:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:47.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:48.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.553036) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406929553083, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2699, "num_deletes": 505, "total_data_size": 5439719, "memory_usage": 5530528, "flush_reason": "Manual Compaction"}
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406929567910, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 3338723, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28216, "largest_seqno": 30909, "table_properties": {"data_size": 3328772, "index_size": 5677, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3205, "raw_key_size": 25702, "raw_average_key_size": 20, "raw_value_size": 3306237, "raw_average_value_size": 2630, "num_data_blocks": 247, "num_entries": 1257, "num_filter_entries": 1257, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759406735, "oldest_key_time": 1759406735, "file_creation_time": 1759406929, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 14927 microseconds, and 8454 cpu microseconds.
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.567958) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 3338723 bytes OK
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.567981) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.569787) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.569815) EVENT_LOG_v1 {"time_micros": 1759406929569797, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.569836) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 5426969, prev total WAL file size 5426969, number of live WAL files 2.
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.571343) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(3260KB)], [57(10MB)]
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406929571415, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 13998131, "oldest_snapshot_seqno": -1}
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5375 keys, 8582965 bytes, temperature: kUnknown
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406929629476, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 8582965, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8547525, "index_size": 20930, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13445, "raw_key_size": 137178, "raw_average_key_size": 25, "raw_value_size": 8451175, "raw_average_value_size": 1572, "num_data_blocks": 843, "num_entries": 5375, "num_filter_entries": 5375, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759406929, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.629868) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 8582965 bytes
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.632185) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 240.5 rd, 147.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 10.2 +0.0 blob) out(8.2 +0.0 blob), read-write-amplify(6.8) write-amplify(2.6) OK, records in: 6387, records dropped: 1012 output_compression: NoCompression
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.632218) EVENT_LOG_v1 {"time_micros": 1759406929632202, "job": 34, "event": "compaction_finished", "compaction_time_micros": 58196, "compaction_time_cpu_micros": 38854, "output_level": 6, "num_output_files": 1, "total_output_size": 8582965, "num_input_records": 6387, "num_output_records": 5375, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406929633552, "job": 34, "event": "table_file_deletion", "file_number": 59}
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759406929637466, "job": 34, "event": "table_file_deletion", "file_number": 57}
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.571187) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.637551) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.637560) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.637563) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.637567) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:08:49 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:08:49.637570) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:08:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:50.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:50.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:51 np0005465988 nova_compute[236126]: 2025-10-02 12:08:51.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:52.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:52 np0005465988 nova_compute[236126]: 2025-10-02 12:08:52.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:52.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:08:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:08:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e167 e167: 3 total, 3 up, 3 in
Oct  2 08:08:53 np0005465988 nova_compute[236126]: 2025-10-02 12:08:53.472 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Acquiring lock "58fc5652-5ae7-4845-af58-9a439200cce0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:53 np0005465988 nova_compute[236126]: 2025-10-02 12:08:53.472 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:53 np0005465988 nova_compute[236126]: 2025-10-02 12:08:53.495 2 DEBUG nova.compute.manager [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:08:53 np0005465988 nova_compute[236126]: 2025-10-02 12:08:53.598 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:53 np0005465988 nova_compute[236126]: 2025-10-02 12:08:53.599 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:53 np0005465988 nova_compute[236126]: 2025-10-02 12:08:53.605 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:08:53 np0005465988 nova_compute[236126]: 2025-10-02 12:08:53.605 2 INFO nova.compute.claims [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:08:53 np0005465988 nova_compute[236126]: 2025-10-02 12:08:53.764 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:54.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:08:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2570955571' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.257 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.263 2 DEBUG nova.compute.provider_tree [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.277 2 DEBUG nova.scheduler.client.report [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.305 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.306 2 DEBUG nova.compute.manager [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:08:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:08:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:08:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:54.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.387 2 DEBUG nova.compute.manager [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.388 2 DEBUG nova.network.neutron [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.431 2 INFO nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.453 2 DEBUG nova.compute.manager [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.573 2 DEBUG nova.compute.manager [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.575 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.576 2 INFO nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Creating image(s)#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.617 2 DEBUG nova.storage.rbd_utils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] rbd image 58fc5652-5ae7-4845-af58-9a439200cce0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.669 2 DEBUG nova.storage.rbd_utils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] rbd image 58fc5652-5ae7-4845-af58-9a439200cce0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.704 2 DEBUG nova.storage.rbd_utils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] rbd image 58fc5652-5ae7-4845-af58-9a439200cce0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.708 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.740 2 DEBUG nova.policy [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '245477e4901945099a0da748199456bc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d9c4d04247d43b086698f34cdea3ffb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.802 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.804 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.805 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.805 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.844 2 DEBUG nova.storage.rbd_utils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] rbd image 58fc5652-5ae7-4845-af58-9a439200cce0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:08:54 np0005465988 nova_compute[236126]: 2025-10-02 12:08:54.848 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 58fc5652-5ae7-4845-af58-9a439200cce0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:55 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Oct  2 08:08:55 np0005465988 nova_compute[236126]: 2025-10-02 12:08:55.151 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 58fc5652-5ae7-4845-af58-9a439200cce0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.303s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:55 np0005465988 nova_compute[236126]: 2025-10-02 12:08:55.260 2 DEBUG nova.storage.rbd_utils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] resizing rbd image 58fc5652-5ae7-4845-af58-9a439200cce0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:08:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:08:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:08:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:08:55 np0005465988 nova_compute[236126]: 2025-10-02 12:08:55.373 2 DEBUG nova.objects.instance [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lazy-loading 'migration_context' on Instance uuid 58fc5652-5ae7-4845-af58-9a439200cce0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:55 np0005465988 nova_compute[236126]: 2025-10-02 12:08:55.397 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:08:55 np0005465988 nova_compute[236126]: 2025-10-02 12:08:55.398 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Ensure instance console log exists: /var/lib/nova/instances/58fc5652-5ae7-4845-af58-9a439200cce0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:08:55 np0005465988 nova_compute[236126]: 2025-10-02 12:08:55.398 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:55 np0005465988 nova_compute[236126]: 2025-10-02 12:08:55.399 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:55 np0005465988 nova_compute[236126]: 2025-10-02 12:08:55.399 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:55.794 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:08:55 np0005465988 nova_compute[236126]: 2025-10-02 12:08:55.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:08:55.797 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:08:55 np0005465988 nova_compute[236126]: 2025-10-02 12:08:55.891 2 DEBUG nova.network.neutron [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Successfully created port: 9c5ec381-6cd0-4a47-b169-78f136b0fd4a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:08:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:56.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:56.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:56 np0005465988 nova_compute[236126]: 2025-10-02 12:08:56.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:08:57 np0005465988 nova_compute[236126]: 2025-10-02 12:08:57.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:57 np0005465988 nova_compute[236126]: 2025-10-02 12:08:57.627 2 DEBUG nova.network.neutron [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Successfully updated port: 9c5ec381-6cd0-4a47-b169-78f136b0fd4a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:08:57 np0005465988 nova_compute[236126]: 2025-10-02 12:08:57.649 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Acquiring lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:08:57 np0005465988 nova_compute[236126]: 2025-10-02 12:08:57.649 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Acquired lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:08:57 np0005465988 nova_compute[236126]: 2025-10-02 12:08:57.649 2 DEBUG nova.network.neutron [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:08:57 np0005465988 nova_compute[236126]: 2025-10-02 12:08:57.751 2 DEBUG nova.compute.manager [req-b129a6e0-518b-4429-9c92-b6bcf205881b req-8fa8ad6c-e535-4b7d-bbf0-2a4fb1aa3e99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Received event network-changed-9c5ec381-6cd0-4a47-b169-78f136b0fd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:57 np0005465988 nova_compute[236126]: 2025-10-02 12:08:57.752 2 DEBUG nova.compute.manager [req-b129a6e0-518b-4429-9c92-b6bcf205881b req-8fa8ad6c-e535-4b7d-bbf0-2a4fb1aa3e99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Refreshing instance network info cache due to event network-changed-9c5ec381-6cd0-4a47-b169-78f136b0fd4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:08:57 np0005465988 nova_compute[236126]: 2025-10-02 12:08:57.753 2 DEBUG oslo_concurrency.lockutils [req-b129a6e0-518b-4429-9c92-b6bcf205881b req-8fa8ad6c-e535-4b7d-bbf0-2a4fb1aa3e99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:08:57 np0005465988 nova_compute[236126]: 2025-10-02 12:08:57.977 2 DEBUG nova.network.neutron [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:08:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:08:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:08:58.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:08:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:08:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:08:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:08:58.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:00.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.044 2 DEBUG nova.network.neutron [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updating instance_info_cache with network_info: [{"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.065 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Releasing lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.066 2 DEBUG nova.compute.manager [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Instance network_info: |[{"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.066 2 DEBUG oslo_concurrency.lockutils [req-b129a6e0-518b-4429-9c92-b6bcf205881b req-8fa8ad6c-e535-4b7d-bbf0-2a4fb1aa3e99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.067 2 DEBUG nova.network.neutron [req-b129a6e0-518b-4429-9c92-b6bcf205881b req-8fa8ad6c-e535-4b7d-bbf0-2a4fb1aa3e99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Refreshing network info cache for port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.070 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Start _get_guest_xml network_info=[{"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.076 2 WARNING nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.086 2 DEBUG nova.virt.libvirt.host [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.087 2 DEBUG nova.virt.libvirt.host [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.090 2 DEBUG nova.virt.libvirt.host [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.091 2 DEBUG nova.virt.libvirt.host [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.092 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.092 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.092 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.093 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.093 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.093 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.093 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.093 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.094 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.094 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.094 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.094 2 DEBUG nova.virt.hardware [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.097 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:00.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:09:00 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1324999065' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:09:00 np0005465988 podman[254609]: 2025-10-02 12:09:00.548995137 +0000 UTC m=+0.080641708 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.563 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.612 2 DEBUG nova.storage.rbd_utils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] rbd image 58fc5652-5ae7-4845-af58-9a439200cce0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:09:00 np0005465988 nova_compute[236126]: 2025-10-02 12:09:00.619 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:09:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:09:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:09:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1399266365' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.093 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.096 2 DEBUG nova.virt.libvirt.vif [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:08:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1847078363',display_name='tempest-FloatingIPsAssociationTestJSON-server-1847078363',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1847078363',id=42,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d9c4d04247d43b086698f34cdea3ffb',ramdisk_id='',reservation_id='r-82bw2ac8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1186616354',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1186616354-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:08:54Z,user_data=None,user_id='245477e4901945099a0da748199456bc',uuid=58fc5652-5ae7-4845-af58-9a439200cce0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.097 2 DEBUG nova.network.os_vif_util [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Converting VIF {"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.098 2 DEBUG nova.network.os_vif_util [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d0:06,bridge_name='br-int',has_traffic_filtering=True,id=9c5ec381-6cd0-4a47-b169-78f136b0fd4a,network=Network(0ac7c314-5717-432f-9a9d-1e92ec61cf23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c5ec381-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.100 2 DEBUG nova.objects.instance [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lazy-loading 'pci_devices' on Instance uuid 58fc5652-5ae7-4845-af58-9a439200cce0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.119 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  <uuid>58fc5652-5ae7-4845-af58-9a439200cce0</uuid>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  <name>instance-0000002a</name>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-1847078363</nova:name>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:09:00</nova:creationTime>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <nova:user uuid="245477e4901945099a0da748199456bc">tempest-FloatingIPsAssociationTestJSON-1186616354-project-member</nova:user>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <nova:project uuid="1d9c4d04247d43b086698f34cdea3ffb">tempest-FloatingIPsAssociationTestJSON-1186616354</nova:project>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <nova:port uuid="9c5ec381-6cd0-4a47-b169-78f136b0fd4a">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <entry name="serial">58fc5652-5ae7-4845-af58-9a439200cce0</entry>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <entry name="uuid">58fc5652-5ae7-4845-af58-9a439200cce0</entry>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/58fc5652-5ae7-4845-af58-9a439200cce0_disk">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/58fc5652-5ae7-4845-af58-9a439200cce0_disk.config">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:e7:d0:06"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <target dev="tap9c5ec381-6c"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/58fc5652-5ae7-4845-af58-9a439200cce0/console.log" append="off"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:09:01 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:09:01 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:09:01 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:09:01 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.121 2 DEBUG nova.compute.manager [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Preparing to wait for external event network-vif-plugged-9c5ec381-6cd0-4a47-b169-78f136b0fd4a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.122 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Acquiring lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.122 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.123 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.124 2 DEBUG nova.virt.libvirt.vif [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:08:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1847078363',display_name='tempest-FloatingIPsAssociationTestJSON-server-1847078363',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1847078363',id=42,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d9c4d04247d43b086698f34cdea3ffb',ramdisk_id='',reservation_id='r-82bw2ac8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1186616354',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1186616354-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:08:54Z,user_data=None,user_id='245477e4901945099a0da748199456bc',uuid=58fc5652-5ae7-4845-af58-9a439200cce0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.124 2 DEBUG nova.network.os_vif_util [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Converting VIF {"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.126 2 DEBUG nova.network.os_vif_util [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d0:06,bridge_name='br-int',has_traffic_filtering=True,id=9c5ec381-6cd0-4a47-b169-78f136b0fd4a,network=Network(0ac7c314-5717-432f-9a9d-1e92ec61cf23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c5ec381-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.127 2 DEBUG os_vif [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d0:06,bridge_name='br-int',has_traffic_filtering=True,id=9c5ec381-6cd0-4a47-b169-78f136b0fd4a,network=Network(0ac7c314-5717-432f-9a9d-1e92ec61cf23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c5ec381-6c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.129 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.130 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.136 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c5ec381-6c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.136 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9c5ec381-6c, col_values=(('external_ids', {'iface-id': '9c5ec381-6cd0-4a47-b169-78f136b0fd4a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:d0:06', 'vm-uuid': '58fc5652-5ae7-4845-af58-9a439200cce0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:01 np0005465988 NetworkManager[45041]: <info>  [1759406941.1400] manager: (tap9c5ec381-6c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.155 2 INFO os_vif [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d0:06,bridge_name='br-int',has_traffic_filtering=True,id=9c5ec381-6cd0-4a47-b169-78f136b0fd4a,network=Network(0ac7c314-5717-432f-9a9d-1e92ec61cf23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c5ec381-6c')#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.210 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.211 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.211 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] No VIF found with MAC fa:16:3e:e7:d0:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.212 2 INFO nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Using config drive#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.251 2 DEBUG nova.storage.rbd_utils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] rbd image 58fc5652-5ae7-4845-af58-9a439200cce0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.679 2 INFO nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Creating config drive at /var/lib/nova/instances/58fc5652-5ae7-4845-af58-9a439200cce0/disk.config#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.688 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/58fc5652-5ae7-4845-af58-9a439200cce0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3ngwjdcf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.826 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/58fc5652-5ae7-4845-af58-9a439200cce0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3ngwjdcf" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.870 2 DEBUG nova.storage.rbd_utils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] rbd image 58fc5652-5ae7-4845-af58-9a439200cce0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.875 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/58fc5652-5ae7-4845-af58-9a439200cce0/disk.config 58fc5652-5ae7-4845-af58-9a439200cce0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.953 2 DEBUG nova.network.neutron [req-b129a6e0-518b-4429-9c92-b6bcf205881b req-8fa8ad6c-e535-4b7d-bbf0-2a4fb1aa3e99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updated VIF entry in instance network info cache for port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.954 2 DEBUG nova.network.neutron [req-b129a6e0-518b-4429-9c92-b6bcf205881b req-8fa8ad6c-e535-4b7d-bbf0-2a4fb1aa3e99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updating instance_info_cache with network_info: [{"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.978 2 DEBUG oslo_concurrency.lockutils [req-b129a6e0-518b-4429-9c92-b6bcf205881b req-8fa8ad6c-e535-4b7d-bbf0-2a4fb1aa3e99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:01 np0005465988 nova_compute[236126]: 2025-10-02 12:09:01.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:02.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:02 np0005465988 nova_compute[236126]: 2025-10-02 12:09:02.101 2 DEBUG oslo_concurrency.processutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/58fc5652-5ae7-4845-af58-9a439200cce0/disk.config 58fc5652-5ae7-4845-af58-9a439200cce0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.225s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:02 np0005465988 nova_compute[236126]: 2025-10-02 12:09:02.102 2 INFO nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Deleting local config drive /var/lib/nova/instances/58fc5652-5ae7-4845-af58-9a439200cce0/disk.config because it was imported into RBD.#033[00m
Oct  2 08:09:02 np0005465988 kernel: tap9c5ec381-6c: entered promiscuous mode
Oct  2 08:09:02 np0005465988 NetworkManager[45041]: <info>  [1759406942.1893] manager: (tap9c5ec381-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Oct  2 08:09:02 np0005465988 nova_compute[236126]: 2025-10-02 12:09:02.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:02Z|00092|binding|INFO|Claiming lport 9c5ec381-6cd0-4a47-b169-78f136b0fd4a for this chassis.
Oct  2 08:09:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:02Z|00093|binding|INFO|9c5ec381-6cd0-4a47-b169-78f136b0fd4a: Claiming fa:16:3e:e7:d0:06 10.100.0.14
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.205 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:d0:06 10.100.0.14'], port_security=['fa:16:3e:e7:d0:06 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '58fc5652-5ae7-4845-af58-9a439200cce0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ac7c314-5717-432f-9a9d-1e92ec61cf23', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d9c4d04247d43b086698f34cdea3ffb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '34d2a56b-4b67-4c37-8020-bb8559e6c196', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5bc1f92c-d069-4844-8f9d-c573877b2411, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=9c5ec381-6cd0-4a47-b169-78f136b0fd4a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.207 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a in datapath 0ac7c314-5717-432f-9a9d-1e92ec61cf23 bound to our chassis#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.210 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0ac7c314-5717-432f-9a9d-1e92ec61cf23#033[00m
Oct  2 08:09:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:02 np0005465988 systemd-machined[192594]: New machine qemu-17-instance-0000002a.
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.231 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[54626858-9842-4c47-8bc8-5577643cba66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.231 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0ac7c314-51 in ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.234 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0ac7c314-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.235 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6069fc96-6a55-45b3-a5fc-9059f7159e4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.236 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[03dadb3a-f697-45b3-b419-04acaac0feeb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.250 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[f6677560-d0c6-4eae-98fb-8b85ce5a9e69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 systemd[1]: Started Virtual Machine qemu-17-instance-0000002a.
Oct  2 08:09:02 np0005465988 systemd-udevd[254797]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.279 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d4fa9269-3e28-4420-95be-58a5eb0799a8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 NetworkManager[45041]: <info>  [1759406942.2974] device (tap9c5ec381-6c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:09:02 np0005465988 NetworkManager[45041]: <info>  [1759406942.2988] device (tap9c5ec381-6c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.311 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[a972c545-8bec-493f-b241-9fcde6a800cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 NetworkManager[45041]: <info>  [1759406942.3192] manager: (tap0ac7c314-50): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.318 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c74f6467-85f3-4743-811d-b2369471dbc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 nova_compute[236126]: 2025-10-02 12:09:02.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:02 np0005465988 systemd-udevd[254801]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:09:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:02Z|00094|binding|INFO|Setting lport 9c5ec381-6cd0-4a47-b169-78f136b0fd4a ovn-installed in OVS
Oct  2 08:09:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:02Z|00095|binding|INFO|Setting lport 9c5ec381-6cd0-4a47-b169-78f136b0fd4a up in Southbound
Oct  2 08:09:02 np0005465988 nova_compute[236126]: 2025-10-02 12:09:02.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:02.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.362 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed5c8d4-38bc-4e2e-af03-aa84dc85c3d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.365 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9e24c4ee-f0e8-4545-8445-c2db620f39dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 NetworkManager[45041]: <info>  [1759406942.3968] device (tap0ac7c314-50): carrier: link connected
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.405 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[88cb066a-7af8-4fde-b79b-9c8bc01a5568]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.428 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[893bc056-1490-453e-b32c-89315c8f26ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0ac7c314-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:ce:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500272, 'reachable_time': 27465, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254827, 'error': None, 'target': 'ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.448 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5f289b64-46a3-4106-8b3a-e749c2831008]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:cebd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500272, 'tstamp': 500272}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254828, 'error': None, 'target': 'ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.473 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[32622967-18e3-486f-96b5-b7ea117a304d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0ac7c314-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:ce:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500272, 'reachable_time': 27465, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254829, 'error': None, 'target': 'ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.522 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a81d802c-9a56-4f7f-a991-97d6f08dfd09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.606 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3f664631-b628-40b4-9f05-42a29f865a0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.608 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ac7c314-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.609 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.610 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ac7c314-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:02 np0005465988 nova_compute[236126]: 2025-10-02 12:09:02.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:02 np0005465988 NetworkManager[45041]: <info>  [1759406942.6132] manager: (tap0ac7c314-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Oct  2 08:09:02 np0005465988 kernel: tap0ac7c314-50: entered promiscuous mode
Oct  2 08:09:02 np0005465988 nova_compute[236126]: 2025-10-02 12:09:02.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.618 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0ac7c314-50, col_values=(('external_ids', {'iface-id': 'bbad9949-ff39-435a-9230-c3cf2c6c1571'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:02 np0005465988 nova_compute[236126]: 2025-10-02 12:09:02.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:02Z|00096|binding|INFO|Releasing lport bbad9949-ff39-435a-9230-c3cf2c6c1571 from this chassis (sb_readonly=0)
Oct  2 08:09:02 np0005465988 nova_compute[236126]: 2025-10-02 12:09:02.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.647 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0ac7c314-5717-432f-9a9d-1e92ec61cf23.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0ac7c314-5717-432f-9a9d-1e92ec61cf23.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.648 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6097e2fe-9bc4-4540-a282-ec656b11835b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.649 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-0ac7c314-5717-432f-9a9d-1e92ec61cf23
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/0ac7c314-5717-432f-9a9d-1e92ec61cf23.pid.haproxy
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 0ac7c314-5717-432f-9a9d-1e92ec61cf23
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:02.653 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23', 'env', 'PROCESS_TAG=haproxy-0ac7c314-5717-432f-9a9d-1e92ec61cf23', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0ac7c314-5717-432f-9a9d-1e92ec61cf23.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:09:03 np0005465988 podman[254862]: 2025-10-02 12:09:03.048217673 +0000 UTC m=+0.043478927 container create 37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:09:03 np0005465988 systemd[1]: Started libpod-conmon-37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e.scope.
Oct  2 08:09:03 np0005465988 podman[254862]: 2025-10-02 12:09:03.026291655 +0000 UTC m=+0.021552929 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:09:03 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:09:03 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5e3e0cbda7ed6adc97468597fdcb7b6e659c8c09c52117063b6a264ea5fcd54/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:09:03 np0005465988 podman[254862]: 2025-10-02 12:09:03.149065728 +0000 UTC m=+0.144327002 container init 37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:09:03 np0005465988 podman[254862]: 2025-10-02 12:09:03.159409349 +0000 UTC m=+0.154670593 container start 37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:09:03 np0005465988 neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23[254878]: [NOTICE]   (254882) : New worker (254884) forked
Oct  2 08:09:03 np0005465988 neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23[254878]: [NOTICE]   (254882) : Loading success.
Oct  2 08:09:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:04.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:04.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.522 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406944.5220425, 58fc5652-5ae7-4845-af58-9a439200cce0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.523 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] VM Started (Lifecycle Event)#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.571 2 DEBUG nova.compute.manager [req-30245578-dd6a-4111-a4ef-9e25050fe88d req-072b96ad-e713-4818-8d81-e766069ee8ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Received event network-vif-plugged-9c5ec381-6cd0-4a47-b169-78f136b0fd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.572 2 DEBUG oslo_concurrency.lockutils [req-30245578-dd6a-4111-a4ef-9e25050fe88d req-072b96ad-e713-4818-8d81-e766069ee8ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.573 2 DEBUG oslo_concurrency.lockutils [req-30245578-dd6a-4111-a4ef-9e25050fe88d req-072b96ad-e713-4818-8d81-e766069ee8ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.574 2 DEBUG oslo_concurrency.lockutils [req-30245578-dd6a-4111-a4ef-9e25050fe88d req-072b96ad-e713-4818-8d81-e766069ee8ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.574 2 DEBUG nova.compute.manager [req-30245578-dd6a-4111-a4ef-9e25050fe88d req-072b96ad-e713-4818-8d81-e766069ee8ce d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Processing event network-vif-plugged-9c5ec381-6cd0-4a47-b169-78f136b0fd4a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.575 2 DEBUG nova.compute.manager [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.581 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.585 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.589 2 INFO nova.virt.libvirt.driver [-] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Instance spawned successfully.#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.590 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.593 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.643 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.643 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406944.523343, 58fc5652-5ae7-4845-af58-9a439200cce0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.644 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.651 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.652 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.653 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.654 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.654 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.655 2 DEBUG nova.virt.libvirt.driver [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.669 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.674 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759406944.580075, 58fc5652-5ae7-4845-af58-9a439200cce0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.675 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.751 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.755 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:09:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:04.799 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.833 2 INFO nova.compute.manager [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Took 10.26 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.834 2 DEBUG nova.compute.manager [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.847 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:09:04 np0005465988 nova_compute[236126]: 2025-10-02 12:09:04.949 2 INFO nova.compute.manager [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Took 11.38 seconds to build instance.#033[00m
Oct  2 08:09:05 np0005465988 nova_compute[236126]: 2025-10-02 12:09:05.071 2 DEBUG oslo_concurrency.lockutils [None req-781724d8-f089-40ed-903a-ce151f1475b4 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:06.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:06 np0005465988 nova_compute[236126]: 2025-10-02 12:09:06.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:06.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:06 np0005465988 nova_compute[236126]: 2025-10-02 12:09:06.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:07 np0005465988 nova_compute[236126]: 2025-10-02 12:09:07.877 2 DEBUG nova.compute.manager [req-0e30c515-31ac-4f5d-b284-2976100d2648 req-12dccb43-0650-4373-9b18-e215522c0efb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Received event network-vif-plugged-9c5ec381-6cd0-4a47-b169-78f136b0fd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:07 np0005465988 nova_compute[236126]: 2025-10-02 12:09:07.877 2 DEBUG oslo_concurrency.lockutils [req-0e30c515-31ac-4f5d-b284-2976100d2648 req-12dccb43-0650-4373-9b18-e215522c0efb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:07 np0005465988 nova_compute[236126]: 2025-10-02 12:09:07.878 2 DEBUG oslo_concurrency.lockutils [req-0e30c515-31ac-4f5d-b284-2976100d2648 req-12dccb43-0650-4373-9b18-e215522c0efb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:07 np0005465988 nova_compute[236126]: 2025-10-02 12:09:07.878 2 DEBUG oslo_concurrency.lockutils [req-0e30c515-31ac-4f5d-b284-2976100d2648 req-12dccb43-0650-4373-9b18-e215522c0efb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:07 np0005465988 nova_compute[236126]: 2025-10-02 12:09:07.878 2 DEBUG nova.compute.manager [req-0e30c515-31ac-4f5d-b284-2976100d2648 req-12dccb43-0650-4373-9b18-e215522c0efb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] No waiting events found dispatching network-vif-plugged-9c5ec381-6cd0-4a47-b169-78f136b0fd4a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:07 np0005465988 nova_compute[236126]: 2025-10-02 12:09:07.879 2 WARNING nova.compute.manager [req-0e30c515-31ac-4f5d-b284-2976100d2648 req-12dccb43-0650-4373-9b18-e215522c0efb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Received unexpected event network-vif-plugged-9c5ec381-6cd0-4a47-b169-78f136b0fd4a for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:08.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:08.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:10.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:10.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:11 np0005465988 nova_compute[236126]: 2025-10-02 12:09:11.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:12 np0005465988 nova_compute[236126]: 2025-10-02 12:09:12.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:12.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:12.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:14.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:14.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:14 np0005465988 podman[254991]: 2025-10-02 12:09:14.531962805 +0000 UTC m=+0.062237382 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:09:14 np0005465988 podman[254992]: 2025-10-02 12:09:14.532752538 +0000 UTC m=+0.061199912 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:09:14 np0005465988 podman[254990]: 2025-10-02 12:09:14.563123072 +0000 UTC m=+0.093400669 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 08:09:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:16.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:16 np0005465988 nova_compute[236126]: 2025-10-02 12:09:16.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:16.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:17 np0005465988 nova_compute[236126]: 2025-10-02 12:09:17.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:18Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e7:d0:06 10.100.0.14
Oct  2 08:09:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:18Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e7:d0:06 10.100.0.14
Oct  2 08:09:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:18.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:18.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:20.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:20.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:21 np0005465988 nova_compute[236126]: 2025-10-02 12:09:21.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:22 np0005465988 nova_compute[236126]: 2025-10-02 12:09:22.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:22.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:22.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:24.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e168 e168: 3 total, 3 up, 3 in
Oct  2 08:09:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:24.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:25 np0005465988 nova_compute[236126]: 2025-10-02 12:09:25.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:25 np0005465988 NetworkManager[45041]: <info>  [1759406965.4829] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Oct  2 08:09:25 np0005465988 NetworkManager[45041]: <info>  [1759406965.4850] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Oct  2 08:09:25 np0005465988 nova_compute[236126]: 2025-10-02 12:09:25.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:25Z|00097|binding|INFO|Releasing lport bbad9949-ff39-435a-9230-c3cf2c6c1571 from this chassis (sb_readonly=0)
Oct  2 08:09:25 np0005465988 nova_compute[236126]: 2025-10-02 12:09:25.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:26.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:26 np0005465988 nova_compute[236126]: 2025-10-02 12:09:26.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:26 np0005465988 nova_compute[236126]: 2025-10-02 12:09:26.318 2 DEBUG nova.compute.manager [req-286e4f72-bb26-4dc8-af9a-915c0a182cc9 req-09eef35f-9671-4007-844e-51824e2e8e76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Received event network-changed-9c5ec381-6cd0-4a47-b169-78f136b0fd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:26 np0005465988 nova_compute[236126]: 2025-10-02 12:09:26.319 2 DEBUG nova.compute.manager [req-286e4f72-bb26-4dc8-af9a-915c0a182cc9 req-09eef35f-9671-4007-844e-51824e2e8e76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Refreshing instance network info cache due to event network-changed-9c5ec381-6cd0-4a47-b169-78f136b0fd4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:26 np0005465988 nova_compute[236126]: 2025-10-02 12:09:26.319 2 DEBUG oslo_concurrency.lockutils [req-286e4f72-bb26-4dc8-af9a-915c0a182cc9 req-09eef35f-9671-4007-844e-51824e2e8e76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:26 np0005465988 nova_compute[236126]: 2025-10-02 12:09:26.319 2 DEBUG oslo_concurrency.lockutils [req-286e4f72-bb26-4dc8-af9a-915c0a182cc9 req-09eef35f-9671-4007-844e-51824e2e8e76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:26 np0005465988 nova_compute[236126]: 2025-10-02 12:09:26.319 2 DEBUG nova.network.neutron [req-286e4f72-bb26-4dc8-af9a-915c0a182cc9 req-09eef35f-9671-4007-844e-51824e2e8e76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Refreshing network info cache for port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:26.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:27 np0005465988 nova_compute[236126]: 2025-10-02 12:09:27.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:27.337 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:27.337 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:27.338 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:28.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:28 np0005465988 nova_compute[236126]: 2025-10-02 12:09:28.234 2 DEBUG nova.network.neutron [req-286e4f72-bb26-4dc8-af9a-915c0a182cc9 req-09eef35f-9671-4007-844e-51824e2e8e76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updated VIF entry in instance network info cache for port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:28 np0005465988 nova_compute[236126]: 2025-10-02 12:09:28.235 2 DEBUG nova.network.neutron [req-286e4f72-bb26-4dc8-af9a-915c0a182cc9 req-09eef35f-9671-4007-844e-51824e2e8e76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updating instance_info_cache with network_info: [{"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:28 np0005465988 nova_compute[236126]: 2025-10-02 12:09:28.254 2 DEBUG oslo_concurrency.lockutils [req-286e4f72-bb26-4dc8-af9a-915c0a182cc9 req-09eef35f-9671-4007-844e-51824e2e8e76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:28.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:29 np0005465988 nova_compute[236126]: 2025-10-02 12:09:29.937 2 DEBUG nova.compute.manager [req-76bcebfb-bda7-42d3-9ff7-049e6e3fb7c9 req-009a56a2-339c-448f-a744-1970fb9fd464 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Received event network-changed-9c5ec381-6cd0-4a47-b169-78f136b0fd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:29 np0005465988 nova_compute[236126]: 2025-10-02 12:09:29.938 2 DEBUG nova.compute.manager [req-76bcebfb-bda7-42d3-9ff7-049e6e3fb7c9 req-009a56a2-339c-448f-a744-1970fb9fd464 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Refreshing instance network info cache due to event network-changed-9c5ec381-6cd0-4a47-b169-78f136b0fd4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:29 np0005465988 nova_compute[236126]: 2025-10-02 12:09:29.938 2 DEBUG oslo_concurrency.lockutils [req-76bcebfb-bda7-42d3-9ff7-049e6e3fb7c9 req-009a56a2-339c-448f-a744-1970fb9fd464 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:29 np0005465988 nova_compute[236126]: 2025-10-02 12:09:29.939 2 DEBUG oslo_concurrency.lockutils [req-76bcebfb-bda7-42d3-9ff7-049e6e3fb7c9 req-009a56a2-339c-448f-a744-1970fb9fd464 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:29 np0005465988 nova_compute[236126]: 2025-10-02 12:09:29.939 2 DEBUG nova.network.neutron [req-76bcebfb-bda7-42d3-9ff7-049e6e3fb7c9 req-009a56a2-339c-448f-a744-1970fb9fd464 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Refreshing network info cache for port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:30.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:30.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:31 np0005465988 nova_compute[236126]: 2025-10-02 12:09:31.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:31 np0005465988 podman[255062]: 2025-10-02 12:09:31.523739215 +0000 UTC m=+0.056814335 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:09:32 np0005465988 nova_compute[236126]: 2025-10-02 12:09:32.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:32.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:32.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:34.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:34 np0005465988 nova_compute[236126]: 2025-10-02 12:09:34.239 2 DEBUG nova.network.neutron [req-76bcebfb-bda7-42d3-9ff7-049e6e3fb7c9 req-009a56a2-339c-448f-a744-1970fb9fd464 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updated VIF entry in instance network info cache for port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:34 np0005465988 nova_compute[236126]: 2025-10-02 12:09:34.240 2 DEBUG nova.network.neutron [req-76bcebfb-bda7-42d3-9ff7-049e6e3fb7c9 req-009a56a2-339c-448f-a744-1970fb9fd464 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updating instance_info_cache with network_info: [{"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:34 np0005465988 nova_compute[236126]: 2025-10-02 12:09:34.275 2 DEBUG oslo_concurrency.lockutils [req-76bcebfb-bda7-42d3-9ff7-049e6e3fb7c9 req-009a56a2-339c-448f-a744-1970fb9fd464 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:34.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:34 np0005465988 nova_compute[236126]: 2025-10-02 12:09:34.484 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:34 np0005465988 nova_compute[236126]: 2025-10-02 12:09:34.485 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:35 np0005465988 nova_compute[236126]: 2025-10-02 12:09:35.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:35 np0005465988 nova_compute[236126]: 2025-10-02 12:09:35.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:35 np0005465988 nova_compute[236126]: 2025-10-02 12:09:35.547 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:35 np0005465988 nova_compute[236126]: 2025-10-02 12:09:35.548 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:35 np0005465988 nova_compute[236126]: 2025-10-02 12:09:35.548 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:35 np0005465988 nova_compute[236126]: 2025-10-02 12:09:35.548 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:09:35 np0005465988 nova_compute[236126]: 2025-10-02 12:09:35.548 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:09:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1242565607' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.042 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:36.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:36.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.493 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000002a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.494 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000002a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.636 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.637 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4561MB free_disk=20.880279541015625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.637 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.638 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.911 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 58fc5652-5ae7-4845-af58-9a439200cce0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.911 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.912 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:09:36 np0005465988 nova_compute[236126]: 2025-10-02 12:09:36.956 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:09:37 np0005465988 nova_compute[236126]: 2025-10-02 12:09:37.020 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:09:37 np0005465988 nova_compute[236126]: 2025-10-02 12:09:37.021 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:09:37 np0005465988 nova_compute[236126]: 2025-10-02 12:09:37.040 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:09:37 np0005465988 nova_compute[236126]: 2025-10-02 12:09:37.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:37 np0005465988 nova_compute[236126]: 2025-10-02 12:09:37.077 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:09:37 np0005465988 nova_compute[236126]: 2025-10-02 12:09:37.121 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:09:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3815105497' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:09:37 np0005465988 nova_compute[236126]: 2025-10-02 12:09:37.594 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:37 np0005465988 nova_compute[236126]: 2025-10-02 12:09:37.604 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:09:37 np0005465988 nova_compute[236126]: 2025-10-02 12:09:37.645 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:09:37 np0005465988 nova_compute[236126]: 2025-10-02 12:09:37.709 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:09:37 np0005465988 nova_compute[236126]: 2025-10-02 12:09:37.709 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:38.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:38.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:39 np0005465988 nova_compute[236126]: 2025-10-02 12:09:39.710 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:39 np0005465988 nova_compute[236126]: 2025-10-02 12:09:39.711 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:39 np0005465988 nova_compute[236126]: 2025-10-02 12:09:39.712 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:39 np0005465988 nova_compute[236126]: 2025-10-02 12:09:39.712 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:09:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:40.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:40Z|00098|binding|INFO|Releasing lport bbad9949-ff39-435a-9230-c3cf2c6c1571 from this chassis (sb_readonly=0)
Oct  2 08:09:40 np0005465988 nova_compute[236126]: 2025-10-02 12:09:40.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:09:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:40.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:09:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:40Z|00099|binding|INFO|Releasing lport bbad9949-ff39-435a-9230-c3cf2c6c1571 from this chassis (sb_readonly=0)
Oct  2 08:09:40 np0005465988 nova_compute[236126]: 2025-10-02 12:09:40.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:41 np0005465988 nova_compute[236126]: 2025-10-02 12:09:41.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:41 np0005465988 nova_compute[236126]: 2025-10-02 12:09:41.470 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:42 np0005465988 nova_compute[236126]: 2025-10-02 12:09:42.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:42.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:42.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:43 np0005465988 nova_compute[236126]: 2025-10-02 12:09:43.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:43 np0005465988 nova_compute[236126]: 2025-10-02 12:09:43.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:09:43 np0005465988 nova_compute[236126]: 2025-10-02 12:09:43.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:09:43 np0005465988 nova_compute[236126]: 2025-10-02 12:09:43.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:43 np0005465988 NetworkManager[45041]: <info>  [1759406983.8174] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Oct  2 08:09:43 np0005465988 NetworkManager[45041]: <info>  [1759406983.8183] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Oct  2 08:09:43 np0005465988 nova_compute[236126]: 2025-10-02 12:09:43.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:43 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:43Z|00100|binding|INFO|Releasing lport bbad9949-ff39-435a-9230-c3cf2c6c1571 from this chassis (sb_readonly=0)
Oct  2 08:09:43 np0005465988 nova_compute[236126]: 2025-10-02 12:09:43.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:44 np0005465988 nova_compute[236126]: 2025-10-02 12:09:44.048 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:44 np0005465988 nova_compute[236126]: 2025-10-02 12:09:44.048 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:44 np0005465988 nova_compute[236126]: 2025-10-02 12:09:44.049 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:09:44 np0005465988 nova_compute[236126]: 2025-10-02 12:09:44.049 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 58fc5652-5ae7-4845-af58-9a439200cce0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:44.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:44.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:44 np0005465988 nova_compute[236126]: 2025-10-02 12:09:44.547 2 DEBUG nova.compute.manager [req-3d6f5abf-9b19-4954-967a-17aa63b65b82 req-5f9c99e6-1316-4afb-b173-f8d2d0b485a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Received event network-changed-9c5ec381-6cd0-4a47-b169-78f136b0fd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:44 np0005465988 nova_compute[236126]: 2025-10-02 12:09:44.548 2 DEBUG nova.compute.manager [req-3d6f5abf-9b19-4954-967a-17aa63b65b82 req-5f9c99e6-1316-4afb-b173-f8d2d0b485a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Refreshing instance network info cache due to event network-changed-9c5ec381-6cd0-4a47-b169-78f136b0fd4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:44 np0005465988 nova_compute[236126]: 2025-10-02 12:09:44.548 2 DEBUG oslo_concurrency.lockutils [req-3d6f5abf-9b19-4954-967a-17aa63b65b82 req-5f9c99e6-1316-4afb-b173-f8d2d0b485a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e169 e169: 3 total, 3 up, 3 in
Oct  2 08:09:45 np0005465988 podman[255188]: 2025-10-02 12:09:45.555840222 +0000 UTC m=+0.080904165 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:09:45 np0005465988 podman[255189]: 2025-10-02 12:09:45.569016256 +0000 UTC m=+0.084899162 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:09:45 np0005465988 podman[255187]: 2025-10-02 12:09:45.600609976 +0000 UTC m=+0.130270763 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:09:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:46.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:46 np0005465988 nova_compute[236126]: 2025-10-02 12:09:46.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:46 np0005465988 nova_compute[236126]: 2025-10-02 12:09:46.377 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updating instance_info_cache with network_info: [{"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:46 np0005465988 nova_compute[236126]: 2025-10-02 12:09:46.396 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:46 np0005465988 nova_compute[236126]: 2025-10-02 12:09:46.397 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:09:46 np0005465988 nova_compute[236126]: 2025-10-02 12:09:46.397 2 DEBUG oslo_concurrency.lockutils [req-3d6f5abf-9b19-4954-967a-17aa63b65b82 req-5f9c99e6-1316-4afb-b173-f8d2d0b485a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:46 np0005465988 nova_compute[236126]: 2025-10-02 12:09:46.397 2 DEBUG nova.network.neutron [req-3d6f5abf-9b19-4954-967a-17aa63b65b82 req-5f9c99e6-1316-4afb-b173-f8d2d0b485a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Refreshing network info cache for port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:46.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:47 np0005465988 nova_compute[236126]: 2025-10-02 12:09:47.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:47 np0005465988 nova_compute[236126]: 2025-10-02 12:09:47.921 2 DEBUG nova.network.neutron [req-3d6f5abf-9b19-4954-967a-17aa63b65b82 req-5f9c99e6-1316-4afb-b173-f8d2d0b485a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updated VIF entry in instance network info cache for port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:47 np0005465988 nova_compute[236126]: 2025-10-02 12:09:47.922 2 DEBUG nova.network.neutron [req-3d6f5abf-9b19-4954-967a-17aa63b65b82 req-5f9c99e6-1316-4afb-b173-f8d2d0b485a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updating instance_info_cache with network_info: [{"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:48 np0005465988 nova_compute[236126]: 2025-10-02 12:09:48.013 2 DEBUG oslo_concurrency.lockutils [req-3d6f5abf-9b19-4954-967a-17aa63b65b82 req-5f9c99e6-1316-4afb-b173-f8d2d0b485a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:48.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:48.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:48 np0005465988 nova_compute[236126]: 2025-10-02 12:09:48.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:49 np0005465988 nova_compute[236126]: 2025-10-02 12:09:49.682 2 DEBUG nova.compute.manager [req-5dfeb8f5-3696-4d50-a122-895211d84bc0 req-b8b2682e-485d-48d1-86d2-89d4329b7de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Received event network-changed-9c5ec381-6cd0-4a47-b169-78f136b0fd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:49 np0005465988 nova_compute[236126]: 2025-10-02 12:09:49.682 2 DEBUG nova.compute.manager [req-5dfeb8f5-3696-4d50-a122-895211d84bc0 req-b8b2682e-485d-48d1-86d2-89d4329b7de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Refreshing instance network info cache due to event network-changed-9c5ec381-6cd0-4a47-b169-78f136b0fd4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:49 np0005465988 nova_compute[236126]: 2025-10-02 12:09:49.682 2 DEBUG oslo_concurrency.lockutils [req-5dfeb8f5-3696-4d50-a122-895211d84bc0 req-b8b2682e-485d-48d1-86d2-89d4329b7de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:49 np0005465988 nova_compute[236126]: 2025-10-02 12:09:49.683 2 DEBUG oslo_concurrency.lockutils [req-5dfeb8f5-3696-4d50-a122-895211d84bc0 req-b8b2682e-485d-48d1-86d2-89d4329b7de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:49 np0005465988 nova_compute[236126]: 2025-10-02 12:09:49.683 2 DEBUG nova.network.neutron [req-5dfeb8f5-3696-4d50-a122-895211d84bc0 req-b8b2682e-485d-48d1-86d2-89d4329b7de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Refreshing network info cache for port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:50.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:50.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:51 np0005465988 nova_compute[236126]: 2025-10-02 12:09:51.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:51 np0005465988 nova_compute[236126]: 2025-10-02 12:09:51.403 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:52 np0005465988 nova_compute[236126]: 2025-10-02 12:09:52.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:52.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:52.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:52 np0005465988 nova_compute[236126]: 2025-10-02 12:09:52.710 2 DEBUG nova.network.neutron [req-5dfeb8f5-3696-4d50-a122-895211d84bc0 req-b8b2682e-485d-48d1-86d2-89d4329b7de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updated VIF entry in instance network info cache for port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:52 np0005465988 nova_compute[236126]: 2025-10-02 12:09:52.711 2 DEBUG nova.network.neutron [req-5dfeb8f5-3696-4d50-a122-895211d84bc0 req-b8b2682e-485d-48d1-86d2-89d4329b7de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updating instance_info_cache with network_info: [{"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:52 np0005465988 nova_compute[236126]: 2025-10-02 12:09:52.734 2 DEBUG oslo_concurrency.lockutils [req-5dfeb8f5-3696-4d50-a122-895211d84bc0 req-b8b2682e-485d-48d1-86d2-89d4329b7de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-58fc5652-5ae7-4845-af58-9a439200cce0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e170 e170: 3 total, 3 up, 3 in
Oct  2 08:09:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:54.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.186 2 DEBUG oslo_concurrency.lockutils [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Acquiring lock "58fc5652-5ae7-4845-af58-9a439200cce0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.187 2 DEBUG oslo_concurrency.lockutils [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.187 2 DEBUG oslo_concurrency.lockutils [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Acquiring lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.188 2 DEBUG oslo_concurrency.lockutils [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.188 2 DEBUG oslo_concurrency.lockutils [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.190 2 INFO nova.compute.manager [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Terminating instance#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.192 2 DEBUG nova.compute.manager [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:09:54 np0005465988 kernel: tap9c5ec381-6c (unregistering): left promiscuous mode
Oct  2 08:09:54 np0005465988 NetworkManager[45041]: <info>  [1759406994.2659] device (tap9c5ec381-6c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:54Z|00101|binding|INFO|Releasing lport 9c5ec381-6cd0-4a47-b169-78f136b0fd4a from this chassis (sb_readonly=0)
Oct  2 08:09:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:54Z|00102|binding|INFO|Setting lport 9c5ec381-6cd0-4a47-b169-78f136b0fd4a down in Southbound
Oct  2 08:09:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:09:54Z|00103|binding|INFO|Removing iface tap9c5ec381-6c ovn-installed in OVS
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:54 np0005465988 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Oct  2 08:09:54 np0005465988 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002a.scope: Consumed 16.344s CPU time.
Oct  2 08:09:54 np0005465988 systemd-machined[192594]: Machine qemu-17-instance-0000002a terminated.
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.343 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:d0:06 10.100.0.14'], port_security=['fa:16:3e:e7:d0:06 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '58fc5652-5ae7-4845-af58-9a439200cce0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ac7c314-5717-432f-9a9d-1e92ec61cf23', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d9c4d04247d43b086698f34cdea3ffb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '34d2a56b-4b67-4c37-8020-bb8559e6c196', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5bc1f92c-d069-4844-8f9d-c573877b2411, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=9c5ec381-6cd0-4a47-b169-78f136b0fd4a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.345 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 9c5ec381-6cd0-4a47-b169-78f136b0fd4a in datapath 0ac7c314-5717-432f-9a9d-1e92ec61cf23 unbound from our chassis#033[00m
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.348 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0ac7c314-5717-432f-9a9d-1e92ec61cf23, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.350 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[935fd3a2-5992-463c-9cf2-695f59cdbdf0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.350 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23 namespace which is not needed anymore#033[00m
Oct  2 08:09:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:54.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.437 2 INFO nova.virt.libvirt.driver [-] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Instance destroyed successfully.#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.438 2 DEBUG nova.objects.instance [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lazy-loading 'resources' on Instance uuid 58fc5652-5ae7-4845-af58-9a439200cce0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.485 2 DEBUG nova.virt.libvirt.vif [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1847078363',display_name='tempest-FloatingIPsAssociationTestJSON-server-1847078363',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1847078363',id=42,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1d9c4d04247d43b086698f34cdea3ffb',ramdisk_id='',reservation_id='r-82bw2ac8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1186616354',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1186616354-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:04Z,user_data=None,user_id='245477e4901945099a0da748199456bc',uuid=58fc5652-5ae7-4845-af58-9a439200cce0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.487 2 DEBUG nova.network.os_vif_util [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Converting VIF {"id": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "address": "fa:16:3e:e7:d0:06", "network": {"id": "0ac7c314-5717-432f-9a9d-1e92ec61cf23", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2132560831-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d9c4d04247d43b086698f34cdea3ffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c5ec381-6c", "ovs_interfaceid": "9c5ec381-6cd0-4a47-b169-78f136b0fd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.487 2 DEBUG nova.network.os_vif_util [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:d0:06,bridge_name='br-int',has_traffic_filtering=True,id=9c5ec381-6cd0-4a47-b169-78f136b0fd4a,network=Network(0ac7c314-5717-432f-9a9d-1e92ec61cf23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c5ec381-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.488 2 DEBUG os_vif [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:d0:06,bridge_name='br-int',has_traffic_filtering=True,id=9c5ec381-6cd0-4a47-b169-78f136b0fd4a,network=Network(0ac7c314-5717-432f-9a9d-1e92ec61cf23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c5ec381-6c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c5ec381-6c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.530 2 INFO os_vif [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:d0:06,bridge_name='br-int',has_traffic_filtering=True,id=9c5ec381-6cd0-4a47-b169-78f136b0fd4a,network=Network(0ac7c314-5717-432f-9a9d-1e92ec61cf23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c5ec381-6c')#033[00m
Oct  2 08:09:54 np0005465988 neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23[254878]: [NOTICE]   (254882) : haproxy version is 2.8.14-c23fe91
Oct  2 08:09:54 np0005465988 neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23[254878]: [NOTICE]   (254882) : path to executable is /usr/sbin/haproxy
Oct  2 08:09:54 np0005465988 neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23[254878]: [WARNING]  (254882) : Exiting Master process...
Oct  2 08:09:54 np0005465988 neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23[254878]: [WARNING]  (254882) : Exiting Master process...
Oct  2 08:09:54 np0005465988 neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23[254878]: [ALERT]    (254882) : Current worker (254884) exited with code 143 (Terminated)
Oct  2 08:09:54 np0005465988 neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23[254878]: [WARNING]  (254882) : All workers exited. Exiting... (0)
Oct  2 08:09:54 np0005465988 systemd[1]: libpod-37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e.scope: Deactivated successfully.
Oct  2 08:09:54 np0005465988 podman[255340]: 2025-10-02 12:09:54.548845505 +0000 UTC m=+0.052966222 container died 37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:09:54 np0005465988 systemd[1]: var-lib-containers-storage-overlay-d5e3e0cbda7ed6adc97468597fdcb7b6e659c8c09c52117063b6a264ea5fcd54-merged.mount: Deactivated successfully.
Oct  2 08:09:54 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:09:54 np0005465988 podman[255340]: 2025-10-02 12:09:54.593839195 +0000 UTC m=+0.097959962 container cleanup 37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:09:54 np0005465988 systemd[1]: libpod-conmon-37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e.scope: Deactivated successfully.
Oct  2 08:09:54 np0005465988 podman[255386]: 2025-10-02 12:09:54.676682796 +0000 UTC m=+0.050759758 container remove 37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.687 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[340392b5-2217-4761-851f-f444eccc3d0c]: (4, ('Thu Oct  2 12:09:54 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23 (37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e)\n37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e\nThu Oct  2 12:09:54 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23 (37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e)\n37b28528fbaba0dca0b32507f085102fc913e2cd5b1fbb13a0ab6ae953a6287e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.689 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7741b2df-796e-486b-8f5f-e7fcda11c596]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.690 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ac7c314-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:54 np0005465988 kernel: tap0ac7c314-50: left promiscuous mode
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:54 np0005465988 nova_compute[236126]: 2025-10-02 12:09:54.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.717 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[13361f3c-6597-4823-bd67-0b89f108a7aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.746 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4be840ae-e498-47aa-8daf-07efe8538492]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.748 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6b620dfa-c087-44d9-aa49-3623fae5f0ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.768 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1ddd31a8-7128-43dc-86e9-0457c5713ec3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500263, 'reachable_time': 23500, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255401, 'error': None, 'target': 'ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:54 np0005465988 systemd[1]: run-netns-ovnmeta\x2d0ac7c314\x2d5717\x2d432f\x2d9a9d\x2d1e92ec61cf23.mount: Deactivated successfully.
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.774 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0ac7c314-5717-432f-9a9d-1e92ec61cf23 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:09:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:54.774 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc91bc1-d14f-48eb-b62e-b67b298aacae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:55 np0005465988 nova_compute[236126]: 2025-10-02 12:09:55.077 2 INFO nova.virt.libvirt.driver [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Deleting instance files /var/lib/nova/instances/58fc5652-5ae7-4845-af58-9a439200cce0_del#033[00m
Oct  2 08:09:55 np0005465988 nova_compute[236126]: 2025-10-02 12:09:55.078 2 INFO nova.virt.libvirt.driver [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Deletion of /var/lib/nova/instances/58fc5652-5ae7-4845-af58-9a439200cce0_del complete#033[00m
Oct  2 08:09:55 np0005465988 nova_compute[236126]: 2025-10-02 12:09:55.216 2 INFO nova.compute.manager [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Took 1.02 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:09:55 np0005465988 nova_compute[236126]: 2025-10-02 12:09:55.218 2 DEBUG oslo.service.loopingcall [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:09:55 np0005465988 nova_compute[236126]: 2025-10-02 12:09:55.218 2 DEBUG nova.compute.manager [-] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:09:55 np0005465988 nova_compute[236126]: 2025-10-02 12:09:55.218 2 DEBUG nova.network.neutron [-] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:09:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:56.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:56.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:57 np0005465988 nova_compute[236126]: 2025-10-02 12:09:57.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:09:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:09:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:09:58.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:09:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:09:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:09:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:09:58.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:09:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:58.475 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:58 np0005465988 nova_compute[236126]: 2025-10-02 12:09:58.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:09:58.478 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:09:58 np0005465988 nova_compute[236126]: 2025-10-02 12:09:58.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005465988 nova_compute[236126]: 2025-10-02 12:09:59.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005465988 nova_compute[236126]: 2025-10-02 12:09:59.950 2 DEBUG nova.network.neutron [-] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:59 np0005465988 nova_compute[236126]: 2025-10-02 12:09:59.984 2 DEBUG nova.compute.manager [req-ce94b2b6-049a-4e78-a945-f97bf43693f3 req-b44a6a82-5e33-4d08-9bce-4fde876e666f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Received event network-vif-deleted-9c5ec381-6cd0-4a47-b169-78f136b0fd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:59 np0005465988 nova_compute[236126]: 2025-10-02 12:09:59.984 2 INFO nova.compute.manager [req-ce94b2b6-049a-4e78-a945-f97bf43693f3 req-b44a6a82-5e33-4d08-9bce-4fde876e666f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Neutron deleted interface 9c5ec381-6cd0-4a47-b169-78f136b0fd4a; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:09:59 np0005465988 nova_compute[236126]: 2025-10-02 12:09:59.985 2 DEBUG nova.network.neutron [req-ce94b2b6-049a-4e78-a945-f97bf43693f3 req-b44a6a82-5e33-4d08-9bce-4fde876e666f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:00.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:00 np0005465988 nova_compute[236126]: 2025-10-02 12:10:00.122 2 INFO nova.compute.manager [-] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Took 4.90 seconds to deallocate network for instance.#033[00m
Oct  2 08:10:00 np0005465988 nova_compute[236126]: 2025-10-02 12:10:00.132 2 DEBUG nova.compute.manager [req-ce94b2b6-049a-4e78-a945-f97bf43693f3 req-b44a6a82-5e33-4d08-9bce-4fde876e666f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Detach interface failed, port_id=9c5ec381-6cd0-4a47-b169-78f136b0fd4a, reason: Instance 58fc5652-5ae7-4845-af58-9a439200cce0 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:10:00 np0005465988 nova_compute[236126]: 2025-10-02 12:10:00.251 2 DEBUG oslo_concurrency.lockutils [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:00 np0005465988 nova_compute[236126]: 2025-10-02 12:10:00.252 2 DEBUG oslo_concurrency.lockutils [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:00 np0005465988 nova_compute[236126]: 2025-10-02 12:10:00.350 2 DEBUG oslo_concurrency.processutils [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:00.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:00 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 08:10:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:10:00 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/139278732' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:10:00 np0005465988 nova_compute[236126]: 2025-10-02 12:10:00.888 2 DEBUG oslo_concurrency.processutils [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:00 np0005465988 nova_compute[236126]: 2025-10-02 12:10:00.894 2 DEBUG nova.compute.provider_tree [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:00 np0005465988 nova_compute[236126]: 2025-10-02 12:10:00.956 2 DEBUG nova.scheduler.client.report [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:01 np0005465988 nova_compute[236126]: 2025-10-02 12:10:01.191 2 DEBUG oslo_concurrency.lockutils [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:01 np0005465988 nova_compute[236126]: 2025-10-02 12:10:01.282 2 INFO nova.scheduler.client.report [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Deleted allocations for instance 58fc5652-5ae7-4845-af58-9a439200cce0#033[00m
Oct  2 08:10:01 np0005465988 nova_compute[236126]: 2025-10-02 12:10:01.401 2 DEBUG oslo_concurrency.lockutils [None req-06aa3a9e-4249-435c-9a99-91fca7a8d263 245477e4901945099a0da748199456bc 1d9c4d04247d43b086698f34cdea3ffb - - default default] Lock "58fc5652-5ae7-4845-af58-9a439200cce0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:02 np0005465988 nova_compute[236126]: 2025-10-02 12:10:02.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:02.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:02.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:02 np0005465988 podman[255559]: 2025-10-02 12:10:02.575290173 +0000 UTC m=+0.101659200 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:10:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:10:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:10:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:10:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:04.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:04.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:04.480 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:04 np0005465988 nova_compute[236126]: 2025-10-02 12:10:04.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:06 np0005465988 nova_compute[236126]: 2025-10-02 12:10:06.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:06.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:06.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:07 np0005465988 nova_compute[236126]: 2025-10-02 12:10:07.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:08.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:10:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:08.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:10:08 np0005465988 nova_compute[236126]: 2025-10-02 12:10:08.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:08 np0005465988 nova_compute[236126]: 2025-10-02 12:10:08.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.095 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquiring lock "d545e995-ceb2-43df-97c6-0549bc4d6da4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.095 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.144 2 DEBUG nova.compute.manager [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.277 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.278 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.287 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.287 2 INFO nova.compute.claims [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.435 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406994.4349017, 58fc5652-5ae7-4845-af58-9a439200cce0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.435 2 INFO nova.compute.manager [-] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.475 2 DEBUG nova.compute.manager [None req-247b0ef8-2d28-4227-bb20-9db73c6d535f - - - - - -] [instance: 58fc5652-5ae7-4845-af58-9a439200cce0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.488 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:10:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3317979224' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.954 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:09 np0005465988 nova_compute[236126]: 2025-10-02 12:10:09.964 2 DEBUG nova.compute.provider_tree [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:10 np0005465988 nova_compute[236126]: 2025-10-02 12:10:10.036 2 DEBUG nova.scheduler.client.report [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:10.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:10:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:10:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:10.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:10 np0005465988 nova_compute[236126]: 2025-10-02 12:10:10.544 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:10 np0005465988 nova_compute[236126]: 2025-10-02 12:10:10.545 2 DEBUG nova.compute.manager [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:10:10 np0005465988 nova_compute[236126]: 2025-10-02 12:10:10.782 2 DEBUG nova.compute.manager [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:10:10 np0005465988 nova_compute[236126]: 2025-10-02 12:10:10.783 2 DEBUG nova.network.neutron [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:10:11 np0005465988 nova_compute[236126]: 2025-10-02 12:10:11.271 2 INFO nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:10:11 np0005465988 nova_compute[236126]: 2025-10-02 12:10:11.748 2 DEBUG nova.policy [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '981a135a24e64d0aa07512e23330974a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9e743c722ec0433e854167192b6dd567', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:10:11 np0005465988 nova_compute[236126]: 2025-10-02 12:10:11.825 2 DEBUG nova.compute.manager [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:12.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:12.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.575 2 DEBUG nova.compute.manager [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.577 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.578 2 INFO nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Creating image(s)#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.607 2 DEBUG nova.storage.rbd_utils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] rbd image d545e995-ceb2-43df-97c6-0549bc4d6da4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.634 2 DEBUG nova.storage.rbd_utils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] rbd image d545e995-ceb2-43df-97c6-0549bc4d6da4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.662 2 DEBUG nova.storage.rbd_utils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] rbd image d545e995-ceb2-43df-97c6-0549bc4d6da4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.666 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.737 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.738 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.740 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.740 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.778 2 DEBUG nova.storage.rbd_utils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] rbd image d545e995-ceb2-43df-97c6-0549bc4d6da4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:10:12 np0005465988 nova_compute[236126]: 2025-10-02 12:10:12.783 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 d545e995-ceb2-43df-97c6-0549bc4d6da4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:13 np0005465988 nova_compute[236126]: 2025-10-02 12:10:13.640 2 DEBUG nova.network.neutron [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Successfully created port: e761bc82-c642-42e2-bfe8-600863f22bf5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:10:13 np0005465988 nova_compute[236126]: 2025-10-02 12:10:13.814 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 d545e995-ceb2-43df-97c6-0549bc4d6da4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:13 np0005465988 nova_compute[236126]: 2025-10-02 12:10:13.906 2 DEBUG nova.storage.rbd_utils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] resizing rbd image d545e995-ceb2-43df-97c6-0549bc4d6da4_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:10:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:14.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:14 np0005465988 nova_compute[236126]: 2025-10-02 12:10:14.177 2 DEBUG nova.objects.instance [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lazy-loading 'migration_context' on Instance uuid d545e995-ceb2-43df-97c6-0549bc4d6da4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:14 np0005465988 nova_compute[236126]: 2025-10-02 12:10:14.349 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:10:14 np0005465988 nova_compute[236126]: 2025-10-02 12:10:14.350 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Ensure instance console log exists: /var/lib/nova/instances/d545e995-ceb2-43df-97c6-0549bc4d6da4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:10:14 np0005465988 nova_compute[236126]: 2025-10-02 12:10:14.350 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:14 np0005465988 nova_compute[236126]: 2025-10-02 12:10:14.351 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:14 np0005465988 nova_compute[236126]: 2025-10-02 12:10:14.351 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:14.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:14 np0005465988 nova_compute[236126]: 2025-10-02 12:10:14.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:15 np0005465988 nova_compute[236126]: 2025-10-02 12:10:15.748 2 DEBUG nova.network.neutron [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Successfully updated port: e761bc82-c642-42e2-bfe8-600863f22bf5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:10:15 np0005465988 nova_compute[236126]: 2025-10-02 12:10:15.763 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquiring lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:15 np0005465988 nova_compute[236126]: 2025-10-02 12:10:15.764 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquired lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:15 np0005465988 nova_compute[236126]: 2025-10-02 12:10:15.764 2 DEBUG nova.network.neutron [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:10:16 np0005465988 nova_compute[236126]: 2025-10-02 12:10:16.010 2 DEBUG nova.compute.manager [req-d9cf4a9c-7ea0-4d48-a78c-e829c2756cdb req-30f517ec-cb4f-44dd-9e0d-e08465ba8f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received event network-changed-e761bc82-c642-42e2-bfe8-600863f22bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:16 np0005465988 nova_compute[236126]: 2025-10-02 12:10:16.011 2 DEBUG nova.compute.manager [req-d9cf4a9c-7ea0-4d48-a78c-e829c2756cdb req-30f517ec-cb4f-44dd-9e0d-e08465ba8f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Refreshing instance network info cache due to event network-changed-e761bc82-c642-42e2-bfe8-600863f22bf5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:10:16 np0005465988 nova_compute[236126]: 2025-10-02 12:10:16.011 2 DEBUG oslo_concurrency.lockutils [req-d9cf4a9c-7ea0-4d48-a78c-e829c2756cdb req-30f517ec-cb4f-44dd-9e0d-e08465ba8f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:16 np0005465988 nova_compute[236126]: 2025-10-02 12:10:16.041 2 DEBUG nova.network.neutron [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:10:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:16.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:16.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:16 np0005465988 podman[255875]: 2025-10-02 12:10:16.54034195 +0000 UTC m=+0.065837117 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:10:16 np0005465988 podman[255876]: 2025-10-02 12:10:16.540986519 +0000 UTC m=+0.071285486 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:10:16 np0005465988 podman[255874]: 2025-10-02 12:10:16.570529529 +0000 UTC m=+0.108415817 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.178 2 DEBUG nova.network.neutron [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updating instance_info_cache with network_info: [{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.198 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Releasing lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.198 2 DEBUG nova.compute.manager [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Instance network_info: |[{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.199 2 DEBUG oslo_concurrency.lockutils [req-d9cf4a9c-7ea0-4d48-a78c-e829c2756cdb req-30f517ec-cb4f-44dd-9e0d-e08465ba8f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.199 2 DEBUG nova.network.neutron [req-d9cf4a9c-7ea0-4d48-a78c-e829c2756cdb req-30f517ec-cb4f-44dd-9e0d-e08465ba8f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Refreshing network info cache for port e761bc82-c642-42e2-bfe8-600863f22bf5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.203 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Start _get_guest_xml network_info=[{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.208 2 WARNING nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.216 2 DEBUG nova.virt.libvirt.host [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.217 2 DEBUG nova.virt.libvirt.host [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.224 2 DEBUG nova.virt.libvirt.host [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.225 2 DEBUG nova.virt.libvirt.host [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.228 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.229 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.229 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.230 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.230 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:10:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.231 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.232 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.232 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.233 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.233 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.233 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.234 2 DEBUG nova.virt.hardware [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.237 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:10:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2102354119' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.677 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.705 2 DEBUG nova.storage.rbd_utils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] rbd image d545e995-ceb2-43df-97c6-0549bc4d6da4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:10:17 np0005465988 nova_compute[236126]: 2025-10-02 12:10:17.710 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:10:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3144016193' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:10:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:18.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.140 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.141 2 DEBUG nova.virt.libvirt.vif [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:10:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-1984858235',display_name='tempest-AttachInterfacesUnderV243Test-server-1984858235',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-1984858235',id=46,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDE7cODM2NyZ3/mJcptetYP+dNehHeWqNZQsR1fFUE+tNKNOaidrTEHHBwYr+NYaYQQooVYpMlnWV5rx1dZMB4MfIwMiwQz0qls4NMFJSJFy/9UQu+BHQq1qBWeVGW3ilw==',key_name='tempest-keypair-705473978',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9e743c722ec0433e854167192b6dd567',ramdisk_id='',reservation_id='r-rbxslj03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-592969324',owner_user_name='tempest-AttachInterfacesUnderV243Test-592969324-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:10:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='981a135a24e64d0aa07512e23330974a',uuid=d545e995-ceb2-43df-97c6-0549bc4d6da4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.142 2 DEBUG nova.network.os_vif_util [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Converting VIF {"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.143 2 DEBUG nova.network.os_vif_util [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:b4:9d,bridge_name='br-int',has_traffic_filtering=True,id=e761bc82-c642-42e2-bfe8-600863f22bf5,network=Network(90bea0c5-f8b7-47cb-bc1d-0929807ff5f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape761bc82-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.144 2 DEBUG nova.objects.instance [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lazy-loading 'pci_devices' on Instance uuid d545e995-ceb2-43df-97c6-0549bc4d6da4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.175 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  <uuid>d545e995-ceb2-43df-97c6-0549bc4d6da4</uuid>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  <name>instance-0000002e</name>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <nova:name>tempest-AttachInterfacesUnderV243Test-server-1984858235</nova:name>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:10:17</nova:creationTime>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <nova:user uuid="981a135a24e64d0aa07512e23330974a">tempest-AttachInterfacesUnderV243Test-592969324-project-member</nova:user>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <nova:project uuid="9e743c722ec0433e854167192b6dd567">tempest-AttachInterfacesUnderV243Test-592969324</nova:project>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <nova:port uuid="e761bc82-c642-42e2-bfe8-600863f22bf5">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <entry name="serial">d545e995-ceb2-43df-97c6-0549bc4d6da4</entry>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <entry name="uuid">d545e995-ceb2-43df-97c6-0549bc4d6da4</entry>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/d545e995-ceb2-43df-97c6-0549bc4d6da4_disk">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/d545e995-ceb2-43df-97c6-0549bc4d6da4_disk.config">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:8c:b4:9d"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <target dev="tape761bc82-c6"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/d545e995-ceb2-43df-97c6-0549bc4d6da4/console.log" append="off"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:10:18 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:10:18 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:10:18 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:10:18 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.177 2 DEBUG nova.compute.manager [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Preparing to wait for external event network-vif-plugged-e761bc82-c642-42e2-bfe8-600863f22bf5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.177 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquiring lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.177 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.178 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.178 2 DEBUG nova.virt.libvirt.vif [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:10:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-1984858235',display_name='tempest-AttachInterfacesUnderV243Test-server-1984858235',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-1984858235',id=46,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDE7cODM2NyZ3/mJcptetYP+dNehHeWqNZQsR1fFUE+tNKNOaidrTEHHBwYr+NYaYQQooVYpMlnWV5rx1dZMB4MfIwMiwQz0qls4NMFJSJFy/9UQu+BHQq1qBWeVGW3ilw==',key_name='tempest-keypair-705473978',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9e743c722ec0433e854167192b6dd567',ramdisk_id='',reservation_id='r-rbxslj03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-592969324',owner_user_name='tempest-AttachInterfacesUnderV243Test-592969324-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:10:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='981a135a24e64d0aa07512e23330974a',uuid=d545e995-ceb2-43df-97c6-0549bc4d6da4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.179 2 DEBUG nova.network.os_vif_util [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Converting VIF {"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.179 2 DEBUG nova.network.os_vif_util [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:b4:9d,bridge_name='br-int',has_traffic_filtering=True,id=e761bc82-c642-42e2-bfe8-600863f22bf5,network=Network(90bea0c5-f8b7-47cb-bc1d-0929807ff5f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape761bc82-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.180 2 DEBUG os_vif [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:b4:9d,bridge_name='br-int',has_traffic_filtering=True,id=e761bc82-c642-42e2-bfe8-600863f22bf5,network=Network(90bea0c5-f8b7-47cb-bc1d-0929807ff5f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape761bc82-c6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.181 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.181 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.185 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape761bc82-c6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.186 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape761bc82-c6, col_values=(('external_ids', {'iface-id': 'e761bc82-c642-42e2-bfe8-600863f22bf5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:b4:9d', 'vm-uuid': 'd545e995-ceb2-43df-97c6-0549bc4d6da4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:18 np0005465988 NetworkManager[45041]: <info>  [1759407018.1891] manager: (tape761bc82-c6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.201 2 INFO os_vif [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:b4:9d,bridge_name='br-int',has_traffic_filtering=True,id=e761bc82-c642-42e2-bfe8-600863f22bf5,network=Network(90bea0c5-f8b7-47cb-bc1d-0929807ff5f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape761bc82-c6')#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.274 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.275 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.275 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] No VIF found with MAC fa:16:3e:8c:b4:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.276 2 INFO nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Using config drive#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.312 2 DEBUG nova.storage.rbd_utils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] rbd image d545e995-ceb2-43df-97c6-0549bc4d6da4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:10:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:18.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.852 2 INFO nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Creating config drive at /var/lib/nova/instances/d545e995-ceb2-43df-97c6-0549bc4d6da4/disk.config#033[00m
Oct  2 08:10:18 np0005465988 nova_compute[236126]: 2025-10-02 12:10:18.866 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d545e995-ceb2-43df-97c6-0549bc4d6da4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbofey88u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.014 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d545e995-ceb2-43df-97c6-0549bc4d6da4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbofey88u" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.058 2 DEBUG nova.storage.rbd_utils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] rbd image d545e995-ceb2-43df-97c6-0549bc4d6da4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.062 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d545e995-ceb2-43df-97c6-0549bc4d6da4/disk.config d545e995-ceb2-43df-97c6-0549bc4d6da4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.277 2 DEBUG oslo_concurrency.processutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d545e995-ceb2-43df-97c6-0549bc4d6da4/disk.config d545e995-ceb2-43df-97c6-0549bc4d6da4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.215s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.279 2 INFO nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Deleting local config drive /var/lib/nova/instances/d545e995-ceb2-43df-97c6-0549bc4d6da4/disk.config because it was imported into RBD.#033[00m
Oct  2 08:10:19 np0005465988 kernel: tape761bc82-c6: entered promiscuous mode
Oct  2 08:10:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:10:19Z|00104|binding|INFO|Claiming lport e761bc82-c642-42e2-bfe8-600863f22bf5 for this chassis.
Oct  2 08:10:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:10:19Z|00105|binding|INFO|e761bc82-c642-42e2-bfe8-600863f22bf5: Claiming fa:16:3e:8c:b4:9d 10.100.0.13
Oct  2 08:10:19 np0005465988 NetworkManager[45041]: <info>  [1759407019.3558] manager: (tape761bc82-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.372 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:b4:9d 10.100.0.13'], port_security=['fa:16:3e:8c:b4:9d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'd545e995-ceb2-43df-97c6-0549bc4d6da4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9e743c722ec0433e854167192b6dd567', 'neutron:revision_number': '2', 'neutron:security_group_ids': '738265f8-af79-4ed9-adba-c93b17bd0264', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d18bbef5-47b9-48d4-bc8e-574c94fc05bc, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=e761bc82-c642-42e2-bfe8-600863f22bf5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.373 142124 INFO neutron.agent.ovn.metadata.agent [-] Port e761bc82-c642-42e2-bfe8-600863f22bf5 in datapath 90bea0c5-f8b7-47cb-bc1d-0929807ff5f6 bound to our chassis#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.374 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90bea0c5-f8b7-47cb-bc1d-0929807ff5f6#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.387 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9936b110-33d6-49f1-b277-a134ff894036]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.387 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap90bea0c5-f1 in ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.389 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap90bea0c5-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.390 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[03199512-5c78-4c18-81de-94fabd637c43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.390 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0f1a5c09-2f96-48cf-9cf1-f344796575f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 systemd-udevd[256078]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:10:19 np0005465988 systemd-machined[192594]: New machine qemu-18-instance-0000002e.
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.406 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[ca40cb8d-ba09-4a0b-9276-2060a5c93dfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 NetworkManager[45041]: <info>  [1759407019.4088] device (tape761bc82-c6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:10:19 np0005465988 NetworkManager[45041]: <info>  [1759407019.4109] device (tape761bc82-c6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.433 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3ebeea8d-223a-46a4-ae7b-42013f06baa7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:19 np0005465988 systemd[1]: Started Virtual Machine qemu-18-instance-0000002e.
Oct  2 08:10:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:10:19Z|00106|binding|INFO|Setting lport e761bc82-c642-42e2-bfe8-600863f22bf5 ovn-installed in OVS
Oct  2 08:10:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:10:19Z|00107|binding|INFO|Setting lport e761bc82-c642-42e2-bfe8-600863f22bf5 up in Southbound
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.463 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[60159e10-f24c-4fe7-ba4d-b831afdba54a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 systemd-udevd[256083]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:10:19 np0005465988 NetworkManager[45041]: <info>  [1759407019.4687] manager: (tap90bea0c5-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.467 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bd97a726-84e6-4ee1-adf1-3b6cbea7ff8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.510 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d99cbd07-642e-48a3-bb00-351ee810c006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.514 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e253b0f2-d695-4e82-900a-713a405332ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 NetworkManager[45041]: <info>  [1759407019.5397] device (tap90bea0c5-f0): carrier: link connected
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.544 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[73e0d33d-4977-4f79-b14a-2a4007b12740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.565 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[66bb740f-f58c-4f05-92dd-d0bd39bbc546]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90bea0c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:00:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507987, 'reachable_time': 38968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256111, 'error': None, 'target': 'ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.588 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c0dd04c2-35a4-4943-95dc-952dfc065b72]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:36'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 507987, 'tstamp': 507987}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256112, 'error': None, 'target': 'ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.613 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[80919605-8d19-4f4e-b4d2-c49bec9e45e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90bea0c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:00:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507987, 'reachable_time': 38968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 256113, 'error': None, 'target': 'ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.658 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9b82fd47-6a5f-4fa6-a99f-e9caedcf3885]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.746 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b0ab64-72b3-47db-9ed8-60b22fd47a50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.748 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90bea0c5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.749 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.750 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90bea0c5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:19 np0005465988 NetworkManager[45041]: <info>  [1759407019.7531] manager: (tap90bea0c5-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Oct  2 08:10:19 np0005465988 kernel: tap90bea0c5-f0: entered promiscuous mode
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.757 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90bea0c5-f0, col_values=(('external_ids', {'iface-id': '92254f36-d722-4bfe-bed5-5ca6a669ba88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:10:19Z|00108|binding|INFO|Releasing lport 92254f36-d722-4bfe-bed5-5ca6a669ba88 from this chassis (sb_readonly=0)
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.764 2 DEBUG nova.network.neutron [req-d9cf4a9c-7ea0-4d48-a78c-e829c2756cdb req-30f517ec-cb4f-44dd-9e0d-e08465ba8f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updated VIF entry in instance network info cache for port e761bc82-c642-42e2-bfe8-600863f22bf5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.764 2 DEBUG nova.network.neutron [req-d9cf4a9c-7ea0-4d48-a78c-e829c2756cdb req-30f517ec-cb4f-44dd-9e0d-e08465ba8f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updating instance_info_cache with network_info: [{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.783 2 DEBUG oslo_concurrency.lockutils [req-d9cf4a9c-7ea0-4d48-a78c-e829c2756cdb req-30f517ec-cb4f-44dd-9e0d-e08465ba8f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.786 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/90bea0c5-f8b7-47cb-bc1d-0929807ff5f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/90bea0c5-f8b7-47cb-bc1d-0929807ff5f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.787 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8328ea-3a4f-4432-ae18-729bf1a7c735]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.788 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/90bea0c5-f8b7-47cb-bc1d-0929807ff5f6.pid.haproxy
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 90bea0c5-f8b7-47cb-bc1d-0929807ff5f6
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:10:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:19.789 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6', 'env', 'PROCESS_TAG=haproxy-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/90bea0c5-f8b7-47cb-bc1d-0929807ff5f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.875 2 DEBUG nova.compute.manager [req-59df3115-40e4-468e-acde-fe2ea940fd2f req-741489cc-6de5-4d4d-aac6-4e3171006036 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received event network-vif-plugged-e761bc82-c642-42e2-bfe8-600863f22bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.876 2 DEBUG oslo_concurrency.lockutils [req-59df3115-40e4-468e-acde-fe2ea940fd2f req-741489cc-6de5-4d4d-aac6-4e3171006036 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.877 2 DEBUG oslo_concurrency.lockutils [req-59df3115-40e4-468e-acde-fe2ea940fd2f req-741489cc-6de5-4d4d-aac6-4e3171006036 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.877 2 DEBUG oslo_concurrency.lockutils [req-59df3115-40e4-468e-acde-fe2ea940fd2f req-741489cc-6de5-4d4d-aac6-4e3171006036 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:19 np0005465988 nova_compute[236126]: 2025-10-02 12:10:19.877 2 DEBUG nova.compute.manager [req-59df3115-40e4-468e-acde-fe2ea940fd2f req-741489cc-6de5-4d4d-aac6-4e3171006036 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Processing event network-vif-plugged-e761bc82-c642-42e2-bfe8-600863f22bf5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:10:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:20.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:20 np0005465988 podman[256144]: 2025-10-02 12:10:20.235815877 +0000 UTC m=+0.052584041 container create aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:10:20 np0005465988 systemd[1]: Started libpod-conmon-aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d.scope.
Oct  2 08:10:20 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:10:20 np0005465988 podman[256144]: 2025-10-02 12:10:20.208860623 +0000 UTC m=+0.025628807 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:10:20 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8cebff645f247ec9e015cfc5185c9ca67b28ca2e19229f455efdb3323168155/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:10:20 np0005465988 podman[256144]: 2025-10-02 12:10:20.320203974 +0000 UTC m=+0.136972128 container init aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:10:20 np0005465988 podman[256144]: 2025-10-02 12:10:20.331239055 +0000 UTC m=+0.148007239 container start aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:10:20 np0005465988 neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6[256158]: [NOTICE]   (256162) : New worker (256164) forked
Oct  2 08:10:20 np0005465988 neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6[256158]: [NOTICE]   (256162) : Loading success.
Oct  2 08:10:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:20.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.870 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407021.870047, d545e995-ceb2-43df-97c6-0549bc4d6da4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.871 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] VM Started (Lifecycle Event)#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.872 2 DEBUG nova.compute.manager [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.875 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.879 2 INFO nova.virt.libvirt.driver [-] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Instance spawned successfully.#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.879 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.915 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.922 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.923 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.923 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.924 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.925 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.926 2 DEBUG nova.virt.libvirt.driver [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.932 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.977 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.978 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407021.8701982, d545e995-ceb2-43df-97c6-0549bc4d6da4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:21 np0005465988 nova_compute[236126]: 2025-10-02 12:10:21.978 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.011 2 DEBUG nova.compute.manager [req-eaa210b5-6511-49f4-a88c-fa3701f2c7e4 req-37b6a6dd-122c-48ae-b70d-36c7b83ff244 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received event network-vif-plugged-e761bc82-c642-42e2-bfe8-600863f22bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.012 2 DEBUG oslo_concurrency.lockutils [req-eaa210b5-6511-49f4-a88c-fa3701f2c7e4 req-37b6a6dd-122c-48ae-b70d-36c7b83ff244 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.013 2 DEBUG oslo_concurrency.lockutils [req-eaa210b5-6511-49f4-a88c-fa3701f2c7e4 req-37b6a6dd-122c-48ae-b70d-36c7b83ff244 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.013 2 DEBUG oslo_concurrency.lockutils [req-eaa210b5-6511-49f4-a88c-fa3701f2c7e4 req-37b6a6dd-122c-48ae-b70d-36c7b83ff244 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.014 2 DEBUG nova.compute.manager [req-eaa210b5-6511-49f4-a88c-fa3701f2c7e4 req-37b6a6dd-122c-48ae-b70d-36c7b83ff244 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] No waiting events found dispatching network-vif-plugged-e761bc82-c642-42e2-bfe8-600863f22bf5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.014 2 WARNING nova.compute.manager [req-eaa210b5-6511-49f4-a88c-fa3701f2c7e4 req-37b6a6dd-122c-48ae-b70d-36c7b83ff244 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received unexpected event network-vif-plugged-e761bc82-c642-42e2-bfe8-600863f22bf5 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.031 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.036 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407021.8751354, d545e995-ceb2-43df-97c6-0549bc4d6da4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.036 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.045 2 INFO nova.compute.manager [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Took 9.47 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.046 2 DEBUG nova.compute.manager [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.105 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.109 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.142 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:10:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:22.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.175 2 INFO nova.compute.manager [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Took 12.94 seconds to build instance.#033[00m
Oct  2 08:10:22 np0005465988 nova_compute[236126]: 2025-10-02 12:10:22.211 2 DEBUG oslo_concurrency.lockutils [None req-2457fecb-a126-4748-810b-eb28162da0af 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:22.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:23 np0005465988 nova_compute[236126]: 2025-10-02 12:10:23.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:24.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:24.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:24 np0005465988 nova_compute[236126]: 2025-10-02 12:10:24.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:24 np0005465988 NetworkManager[45041]: <info>  [1759407024.9635] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Oct  2 08:10:24 np0005465988 NetworkManager[45041]: <info>  [1759407024.9644] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Oct  2 08:10:25 np0005465988 nova_compute[236126]: 2025-10-02 12:10:25.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:10:25Z|00109|binding|INFO|Releasing lport 92254f36-d722-4bfe-bed5-5ca6a669ba88 from this chassis (sb_readonly=0)
Oct  2 08:10:25 np0005465988 nova_compute[236126]: 2025-10-02 12:10:25.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:25 np0005465988 nova_compute[236126]: 2025-10-02 12:10:25.518 2 DEBUG nova.compute.manager [req-af53bd34-7334-4969-879a-c6c5e23d04b1 req-3d175b80-b06d-41f8-a127-94a81c15cb94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received event network-changed-e761bc82-c642-42e2-bfe8-600863f22bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:25 np0005465988 nova_compute[236126]: 2025-10-02 12:10:25.519 2 DEBUG nova.compute.manager [req-af53bd34-7334-4969-879a-c6c5e23d04b1 req-3d175b80-b06d-41f8-a127-94a81c15cb94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Refreshing instance network info cache due to event network-changed-e761bc82-c642-42e2-bfe8-600863f22bf5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:10:25 np0005465988 nova_compute[236126]: 2025-10-02 12:10:25.520 2 DEBUG oslo_concurrency.lockutils [req-af53bd34-7334-4969-879a-c6c5e23d04b1 req-3d175b80-b06d-41f8-a127-94a81c15cb94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:25 np0005465988 nova_compute[236126]: 2025-10-02 12:10:25.520 2 DEBUG oslo_concurrency.lockutils [req-af53bd34-7334-4969-879a-c6c5e23d04b1 req-3d175b80-b06d-41f8-a127-94a81c15cb94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:25 np0005465988 nova_compute[236126]: 2025-10-02 12:10:25.521 2 DEBUG nova.network.neutron [req-af53bd34-7334-4969-879a-c6c5e23d04b1 req-3d175b80-b06d-41f8-a127-94a81c15cb94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Refreshing network info cache for port e761bc82-c642-42e2-bfe8-600863f22bf5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:10:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:26.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:26.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:27 np0005465988 nova_compute[236126]: 2025-10-02 12:10:27.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:27 np0005465988 nova_compute[236126]: 2025-10-02 12:10:27.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:27 np0005465988 nova_compute[236126]: 2025-10-02 12:10:27.315 2 DEBUG nova.network.neutron [req-af53bd34-7334-4969-879a-c6c5e23d04b1 req-3d175b80-b06d-41f8-a127-94a81c15cb94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updated VIF entry in instance network info cache for port e761bc82-c642-42e2-bfe8-600863f22bf5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:10:27 np0005465988 nova_compute[236126]: 2025-10-02 12:10:27.316 2 DEBUG nova.network.neutron [req-af53bd34-7334-4969-879a-c6c5e23d04b1 req-3d175b80-b06d-41f8-a127-94a81c15cb94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updating instance_info_cache with network_info: [{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:27.337 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:27.338 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:27.339 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:27 np0005465988 nova_compute[236126]: 2025-10-02 12:10:27.348 2 DEBUG oslo_concurrency.lockutils [req-af53bd34-7334-4969-879a-c6c5e23d04b1 req-3d175b80-b06d-41f8-a127-94a81c15cb94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:28.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:28 np0005465988 nova_compute[236126]: 2025-10-02 12:10:28.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:28.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:30.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:30.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e171 e171: 3 total, 3 up, 3 in
Oct  2 08:10:32 np0005465988 nova_compute[236126]: 2025-10-02 12:10:32.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:32.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:32.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e172 e172: 3 total, 3 up, 3 in
Oct  2 08:10:33 np0005465988 nova_compute[236126]: 2025-10-02 12:10:33.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:33 np0005465988 ovn_controller[132601]: 2025-10-02T12:10:33Z|00110|binding|INFO|Releasing lport 92254f36-d722-4bfe-bed5-5ca6a669ba88 from this chassis (sb_readonly=0)
Oct  2 08:10:33 np0005465988 nova_compute[236126]: 2025-10-02 12:10:33.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:33 np0005465988 podman[256223]: 2025-10-02 12:10:33.523586306 +0000 UTC m=+0.052964480 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:10:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e173 e173: 3 total, 3 up, 3 in
Oct  2 08:10:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:34.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:34.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:35 np0005465988 nova_compute[236126]: 2025-10-02 12:10:35.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:35 np0005465988 nova_compute[236126]: 2025-10-02 12:10:35.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:36.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:10:36Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8c:b4:9d 10.100.0.13
Oct  2 08:10:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:10:36Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8c:b4:9d 10.100.0.13
Oct  2 08:10:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:36.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:36 np0005465988 nova_compute[236126]: 2025-10-02 12:10:36.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:36 np0005465988 nova_compute[236126]: 2025-10-02 12:10:36.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:36 np0005465988 nova_compute[236126]: 2025-10-02 12:10:36.592 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:36 np0005465988 nova_compute[236126]: 2025-10-02 12:10:36.593 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:36 np0005465988 nova_compute[236126]: 2025-10-02 12:10:36.593 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:36 np0005465988 nova_compute[236126]: 2025-10-02 12:10:36.593 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:10:36 np0005465988 nova_compute[236126]: 2025-10-02 12:10:36.594 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:10:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4088970629' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:10:37 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.092 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:37 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:37 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.378 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000002e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:10:37 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.379 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000002e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:10:37 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.583 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:10:37 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.584 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4527MB free_disk=20.921127319335938GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:10:37 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.584 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:37 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.584 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:37 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.959 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance d545e995-ceb2-43df-97c6-0549bc4d6da4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:10:37 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.959 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:10:37 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.959 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:10:38 np0005465988 nova_compute[236126]: 2025-10-02 12:10:37.999 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:38.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:38 np0005465988 nova_compute[236126]: 2025-10-02 12:10:38.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:10:38 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2592809986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:10:38 np0005465988 nova_compute[236126]: 2025-10-02 12:10:38.436 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:38 np0005465988 nova_compute[236126]: 2025-10-02 12:10:38.445 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:38.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:38 np0005465988 nova_compute[236126]: 2025-10-02 12:10:38.584 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:39 np0005465988 nova_compute[236126]: 2025-10-02 12:10:39.007 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:10:39 np0005465988 nova_compute[236126]: 2025-10-02 12:10:39.008 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.424s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:40.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:40.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e174 e174: 3 total, 3 up, 3 in
Oct  2 08:10:42 np0005465988 nova_compute[236126]: 2025-10-02 12:10:42.009 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:42 np0005465988 nova_compute[236126]: 2025-10-02 12:10:42.010 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:42 np0005465988 nova_compute[236126]: 2025-10-02 12:10:42.010 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:42 np0005465988 nova_compute[236126]: 2025-10-02 12:10:42.011 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:10:42 np0005465988 nova_compute[236126]: 2025-10-02 12:10:42.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:42.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:42 np0005465988 nova_compute[236126]: 2025-10-02 12:10:42.470 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:42.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:42 np0005465988 ovn_controller[132601]: 2025-10-02T12:10:42Z|00111|binding|INFO|Releasing lport 92254f36-d722-4bfe-bed5-5ca6a669ba88 from this chassis (sb_readonly=0)
Oct  2 08:10:42 np0005465988 nova_compute[236126]: 2025-10-02 12:10:42.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e175 e175: 3 total, 3 up, 3 in
Oct  2 08:10:43 np0005465988 nova_compute[236126]: 2025-10-02 12:10:43.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:44.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e176 e176: 3 total, 3 up, 3 in
Oct  2 08:10:44 np0005465988 nova_compute[236126]: 2025-10-02 12:10:44.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:44 np0005465988 nova_compute[236126]: 2025-10-02 12:10:44.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:10:44 np0005465988 nova_compute[236126]: 2025-10-02 12:10:44.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:10:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:44.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:44 np0005465988 nova_compute[236126]: 2025-10-02 12:10:44.720 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:44 np0005465988 nova_compute[236126]: 2025-10-02 12:10:44.721 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:44 np0005465988 nova_compute[236126]: 2025-10-02 12:10:44.721 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:10:44 np0005465988 nova_compute[236126]: 2025-10-02 12:10:44.721 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d545e995-ceb2-43df-97c6-0549bc4d6da4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:46.034 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:10:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:46.035 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:10:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:10:46.037 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:46 np0005465988 nova_compute[236126]: 2025-10-02 12:10:46.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:46.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:46.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:46 np0005465988 nova_compute[236126]: 2025-10-02 12:10:46.774 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updating instance_info_cache with network_info: [{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:46 np0005465988 nova_compute[236126]: 2025-10-02 12:10:46.795 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:46 np0005465988 nova_compute[236126]: 2025-10-02 12:10:46.796 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:10:47 np0005465988 nova_compute[236126]: 2025-10-02 12:10:47.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:47 np0005465988 podman[256347]: 2025-10-02 12:10:47.535091477 +0000 UTC m=+0.063678921 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 08:10:47 np0005465988 podman[256348]: 2025-10-02 12:10:47.550150045 +0000 UTC m=+0.072883419 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:10:47 np0005465988 podman[256346]: 2025-10-02 12:10:47.585395599 +0000 UTC m=+0.117102233 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:10:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e177 e177: 3 total, 3 up, 3 in
Oct  2 08:10:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:48.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:48 np0005465988 nova_compute[236126]: 2025-10-02 12:10:48.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:48.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e178 e178: 3 total, 3 up, 3 in
Oct  2 08:10:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:50.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:50.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:52 np0005465988 nova_compute[236126]: 2025-10-02 12:10:52.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:52.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:52.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:53 np0005465988 nova_compute[236126]: 2025-10-02 12:10:53.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:53 np0005465988 nova_compute[236126]: 2025-10-02 12:10:53.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e179 e179: 3 total, 3 up, 3 in
Oct  2 08:10:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:54.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:54.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e180 e180: 3 total, 3 up, 3 in
Oct  2 08:10:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:10:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1703261303' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:10:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:10:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1703261303' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:10:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:56.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:10:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:56.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:10:56 np0005465988 nova_compute[236126]: 2025-10-02 12:10:56.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:57 np0005465988 nova_compute[236126]: 2025-10-02 12:10:57.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:10:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e181 e181: 3 total, 3 up, 3 in
Oct  2 08:10:58 np0005465988 nova_compute[236126]: 2025-10-02 12:10:58.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:10:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:10:58.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:10:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e182 e182: 3 total, 3 up, 3 in
Oct  2 08:10:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:10:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:10:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:10:58.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:10:58 np0005465988 nova_compute[236126]: 2025-10-02 12:10:58.892 2 DEBUG nova.objects.instance [None req-2ed58743-e27b-4e6e-90db-6c880599fbf3 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lazy-loading 'flavor' on Instance uuid d545e995-ceb2-43df-97c6-0549bc4d6da4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:58 np0005465988 nova_compute[236126]: 2025-10-02 12:10:58.915 2 DEBUG oslo_concurrency.lockutils [None req-2ed58743-e27b-4e6e-90db-6c880599fbf3 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquiring lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:58 np0005465988 nova_compute[236126]: 2025-10-02 12:10:58.916 2 DEBUG oslo_concurrency.lockutils [None req-2ed58743-e27b-4e6e-90db-6c880599fbf3 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquired lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:00.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:00 np0005465988 nova_compute[236126]: 2025-10-02 12:11:00.389 2 DEBUG nova.network.neutron [None req-2ed58743-e27b-4e6e-90db-6c880599fbf3 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:00 np0005465988 nova_compute[236126]: 2025-10-02 12:11:00.493 2 DEBUG nova.compute.manager [req-3a78a19d-80c6-49db-bf9c-ac634c142429 req-4b5b358f-2e1d-4981-ba4c-d8a559466e7f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received event network-changed-e761bc82-c642-42e2-bfe8-600863f22bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:00 np0005465988 nova_compute[236126]: 2025-10-02 12:11:00.494 2 DEBUG nova.compute.manager [req-3a78a19d-80c6-49db-bf9c-ac634c142429 req-4b5b358f-2e1d-4981-ba4c-d8a559466e7f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Refreshing instance network info cache due to event network-changed-e761bc82-c642-42e2-bfe8-600863f22bf5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:00 np0005465988 nova_compute[236126]: 2025-10-02 12:11:00.494 2 DEBUG oslo_concurrency.lockutils [req-3a78a19d-80c6-49db-bf9c-ac634c142429 req-4b5b358f-2e1d-4981-ba4c-d8a559466e7f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:00.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:00Z|00112|binding|INFO|Releasing lport 92254f36-d722-4bfe-bed5-5ca6a669ba88 from this chassis (sb_readonly=0)
Oct  2 08:11:00 np0005465988 nova_compute[236126]: 2025-10-02 12:11:00.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:02 np0005465988 nova_compute[236126]: 2025-10-02 12:11:02.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:02.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e182 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:02 np0005465988 nova_compute[236126]: 2025-10-02 12:11:02.297 2 DEBUG nova.network.neutron [None req-2ed58743-e27b-4e6e-90db-6c880599fbf3 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updating instance_info_cache with network_info: [{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:02 np0005465988 nova_compute[236126]: 2025-10-02 12:11:02.334 2 DEBUG oslo_concurrency.lockutils [None req-2ed58743-e27b-4e6e-90db-6c880599fbf3 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Releasing lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:02 np0005465988 nova_compute[236126]: 2025-10-02 12:11:02.335 2 DEBUG nova.compute.manager [None req-2ed58743-e27b-4e6e-90db-6c880599fbf3 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Oct  2 08:11:02 np0005465988 nova_compute[236126]: 2025-10-02 12:11:02.335 2 DEBUG nova.compute.manager [None req-2ed58743-e27b-4e6e-90db-6c880599fbf3 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] network_info to inject: |[{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Oct  2 08:11:02 np0005465988 nova_compute[236126]: 2025-10-02 12:11:02.340 2 DEBUG oslo_concurrency.lockutils [req-3a78a19d-80c6-49db-bf9c-ac634c142429 req-4b5b358f-2e1d-4981-ba4c-d8a559466e7f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:02 np0005465988 nova_compute[236126]: 2025-10-02 12:11:02.341 2 DEBUG nova.network.neutron [req-3a78a19d-80c6-49db-bf9c-ac634c142429 req-4b5b358f-2e1d-4981-ba4c-d8a559466e7f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Refreshing network info cache for port e761bc82-c642-42e2-bfe8-600863f22bf5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:02.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:03 np0005465988 nova_compute[236126]: 2025-10-02 12:11:03.132 2 DEBUG nova.objects.instance [None req-c703c9f9-628e-44a7-afd4-701aa81d1bf5 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lazy-loading 'flavor' on Instance uuid d545e995-ceb2-43df-97c6-0549bc4d6da4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:03 np0005465988 nova_compute[236126]: 2025-10-02 12:11:03.166 2 DEBUG oslo_concurrency.lockutils [None req-c703c9f9-628e-44a7-afd4-701aa81d1bf5 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquiring lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:03 np0005465988 nova_compute[236126]: 2025-10-02 12:11:03.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:03Z|00113|binding|INFO|Releasing lport 92254f36-d722-4bfe-bed5-5ca6a669ba88 from this chassis (sb_readonly=0)
Oct  2 08:11:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e183 e183: 3 total, 3 up, 3 in
Oct  2 08:11:03 np0005465988 nova_compute[236126]: 2025-10-02 12:11:03.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:04.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:04.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:04 np0005465988 podman[256473]: 2025-10-02 12:11:04.549666494 +0000 UTC m=+0.077337588 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:11:04 np0005465988 nova_compute[236126]: 2025-10-02 12:11:04.591 2 DEBUG nova.network.neutron [req-3a78a19d-80c6-49db-bf9c-ac634c142429 req-4b5b358f-2e1d-4981-ba4c-d8a559466e7f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updated VIF entry in instance network info cache for port e761bc82-c642-42e2-bfe8-600863f22bf5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:04 np0005465988 nova_compute[236126]: 2025-10-02 12:11:04.592 2 DEBUG nova.network.neutron [req-3a78a19d-80c6-49db-bf9c-ac634c142429 req-4b5b358f-2e1d-4981-ba4c-d8a559466e7f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updating instance_info_cache with network_info: [{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:04 np0005465988 nova_compute[236126]: 2025-10-02 12:11:04.609 2 DEBUG oslo_concurrency.lockutils [req-3a78a19d-80c6-49db-bf9c-ac634c142429 req-4b5b358f-2e1d-4981-ba4c-d8a559466e7f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:04 np0005465988 nova_compute[236126]: 2025-10-02 12:11:04.611 2 DEBUG oslo_concurrency.lockutils [None req-c703c9f9-628e-44a7-afd4-701aa81d1bf5 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquired lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:05 np0005465988 nova_compute[236126]: 2025-10-02 12:11:05.886 2 DEBUG nova.network.neutron [None req-c703c9f9-628e-44a7-afd4-701aa81d1bf5 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:06 np0005465988 nova_compute[236126]: 2025-10-02 12:11:06.068 2 DEBUG nova.compute.manager [req-023095e5-b7ef-4b87-b918-c7635f8bd97b req-2bd6fe0b-078c-4ec7-950a-7d404ddcf80c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received event network-changed-e761bc82-c642-42e2-bfe8-600863f22bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:06 np0005465988 nova_compute[236126]: 2025-10-02 12:11:06.069 2 DEBUG nova.compute.manager [req-023095e5-b7ef-4b87-b918-c7635f8bd97b req-2bd6fe0b-078c-4ec7-950a-7d404ddcf80c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Refreshing instance network info cache due to event network-changed-e761bc82-c642-42e2-bfe8-600863f22bf5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:06 np0005465988 nova_compute[236126]: 2025-10-02 12:11:06.070 2 DEBUG oslo_concurrency.lockutils [req-023095e5-b7ef-4b87-b918-c7635f8bd97b req-2bd6fe0b-078c-4ec7-950a-7d404ddcf80c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:11:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:06.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:11:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:06.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:07 np0005465988 nova_compute[236126]: 2025-10-02 12:11:07.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:07 np0005465988 nova_compute[236126]: 2025-10-02 12:11:07.346 2 DEBUG nova.network.neutron [None req-c703c9f9-628e-44a7-afd4-701aa81d1bf5 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updating instance_info_cache with network_info: [{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:07 np0005465988 nova_compute[236126]: 2025-10-02 12:11:07.379 2 DEBUG oslo_concurrency.lockutils [None req-c703c9f9-628e-44a7-afd4-701aa81d1bf5 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Releasing lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:07 np0005465988 nova_compute[236126]: 2025-10-02 12:11:07.379 2 DEBUG nova.compute.manager [None req-c703c9f9-628e-44a7-afd4-701aa81d1bf5 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Oct  2 08:11:07 np0005465988 nova_compute[236126]: 2025-10-02 12:11:07.380 2 DEBUG nova.compute.manager [None req-c703c9f9-628e-44a7-afd4-701aa81d1bf5 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] network_info to inject: |[{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Oct  2 08:11:07 np0005465988 nova_compute[236126]: 2025-10-02 12:11:07.382 2 DEBUG oslo_concurrency.lockutils [req-023095e5-b7ef-4b87-b918-c7635f8bd97b req-2bd6fe0b-078c-4ec7-950a-7d404ddcf80c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:07 np0005465988 nova_compute[236126]: 2025-10-02 12:11:07.383 2 DEBUG nova.network.neutron [req-023095e5-b7ef-4b87-b918-c7635f8bd97b req-2bd6fe0b-078c-4ec7-950a-7d404ddcf80c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Refreshing network info cache for port e761bc82-c642-42e2-bfe8-600863f22bf5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.155 2 DEBUG oslo_concurrency.lockutils [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquiring lock "d545e995-ceb2-43df-97c6-0549bc4d6da4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.156 2 DEBUG oslo_concurrency.lockutils [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.156 2 DEBUG oslo_concurrency.lockutils [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquiring lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.156 2 DEBUG oslo_concurrency.lockutils [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.157 2 DEBUG oslo_concurrency.lockutils [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.158 2 INFO nova.compute.manager [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Terminating instance#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.160 2 DEBUG nova.compute.manager [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:11:08 np0005465988 kernel: tape761bc82-c6 (unregistering): left promiscuous mode
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005465988 NetworkManager[45041]: <info>  [1759407068.2143] device (tape761bc82-c6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:08.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:08 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:08Z|00114|binding|INFO|Releasing lport e761bc82-c642-42e2-bfe8-600863f22bf5 from this chassis (sb_readonly=0)
Oct  2 08:11:08 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:08Z|00115|binding|INFO|Setting lport e761bc82-c642-42e2-bfe8-600863f22bf5 down in Southbound
Oct  2 08:11:08 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:08Z|00116|binding|INFO|Removing iface tape761bc82-c6 ovn-installed in OVS
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.238 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:b4:9d 10.100.0.13'], port_security=['fa:16:3e:8c:b4:9d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'd545e995-ceb2-43df-97c6-0549bc4d6da4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9e743c722ec0433e854167192b6dd567', 'neutron:revision_number': '6', 'neutron:security_group_ids': '738265f8-af79-4ed9-adba-c93b17bd0264', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d18bbef5-47b9-48d4-bc8e-574c94fc05bc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=e761bc82-c642-42e2-bfe8-600863f22bf5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.240 142124 INFO neutron.agent.ovn.metadata.agent [-] Port e761bc82-c642-42e2-bfe8-600863f22bf5 in datapath 90bea0c5-f8b7-47cb-bc1d-0929807ff5f6 unbound from our chassis#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.242 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90bea0c5-f8b7-47cb-bc1d-0929807ff5f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.244 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f4653ec5-16d6-42d8-8a01-fa549381919f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.244 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6 namespace which is not needed anymore#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005465988 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Oct  2 08:11:08 np0005465988 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000002e.scope: Consumed 16.681s CPU time.
Oct  2 08:11:08 np0005465988 systemd-machined[192594]: Machine qemu-18-instance-0000002e terminated.
Oct  2 08:11:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e184 e184: 3 total, 3 up, 3 in
Oct  2 08:11:08 np0005465988 neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6[256158]: [NOTICE]   (256162) : haproxy version is 2.8.14-c23fe91
Oct  2 08:11:08 np0005465988 neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6[256158]: [NOTICE]   (256162) : path to executable is /usr/sbin/haproxy
Oct  2 08:11:08 np0005465988 neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6[256158]: [ALERT]    (256162) : Current worker (256164) exited with code 143 (Terminated)
Oct  2 08:11:08 np0005465988 neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6[256158]: [WARNING]  (256162) : All workers exited. Exiting... (0)
Oct  2 08:11:08 np0005465988 systemd[1]: libpod-aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d.scope: Deactivated successfully.
Oct  2 08:11:08 np0005465988 podman[256520]: 2025-10-02 12:11:08.40360768 +0000 UTC m=+0.058856041 container died aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.401 2 INFO nova.virt.libvirt.driver [-] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Instance destroyed successfully.#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.402 2 DEBUG nova.objects.instance [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lazy-loading 'resources' on Instance uuid d545e995-ceb2-43df-97c6-0549bc4d6da4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.427 2 DEBUG nova.virt.libvirt.vif [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-1984858235',display_name='tempest-AttachInterfacesUnderV243Test-server-1984858235',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-1984858235',id=46,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDE7cODM2NyZ3/mJcptetYP+dNehHeWqNZQsR1fFUE+tNKNOaidrTEHHBwYr+NYaYQQooVYpMlnWV5rx1dZMB4MfIwMiwQz0qls4NMFJSJFy/9UQu+BHQq1qBWeVGW3ilw==',key_name='tempest-keypair-705473978',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9e743c722ec0433e854167192b6dd567',ramdisk_id='',reservation_id='r-rbxslj03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesUnderV243Test-592969324',owner_user_name='tempest-AttachInterfacesUnderV243Test-592969324-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:11:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='981a135a24e64d0aa07512e23330974a',uuid=d545e995-ceb2-43df-97c6-0549bc4d6da4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.428 2 DEBUG nova.network.os_vif_util [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Converting VIF {"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.430 2 DEBUG nova.network.os_vif_util [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:b4:9d,bridge_name='br-int',has_traffic_filtering=True,id=e761bc82-c642-42e2-bfe8-600863f22bf5,network=Network(90bea0c5-f8b7-47cb-bc1d-0929807ff5f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape761bc82-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.432 2 DEBUG os_vif [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:b4:9d,bridge_name='br-int',has_traffic_filtering=True,id=e761bc82-c642-42e2-bfe8-600863f22bf5,network=Network(90bea0c5-f8b7-47cb-bc1d-0929807ff5f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape761bc82-c6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.436 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape761bc82-c6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d-userdata-shm.mount: Deactivated successfully.
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005465988 systemd[1]: var-lib-containers-storage-overlay-d8cebff645f247ec9e015cfc5185c9ca67b28ca2e19229f455efdb3323168155-merged.mount: Deactivated successfully.
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.448 2 INFO os_vif [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:b4:9d,bridge_name='br-int',has_traffic_filtering=True,id=e761bc82-c642-42e2-bfe8-600863f22bf5,network=Network(90bea0c5-f8b7-47cb-bc1d-0929807ff5f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape761bc82-c6')#033[00m
Oct  2 08:11:08 np0005465988 podman[256520]: 2025-10-02 12:11:08.452871041 +0000 UTC m=+0.108119402 container cleanup aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:11:08 np0005465988 systemd[1]: libpod-conmon-aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d.scope: Deactivated successfully.
Oct  2 08:11:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:08.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:08 np0005465988 podman[256570]: 2025-10-02 12:11:08.513006548 +0000 UTC m=+0.039202940 container remove aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.522 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5b68cd89-a76d-46c5-b0f1-1e8de85dc365]: (4, ('Thu Oct  2 12:11:08 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6 (aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d)\naa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d\nThu Oct  2 12:11:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6 (aa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d)\naa554940599364b97ee93edf168782823e1645cd4d0feaf49c055972543be58d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.524 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0570281c-d931-4a12-93c2-7c501c34b4ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.525 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90bea0c5-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:08 np0005465988 kernel: tap90bea0c5-f0: left promiscuous mode
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.542 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5dfa88-0742-4f7e-aea8-1e046f33c55c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.571 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb079f7-7ed1-45eb-a384-63cc8ca52c8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.573 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a20f2ed8-26e2-4cbe-b331-1c5a8f876cd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.593 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4bb136-2717-494b-9892-21efd1b912dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507979, 'reachable_time': 23304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256587, 'error': None, 'target': 'ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.594 2 DEBUG nova.compute.manager [req-09778ea7-396e-451f-9a1d-d5b599016c7d req-824e5c85-4173-489c-8d8d-79b20fa596e6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received event network-vif-unplugged-e761bc82-c642-42e2-bfe8-600863f22bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.595 2 DEBUG oslo_concurrency.lockutils [req-09778ea7-396e-451f-9a1d-d5b599016c7d req-824e5c85-4173-489c-8d8d-79b20fa596e6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.595 2 DEBUG oslo_concurrency.lockutils [req-09778ea7-396e-451f-9a1d-d5b599016c7d req-824e5c85-4173-489c-8d8d-79b20fa596e6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.596 2 DEBUG oslo_concurrency.lockutils [req-09778ea7-396e-451f-9a1d-d5b599016c7d req-824e5c85-4173-489c-8d8d-79b20fa596e6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:08 np0005465988 systemd[1]: run-netns-ovnmeta\x2d90bea0c5\x2df8b7\x2d47cb\x2dbc1d\x2d0929807ff5f6.mount: Deactivated successfully.
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.596 2 DEBUG nova.compute.manager [req-09778ea7-396e-451f-9a1d-d5b599016c7d req-824e5c85-4173-489c-8d8d-79b20fa596e6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] No waiting events found dispatching network-vif-unplugged-e761bc82-c642-42e2-bfe8-600863f22bf5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.600 2 DEBUG nova.compute.manager [req-09778ea7-396e-451f-9a1d-d5b599016c7d req-824e5c85-4173-489c-8d8d-79b20fa596e6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received event network-vif-unplugged-e761bc82-c642-42e2-bfe8-600863f22bf5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.601 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-90bea0c5-f8b7-47cb-bc1d-0929807ff5f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:11:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:08.601 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[3d13a1e9-d0f0-4a77-a8a6-cdf76bdde287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005465988 nova_compute[236126]: 2025-10-02 12:11:08.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:09 np0005465988 nova_compute[236126]: 2025-10-02 12:11:09.025 2 DEBUG nova.network.neutron [req-023095e5-b7ef-4b87-b918-c7635f8bd97b req-2bd6fe0b-078c-4ec7-950a-7d404ddcf80c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updated VIF entry in instance network info cache for port e761bc82-c642-42e2-bfe8-600863f22bf5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:09 np0005465988 nova_compute[236126]: 2025-10-02 12:11:09.025 2 DEBUG nova.network.neutron [req-023095e5-b7ef-4b87-b918-c7635f8bd97b req-2bd6fe0b-078c-4ec7-950a-7d404ddcf80c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updating instance_info_cache with network_info: [{"id": "e761bc82-c642-42e2-bfe8-600863f22bf5", "address": "fa:16:3e:8c:b4:9d", "network": {"id": "90bea0c5-f8b7-47cb-bc1d-0929807ff5f6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2057840858-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e743c722ec0433e854167192b6dd567", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape761bc82-c6", "ovs_interfaceid": "e761bc82-c642-42e2-bfe8-600863f22bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:09 np0005465988 nova_compute[236126]: 2025-10-02 12:11:09.043 2 DEBUG oslo_concurrency.lockutils [req-023095e5-b7ef-4b87-b918-c7635f8bd97b req-2bd6fe0b-078c-4ec7-950a-7d404ddcf80c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d545e995-ceb2-43df-97c6-0549bc4d6da4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.048 2 INFO nova.virt.libvirt.driver [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Deleting instance files /var/lib/nova/instances/d545e995-ceb2-43df-97c6-0549bc4d6da4_del#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.050 2 INFO nova.virt.libvirt.driver [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Deletion of /var/lib/nova/instances/d545e995-ceb2-43df-97c6-0549bc4d6da4_del complete#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.114 2 INFO nova.compute.manager [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Took 1.95 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.115 2 DEBUG oslo.service.loopingcall [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.116 2 DEBUG nova.compute.manager [-] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.116 2 DEBUG nova.network.neutron [-] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:11:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:10.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:10.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.666 2 DEBUG nova.compute.manager [req-ff527bd4-8803-4c9d-bb9b-3adfe9c37b3c req-13b88f51-0d45-42ca-bb32-05cfdfb3329c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received event network-vif-plugged-e761bc82-c642-42e2-bfe8-600863f22bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.668 2 DEBUG oslo_concurrency.lockutils [req-ff527bd4-8803-4c9d-bb9b-3adfe9c37b3c req-13b88f51-0d45-42ca-bb32-05cfdfb3329c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.669 2 DEBUG oslo_concurrency.lockutils [req-ff527bd4-8803-4c9d-bb9b-3adfe9c37b3c req-13b88f51-0d45-42ca-bb32-05cfdfb3329c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.669 2 DEBUG oslo_concurrency.lockutils [req-ff527bd4-8803-4c9d-bb9b-3adfe9c37b3c req-13b88f51-0d45-42ca-bb32-05cfdfb3329c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.670 2 DEBUG nova.compute.manager [req-ff527bd4-8803-4c9d-bb9b-3adfe9c37b3c req-13b88f51-0d45-42ca-bb32-05cfdfb3329c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] No waiting events found dispatching network-vif-plugged-e761bc82-c642-42e2-bfe8-600863f22bf5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.670 2 WARNING nova.compute.manager [req-ff527bd4-8803-4c9d-bb9b-3adfe9c37b3c req-13b88f51-0d45-42ca-bb32-05cfdfb3329c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received unexpected event network-vif-plugged-e761bc82-c642-42e2-bfe8-600863f22bf5 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.959 2 DEBUG nova.network.neutron [-] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:10 np0005465988 nova_compute[236126]: 2025-10-02 12:11:10.985 2 INFO nova.compute.manager [-] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Took 0.87 seconds to deallocate network for instance.#033[00m
Oct  2 08:11:11 np0005465988 nova_compute[236126]: 2025-10-02 12:11:11.061 2 DEBUG oslo_concurrency.lockutils [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:11 np0005465988 nova_compute[236126]: 2025-10-02 12:11:11.062 2 DEBUG oslo_concurrency.lockutils [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:11 np0005465988 nova_compute[236126]: 2025-10-02 12:11:11.138 2 DEBUG oslo_concurrency.processutils [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:11:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:11:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:11:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:11:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4274527272' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:11:11 np0005465988 nova_compute[236126]: 2025-10-02 12:11:11.613 2 DEBUG oslo_concurrency.processutils [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:11 np0005465988 nova_compute[236126]: 2025-10-02 12:11:11.621 2 DEBUG nova.compute.provider_tree [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:11 np0005465988 nova_compute[236126]: 2025-10-02 12:11:11.659 2 DEBUG nova.scheduler.client.report [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:11 np0005465988 nova_compute[236126]: 2025-10-02 12:11:11.690 2 DEBUG oslo_concurrency.lockutils [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:11 np0005465988 nova_compute[236126]: 2025-10-02 12:11:11.716 2 INFO nova.scheduler.client.report [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Deleted allocations for instance d545e995-ceb2-43df-97c6-0549bc4d6da4#033[00m
Oct  2 08:11:11 np0005465988 nova_compute[236126]: 2025-10-02 12:11:11.821 2 DEBUG oslo_concurrency.lockutils [None req-8ab92b44-d58b-4ded-a590-33c5be0e4057 981a135a24e64d0aa07512e23330974a 9e743c722ec0433e854167192b6dd567 - - default default] Lock "d545e995-ceb2-43df-97c6-0549bc4d6da4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:11 np0005465988 nova_compute[236126]: 2025-10-02 12:11:11.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:12 np0005465988 nova_compute[236126]: 2025-10-02 12:11:12.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:12.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:12.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:12 np0005465988 nova_compute[236126]: 2025-10-02 12:11:12.731 2 DEBUG nova.compute.manager [req-613c2944-5f5b-4d55-97d7-66af817b650e req-22df35b7-bdc7-4ddf-abe0-d98c37c8adc7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Received event network-vif-deleted-e761bc82-c642-42e2-bfe8-600863f22bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:13 np0005465988 nova_compute[236126]: 2025-10-02 12:11:13.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:13 np0005465988 nova_compute[236126]: 2025-10-02 12:11:13.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:14.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:14 np0005465988 nova_compute[236126]: 2025-10-02 12:11:14.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:14.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:16.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:16.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:16 np0005465988 nova_compute[236126]: 2025-10-02 12:11:16.581 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Acquiring lock "598e1759-7095-483d-bfc5-34c50a97b4f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:16 np0005465988 nova_compute[236126]: 2025-10-02 12:11:16.582 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:16 np0005465988 nova_compute[236126]: 2025-10-02 12:11:16.599 2 DEBUG nova.compute.manager [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:11:16 np0005465988 nova_compute[236126]: 2025-10-02 12:11:16.681 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:16 np0005465988 nova_compute[236126]: 2025-10-02 12:11:16.682 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:16 np0005465988 nova_compute[236126]: 2025-10-02 12:11:16.691 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:11:16 np0005465988 nova_compute[236126]: 2025-10-02 12:11:16.692 2 INFO nova.compute.claims [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:11:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:11:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:11:16 np0005465988 nova_compute[236126]: 2025-10-02 12:11:16.791 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:11:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2489225117' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.225 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.231 2 DEBUG nova.compute.provider_tree [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.253 2 DEBUG nova.scheduler.client.report [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.283 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.284 2 DEBUG nova.compute.manager [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.341 2 DEBUG nova.compute.manager [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.341 2 DEBUG nova.network.neutron [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.365 2 INFO nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.386 2 DEBUG nova.compute.manager [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.468 2 DEBUG nova.compute.manager [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.470 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.471 2 INFO nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Creating image(s)#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.513 2 DEBUG nova.storage.rbd_utils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] rbd image 598e1759-7095-483d-bfc5-34c50a97b4f8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.560 2 DEBUG nova.storage.rbd_utils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] rbd image 598e1759-7095-483d-bfc5-34c50a97b4f8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.597 2 DEBUG nova.storage.rbd_utils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] rbd image 598e1759-7095-483d-bfc5-34c50a97b4f8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.602 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.695 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.698 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.699 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.700 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.741 2 DEBUG nova.storage.rbd_utils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] rbd image 598e1759-7095-483d-bfc5-34c50a97b4f8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.747 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 598e1759-7095-483d-bfc5-34c50a97b4f8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.862 2 DEBUG nova.policy [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c2999fde0aa643028301c0aef2c02f66', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7c21a3b8b2ec4521af33df81f746e3ff', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:11:17 np0005465988 nova_compute[236126]: 2025-10-02 12:11:17.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:18 np0005465988 nova_compute[236126]: 2025-10-02 12:11:18.096 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 598e1759-7095-483d-bfc5-34c50a97b4f8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.349s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:18 np0005465988 nova_compute[236126]: 2025-10-02 12:11:18.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:18 np0005465988 nova_compute[236126]: 2025-10-02 12:11:18.207 2 DEBUG nova.storage.rbd_utils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] resizing rbd image 598e1759-7095-483d-bfc5-34c50a97b4f8_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:11:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:18.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:18 np0005465988 nova_compute[236126]: 2025-10-02 12:11:18.317 2 DEBUG nova.objects.instance [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lazy-loading 'migration_context' on Instance uuid 598e1759-7095-483d-bfc5-34c50a97b4f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:18 np0005465988 nova_compute[236126]: 2025-10-02 12:11:18.333 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:11:18 np0005465988 nova_compute[236126]: 2025-10-02 12:11:18.333 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Ensure instance console log exists: /var/lib/nova/instances/598e1759-7095-483d-bfc5-34c50a97b4f8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:11:18 np0005465988 nova_compute[236126]: 2025-10-02 12:11:18.333 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:18 np0005465988 nova_compute[236126]: 2025-10-02 12:11:18.334 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:18 np0005465988 nova_compute[236126]: 2025-10-02 12:11:18.334 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:18 np0005465988 nova_compute[236126]: 2025-10-02 12:11:18.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:18.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:18 np0005465988 podman[257037]: 2025-10-02 12:11:18.566679158 +0000 UTC m=+0.088419580 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  2 08:11:18 np0005465988 podman[257036]: 2025-10-02 12:11:18.578209343 +0000 UTC m=+0.105360522 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:11:18 np0005465988 podman[257035]: 2025-10-02 12:11:18.623287942 +0000 UTC m=+0.148842195 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:11:19 np0005465988 nova_compute[236126]: 2025-10-02 12:11:19.211 2 DEBUG nova.network.neutron [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Successfully created port: f8bab6cf-7dfd-4fa3-8984-36374cf87924 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:11:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:20.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:20 np0005465988 nova_compute[236126]: 2025-10-02 12:11:20.424 2 DEBUG nova.network.neutron [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Successfully updated port: f8bab6cf-7dfd-4fa3-8984-36374cf87924 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:11:20 np0005465988 nova_compute[236126]: 2025-10-02 12:11:20.444 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Acquiring lock "refresh_cache-598e1759-7095-483d-bfc5-34c50a97b4f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:20 np0005465988 nova_compute[236126]: 2025-10-02 12:11:20.444 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Acquired lock "refresh_cache-598e1759-7095-483d-bfc5-34c50a97b4f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:20 np0005465988 nova_compute[236126]: 2025-10-02 12:11:20.445 2 DEBUG nova.network.neutron [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:11:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:20.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:11:20 np0005465988 nova_compute[236126]: 2025-10-02 12:11:20.552 2 DEBUG nova.compute.manager [req-dbb94495-fc3c-46b6-9783-57d2deda63e1 req-89d18b14-970f-49bd-adb9-acd10f39240b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Received event network-changed-f8bab6cf-7dfd-4fa3-8984-36374cf87924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:20 np0005465988 nova_compute[236126]: 2025-10-02 12:11:20.553 2 DEBUG nova.compute.manager [req-dbb94495-fc3c-46b6-9783-57d2deda63e1 req-89d18b14-970f-49bd-adb9-acd10f39240b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Refreshing instance network info cache due to event network-changed-f8bab6cf-7dfd-4fa3-8984-36374cf87924. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:20 np0005465988 nova_compute[236126]: 2025-10-02 12:11:20.553 2 DEBUG oslo_concurrency.lockutils [req-dbb94495-fc3c-46b6-9783-57d2deda63e1 req-89d18b14-970f-49bd-adb9-acd10f39240b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-598e1759-7095-483d-bfc5-34c50a97b4f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:20 np0005465988 nova_compute[236126]: 2025-10-02 12:11:20.628 2 DEBUG nova.network.neutron [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.207 2 DEBUG nova.network.neutron [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Updating instance_info_cache with network_info: [{"id": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "address": "fa:16:3e:bc:d8:52", "network": {"id": "67016734-de32-452f-b3cd-af3113cea332", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1036479945-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c21a3b8b2ec4521af33df81f746e3ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8bab6cf-7d", "ovs_interfaceid": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.246 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Releasing lock "refresh_cache-598e1759-7095-483d-bfc5-34c50a97b4f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.246 2 DEBUG nova.compute.manager [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Instance network_info: |[{"id": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "address": "fa:16:3e:bc:d8:52", "network": {"id": "67016734-de32-452f-b3cd-af3113cea332", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1036479945-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c21a3b8b2ec4521af33df81f746e3ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8bab6cf-7d", "ovs_interfaceid": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.247 2 DEBUG oslo_concurrency.lockutils [req-dbb94495-fc3c-46b6-9783-57d2deda63e1 req-89d18b14-970f-49bd-adb9-acd10f39240b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-598e1759-7095-483d-bfc5-34c50a97b4f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.247 2 DEBUG nova.network.neutron [req-dbb94495-fc3c-46b6-9783-57d2deda63e1 req-89d18b14-970f-49bd-adb9-acd10f39240b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Refreshing network info cache for port f8bab6cf-7dfd-4fa3-8984-36374cf87924 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:22.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.252 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Start _get_guest_xml network_info=[{"id": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "address": "fa:16:3e:bc:d8:52", "network": {"id": "67016734-de32-452f-b3cd-af3113cea332", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1036479945-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c21a3b8b2ec4521af33df81f746e3ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8bab6cf-7d", "ovs_interfaceid": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.260 2 WARNING nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.266 2 DEBUG nova.virt.libvirt.host [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.266 2 DEBUG nova.virt.libvirt.host [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.273 2 DEBUG nova.virt.libvirt.host [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.273 2 DEBUG nova.virt.libvirt.host [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.275 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.276 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.276 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.277 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.277 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.277 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.277 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.278 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.278 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.279 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.279 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.279 2 DEBUG nova.virt.hardware [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.283 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:22.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:11:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2799465910' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.767 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.798 2 DEBUG nova.storage.rbd_utils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] rbd image 598e1759-7095-483d-bfc5-34c50a97b4f8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:22 np0005465988 nova_compute[236126]: 2025-10-02 12:11:22.803 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:11:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1828815906' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.259 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.261 2 DEBUG nova.virt.libvirt.vif [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:11:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-25337122',display_name='tempest-ImagesNegativeTestJSON-server-25337122',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-25337122',id=50,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7c21a3b8b2ec4521af33df81f746e3ff',ramdisk_id='',reservation_id='r-zu71q878',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-522836631',owner_user_name='tempest-ImagesNegativeTestJSON-522836631-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:11:17Z,user_data=None,user_id='c2999fde0aa643028301c0aef2c02f66',uuid=598e1759-7095-483d-bfc5-34c50a97b4f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "address": "fa:16:3e:bc:d8:52", "network": {"id": "67016734-de32-452f-b3cd-af3113cea332", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1036479945-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c21a3b8b2ec4521af33df81f746e3ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8bab6cf-7d", "ovs_interfaceid": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.261 2 DEBUG nova.network.os_vif_util [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Converting VIF {"id": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "address": "fa:16:3e:bc:d8:52", "network": {"id": "67016734-de32-452f-b3cd-af3113cea332", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1036479945-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c21a3b8b2ec4521af33df81f746e3ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8bab6cf-7d", "ovs_interfaceid": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.262 2 DEBUG nova.network.os_vif_util [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:d8:52,bridge_name='br-int',has_traffic_filtering=True,id=f8bab6cf-7dfd-4fa3-8984-36374cf87924,network=Network(67016734-de32-452f-b3cd-af3113cea332),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8bab6cf-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.263 2 DEBUG nova.objects.instance [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lazy-loading 'pci_devices' on Instance uuid 598e1759-7095-483d-bfc5-34c50a97b4f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.277 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  <uuid>598e1759-7095-483d-bfc5-34c50a97b4f8</uuid>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  <name>instance-00000032</name>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <nova:name>tempest-ImagesNegativeTestJSON-server-25337122</nova:name>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:11:22</nova:creationTime>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <nova:user uuid="c2999fde0aa643028301c0aef2c02f66">tempest-ImagesNegativeTestJSON-522836631-project-member</nova:user>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <nova:project uuid="7c21a3b8b2ec4521af33df81f746e3ff">tempest-ImagesNegativeTestJSON-522836631</nova:project>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <nova:port uuid="f8bab6cf-7dfd-4fa3-8984-36374cf87924">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <entry name="serial">598e1759-7095-483d-bfc5-34c50a97b4f8</entry>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <entry name="uuid">598e1759-7095-483d-bfc5-34c50a97b4f8</entry>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/598e1759-7095-483d-bfc5-34c50a97b4f8_disk">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/598e1759-7095-483d-bfc5-34c50a97b4f8_disk.config">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:bc:d8:52"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <target dev="tapf8bab6cf-7d"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/598e1759-7095-483d-bfc5-34c50a97b4f8/console.log" append="off"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:11:23 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:11:23 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:11:23 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:11:23 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.278 2 DEBUG nova.compute.manager [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Preparing to wait for external event network-vif-plugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.278 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Acquiring lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.278 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.278 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.279 2 DEBUG nova.virt.libvirt.vif [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:11:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-25337122',display_name='tempest-ImagesNegativeTestJSON-server-25337122',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-25337122',id=50,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7c21a3b8b2ec4521af33df81f746e3ff',ramdisk_id='',reservation_id='r-zu71q878',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-522836631',owner_user_name='tempest-ImagesNegativeTestJSON-522836631-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:11:17Z,user_data=None,user_id='c2999fde0aa643028301c0aef2c02f66',uuid=598e1759-7095-483d-bfc5-34c50a97b4f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "address": "fa:16:3e:bc:d8:52", "network": {"id": "67016734-de32-452f-b3cd-af3113cea332", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1036479945-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c21a3b8b2ec4521af33df81f746e3ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8bab6cf-7d", "ovs_interfaceid": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.279 2 DEBUG nova.network.os_vif_util [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Converting VIF {"id": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "address": "fa:16:3e:bc:d8:52", "network": {"id": "67016734-de32-452f-b3cd-af3113cea332", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1036479945-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c21a3b8b2ec4521af33df81f746e3ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8bab6cf-7d", "ovs_interfaceid": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.280 2 DEBUG nova.network.os_vif_util [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:d8:52,bridge_name='br-int',has_traffic_filtering=True,id=f8bab6cf-7dfd-4fa3-8984-36374cf87924,network=Network(67016734-de32-452f-b3cd-af3113cea332),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8bab6cf-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.280 2 DEBUG os_vif [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:d8:52,bridge_name='br-int',has_traffic_filtering=True,id=f8bab6cf-7dfd-4fa3-8984-36374cf87924,network=Network(67016734-de32-452f-b3cd-af3113cea332),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8bab6cf-7d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.281 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.281 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.284 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8bab6cf-7d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.285 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf8bab6cf-7d, col_values=(('external_ids', {'iface-id': 'f8bab6cf-7dfd-4fa3-8984-36374cf87924', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bc:d8:52', 'vm-uuid': '598e1759-7095-483d-bfc5-34c50a97b4f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005465988 NetworkManager[45041]: <info>  [1759407083.2877] manager: (tapf8bab6cf-7d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.293 2 INFO os_vif [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:d8:52,bridge_name='br-int',has_traffic_filtering=True,id=f8bab6cf-7dfd-4fa3-8984-36374cf87924,network=Network(67016734-de32-452f-b3cd-af3113cea332),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8bab6cf-7d')#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.343 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.343 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.344 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] No VIF found with MAC fa:16:3e:bc:d8:52, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.344 2 INFO nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Using config drive#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.365 2 DEBUG nova.storage.rbd_utils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] rbd image 598e1759-7095-483d-bfc5-34c50a97b4f8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.395 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407068.3943086, d545e995-ceb2-43df-97c6-0549bc4d6da4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.395 2 INFO nova.compute.manager [-] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.417 2 DEBUG nova.compute.manager [None req-047de9bd-f7bb-4124-b2c1-95ee65c5ccf7 - - - - - -] [instance: d545e995-ceb2-43df-97c6-0549bc4d6da4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.975 2 INFO nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Creating config drive at /var/lib/nova/instances/598e1759-7095-483d-bfc5-34c50a97b4f8/disk.config#033[00m
Oct  2 08:11:23 np0005465988 nova_compute[236126]: 2025-10-02 12:11:23.984 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/598e1759-7095-483d-bfc5-34c50a97b4f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcfzz815n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:24 np0005465988 nova_compute[236126]: 2025-10-02 12:11:24.130 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/598e1759-7095-483d-bfc5-34c50a97b4f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcfzz815n" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:24 np0005465988 nova_compute[236126]: 2025-10-02 12:11:24.173 2 DEBUG nova.storage.rbd_utils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] rbd image 598e1759-7095-483d-bfc5-34c50a97b4f8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:24 np0005465988 nova_compute[236126]: 2025-10-02 12:11:24.179 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/598e1759-7095-483d-bfc5-34c50a97b4f8/disk.config 598e1759-7095-483d-bfc5-34c50a97b4f8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:24.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:24.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:24 np0005465988 nova_compute[236126]: 2025-10-02 12:11:24.758 2 DEBUG oslo_concurrency.processutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/598e1759-7095-483d-bfc5-34c50a97b4f8/disk.config 598e1759-7095-483d-bfc5-34c50a97b4f8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.579s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:24 np0005465988 nova_compute[236126]: 2025-10-02 12:11:24.760 2 INFO nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Deleting local config drive /var/lib/nova/instances/598e1759-7095-483d-bfc5-34c50a97b4f8/disk.config because it was imported into RBD.#033[00m
Oct  2 08:11:24 np0005465988 kernel: tapf8bab6cf-7d: entered promiscuous mode
Oct  2 08:11:24 np0005465988 NetworkManager[45041]: <info>  [1759407084.8326] manager: (tapf8bab6cf-7d): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Oct  2 08:11:24 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:24Z|00117|binding|INFO|Claiming lport f8bab6cf-7dfd-4fa3-8984-36374cf87924 for this chassis.
Oct  2 08:11:24 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:24Z|00118|binding|INFO|f8bab6cf-7dfd-4fa3-8984-36374cf87924: Claiming fa:16:3e:bc:d8:52 10.100.0.12
Oct  2 08:11:24 np0005465988 nova_compute[236126]: 2025-10-02 12:11:24.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:24 np0005465988 nova_compute[236126]: 2025-10-02 12:11:24.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.860 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:d8:52 10.100.0.12'], port_security=['fa:16:3e:bc:d8:52 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '598e1759-7095-483d-bfc5-34c50a97b4f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-67016734-de32-452f-b3cd-af3113cea332', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7c21a3b8b2ec4521af33df81f746e3ff', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9c34afd8-b630-4d78-b08d-ba19afcaceb9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50a352a1-57b3-4502-a354-b0a1acd68cff, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f8bab6cf-7dfd-4fa3-8984-36374cf87924) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.861 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f8bab6cf-7dfd-4fa3-8984-36374cf87924 in datapath 67016734-de32-452f-b3cd-af3113cea332 bound to our chassis#033[00m
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.863 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 67016734-de32-452f-b3cd-af3113cea332#033[00m
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.880 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[19b0ace9-d190-4126-b5ae-86a769980bf8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.881 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap67016734-d1 in ovnmeta-67016734-de32-452f-b3cd-af3113cea332 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:11:24 np0005465988 systemd-udevd[257239]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:11:24 np0005465988 systemd-machined[192594]: New machine qemu-19-instance-00000032.
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.884 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap67016734-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.884 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[561fb901-a004-437b-a872-93b5e3f17f05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.885 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cf96b9b1-5f3a-4561-bb78-9ea44a273a80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.900 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2bf684-bb5c-4370-bced-3fad433f59f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:24 np0005465988 NetworkManager[45041]: <info>  [1759407084.9048] device (tapf8bab6cf-7d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:11:24 np0005465988 NetworkManager[45041]: <info>  [1759407084.9061] device (tapf8bab6cf-7d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:11:24 np0005465988 nova_compute[236126]: 2025-10-02 12:11:24.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.927 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c19e3e38-5b6f-491e-a716-fe2625d55151]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:24 np0005465988 systemd[1]: Started Virtual Machine qemu-19-instance-00000032.
Oct  2 08:11:24 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:24Z|00119|binding|INFO|Setting lport f8bab6cf-7dfd-4fa3-8984-36374cf87924 ovn-installed in OVS
Oct  2 08:11:24 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:24Z|00120|binding|INFO|Setting lport f8bab6cf-7dfd-4fa3-8984-36374cf87924 up in Southbound
Oct  2 08:11:24 np0005465988 nova_compute[236126]: 2025-10-02 12:11:24.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.962 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[54f48f5b-71c1-4d53-9767-795ccfb9497b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:24.968 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7bfe30-a30a-459b-ac59-719b2bb7f5c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:24 np0005465988 NetworkManager[45041]: <info>  [1759407084.9693] manager: (tap67016734-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Oct  2 08:11:24 np0005465988 systemd-udevd[257242]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.001 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[6620657c-6ce6-46b4-8813-62140505b856]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.005 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ae138959-176f-49d1-964d-681f6ecf9617]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:25 np0005465988 NetworkManager[45041]: <info>  [1759407085.0291] device (tap67016734-d0): carrier: link connected
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.036 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f540c55c-c9af-41f1-b045-c18d8ccc14a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.053 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[334e664a-a40e-446c-92ca-a6a4411312fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap67016734-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:0f:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514536, 'reachable_time': 30183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257271, 'error': None, 'target': 'ovnmeta-67016734-de32-452f-b3cd-af3113cea332', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.062 2 DEBUG nova.network.neutron [req-dbb94495-fc3c-46b6-9783-57d2deda63e1 req-89d18b14-970f-49bd-adb9-acd10f39240b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Updated VIF entry in instance network info cache for port f8bab6cf-7dfd-4fa3-8984-36374cf87924. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.062 2 DEBUG nova.network.neutron [req-dbb94495-fc3c-46b6-9783-57d2deda63e1 req-89d18b14-970f-49bd-adb9-acd10f39240b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Updating instance_info_cache with network_info: [{"id": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "address": "fa:16:3e:bc:d8:52", "network": {"id": "67016734-de32-452f-b3cd-af3113cea332", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1036479945-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c21a3b8b2ec4521af33df81f746e3ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8bab6cf-7d", "ovs_interfaceid": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.069 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4003b2dd-832f-4449-aeda-a185810ba5ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8b:f8e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 514536, 'tstamp': 514536}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257272, 'error': None, 'target': 'ovnmeta-67016734-de32-452f-b3cd-af3113cea332', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.092 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[470ff7a7-a785-495b-9c63-da3b4b285012]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap67016734-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:0f:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514536, 'reachable_time': 30183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 257273, 'error': None, 'target': 'ovnmeta-67016734-de32-452f-b3cd-af3113cea332', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.097 2 DEBUG oslo_concurrency.lockutils [req-dbb94495-fc3c-46b6-9783-57d2deda63e1 req-89d18b14-970f-49bd-adb9-acd10f39240b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-598e1759-7095-483d-bfc5-34c50a97b4f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.135 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fda0beb9-c30f-48cc-a4d1-500fc318488c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.199 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dc67f288-a74f-4386-9601-0a1734c83e7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.200 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap67016734-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.200 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.201 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap67016734-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:25 np0005465988 kernel: tap67016734-d0: entered promiscuous mode
Oct  2 08:11:25 np0005465988 NetworkManager[45041]: <info>  [1759407085.2037] manager: (tap67016734-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.207 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap67016734-d0, col_values=(('external_ids', {'iface-id': 'a21720a3-e50e-498c-bc80-61d1ff0f95a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:25Z|00121|binding|INFO|Releasing lport a21720a3-e50e-498c-bc80-61d1ff0f95a8 from this chassis (sb_readonly=0)
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.209 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/67016734-de32-452f-b3cd-af3113cea332.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/67016734-de32-452f-b3cd-af3113cea332.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.221 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5b55b2e1-e681-4a8c-9a4b-18a8c13fdb6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.222 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-67016734-de32-452f-b3cd-af3113cea332
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/67016734-de32-452f-b3cd-af3113cea332.pid.haproxy
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 67016734-de32-452f-b3cd-af3113cea332
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:25.222 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-67016734-de32-452f-b3cd-af3113cea332', 'env', 'PROCESS_TAG=haproxy-67016734-de32-452f-b3cd-af3113cea332', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/67016734-de32-452f-b3cd-af3113cea332.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.254 2 DEBUG nova.compute.manager [req-39a3a4e3-0623-4205-a652-75ee47f333db req-62028bba-8c93-446b-95ff-8cdaac6013af d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Received event network-vif-plugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.255 2 DEBUG oslo_concurrency.lockutils [req-39a3a4e3-0623-4205-a652-75ee47f333db req-62028bba-8c93-446b-95ff-8cdaac6013af d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.256 2 DEBUG oslo_concurrency.lockutils [req-39a3a4e3-0623-4205-a652-75ee47f333db req-62028bba-8c93-446b-95ff-8cdaac6013af d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.256 2 DEBUG oslo_concurrency.lockutils [req-39a3a4e3-0623-4205-a652-75ee47f333db req-62028bba-8c93-446b-95ff-8cdaac6013af d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:25 np0005465988 nova_compute[236126]: 2025-10-02 12:11:25.256 2 DEBUG nova.compute.manager [req-39a3a4e3-0623-4205-a652-75ee47f333db req-62028bba-8c93-446b-95ff-8cdaac6013af d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Processing event network-vif-plugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:11:25 np0005465988 podman[257305]: 2025-10-02 12:11:25.585418849 +0000 UTC m=+0.043896797 container create c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:11:25 np0005465988 systemd[1]: Started libpod-conmon-c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168.scope.
Oct  2 08:11:25 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:11:25 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b756acba8bc2c88c475c407acd5e8380900e48427226fc7d2d988089684bb710/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:11:25 np0005465988 podman[257305]: 2025-10-02 12:11:25.563667437 +0000 UTC m=+0.022145385 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:11:25 np0005465988 podman[257305]: 2025-10-02 12:11:25.66498416 +0000 UTC m=+0.123462118 container init c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:11:25 np0005465988 podman[257305]: 2025-10-02 12:11:25.675431424 +0000 UTC m=+0.133909362 container start c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  2 08:11:25 np0005465988 neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332[257320]: [NOTICE]   (257324) : New worker (257326) forked
Oct  2 08:11:25 np0005465988 neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332[257320]: [NOTICE]   (257324) : Loading success.
Oct  2 08:11:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:26.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:26.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:26 np0005465988 nova_compute[236126]: 2025-10-02 12:11:26.982 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407086.9820085, 598e1759-7095-483d-bfc5-34c50a97b4f8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:26 np0005465988 nova_compute[236126]: 2025-10-02 12:11:26.982 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] VM Started (Lifecycle Event)#033[00m
Oct  2 08:11:26 np0005465988 nova_compute[236126]: 2025-10-02 12:11:26.984 2 DEBUG nova.compute.manager [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:11:26 np0005465988 nova_compute[236126]: 2025-10-02 12:11:26.990 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:11:26 np0005465988 nova_compute[236126]: 2025-10-02 12:11:26.993 2 INFO nova.virt.libvirt.driver [-] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Instance spawned successfully.#033[00m
Oct  2 08:11:26 np0005465988 nova_compute[236126]: 2025-10-02 12:11:26.993 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.021 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.027 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.031 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.031 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.031 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.032 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.032 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.032 2 DEBUG nova.virt.libvirt.driver [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.059 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.059 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407086.984624, 598e1759-7095-483d-bfc5-34c50a97b4f8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.059 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.087 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.091 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407086.9881797, 598e1759-7095-483d-bfc5-34c50a97b4f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.092 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.101 2 INFO nova.compute.manager [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Took 9.63 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.101 2 DEBUG nova.compute.manager [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.139 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.142 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.171 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.190 2 INFO nova.compute.manager [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Took 10.54 seconds to build instance.#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.219 2 DEBUG oslo_concurrency.lockutils [None req-ec291c5e-12b0-48eb-9584-5590f369f4c7 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:27.338 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:27.339 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:27.340 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.414 2 DEBUG nova.compute.manager [req-64f112b7-6bcb-4843-94fb-a173500c20a1 req-bbf9e7a9-dfa5-4026-ada7-a01c155f2fda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Received event network-vif-plugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.415 2 DEBUG oslo_concurrency.lockutils [req-64f112b7-6bcb-4843-94fb-a173500c20a1 req-bbf9e7a9-dfa5-4026-ada7-a01c155f2fda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.415 2 DEBUG oslo_concurrency.lockutils [req-64f112b7-6bcb-4843-94fb-a173500c20a1 req-bbf9e7a9-dfa5-4026-ada7-a01c155f2fda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.416 2 DEBUG oslo_concurrency.lockutils [req-64f112b7-6bcb-4843-94fb-a173500c20a1 req-bbf9e7a9-dfa5-4026-ada7-a01c155f2fda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.416 2 DEBUG nova.compute.manager [req-64f112b7-6bcb-4843-94fb-a173500c20a1 req-bbf9e7a9-dfa5-4026-ada7-a01c155f2fda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] No waiting events found dispatching network-vif-plugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:27 np0005465988 nova_compute[236126]: 2025-10-02 12:11:27.416 2 WARNING nova.compute.manager [req-64f112b7-6bcb-4843-94fb-a173500c20a1 req-bbf9e7a9-dfa5-4026-ada7-a01c155f2fda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Received unexpected event network-vif-plugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.092 2 DEBUG oslo_concurrency.lockutils [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Acquiring lock "598e1759-7095-483d-bfc5-34c50a97b4f8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.093 2 DEBUG oslo_concurrency.lockutils [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.093 2 DEBUG oslo_concurrency.lockutils [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Acquiring lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.094 2 DEBUG oslo_concurrency.lockutils [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.094 2 DEBUG oslo_concurrency.lockutils [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.096 2 INFO nova.compute.manager [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Terminating instance#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.097 2 DEBUG nova.compute.manager [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:11:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:28.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:28 np0005465988 kernel: tapf8bab6cf-7d (unregistering): left promiscuous mode
Oct  2 08:11:28 np0005465988 NetworkManager[45041]: <info>  [1759407088.4137] device (tapf8bab6cf-7d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:28Z|00122|binding|INFO|Releasing lport f8bab6cf-7dfd-4fa3-8984-36374cf87924 from this chassis (sb_readonly=0)
Oct  2 08:11:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:28Z|00123|binding|INFO|Setting lport f8bab6cf-7dfd-4fa3-8984-36374cf87924 down in Southbound
Oct  2 08:11:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:28Z|00124|binding|INFO|Removing iface tapf8bab6cf-7d ovn-installed in OVS
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:28 np0005465988 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000032.scope: Deactivated successfully.
Oct  2 08:11:28 np0005465988 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000032.scope: Consumed 2.902s CPU time.
Oct  2 08:11:28 np0005465988 systemd-machined[192594]: Machine qemu-19-instance-00000032 terminated.
Oct  2 08:11:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:28.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.538 2 INFO nova.virt.libvirt.driver [-] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Instance destroyed successfully.#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.538 2 DEBUG nova.objects.instance [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lazy-loading 'resources' on Instance uuid 598e1759-7095-483d-bfc5-34c50a97b4f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.576 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:d8:52 10.100.0.12'], port_security=['fa:16:3e:bc:d8:52 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '598e1759-7095-483d-bfc5-34c50a97b4f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-67016734-de32-452f-b3cd-af3113cea332', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7c21a3b8b2ec4521af33df81f746e3ff', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c34afd8-b630-4d78-b08d-ba19afcaceb9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50a352a1-57b3-4502-a354-b0a1acd68cff, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f8bab6cf-7dfd-4fa3-8984-36374cf87924) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.578 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f8bab6cf-7dfd-4fa3-8984-36374cf87924 in datapath 67016734-de32-452f-b3cd-af3113cea332 unbound from our chassis#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.581 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 67016734-de32-452f-b3cd-af3113cea332, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.582 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5e1b2fe5-a2ed-4836-b61d-57d9138933d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.583 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-67016734-de32-452f-b3cd-af3113cea332 namespace which is not needed anymore#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.588 2 DEBUG nova.virt.libvirt.vif [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:11:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-25337122',display_name='tempest-ImagesNegativeTestJSON-server-25337122',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-25337122',id=50,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:11:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7c21a3b8b2ec4521af33df81f746e3ff',ramdisk_id='',reservation_id='r-zu71q878',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-522836631',owner_user_name='tempest-ImagesNegativeTestJSON-522836631-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:11:27Z,user_data=None,user_id='c2999fde0aa643028301c0aef2c02f66',uuid=598e1759-7095-483d-bfc5-34c50a97b4f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "address": "fa:16:3e:bc:d8:52", "network": {"id": "67016734-de32-452f-b3cd-af3113cea332", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1036479945-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c21a3b8b2ec4521af33df81f746e3ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8bab6cf-7d", "ovs_interfaceid": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.588 2 DEBUG nova.network.os_vif_util [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Converting VIF {"id": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "address": "fa:16:3e:bc:d8:52", "network": {"id": "67016734-de32-452f-b3cd-af3113cea332", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1036479945-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c21a3b8b2ec4521af33df81f746e3ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8bab6cf-7d", "ovs_interfaceid": "f8bab6cf-7dfd-4fa3-8984-36374cf87924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.589 2 DEBUG nova.network.os_vif_util [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:d8:52,bridge_name='br-int',has_traffic_filtering=True,id=f8bab6cf-7dfd-4fa3-8984-36374cf87924,network=Network(67016734-de32-452f-b3cd-af3113cea332),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8bab6cf-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.589 2 DEBUG os_vif [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:d8:52,bridge_name='br-int',has_traffic_filtering=True,id=f8bab6cf-7dfd-4fa3-8984-36374cf87924,network=Network(67016734-de32-452f-b3cd-af3113cea332),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8bab6cf-7d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.590 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8bab6cf-7d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.600 2 INFO os_vif [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:d8:52,bridge_name='br-int',has_traffic_filtering=True,id=f8bab6cf-7dfd-4fa3-8984-36374cf87924,network=Network(67016734-de32-452f-b3cd-af3113cea332),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8bab6cf-7d')#033[00m
Oct  2 08:11:28 np0005465988 neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332[257320]: [NOTICE]   (257324) : haproxy version is 2.8.14-c23fe91
Oct  2 08:11:28 np0005465988 neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332[257320]: [NOTICE]   (257324) : path to executable is /usr/sbin/haproxy
Oct  2 08:11:28 np0005465988 neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332[257320]: [WARNING]  (257324) : Exiting Master process...
Oct  2 08:11:28 np0005465988 neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332[257320]: [ALERT]    (257324) : Current worker (257326) exited with code 143 (Terminated)
Oct  2 08:11:28 np0005465988 neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332[257320]: [WARNING]  (257324) : All workers exited. Exiting... (0)
Oct  2 08:11:28 np0005465988 systemd[1]: libpod-c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168.scope: Deactivated successfully.
Oct  2 08:11:28 np0005465988 podman[257431]: 2025-10-02 12:11:28.731975421 +0000 UTC m=+0.054435452 container died c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:11:28 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168-userdata-shm.mount: Deactivated successfully.
Oct  2 08:11:28 np0005465988 systemd[1]: var-lib-containers-storage-overlay-b756acba8bc2c88c475c407acd5e8380900e48427226fc7d2d988089684bb710-merged.mount: Deactivated successfully.
Oct  2 08:11:28 np0005465988 podman[257431]: 2025-10-02 12:11:28.776153815 +0000 UTC m=+0.098613856 container cleanup c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:11:28 np0005465988 systemd[1]: libpod-conmon-c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168.scope: Deactivated successfully.
Oct  2 08:11:28 np0005465988 podman[257462]: 2025-10-02 12:11:28.859763894 +0000 UTC m=+0.054211836 container remove c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.866 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[87fde11d-0d0d-4965-a0af-85e68f322043]: (4, ('Thu Oct  2 12:11:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332 (c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168)\nc68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168\nThu Oct  2 12:11:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-67016734-de32-452f-b3cd-af3113cea332 (c68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168)\nc68d8f401080549b6a79c84be359ac9909183cebd44ee0c8fcc57ff628057168\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.869 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[93f611bc-358a-4c06-a9d5-5230a9967330]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.870 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap67016734-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:28 np0005465988 kernel: tap67016734-d0: left promiscuous mode
Oct  2 08:11:28 np0005465988 nova_compute[236126]: 2025-10-02 12:11:28.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.893 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf1ca3a-4c00-43dd-84a9-5fa725d04aa0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.926 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[619ea997-c212-4e27-805e-e264f7458faf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.928 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2ccb7160-6118-492b-bc45-0322783c4262]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.944 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d2c81dc9-2575-45f4-90b4-0b4631183d35]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514528, 'reachable_time': 40262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257478, 'error': None, 'target': 'ovnmeta-67016734-de32-452f-b3cd-af3113cea332', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:28 np0005465988 systemd[1]: run-netns-ovnmeta\x2d67016734\x2dde32\x2d452f\x2db3cd\x2daf3113cea332.mount: Deactivated successfully.
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.948 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-67016734-de32-452f-b3cd-af3113cea332 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:11:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:28.948 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[80475a85-48bf-4ff0-8124-a5eebf156788]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:29.425 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:29.427 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.544 2 DEBUG nova.compute.manager [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Received event network-vif-unplugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.545 2 DEBUG oslo_concurrency.lockutils [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.545 2 DEBUG oslo_concurrency.lockutils [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.546 2 DEBUG oslo_concurrency.lockutils [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.546 2 DEBUG nova.compute.manager [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] No waiting events found dispatching network-vif-unplugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.546 2 DEBUG nova.compute.manager [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Received event network-vif-unplugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.546 2 DEBUG nova.compute.manager [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Received event network-vif-plugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.546 2 DEBUG oslo_concurrency.lockutils [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.547 2 DEBUG oslo_concurrency.lockutils [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.547 2 DEBUG oslo_concurrency.lockutils [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.547 2 DEBUG nova.compute.manager [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] No waiting events found dispatching network-vif-plugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:29 np0005465988 nova_compute[236126]: 2025-10-02 12:11:29.547 2 WARNING nova.compute.manager [req-2ff76d57-9d32-4e96-9bea-71bf6b44e439 req-ac24831a-485e-408d-8dc5-8b12526dc6b4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Received unexpected event network-vif-plugged-f8bab6cf-7dfd-4fa3-8984-36374cf87924 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:11:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:30.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:30.429 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:30.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:32 np0005465988 nova_compute[236126]: 2025-10-02 12:11:32.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:32.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:32.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:33 np0005465988 nova_compute[236126]: 2025-10-02 12:11:33.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:33 np0005465988 nova_compute[236126]: 2025-10-02 12:11:33.911 2 INFO nova.virt.libvirt.driver [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Deleting instance files /var/lib/nova/instances/598e1759-7095-483d-bfc5-34c50a97b4f8_del#033[00m
Oct  2 08:11:33 np0005465988 nova_compute[236126]: 2025-10-02 12:11:33.912 2 INFO nova.virt.libvirt.driver [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Deletion of /var/lib/nova/instances/598e1759-7095-483d-bfc5-34c50a97b4f8_del complete#033[00m
Oct  2 08:11:33 np0005465988 nova_compute[236126]: 2025-10-02 12:11:33.987 2 INFO nova.compute.manager [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Took 5.89 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:11:33 np0005465988 nova_compute[236126]: 2025-10-02 12:11:33.988 2 DEBUG oslo.service.loopingcall [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:11:33 np0005465988 nova_compute[236126]: 2025-10-02 12:11:33.990 2 DEBUG nova.compute.manager [-] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:11:33 np0005465988 nova_compute[236126]: 2025-10-02 12:11:33.990 2 DEBUG nova.network.neutron [-] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:11:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:34.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:11:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:34.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:11:34 np0005465988 podman[257531]: 2025-10-02 12:11:34.685297019 +0000 UTC m=+0.056019788 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.054 2 DEBUG nova.network.neutron [-] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.076 2 INFO nova.compute.manager [-] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Took 1.09 seconds to deallocate network for instance.#033[00m
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.145 2 DEBUG nova.compute.manager [req-12b3e958-ed71-407e-9a02-64a1e7973488 req-213871ca-9823-4418-8970-ff8329788d82 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Received event network-vif-deleted-f8bab6cf-7dfd-4fa3-8984-36374cf87924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.148 2 DEBUG oslo_concurrency.lockutils [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.149 2 DEBUG oslo_concurrency.lockutils [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.213 2 DEBUG oslo_concurrency.processutils [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:11:35 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2443498065' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.696 2 DEBUG oslo_concurrency.processutils [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.707 2 DEBUG nova.compute.provider_tree [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.738 2 DEBUG nova.scheduler.client.report [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.770 2 DEBUG oslo_concurrency.lockutils [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.804 2 INFO nova.scheduler.client.report [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Deleted allocations for instance 598e1759-7095-483d-bfc5-34c50a97b4f8#033[00m
Oct  2 08:11:35 np0005465988 nova_compute[236126]: 2025-10-02 12:11:35.889 2 DEBUG oslo_concurrency.lockutils [None req-70b595ab-186e-4318-a24f-f1f473162d50 c2999fde0aa643028301c0aef2c02f66 7c21a3b8b2ec4521af33df81f746e3ff - - default default] Lock "598e1759-7095-483d-bfc5-34c50a97b4f8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:36.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:36 np0005465988 nova_compute[236126]: 2025-10-02 12:11:36.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:36.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:37 np0005465988 nova_compute[236126]: 2025-10-02 12:11:37.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:37 np0005465988 nova_compute[236126]: 2025-10-02 12:11:37.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:37 np0005465988 nova_compute[236126]: 2025-10-02 12:11:37.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:37 np0005465988 nova_compute[236126]: 2025-10-02 12:11:37.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:37 np0005465988 nova_compute[236126]: 2025-10-02 12:11:37.502 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:37 np0005465988 nova_compute[236126]: 2025-10-02 12:11:37.503 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:37 np0005465988 nova_compute[236126]: 2025-10-02 12:11:37.503 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:37 np0005465988 nova_compute[236126]: 2025-10-02 12:11:37.504 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:11:37 np0005465988 nova_compute[236126]: 2025-10-02 12:11:37.504 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:11:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2360845791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:11:37 np0005465988 nova_compute[236126]: 2025-10-02 12:11:37.990 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.226 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.228 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4738MB free_disk=20.92188262939453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.228 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.228 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:38.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.326 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.326 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.352 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:38.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:11:38 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2049217689' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.813 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.821 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.854 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.884 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:11:38 np0005465988 nova_compute[236126]: 2025-10-02 12:11:38.885 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:40 np0005465988 nova_compute[236126]: 2025-10-02 12:11:40.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:40.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:40.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e185 e185: 3 total, 3 up, 3 in
Oct  2 08:11:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e186 e186: 3 total, 3 up, 3 in
Oct  2 08:11:41 np0005465988 nova_compute[236126]: 2025-10-02 12:11:41.885 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:41 np0005465988 nova_compute[236126]: 2025-10-02 12:11:41.885 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:41 np0005465988 nova_compute[236126]: 2025-10-02 12:11:41.886 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:41 np0005465988 nova_compute[236126]: 2025-10-02 12:11:41.886 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:11:42 np0005465988 nova_compute[236126]: 2025-10-02 12:11:42.160 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Acquiring lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:42 np0005465988 nova_compute[236126]: 2025-10-02 12:11:42.160 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:42 np0005465988 nova_compute[236126]: 2025-10-02 12:11:42.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:42 np0005465988 nova_compute[236126]: 2025-10-02 12:11:42.204 2 DEBUG nova.compute.manager [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:11:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:42.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:42 np0005465988 nova_compute[236126]: 2025-10-02 12:11:42.357 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:42 np0005465988 nova_compute[236126]: 2025-10-02 12:11:42.358 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:42 np0005465988 nova_compute[236126]: 2025-10-02 12:11:42.372 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:11:42 np0005465988 nova_compute[236126]: 2025-10-02 12:11:42.373 2 INFO nova.compute.claims [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:11:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:42.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:42 np0005465988 nova_compute[236126]: 2025-10-02 12:11:42.590 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e187 e187: 3 total, 3 up, 3 in
Oct  2 08:11:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:11:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/20976089' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.052 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.060 2 DEBUG nova.compute.provider_tree [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.079 2 DEBUG nova.scheduler.client.report [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.142 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.143 2 DEBUG nova.compute.manager [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.242 2 DEBUG nova.compute.manager [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.243 2 DEBUG nova.network.neutron [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.286 2 INFO nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.536 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407088.53475, 598e1759-7095-483d-bfc5-34c50a97b4f8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.537 2 INFO nova.compute.manager [-] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005465988 nova_compute[236126]: 2025-10-02 12:11:43.855 2 DEBUG nova.compute.manager [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:11:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:44.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:44 np0005465988 nova_compute[236126]: 2025-10-02 12:11:44.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:44.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:44 np0005465988 nova_compute[236126]: 2025-10-02 12:11:44.930 2 DEBUG nova.compute.manager [None req-ffdd5100-4f26-4523-8cab-9d271853064f - - - - - -] [instance: 598e1759-7095-483d-bfc5-34c50a97b4f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.127 2 DEBUG nova.policy [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '149e3925214640b484809bc9362e31ac', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '515a20f509b440c1bda78c309dea196e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.174 2 DEBUG nova.compute.manager [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.176 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.176 2 INFO nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Creating image(s)#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.216 2 DEBUG nova.storage.rbd_utils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] rbd image 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.260 2 DEBUG nova.storage.rbd_utils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] rbd image 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.295 2 DEBUG nova.storage.rbd_utils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] rbd image 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.299 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.367 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.369 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.369 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.370 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.396 2 DEBUG nova.storage.rbd_utils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] rbd image 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.400 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.535 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:11:45 np0005465988 nova_compute[236126]: 2025-10-02 12:11:45.536 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:11:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:46.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:46.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:47 np0005465988 nova_compute[236126]: 2025-10-02 12:11:47.008 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.608s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:47 np0005465988 nova_compute[236126]: 2025-10-02 12:11:47.103 2 DEBUG nova.storage.rbd_utils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] resizing rbd image 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:11:47 np0005465988 nova_compute[236126]: 2025-10-02 12:11:47.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:47 np0005465988 nova_compute[236126]: 2025-10-02 12:11:47.705 2 DEBUG nova.objects.instance [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lazy-loading 'migration_context' on Instance uuid 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:47 np0005465988 nova_compute[236126]: 2025-10-02 12:11:47.911 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:11:47 np0005465988 nova_compute[236126]: 2025-10-02 12:11:47.912 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Ensure instance console log exists: /var/lib/nova/instances/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:11:47 np0005465988 nova_compute[236126]: 2025-10-02 12:11:47.912 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:47 np0005465988 nova_compute[236126]: 2025-10-02 12:11:47.913 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:47 np0005465988 nova_compute[236126]: 2025-10-02 12:11:47.913 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:48.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:48 np0005465988 nova_compute[236126]: 2025-10-02 12:11:48.509 2 DEBUG nova.network.neutron [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Successfully created port: 19af83a7-4a5a-4802-b112-20ac31fbfae7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:11:48 np0005465988 nova_compute[236126]: 2025-10-02 12:11:48.531 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:48.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e188 e188: 3 total, 3 up, 3 in
Oct  2 08:11:48 np0005465988 nova_compute[236126]: 2025-10-02 12:11:48.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:49 np0005465988 podman[257820]: 2025-10-02 12:11:49.576694044 +0000 UTC m=+0.094778504 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:11:49 np0005465988 podman[257819]: 2025-10-02 12:11:49.594340057 +0000 UTC m=+0.121529872 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:11:49 np0005465988 podman[257821]: 2025-10-02 12:11:49.598013334 +0000 UTC m=+0.108685479 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:11:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:50.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e189 e189: 3 total, 3 up, 3 in
Oct  2 08:11:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:50.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:50 np0005465988 nova_compute[236126]: 2025-10-02 12:11:50.782 2 DEBUG nova.network.neutron [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Successfully updated port: 19af83a7-4a5a-4802-b112-20ac31fbfae7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:11:50 np0005465988 nova_compute[236126]: 2025-10-02 12:11:50.810 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Acquiring lock "refresh_cache-7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:50 np0005465988 nova_compute[236126]: 2025-10-02 12:11:50.810 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Acquired lock "refresh_cache-7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:50 np0005465988 nova_compute[236126]: 2025-10-02 12:11:50.810 2 DEBUG nova.network.neutron [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:51 np0005465988 nova_compute[236126]: 2025-10-02 12:11:51.102 2 DEBUG nova.compute.manager [req-2d6561fb-5dad-46d9-9a32-d6f339164430 req-e8b909c9-4f48-41af-b9d3-5744aab847b9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Received event network-changed-19af83a7-4a5a-4802-b112-20ac31fbfae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:51 np0005465988 nova_compute[236126]: 2025-10-02 12:11:51.102 2 DEBUG nova.compute.manager [req-2d6561fb-5dad-46d9-9a32-d6f339164430 req-e8b909c9-4f48-41af-b9d3-5744aab847b9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Refreshing instance network info cache due to event network-changed-19af83a7-4a5a-4802-b112-20ac31fbfae7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:51 np0005465988 nova_compute[236126]: 2025-10-02 12:11:51.102 2 DEBUG oslo_concurrency.lockutils [req-2d6561fb-5dad-46d9-9a32-d6f339164430 req-e8b909c9-4f48-41af-b9d3-5744aab847b9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:51 np0005465988 nova_compute[236126]: 2025-10-02 12:11:51.786 2 DEBUG nova.network.neutron [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:11:52 np0005465988 nova_compute[236126]: 2025-10-02 12:11:52.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:52.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:52.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.237145) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407113237272, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2759, "num_deletes": 518, "total_data_size": 5516023, "memory_usage": 5600512, "flush_reason": "Manual Compaction"}
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407113254729, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 3596897, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30914, "largest_seqno": 33668, "table_properties": {"data_size": 3586221, "index_size": 6338, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3269, "raw_key_size": 25900, "raw_average_key_size": 20, "raw_value_size": 3562533, "raw_average_value_size": 2755, "num_data_blocks": 274, "num_entries": 1293, "num_filter_entries": 1293, "num_deletions": 518, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759406930, "oldest_key_time": 1759406930, "file_creation_time": 1759407113, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 17669 microseconds, and 11171 cpu microseconds.
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.254819) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 3596897 bytes OK
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.254855) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.257089) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.257119) EVENT_LOG_v1 {"time_micros": 1759407113257108, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.257148) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 5502936, prev total WAL file size 5502936, number of live WAL files 2.
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.259690) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3512KB)], [60(8381KB)]
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407113259755, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 12179862, "oldest_snapshot_seqno": -1}
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 5616 keys, 10139363 bytes, temperature: kUnknown
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407113323430, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 10139363, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10100131, "index_size": 24082, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14085, "raw_key_size": 144404, "raw_average_key_size": 25, "raw_value_size": 9997436, "raw_average_value_size": 1780, "num_data_blocks": 968, "num_entries": 5616, "num_filter_entries": 5616, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759407113, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.323927) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 10139363 bytes
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.325425) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 190.9 rd, 158.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 8.2 +0.0 blob) out(9.7 +0.0 blob), read-write-amplify(6.2) write-amplify(2.8) OK, records in: 6668, records dropped: 1052 output_compression: NoCompression
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.325449) EVENT_LOG_v1 {"time_micros": 1759407113325435, "job": 36, "event": "compaction_finished", "compaction_time_micros": 63790, "compaction_time_cpu_micros": 41504, "output_level": 6, "num_output_files": 1, "total_output_size": 10139363, "num_input_records": 6668, "num_output_records": 5616, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407113326146, "job": 36, "event": "table_file_deletion", "file_number": 62}
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407113328274, "job": 36, "event": "table_file_deletion", "file_number": 60}
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.259430) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.328339) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.328346) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.328347) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.328349) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:11:53.328350) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.777 2 DEBUG nova.network.neutron [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Updating instance_info_cache with network_info: [{"id": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "address": "fa:16:3e:e2:ae:13", "network": {"id": "fd7850a2-e443-4918-8039-c177b9f865e9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1493494643-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "515a20f509b440c1bda78c309dea196e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19af83a7-4a", "ovs_interfaceid": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.805 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Releasing lock "refresh_cache-7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.806 2 DEBUG nova.compute.manager [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Instance network_info: |[{"id": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "address": "fa:16:3e:e2:ae:13", "network": {"id": "fd7850a2-e443-4918-8039-c177b9f865e9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1493494643-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "515a20f509b440c1bda78c309dea196e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19af83a7-4a", "ovs_interfaceid": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.806 2 DEBUG oslo_concurrency.lockutils [req-2d6561fb-5dad-46d9-9a32-d6f339164430 req-e8b909c9-4f48-41af-b9d3-5744aab847b9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.806 2 DEBUG nova.network.neutron [req-2d6561fb-5dad-46d9-9a32-d6f339164430 req-e8b909c9-4f48-41af-b9d3-5744aab847b9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Refreshing network info cache for port 19af83a7-4a5a-4802-b112-20ac31fbfae7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.809 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Start _get_guest_xml network_info=[{"id": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "address": "fa:16:3e:e2:ae:13", "network": {"id": "fd7850a2-e443-4918-8039-c177b9f865e9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1493494643-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "515a20f509b440c1bda78c309dea196e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19af83a7-4a", "ovs_interfaceid": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.814 2 WARNING nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.821 2 DEBUG nova.virt.libvirt.host [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.822 2 DEBUG nova.virt.libvirt.host [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.826 2 DEBUG nova.virt.libvirt.host [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.827 2 DEBUG nova.virt.libvirt.host [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.828 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.828 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.829 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.829 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.829 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.830 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.830 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.830 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.830 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.830 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.831 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.831 2 DEBUG nova.virt.hardware [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:11:53 np0005465988 nova_compute[236126]: 2025-10-02 12:11:53.834 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:11:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2172909903' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.295 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:54.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.339 2 DEBUG nova.storage.rbd_utils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] rbd image 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.345 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:54.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:11:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2137773829' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.781 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.783 2 DEBUG nova.virt.libvirt.vif [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:11:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1925963903',display_name='tempest-ServersTestManualDisk-server-1925963903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1925963903',id=52,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKLsZLJ0JgRu5rKRhdrh1/TyoKw8U2kXcryCIyZkfI9uIl3Zx/ZmKXNHGlC1E2QZKYEU/Q0USHyhPx5HdjE1pHIDHTUPiMDFrmsPvwA9v8UIEum4olA4ZiOqBX1lTtkFxw==',key_name='tempest-keypair-2000206289',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='515a20f509b440c1bda78c309dea196e',ramdisk_id='',reservation_id='r-div4k10m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1757413810',owner_user_name='tempest-ServersTestManualDisk-1757413810-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:11:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='149e3925214640b484809bc9362e31ac',uuid=7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "address": "fa:16:3e:e2:ae:13", "network": {"id": "fd7850a2-e443-4918-8039-c177b9f865e9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1493494643-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "515a20f509b440c1bda78c309dea196e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19af83a7-4a", "ovs_interfaceid": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.783 2 DEBUG nova.network.os_vif_util [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Converting VIF {"id": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "address": "fa:16:3e:e2:ae:13", "network": {"id": "fd7850a2-e443-4918-8039-c177b9f865e9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1493494643-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "515a20f509b440c1bda78c309dea196e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19af83a7-4a", "ovs_interfaceid": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.784 2 DEBUG nova.network.os_vif_util [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:ae:13,bridge_name='br-int',has_traffic_filtering=True,id=19af83a7-4a5a-4802-b112-20ac31fbfae7,network=Network(fd7850a2-e443-4918-8039-c177b9f865e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19af83a7-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.785 2 DEBUG nova.objects.instance [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lazy-loading 'pci_devices' on Instance uuid 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.824 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  <uuid>7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e</uuid>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  <name>instance-00000034</name>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersTestManualDisk-server-1925963903</nova:name>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:11:53</nova:creationTime>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <nova:user uuid="149e3925214640b484809bc9362e31ac">tempest-ServersTestManualDisk-1757413810-project-member</nova:user>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <nova:project uuid="515a20f509b440c1bda78c309dea196e">tempest-ServersTestManualDisk-1757413810</nova:project>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <nova:port uuid="19af83a7-4a5a-4802-b112-20ac31fbfae7">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <entry name="serial">7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e</entry>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <entry name="uuid">7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e</entry>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk.config">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:e2:ae:13"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <target dev="tap19af83a7-4a"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e/console.log" append="off"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:11:54 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:11:54 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:11:54 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:11:54 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.826 2 DEBUG nova.compute.manager [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Preparing to wait for external event network-vif-plugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.826 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Acquiring lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.826 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.826 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.827 2 DEBUG nova.virt.libvirt.vif [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:11:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1925963903',display_name='tempest-ServersTestManualDisk-server-1925963903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1925963903',id=52,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKLsZLJ0JgRu5rKRhdrh1/TyoKw8U2kXcryCIyZkfI9uIl3Zx/ZmKXNHGlC1E2QZKYEU/Q0USHyhPx5HdjE1pHIDHTUPiMDFrmsPvwA9v8UIEum4olA4ZiOqBX1lTtkFxw==',key_name='tempest-keypair-2000206289',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='515a20f509b440c1bda78c309dea196e',ramdisk_id='',reservation_id='r-div4k10m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1757413810',owner_user_name='tempest-ServersTestManualDisk-1757413810-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:11:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='149e3925214640b484809bc9362e31ac',uuid=7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "address": "fa:16:3e:e2:ae:13", "network": {"id": "fd7850a2-e443-4918-8039-c177b9f865e9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1493494643-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "515a20f509b440c1bda78c309dea196e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19af83a7-4a", "ovs_interfaceid": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.827 2 DEBUG nova.network.os_vif_util [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Converting VIF {"id": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "address": "fa:16:3e:e2:ae:13", "network": {"id": "fd7850a2-e443-4918-8039-c177b9f865e9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1493494643-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "515a20f509b440c1bda78c309dea196e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19af83a7-4a", "ovs_interfaceid": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.828 2 DEBUG nova.network.os_vif_util [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:ae:13,bridge_name='br-int',has_traffic_filtering=True,id=19af83a7-4a5a-4802-b112-20ac31fbfae7,network=Network(fd7850a2-e443-4918-8039-c177b9f865e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19af83a7-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.828 2 DEBUG os_vif [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:ae:13,bridge_name='br-int',has_traffic_filtering=True,id=19af83a7-4a5a-4802-b112-20ac31fbfae7,network=Network(fd7850a2-e443-4918-8039-c177b9f865e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19af83a7-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.832 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19af83a7-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.832 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap19af83a7-4a, col_values=(('external_ids', {'iface-id': '19af83a7-4a5a-4802-b112-20ac31fbfae7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:ae:13', 'vm-uuid': '7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:54 np0005465988 NetworkManager[45041]: <info>  [1759407114.8344] manager: (tap19af83a7-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.841 2 INFO os_vif [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:ae:13,bridge_name='br-int',has_traffic_filtering=True,id=19af83a7-4a5a-4802-b112-20ac31fbfae7,network=Network(fd7850a2-e443-4918-8039-c177b9f865e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19af83a7-4a')#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.893 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.894 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.894 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] No VIF found with MAC fa:16:3e:e2:ae:13, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.895 2 INFO nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Using config drive#033[00m
Oct  2 08:11:54 np0005465988 nova_compute[236126]: 2025-10-02 12:11:54.935 2 DEBUG nova.storage.rbd_utils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] rbd image 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:56 np0005465988 nova_compute[236126]: 2025-10-02 12:11:56.033 2 INFO nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Creating config drive at /var/lib/nova/instances/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e/disk.config#033[00m
Oct  2 08:11:56 np0005465988 nova_compute[236126]: 2025-10-02 12:11:56.050 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6bhq1ig4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:56 np0005465988 nova_compute[236126]: 2025-10-02 12:11:56.200 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6bhq1ig4" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:56 np0005465988 nova_compute[236126]: 2025-10-02 12:11:56.237 2 DEBUG nova.storage.rbd_utils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] rbd image 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:11:56 np0005465988 nova_compute[236126]: 2025-10-02 12:11:56.242 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e/disk.config 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:56.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:11:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:56.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:11:56 np0005465988 nova_compute[236126]: 2025-10-02 12:11:56.695 2 DEBUG oslo_concurrency.processutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e/disk.config 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:56 np0005465988 nova_compute[236126]: 2025-10-02 12:11:56.696 2 INFO nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Deleting local config drive /var/lib/nova/instances/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e/disk.config because it was imported into RBD.#033[00m
Oct  2 08:11:56 np0005465988 kernel: tap19af83a7-4a: entered promiscuous mode
Oct  2 08:11:56 np0005465988 NetworkManager[45041]: <info>  [1759407116.7638] manager: (tap19af83a7-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Oct  2 08:11:56 np0005465988 systemd-udevd[258063]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:11:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:56Z|00125|binding|INFO|Claiming lport 19af83a7-4a5a-4802-b112-20ac31fbfae7 for this chassis.
Oct  2 08:11:56 np0005465988 nova_compute[236126]: 2025-10-02 12:11:56.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:56Z|00126|binding|INFO|19af83a7-4a5a-4802-b112-20ac31fbfae7: Claiming fa:16:3e:e2:ae:13 10.100.0.12
Oct  2 08:11:56 np0005465988 nova_compute[236126]: 2025-10-02 12:11:56.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:56 np0005465988 NetworkManager[45041]: <info>  [1759407116.8407] device (tap19af83a7-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:11:56 np0005465988 NetworkManager[45041]: <info>  [1759407116.8423] device (tap19af83a7-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.840 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:ae:13 10.100.0.12'], port_security=['fa:16:3e:e2:ae:13 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd7850a2-e443-4918-8039-c177b9f865e9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '515a20f509b440c1bda78c309dea196e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fa32e577-50b2-48b4-8e0f-570787035c44', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bc15082-7b92-4e80-902f-3ff67b73a485, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=19af83a7-4a5a-4802-b112-20ac31fbfae7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.842 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 19af83a7-4a5a-4802-b112-20ac31fbfae7 in datapath fd7850a2-e443-4918-8039-c177b9f865e9 bound to our chassis#033[00m
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.844 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd7850a2-e443-4918-8039-c177b9f865e9#033[00m
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.857 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a4fb9296-1dba-4680-85fb-5ad56f860b26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.858 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd7850a2-e1 in ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:11:56 np0005465988 systemd-machined[192594]: New machine qemu-20-instance-00000034.
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.859 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd7850a2-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.860 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[df5dc207-8529-499a-aaa5-911d0b453836]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.861 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6c5b5b8c-939d-4e24-bd2e-92e6dd60c041]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.875 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf02c81-ac2e-49b7-aab8-3ea0c0b5d5f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:56 np0005465988 systemd[1]: Started Virtual Machine qemu-20-instance-00000034.
Oct  2 08:11:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:56Z|00127|binding|INFO|Setting lport 19af83a7-4a5a-4802-b112-20ac31fbfae7 ovn-installed in OVS
Oct  2 08:11:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:56Z|00128|binding|INFO|Setting lport 19af83a7-4a5a-4802-b112-20ac31fbfae7 up in Southbound
Oct  2 08:11:56 np0005465988 nova_compute[236126]: 2025-10-02 12:11:56.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.904 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f7d37a80-f7e5-463f-8ed4-0d720d4fffde]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.943 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[607834e4-498b-4a57-8bde-7e5e39fa83f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:56 np0005465988 NetworkManager[45041]: <info>  [1759407116.9490] manager: (tapfd7850a2-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.948 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d53cd2-187e-4007-b59b-b5860b724cb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:56 np0005465988 systemd-udevd[258068]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.981 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[790f8395-2432-4229-96e0-6c268502eeb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:56.985 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[33fe7732-7e48-453a-94be-d7c57dedb603]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:57 np0005465988 NetworkManager[45041]: <info>  [1759407117.0102] device (tapfd7850a2-e0): carrier: link connected
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.017 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[24595340-da16-4c1a-a037-c80f2640c00e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.037 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0b41c318-b2af-426c-8984-b75891416935]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd7850a2-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:72:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517734, 'reachable_time': 23085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258100, 'error': None, 'target': 'ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.042 2 DEBUG nova.network.neutron [req-2d6561fb-5dad-46d9-9a32-d6f339164430 req-e8b909c9-4f48-41af-b9d3-5744aab847b9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Updated VIF entry in instance network info cache for port 19af83a7-4a5a-4802-b112-20ac31fbfae7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.043 2 DEBUG nova.network.neutron [req-2d6561fb-5dad-46d9-9a32-d6f339164430 req-e8b909c9-4f48-41af-b9d3-5744aab847b9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Updating instance_info_cache with network_info: [{"id": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "address": "fa:16:3e:e2:ae:13", "network": {"id": "fd7850a2-e443-4918-8039-c177b9f865e9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1493494643-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "515a20f509b440c1bda78c309dea196e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19af83a7-4a", "ovs_interfaceid": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.055 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[27596a71-6e3e-43e0-99e3-0368313daf29]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:72cf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517734, 'tstamp': 517734}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258101, 'error': None, 'target': 'ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.062 2 DEBUG oslo_concurrency.lockutils [req-2d6561fb-5dad-46d9-9a32-d6f339164430 req-e8b909c9-4f48-41af-b9d3-5744aab847b9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.077 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a4939bbd-b174-458c-802b-3d391d7926e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd7850a2-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:72:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517734, 'reachable_time': 23085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 258102, 'error': None, 'target': 'ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.108 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ca0ea196-b805-40a5-a994-a82a8b670e81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.167 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0e84deae-ec9d-45b0-8ee5-92579d652794]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.168 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd7850a2-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.168 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.168 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd7850a2-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:57 np0005465988 kernel: tapfd7850a2-e0: entered promiscuous mode
Oct  2 08:11:57 np0005465988 NetworkManager[45041]: <info>  [1759407117.1708] manager: (tapfd7850a2-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.173 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd7850a2-e0, col_values=(('external_ids', {'iface-id': '1253fc22-c12c-4ae9-8306-6c7ed9f1908a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:57 np0005465988 ovn_controller[132601]: 2025-10-02T12:11:57Z|00129|binding|INFO|Releasing lport 1253fc22-c12c-4ae9-8306-6c7ed9f1908a from this chassis (sb_readonly=0)
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.187 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd7850a2-e443-4918-8039-c177b9f865e9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd7850a2-e443-4918-8039-c177b9f865e9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.187 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4393475f-acd0-481c-8139-222e752b0a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.188 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-fd7850a2-e443-4918-8039-c177b9f865e9
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/fd7850a2-e443-4918-8039-c177b9f865e9.pid.haproxy
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID fd7850a2-e443-4918-8039-c177b9f865e9
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:11:57.189 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9', 'env', 'PROCESS_TAG=haproxy-fd7850a2-e443-4918-8039-c177b9f865e9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd7850a2-e443-4918-8039-c177b9f865e9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:11:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:11:57 np0005465988 podman[258174]: 2025-10-02 12:11:57.595958731 +0000 UTC m=+0.073316191 container create 31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.615 2 DEBUG nova.compute.manager [req-eed6291a-a3dd-446a-b81a-58630ddd1d3d req-c4c5fbfb-daab-442f-8721-571334f5d5db d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Received event network-vif-plugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.616 2 DEBUG oslo_concurrency.lockutils [req-eed6291a-a3dd-446a-b81a-58630ddd1d3d req-c4c5fbfb-daab-442f-8721-571334f5d5db d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.616 2 DEBUG oslo_concurrency.lockutils [req-eed6291a-a3dd-446a-b81a-58630ddd1d3d req-c4c5fbfb-daab-442f-8721-571334f5d5db d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.617 2 DEBUG oslo_concurrency.lockutils [req-eed6291a-a3dd-446a-b81a-58630ddd1d3d req-c4c5fbfb-daab-442f-8721-571334f5d5db d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.617 2 DEBUG nova.compute.manager [req-eed6291a-a3dd-446a-b81a-58630ddd1d3d req-c4c5fbfb-daab-442f-8721-571334f5d5db d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Processing event network-vif-plugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:11:57 np0005465988 systemd[1]: Started libpod-conmon-31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737.scope.
Oct  2 08:11:57 np0005465988 podman[258174]: 2025-10-02 12:11:57.562107927 +0000 UTC m=+0.039465397 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:11:57 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:11:57 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8faa3b2a279733222874da63cd66372b83e6a3a8f43299a0a2c1bf61892e9240/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.694 2 DEBUG nova.compute.manager [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.695 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407117.6954706, 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.696 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] VM Started (Lifecycle Event)#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.698 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.702 2 INFO nova.virt.libvirt.driver [-] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Instance spawned successfully.#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.703 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:11:57 np0005465988 podman[258174]: 2025-10-02 12:11:57.704001749 +0000 UTC m=+0.181359260 container init 31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:11:57 np0005465988 podman[258174]: 2025-10-02 12:11:57.710798747 +0000 UTC m=+0.188156207 container start 31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.734 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:57 np0005465988 neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9[258189]: [NOTICE]   (258193) : New worker (258195) forked
Oct  2 08:11:57 np0005465988 neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9[258189]: [NOTICE]   (258193) : Loading success.
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.746 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.753 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.754 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.755 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.756 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.757 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.758 2 DEBUG nova.virt.libvirt.driver [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.771 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.772 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407117.696541, 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.772 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.820 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.825 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407117.6981187, 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.826 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.859 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.867 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.882 2 INFO nova.compute.manager [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Took 12.71 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.882 2 DEBUG nova.compute.manager [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.893 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.964 2 INFO nova.compute.manager [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Took 15.64 seconds to build instance.#033[00m
Oct  2 08:11:57 np0005465988 nova_compute[236126]: 2025-10-02 12:11:57.985 2 DEBUG oslo_concurrency.lockutils [None req-30f20cfe-39f3-4be9-a536-ca617d173fa8 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:11:58.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e190 e190: 3 total, 3 up, 3 in
Oct  2 08:11:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:11:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:11:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:11:58.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:11:59 np0005465988 nova_compute[236126]: 2025-10-02 12:11:59.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:00 np0005465988 nova_compute[236126]: 2025-10-02 12:12:00.308 2 DEBUG nova.compute.manager [req-2cec44c8-0678-40dd-b5ff-83cc5f03d591 req-a2e68cec-8b94-4545-a4e2-652e1ce9c52d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Received event network-vif-plugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:00 np0005465988 nova_compute[236126]: 2025-10-02 12:12:00.308 2 DEBUG oslo_concurrency.lockutils [req-2cec44c8-0678-40dd-b5ff-83cc5f03d591 req-a2e68cec-8b94-4545-a4e2-652e1ce9c52d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:00 np0005465988 nova_compute[236126]: 2025-10-02 12:12:00.309 2 DEBUG oslo_concurrency.lockutils [req-2cec44c8-0678-40dd-b5ff-83cc5f03d591 req-a2e68cec-8b94-4545-a4e2-652e1ce9c52d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:00 np0005465988 nova_compute[236126]: 2025-10-02 12:12:00.309 2 DEBUG oslo_concurrency.lockutils [req-2cec44c8-0678-40dd-b5ff-83cc5f03d591 req-a2e68cec-8b94-4545-a4e2-652e1ce9c52d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:00 np0005465988 nova_compute[236126]: 2025-10-02 12:12:00.309 2 DEBUG nova.compute.manager [req-2cec44c8-0678-40dd-b5ff-83cc5f03d591 req-a2e68cec-8b94-4545-a4e2-652e1ce9c52d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] No waiting events found dispatching network-vif-plugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:00 np0005465988 nova_compute[236126]: 2025-10-02 12:12:00.310 2 WARNING nova.compute.manager [req-2cec44c8-0678-40dd-b5ff-83cc5f03d591 req-a2e68cec-8b94-4545-a4e2-652e1ce9c52d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Received unexpected event network-vif-plugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:12:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:00.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:00.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:02 np0005465988 nova_compute[236126]: 2025-10-02 12:12:02.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:02 np0005465988 NetworkManager[45041]: <info>  [1759407122.2231] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Oct  2 08:12:02 np0005465988 NetworkManager[45041]: <info>  [1759407122.2251] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Oct  2 08:12:02 np0005465988 nova_compute[236126]: 2025-10-02 12:12:02.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:02.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:02 np0005465988 nova_compute[236126]: 2025-10-02 12:12:02.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:12:02Z|00130|binding|INFO|Releasing lport 1253fc22-c12c-4ae9-8306-6c7ed9f1908a from this chassis (sb_readonly=0)
Oct  2 08:12:02 np0005465988 nova_compute[236126]: 2025-10-02 12:12:02.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:02.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:03 np0005465988 nova_compute[236126]: 2025-10-02 12:12:03.342 2 DEBUG nova.compute.manager [req-f352712d-38b0-4b43-95e2-462607c233ba req-9462ca34-4456-40cf-8f72-9ba327745bdd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Received event network-changed-19af83a7-4a5a-4802-b112-20ac31fbfae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:03 np0005465988 nova_compute[236126]: 2025-10-02 12:12:03.342 2 DEBUG nova.compute.manager [req-f352712d-38b0-4b43-95e2-462607c233ba req-9462ca34-4456-40cf-8f72-9ba327745bdd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Refreshing instance network info cache due to event network-changed-19af83a7-4a5a-4802-b112-20ac31fbfae7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:12:03 np0005465988 nova_compute[236126]: 2025-10-02 12:12:03.343 2 DEBUG oslo_concurrency.lockutils [req-f352712d-38b0-4b43-95e2-462607c233ba req-9462ca34-4456-40cf-8f72-9ba327745bdd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:03 np0005465988 nova_compute[236126]: 2025-10-02 12:12:03.343 2 DEBUG oslo_concurrency.lockutils [req-f352712d-38b0-4b43-95e2-462607c233ba req-9462ca34-4456-40cf-8f72-9ba327745bdd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:03 np0005465988 nova_compute[236126]: 2025-10-02 12:12:03.343 2 DEBUG nova.network.neutron [req-f352712d-38b0-4b43-95e2-462607c233ba req-9462ca34-4456-40cf-8f72-9ba327745bdd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Refreshing network info cache for port 19af83a7-4a5a-4802-b112-20ac31fbfae7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:12:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:04.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:04.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:04 np0005465988 nova_compute[236126]: 2025-10-02 12:12:04.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:05 np0005465988 podman[258209]: 2025-10-02 12:12:05.562613299 +0000 UTC m=+0.089067209 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:12:05 np0005465988 nova_compute[236126]: 2025-10-02 12:12:05.935 2 DEBUG nova.network.neutron [req-f352712d-38b0-4b43-95e2-462607c233ba req-9462ca34-4456-40cf-8f72-9ba327745bdd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Updated VIF entry in instance network info cache for port 19af83a7-4a5a-4802-b112-20ac31fbfae7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:12:05 np0005465988 nova_compute[236126]: 2025-10-02 12:12:05.936 2 DEBUG nova.network.neutron [req-f352712d-38b0-4b43-95e2-462607c233ba req-9462ca34-4456-40cf-8f72-9ba327745bdd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Updating instance_info_cache with network_info: [{"id": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "address": "fa:16:3e:e2:ae:13", "network": {"id": "fd7850a2-e443-4918-8039-c177b9f865e9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1493494643-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "515a20f509b440c1bda78c309dea196e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19af83a7-4a", "ovs_interfaceid": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:06 np0005465988 nova_compute[236126]: 2025-10-02 12:12:06.019 2 DEBUG oslo_concurrency.lockutils [req-f352712d-38b0-4b43-95e2-462607c233ba req-9462ca34-4456-40cf-8f72-9ba327745bdd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:06.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:06.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:07 np0005465988 nova_compute[236126]: 2025-10-02 12:12:07.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:08.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:08.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:09 np0005465988 nova_compute[236126]: 2025-10-02 12:12:09.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:10.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:10.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:12 np0005465988 nova_compute[236126]: 2025-10-02 12:12:12.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:12.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:12.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:12 np0005465988 ovn_controller[132601]: 2025-10-02T12:12:12Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e2:ae:13 10.100.0.12
Oct  2 08:12:12 np0005465988 ovn_controller[132601]: 2025-10-02T12:12:12Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e2:ae:13 10.100.0.12
Oct  2 08:12:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:14.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:14.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:14 np0005465988 nova_compute[236126]: 2025-10-02 12:12:14.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:16.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:16.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:17.045 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:17.046 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:12:17 np0005465988 nova_compute[236126]: 2025-10-02 12:12:17.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:17 np0005465988 nova_compute[236126]: 2025-10-02 12:12:17.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:17 np0005465988 podman[258455]: 2025-10-02 12:12:17.641035674 +0000 UTC m=+0.063044472 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 08:12:17 np0005465988 podman[258455]: 2025-10-02 12:12:17.778820487 +0000 UTC m=+0.200829275 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Oct  2 08:12:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:18.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:18 np0005465988 podman[258590]: 2025-10-02 12:12:18.388112178 +0000 UTC m=+0.045951286 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:12:18 np0005465988 podman[258590]: 2025-10-02 12:12:18.398747267 +0000 UTC m=+0.056586375 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:12:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:18.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:18 np0005465988 podman[258655]: 2025-10-02 12:12:18.639840391 +0000 UTC m=+0.068025567 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, version=2.2.4, name=keepalived, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2)
Oct  2 08:12:18 np0005465988 podman[258655]: 2025-10-02 12:12:18.660857182 +0000 UTC m=+0.089042358 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, architecture=x86_64, com.redhat.component=keepalived-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4)
Oct  2 08:12:19 np0005465988 nova_compute[236126]: 2025-10-02 12:12:19.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:12:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:12:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:12:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:20.048 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:20.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:20 np0005465988 podman[258839]: 2025-10-02 12:12:20.533374542 +0000 UTC m=+0.065615397 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:12:20 np0005465988 podman[258840]: 2025-10-02 12:12:20.549043638 +0000 UTC m=+0.070698665 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:12:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:20.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:20 np0005465988 podman[258838]: 2025-10-02 12:12:20.619757782 +0000 UTC m=+0.151998607 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:12:21 np0005465988 nova_compute[236126]: 2025-10-02 12:12:21.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:21 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:12:21 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:12:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e191 e191: 3 total, 3 up, 3 in
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:22.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.477 2 DEBUG oslo_concurrency.lockutils [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Acquiring lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.478 2 DEBUG oslo_concurrency.lockutils [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.478 2 DEBUG oslo_concurrency.lockutils [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Acquiring lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.478 2 DEBUG oslo_concurrency.lockutils [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.479 2 DEBUG oslo_concurrency.lockutils [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.480 2 INFO nova.compute.manager [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Terminating instance#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.482 2 DEBUG nova.compute.manager [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:12:22 np0005465988 kernel: tap19af83a7-4a (unregistering): left promiscuous mode
Oct  2 08:12:22 np0005465988 NetworkManager[45041]: <info>  [1759407142.5959] device (tap19af83a7-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:12:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:22.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:22 np0005465988 ovn_controller[132601]: 2025-10-02T12:12:22Z|00131|binding|INFO|Releasing lport 19af83a7-4a5a-4802-b112-20ac31fbfae7 from this chassis (sb_readonly=0)
Oct  2 08:12:22 np0005465988 ovn_controller[132601]: 2025-10-02T12:12:22Z|00132|binding|INFO|Setting lport 19af83a7-4a5a-4802-b112-20ac31fbfae7 down in Southbound
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005465988 ovn_controller[132601]: 2025-10-02T12:12:22Z|00133|binding|INFO|Removing iface tap19af83a7-4a ovn-installed in OVS
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:22.623 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:ae:13 10.100.0.12'], port_security=['fa:16:3e:e2:ae:13 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd7850a2-e443-4918-8039-c177b9f865e9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '515a20f509b440c1bda78c309dea196e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fa32e577-50b2-48b4-8e0f-570787035c44', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.183'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bc15082-7b92-4e80-902f-3ff67b73a485, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=19af83a7-4a5a-4802-b112-20ac31fbfae7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:22.625 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 19af83a7-4a5a-4802-b112-20ac31fbfae7 in datapath fd7850a2-e443-4918-8039-c177b9f865e9 unbound from our chassis#033[00m
Oct  2 08:12:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:22.627 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd7850a2-e443-4918-8039-c177b9f865e9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:22.630 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb7c57e-bc51-41d8-a132-ca677232f60a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:22.631 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9 namespace which is not needed anymore#033[00m
Oct  2 08:12:22 np0005465988 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000034.scope: Deactivated successfully.
Oct  2 08:12:22 np0005465988 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000034.scope: Consumed 14.208s CPU time.
Oct  2 08:12:22 np0005465988 systemd-machined[192594]: Machine qemu-20-instance-00000034 terminated.
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.726 2 INFO nova.virt.libvirt.driver [-] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Instance destroyed successfully.#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.728 2 DEBUG nova.objects.instance [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lazy-loading 'resources' on Instance uuid 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.749 2 DEBUG nova.virt.libvirt.vif [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:11:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1925963903',display_name='tempest-ServersTestManualDisk-server-1925963903',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1925963903',id=52,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKLsZLJ0JgRu5rKRhdrh1/TyoKw8U2kXcryCIyZkfI9uIl3Zx/ZmKXNHGlC1E2QZKYEU/Q0USHyhPx5HdjE1pHIDHTUPiMDFrmsPvwA9v8UIEum4olA4ZiOqBX1lTtkFxw==',key_name='tempest-keypair-2000206289',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:11:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='515a20f509b440c1bda78c309dea196e',ramdisk_id='',reservation_id='r-div4k10m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-1757413810',owner_user_name='tempest-ServersTestManualDisk-1757413810-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:11:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='149e3925214640b484809bc9362e31ac',uuid=7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "address": "fa:16:3e:e2:ae:13", "network": {"id": "fd7850a2-e443-4918-8039-c177b9f865e9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1493494643-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "515a20f509b440c1bda78c309dea196e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19af83a7-4a", "ovs_interfaceid": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.750 2 DEBUG nova.network.os_vif_util [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Converting VIF {"id": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "address": "fa:16:3e:e2:ae:13", "network": {"id": "fd7850a2-e443-4918-8039-c177b9f865e9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1493494643-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "515a20f509b440c1bda78c309dea196e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19af83a7-4a", "ovs_interfaceid": "19af83a7-4a5a-4802-b112-20ac31fbfae7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.754 2 DEBUG nova.network.os_vif_util [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e2:ae:13,bridge_name='br-int',has_traffic_filtering=True,id=19af83a7-4a5a-4802-b112-20ac31fbfae7,network=Network(fd7850a2-e443-4918-8039-c177b9f865e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19af83a7-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.755 2 DEBUG os_vif [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:ae:13,bridge_name='br-int',has_traffic_filtering=True,id=19af83a7-4a5a-4802-b112-20ac31fbfae7,network=Network(fd7850a2-e443-4918-8039-c177b9f865e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19af83a7-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.757 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19af83a7-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.764 2 INFO os_vif [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:ae:13,bridge_name='br-int',has_traffic_filtering=True,id=19af83a7-4a5a-4802-b112-20ac31fbfae7,network=Network(fd7850a2-e443-4918-8039-c177b9f865e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19af83a7-4a')#033[00m
Oct  2 08:12:22 np0005465988 neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9[258189]: [NOTICE]   (258193) : haproxy version is 2.8.14-c23fe91
Oct  2 08:12:22 np0005465988 neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9[258189]: [NOTICE]   (258193) : path to executable is /usr/sbin/haproxy
Oct  2 08:12:22 np0005465988 neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9[258189]: [WARNING]  (258193) : Exiting Master process...
Oct  2 08:12:22 np0005465988 neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9[258189]: [WARNING]  (258193) : Exiting Master process...
Oct  2 08:12:22 np0005465988 neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9[258189]: [ALERT]    (258193) : Current worker (258195) exited with code 143 (Terminated)
Oct  2 08:12:22 np0005465988 neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9[258189]: [WARNING]  (258193) : All workers exited. Exiting... (0)
Oct  2 08:12:22 np0005465988 systemd[1]: libpod-31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737.scope: Deactivated successfully.
Oct  2 08:12:22 np0005465988 podman[258930]: 2025-10-02 12:12:22.799759325 +0000 UTC m=+0.060535410 container died 31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:12:22 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737-userdata-shm.mount: Deactivated successfully.
Oct  2 08:12:22 np0005465988 systemd[1]: var-lib-containers-storage-overlay-8faa3b2a279733222874da63cd66372b83e6a3a8f43299a0a2c1bf61892e9240-merged.mount: Deactivated successfully.
Oct  2 08:12:22 np0005465988 podman[258930]: 2025-10-02 12:12:22.845935316 +0000 UTC m=+0.106711371 container cleanup 31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:12:22 np0005465988 systemd[1]: libpod-conmon-31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737.scope: Deactivated successfully.
Oct  2 08:12:22 np0005465988 podman[258983]: 2025-10-02 12:12:22.919064291 +0000 UTC m=+0.046151112 container remove 31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:12:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:22.924 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[569c25ff-b0b7-439f-81af-6f988b146ca4]: (4, ('Thu Oct  2 12:12:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9 (31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737)\n31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737\nThu Oct  2 12:12:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9 (31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737)\n31b759bc9fc7b41b78bc251492dca86564532565196c278d5d4bc643d8694737\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:22.926 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e3372f6f-da52-45a2-beb4-7ad00945d53e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:22.927 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd7850a2-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005465988 kernel: tapfd7850a2-e0: left promiscuous mode
Oct  2 08:12:22 np0005465988 nova_compute[236126]: 2025-10-02 12:12:22.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:22.946 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e01f997f-0dde-4b38-bb41-9228ab901fed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:22.980 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[88066906-8ac5-4621-bad6-00e4c6609f63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:22.982 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb123d4-24ee-49c9-b803-d083cff4a42f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:23.002 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d30caee8-9ae0-4112-9998-d821295587d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517726, 'reachable_time': 33106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258998, 'error': None, 'target': 'ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:23 np0005465988 systemd[1]: run-netns-ovnmeta\x2dfd7850a2\x2de443\x2d4918\x2d8039\x2dc177b9f865e9.mount: Deactivated successfully.
Oct  2 08:12:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:23.006 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd7850a2-e443-4918-8039-c177b9f865e9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:12:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:23.006 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9deb3b-7872-43d5-8d53-0291dea1b631]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e192 e192: 3 total, 3 up, 3 in
Oct  2 08:12:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:24.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:24.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:24 np0005465988 nova_compute[236126]: 2025-10-02 12:12:24.680 2 INFO nova.virt.libvirt.driver [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Deleting instance files /var/lib/nova/instances/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_del#033[00m
Oct  2 08:12:24 np0005465988 nova_compute[236126]: 2025-10-02 12:12:24.681 2 INFO nova.virt.libvirt.driver [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Deletion of /var/lib/nova/instances/7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e_del complete#033[00m
Oct  2 08:12:24 np0005465988 nova_compute[236126]: 2025-10-02 12:12:24.769 2 INFO nova.compute.manager [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Took 2.29 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:12:24 np0005465988 nova_compute[236126]: 2025-10-02 12:12:24.770 2 DEBUG oslo.service.loopingcall [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:12:24 np0005465988 nova_compute[236126]: 2025-10-02 12:12:24.770 2 DEBUG nova.compute.manager [-] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:12:24 np0005465988 nova_compute[236126]: 2025-10-02 12:12:24.771 2 DEBUG nova.network.neutron [-] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:12:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e193 e193: 3 total, 3 up, 3 in
Oct  2 08:12:25 np0005465988 nova_compute[236126]: 2025-10-02 12:12:25.544 2 DEBUG nova.compute.manager [req-22d1d439-e43c-4a73-b533-8021ab31d5ce req-5ae5cdc3-4692-4e30-a963-423353fed1ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Received event network-vif-unplugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:25 np0005465988 nova_compute[236126]: 2025-10-02 12:12:25.545 2 DEBUG oslo_concurrency.lockutils [req-22d1d439-e43c-4a73-b533-8021ab31d5ce req-5ae5cdc3-4692-4e30-a963-423353fed1ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:25 np0005465988 nova_compute[236126]: 2025-10-02 12:12:25.545 2 DEBUG oslo_concurrency.lockutils [req-22d1d439-e43c-4a73-b533-8021ab31d5ce req-5ae5cdc3-4692-4e30-a963-423353fed1ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:25 np0005465988 nova_compute[236126]: 2025-10-02 12:12:25.545 2 DEBUG oslo_concurrency.lockutils [req-22d1d439-e43c-4a73-b533-8021ab31d5ce req-5ae5cdc3-4692-4e30-a963-423353fed1ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:25 np0005465988 nova_compute[236126]: 2025-10-02 12:12:25.546 2 DEBUG nova.compute.manager [req-22d1d439-e43c-4a73-b533-8021ab31d5ce req-5ae5cdc3-4692-4e30-a963-423353fed1ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] No waiting events found dispatching network-vif-unplugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:25 np0005465988 nova_compute[236126]: 2025-10-02 12:12:25.546 2 DEBUG nova.compute.manager [req-22d1d439-e43c-4a73-b533-8021ab31d5ce req-5ae5cdc3-4692-4e30-a963-423353fed1ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Received event network-vif-unplugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:12:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:26.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:12:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:12:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:26.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:12:26 np0005465988 nova_compute[236126]: 2025-10-02 12:12:26.792 2 DEBUG nova.network.neutron [-] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:26 np0005465988 nova_compute[236126]: 2025-10-02 12:12:26.818 2 INFO nova.compute.manager [-] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Took 2.05 seconds to deallocate network for instance.#033[00m
Oct  2 08:12:26 np0005465988 nova_compute[236126]: 2025-10-02 12:12:26.888 2 DEBUG oslo_concurrency.lockutils [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:26 np0005465988 nova_compute[236126]: 2025-10-02 12:12:26.888 2 DEBUG oslo_concurrency.lockutils [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:26 np0005465988 nova_compute[236126]: 2025-10-02 12:12:26.948 2 DEBUG oslo_concurrency.processutils [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:27.339 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:27.339 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:27.339 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:12:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/991575295' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.362 2 DEBUG oslo_concurrency.processutils [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.368 2 DEBUG nova.compute.provider_tree [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.408 2 DEBUG nova.scheduler.client.report [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.462 2 DEBUG oslo_concurrency.lockutils [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.497 2 INFO nova.scheduler.client.report [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Deleted allocations for instance 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.617 2 DEBUG oslo_concurrency.lockutils [None req-51207d54-540e-4fc0-8d94-8fbcf35f66fd 149e3925214640b484809bc9362e31ac 515a20f509b440c1bda78c309dea196e - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.694 2 DEBUG nova.compute.manager [req-dd50edc4-e042-41e7-825e-39109148314f req-75c334c5-d334-46b0-a6ec-68e7cd7f0b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Received event network-vif-plugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.694 2 DEBUG oslo_concurrency.lockutils [req-dd50edc4-e042-41e7-825e-39109148314f req-75c334c5-d334-46b0-a6ec-68e7cd7f0b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.695 2 DEBUG oslo_concurrency.lockutils [req-dd50edc4-e042-41e7-825e-39109148314f req-75c334c5-d334-46b0-a6ec-68e7cd7f0b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.695 2 DEBUG oslo_concurrency.lockutils [req-dd50edc4-e042-41e7-825e-39109148314f req-75c334c5-d334-46b0-a6ec-68e7cd7f0b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.696 2 DEBUG nova.compute.manager [req-dd50edc4-e042-41e7-825e-39109148314f req-75c334c5-d334-46b0-a6ec-68e7cd7f0b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] No waiting events found dispatching network-vif-plugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.696 2 WARNING nova.compute.manager [req-dd50edc4-e042-41e7-825e-39109148314f req-75c334c5-d334-46b0-a6ec-68e7cd7f0b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Received unexpected event network-vif-plugged-19af83a7-4a5a-4802-b112-20ac31fbfae7 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.697 2 DEBUG nova.compute.manager [req-dd50edc4-e042-41e7-825e-39109148314f req-75c334c5-d334-46b0-a6ec-68e7cd7f0b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Received event network-vif-deleted-19af83a7-4a5a-4802-b112-20ac31fbfae7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:27 np0005465988 nova_compute[236126]: 2025-10-02 12:12:27.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:12:28 np0005465988 nova_compute[236126]: 2025-10-02 12:12:28.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:28.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:28.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:30.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:30.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e194 e194: 3 total, 3 up, 3 in
Oct  2 08:12:32 np0005465988 nova_compute[236126]: 2025-10-02 12:12:32.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:32.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:32.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:32 np0005465988 nova_compute[236126]: 2025-10-02 12:12:32.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e195 e195: 3 total, 3 up, 3 in
Oct  2 08:12:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:34.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:34.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:35 np0005465988 nova_compute[236126]: 2025-10-02 12:12:35.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:35 np0005465988 nova_compute[236126]: 2025-10-02 12:12:35.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:36.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:36 np0005465988 podman[259129]: 2025-10-02 12:12:36.555623763 +0000 UTC m=+0.083298742 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  2 08:12:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:36.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.336 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.337 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.372 2 DEBUG nova.compute.manager [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.506 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.506 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.513 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.514 2 INFO nova.compute.claims [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.657 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.723 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407142.7214856, 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.724 2 INFO nova.compute.manager [-] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.754 2 DEBUG nova.compute.manager [None req-22426aa2-a0f9-4090-9a30-db7476537b48 - - - - - -] [instance: 7b5ebb31-7a43-4bec-a49f-69a4c47d8d1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:37 np0005465988 nova_compute[236126]: 2025-10-02 12:12:37.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:12:38 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/728985912' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.122 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.126 2 DEBUG nova.compute.provider_tree [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.154 2 DEBUG nova.scheduler.client.report [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.198 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.199 2 DEBUG nova.compute.manager [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.263 2 DEBUG nova.compute.manager [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.264 2 DEBUG nova.network.neutron [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.303 2 INFO nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.347 2 DEBUG nova.compute.manager [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:12:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:38.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.516 2 DEBUG nova.compute.manager [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.518 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.518 2 INFO nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Creating image(s)#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.561 2 DEBUG nova.storage.rbd_utils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:12:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e196 e196: 3 total, 3 up, 3 in
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.597 2 DEBUG nova.storage.rbd_utils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:12:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:38.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.631 2 DEBUG nova.storage.rbd_utils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.636 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.723 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.724 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.725 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.726 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.772 2 DEBUG nova.storage.rbd_utils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.777 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:38 np0005465988 nova_compute[236126]: 2025-10-02 12:12:38.880 2 DEBUG nova.policy [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0df47040f1ff4ce69a6fbdfd9eba4955', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '55d20ae21b6d4f0abfff3bccc371ee7a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.100 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.323s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.193 2 DEBUG nova.storage.rbd_utils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] resizing rbd image d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.328 2 DEBUG nova.objects.instance [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lazy-loading 'migration_context' on Instance uuid d8adf6f4-e7d3-4a21-87f7-4b2396126258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.356 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.357 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Ensure instance console log exists: /var/lib/nova/instances/d8adf6f4-e7d3-4a21-87f7-4b2396126258/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.357 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.357 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.358 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.507 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.507 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.507 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.507 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.508 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:12:39 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2811533309' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:12:39 np0005465988 nova_compute[236126]: 2025-10-02 12:12:39.913 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.117 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.119 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4740MB free_disk=20.942779541015625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.119 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.119 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.200 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance d8adf6f4-e7d3-4a21-87f7-4b2396126258 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.200 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.200 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.262 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:40.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:40.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:12:40 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3874037984' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.732 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.736 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.755 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.796 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.796 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:40 np0005465988 nova_compute[236126]: 2025-10-02 12:12:40.824 2 DEBUG nova.network.neutron [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Successfully created port: 519ef635-69c9-49bb-9158-8e3bee30c427 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:12:42 np0005465988 nova_compute[236126]: 2025-10-02 12:12:42.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:42.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:42.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:42 np0005465988 nova_compute[236126]: 2025-10-02 12:12:42.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:42 np0005465988 nova_compute[236126]: 2025-10-02 12:12:42.796 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:42 np0005465988 nova_compute[236126]: 2025-10-02 12:12:42.797 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:42 np0005465988 nova_compute[236126]: 2025-10-02 12:12:42.798 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:42 np0005465988 nova_compute[236126]: 2025-10-02 12:12:42.798 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:12:43 np0005465988 nova_compute[236126]: 2025-10-02 12:12:43.051 2 DEBUG nova.network.neutron [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Successfully updated port: 519ef635-69c9-49bb-9158-8e3bee30c427 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:12:43 np0005465988 nova_compute[236126]: 2025-10-02 12:12:43.086 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "refresh_cache-d8adf6f4-e7d3-4a21-87f7-4b2396126258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:43 np0005465988 nova_compute[236126]: 2025-10-02 12:12:43.087 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquired lock "refresh_cache-d8adf6f4-e7d3-4a21-87f7-4b2396126258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:43 np0005465988 nova_compute[236126]: 2025-10-02 12:12:43.087 2 DEBUG nova.network.neutron [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:12:43 np0005465988 nova_compute[236126]: 2025-10-02 12:12:43.235 2 DEBUG nova.compute.manager [req-c5a42476-6fc3-4182-a2e1-9ec6f2cb366d req-730f4239-5eeb-4662-bc7e-b7ad8d2eb39f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Received event network-changed-519ef635-69c9-49bb-9158-8e3bee30c427 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:43 np0005465988 nova_compute[236126]: 2025-10-02 12:12:43.236 2 DEBUG nova.compute.manager [req-c5a42476-6fc3-4182-a2e1-9ec6f2cb366d req-730f4239-5eeb-4662-bc7e-b7ad8d2eb39f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Refreshing instance network info cache due to event network-changed-519ef635-69c9-49bb-9158-8e3bee30c427. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:12:43 np0005465988 nova_compute[236126]: 2025-10-02 12:12:43.236 2 DEBUG oslo_concurrency.lockutils [req-c5a42476-6fc3-4182-a2e1-9ec6f2cb366d req-730f4239-5eeb-4662-bc7e-b7ad8d2eb39f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d8adf6f4-e7d3-4a21-87f7-4b2396126258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:43 np0005465988 nova_compute[236126]: 2025-10-02 12:12:43.344 2 DEBUG nova.network.neutron [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:12:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:44.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:44.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.072 2 DEBUG nova.network.neutron [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Updating instance_info_cache with network_info: [{"id": "519ef635-69c9-49bb-9158-8e3bee30c427", "address": "fa:16:3e:2d:e0:82", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap519ef635-69", "ovs_interfaceid": "519ef635-69c9-49bb-9158-8e3bee30c427", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.183 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Releasing lock "refresh_cache-d8adf6f4-e7d3-4a21-87f7-4b2396126258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.183 2 DEBUG nova.compute.manager [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Instance network_info: |[{"id": "519ef635-69c9-49bb-9158-8e3bee30c427", "address": "fa:16:3e:2d:e0:82", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap519ef635-69", "ovs_interfaceid": "519ef635-69c9-49bb-9158-8e3bee30c427", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.184 2 DEBUG oslo_concurrency.lockutils [req-c5a42476-6fc3-4182-a2e1-9ec6f2cb366d req-730f4239-5eeb-4662-bc7e-b7ad8d2eb39f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d8adf6f4-e7d3-4a21-87f7-4b2396126258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.184 2 DEBUG nova.network.neutron [req-c5a42476-6fc3-4182-a2e1-9ec6f2cb366d req-730f4239-5eeb-4662-bc7e-b7ad8d2eb39f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Refreshing network info cache for port 519ef635-69c9-49bb-9158-8e3bee30c427 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.189 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Start _get_guest_xml network_info=[{"id": "519ef635-69c9-49bb-9158-8e3bee30c427", "address": "fa:16:3e:2d:e0:82", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap519ef635-69", "ovs_interfaceid": "519ef635-69c9-49bb-9158-8e3bee30c427", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.194 2 WARNING nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.198 2 DEBUG nova.virt.libvirt.host [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.199 2 DEBUG nova.virt.libvirt.host [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.207 2 DEBUG nova.virt.libvirt.host [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.208 2 DEBUG nova.virt.libvirt.host [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.209 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.209 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.209 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.210 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.210 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.210 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.210 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.210 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.211 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.211 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.211 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.211 2 DEBUG nova.virt.hardware [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.214 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.470 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:12:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1524387760' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.650 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.696 2 DEBUG nova.storage.rbd_utils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:12:45 np0005465988 nova_compute[236126]: 2025-10-02 12:12:45.701 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:12:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1341746596' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.192 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.195 2 DEBUG nova.virt.libvirt.vif [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:12:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1007708604',display_name='tempest-ImagesTestJSON-server-1007708604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1007708604',id=55,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='55d20ae21b6d4f0abfff3bccc371ee7a',ramdisk_id='',reservation_id='r-cyztd4k0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-2116266493',owner_user_name='tempest-ImagesTestJSON-2116266493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:12:38Z,user_data=None,user_id='0df47040f1ff4ce69a6fbdfd9eba4955',uuid=d8adf6f4-e7d3-4a21-87f7-4b2396126258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "519ef635-69c9-49bb-9158-8e3bee30c427", "address": "fa:16:3e:2d:e0:82", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap519ef635-69", "ovs_interfaceid": "519ef635-69c9-49bb-9158-8e3bee30c427", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.195 2 DEBUG nova.network.os_vif_util [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converting VIF {"id": "519ef635-69c9-49bb-9158-8e3bee30c427", "address": "fa:16:3e:2d:e0:82", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap519ef635-69", "ovs_interfaceid": "519ef635-69c9-49bb-9158-8e3bee30c427", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.196 2 DEBUG nova.network.os_vif_util [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:e0:82,bridge_name='br-int',has_traffic_filtering=True,id=519ef635-69c9-49bb-9158-8e3bee30c427,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap519ef635-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.198 2 DEBUG nova.objects.instance [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lazy-loading 'pci_devices' on Instance uuid d8adf6f4-e7d3-4a21-87f7-4b2396126258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.217 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  <uuid>d8adf6f4-e7d3-4a21-87f7-4b2396126258</uuid>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  <name>instance-00000037</name>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <nova:name>tempest-ImagesTestJSON-server-1007708604</nova:name>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:12:45</nova:creationTime>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <nova:user uuid="0df47040f1ff4ce69a6fbdfd9eba4955">tempest-ImagesTestJSON-2116266493-project-member</nova:user>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <nova:project uuid="55d20ae21b6d4f0abfff3bccc371ee7a">tempest-ImagesTestJSON-2116266493</nova:project>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <nova:port uuid="519ef635-69c9-49bb-9158-8e3bee30c427">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <entry name="serial">d8adf6f4-e7d3-4a21-87f7-4b2396126258</entry>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <entry name="uuid">d8adf6f4-e7d3-4a21-87f7-4b2396126258</entry>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk.config">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:2d:e0:82"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <target dev="tap519ef635-69"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/d8adf6f4-e7d3-4a21-87f7-4b2396126258/console.log" append="off"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:12:46 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:12:46 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:12:46 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:12:46 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.219 2 DEBUG nova.compute.manager [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Preparing to wait for external event network-vif-plugged-519ef635-69c9-49bb-9158-8e3bee30c427 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.219 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.219 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.220 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.220 2 DEBUG nova.virt.libvirt.vif [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:12:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1007708604',display_name='tempest-ImagesTestJSON-server-1007708604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1007708604',id=55,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='55d20ae21b6d4f0abfff3bccc371ee7a',ramdisk_id='',reservation_id='r-cyztd4k0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-2116266493',owner_user_name='tempest-ImagesTestJSON-2116266493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:12:38Z,user_data=None,user_id='0df47040f1ff4ce69a6fbdfd9eba4955',uuid=d8adf6f4-e7d3-4a21-87f7-4b2396126258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "519ef635-69c9-49bb-9158-8e3bee30c427", "address": "fa:16:3e:2d:e0:82", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap519ef635-69", "ovs_interfaceid": "519ef635-69c9-49bb-9158-8e3bee30c427", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.221 2 DEBUG nova.network.os_vif_util [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converting VIF {"id": "519ef635-69c9-49bb-9158-8e3bee30c427", "address": "fa:16:3e:2d:e0:82", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap519ef635-69", "ovs_interfaceid": "519ef635-69c9-49bb-9158-8e3bee30c427", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.221 2 DEBUG nova.network.os_vif_util [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:e0:82,bridge_name='br-int',has_traffic_filtering=True,id=519ef635-69c9-49bb-9158-8e3bee30c427,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap519ef635-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.222 2 DEBUG os_vif [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:e0:82,bridge_name='br-int',has_traffic_filtering=True,id=519ef635-69c9-49bb-9158-8e3bee30c427,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap519ef635-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.223 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.223 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.226 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap519ef635-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.226 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap519ef635-69, col_values=(('external_ids', {'iface-id': '519ef635-69c9-49bb-9158-8e3bee30c427', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:e0:82', 'vm-uuid': 'd8adf6f4-e7d3-4a21-87f7-4b2396126258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:46 np0005465988 NetworkManager[45041]: <info>  [1759407166.2287] manager: (tap519ef635-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.234 2 INFO os_vif [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:e0:82,bridge_name='br-int',has_traffic_filtering=True,id=519ef635-69c9-49bb-9158-8e3bee30c427,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap519ef635-69')#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.329 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.329 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.330 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] No VIF found with MAC fa:16:3e:2d:e0:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.330 2 INFO nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Using config drive#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.365 2 DEBUG nova.storage.rbd_utils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:12:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:46.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.505 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.505 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:12:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:46.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.953 2 INFO nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Creating config drive at /var/lib/nova/instances/d8adf6f4-e7d3-4a21-87f7-4b2396126258/disk.config#033[00m
Oct  2 08:12:46 np0005465988 nova_compute[236126]: 2025-10-02 12:12:46.964 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d8adf6f4-e7d3-4a21-87f7-4b2396126258/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8h8yikwe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.110 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d8adf6f4-e7d3-4a21-87f7-4b2396126258/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8h8yikwe" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.152 2 DEBUG nova.storage.rbd_utils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.156 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d8adf6f4-e7d3-4a21-87f7-4b2396126258/disk.config d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.321 2 DEBUG oslo_concurrency.processutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d8adf6f4-e7d3-4a21-87f7-4b2396126258/disk.config d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.322 2 INFO nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Deleting local config drive /var/lib/nova/instances/d8adf6f4-e7d3-4a21-87f7-4b2396126258/disk.config because it was imported into RBD.#033[00m
Oct  2 08:12:47 np0005465988 kernel: tap519ef635-69: entered promiscuous mode
Oct  2 08:12:47 np0005465988 NetworkManager[45041]: <info>  [1759407167.3956] manager: (tap519ef635-69): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:12:47Z|00134|binding|INFO|Claiming lport 519ef635-69c9-49bb-9158-8e3bee30c427 for this chassis.
Oct  2 08:12:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:12:47Z|00135|binding|INFO|519ef635-69c9-49bb-9158-8e3bee30c427: Claiming fa:16:3e:2d:e0:82 10.100.0.13
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.414 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:e0:82 10.100.0.13'], port_security=['fa:16:3e:2d:e0:82 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'd8adf6f4-e7d3-4a21-87f7-4b2396126258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55d20ae21b6d4f0abfff3bccc371ee7a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a015800-2f8b-4fd4-818b-829a4dcb7912', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9678136b-02f9-4c61-b96e-15935f11dca7, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=519ef635-69c9-49bb-9158-8e3bee30c427) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.415 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 519ef635-69c9-49bb-9158-8e3bee30c427 in datapath 6d00de8e-203c-4e94-b60f-36ba9ccef805 bound to our chassis#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.417 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6d00de8e-203c-4e94-b60f-36ba9ccef805#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.432 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[21c729b9-31a3-4ea5-9d58-95617a711912]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.433 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6d00de8e-21 in ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.435 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6d00de8e-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.435 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b130e7ed-fcd7-40fb-b8c8-f15eaf4380ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.436 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[218670f0-dfd1-47fa-a6fb-9977a1f79647]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 systemd-machined[192594]: New machine qemu-21-instance-00000037.
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.448 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[73722bf0-5a5e-4392-a513-b6610751c489]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 systemd-udevd[259525]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:47 np0005465988 systemd[1]: Started Virtual Machine qemu-21-instance-00000037.
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.479 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc4b6dd-b2ab-44df-911d-fab71f08b9c4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 NetworkManager[45041]: <info>  [1759407167.4823] device (tap519ef635-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:12:47 np0005465988 NetworkManager[45041]: <info>  [1759407167.4833] device (tap519ef635-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:12:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:12:47Z|00136|binding|INFO|Setting lport 519ef635-69c9-49bb-9158-8e3bee30c427 ovn-installed in OVS
Oct  2 08:12:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:12:47Z|00137|binding|INFO|Setting lport 519ef635-69c9-49bb-9158-8e3bee30c427 up in Southbound
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.516 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[181ee5fa-d077-43b8-ac3a-3b52c9fe5c52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.522 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8855ff7b-40d5-4209-92a2-37e18028c73d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 systemd-udevd[259528]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:12:47 np0005465988 NetworkManager[45041]: <info>  [1759407167.5235] manager: (tap6d00de8e-20): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.560 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4f2780f4-2f2f-4871-a756-ddb274925095]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.564 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4cf0eef5-010c-49fd-8b53-99873ff6490e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 NetworkManager[45041]: <info>  [1759407167.5830] device (tap6d00de8e-20): carrier: link connected
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.589 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[057c1c68-33a8-40a2-8f1b-4730d10abb48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.611 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[268f3e45-4e63-43fb-b217-069fa9688766]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d00de8e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:28:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522791, 'reachable_time': 18802, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259556, 'error': None, 'target': 'ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.625 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a62c895e-aea0-4f4a-847b-079c9a4a8768]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe87:28f2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522791, 'tstamp': 522791}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259557, 'error': None, 'target': 'ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.644 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8b480896-c0cb-42ed-83ff-8d01f2eaf4a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d00de8e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:28:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522791, 'reachable_time': 18802, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259558, 'error': None, 'target': 'ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.675 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f1df9bb7-9183-4e16-9a54-c0e1e25a23fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.704 2 DEBUG nova.network.neutron [req-c5a42476-6fc3-4182-a2e1-9ec6f2cb366d req-730f4239-5eeb-4662-bc7e-b7ad8d2eb39f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Updated VIF entry in instance network info cache for port 519ef635-69c9-49bb-9158-8e3bee30c427. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.705 2 DEBUG nova.network.neutron [req-c5a42476-6fc3-4182-a2e1-9ec6f2cb366d req-730f4239-5eeb-4662-bc7e-b7ad8d2eb39f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Updating instance_info_cache with network_info: [{"id": "519ef635-69c9-49bb-9158-8e3bee30c427", "address": "fa:16:3e:2d:e0:82", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap519ef635-69", "ovs_interfaceid": "519ef635-69c9-49bb-9158-8e3bee30c427", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.737 2 DEBUG oslo_concurrency.lockutils [req-c5a42476-6fc3-4182-a2e1-9ec6f2cb366d req-730f4239-5eeb-4662-bc7e-b7ad8d2eb39f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d8adf6f4-e7d3-4a21-87f7-4b2396126258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.745 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e555525a-c4ec-408b-8971-1c8567001fe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.747 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d00de8e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.747 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.747 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d00de8e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:47 np0005465988 NetworkManager[45041]: <info>  [1759407167.7697] manager: (tap6d00de8e-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Oct  2 08:12:47 np0005465988 kernel: tap6d00de8e-20: entered promiscuous mode
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.773 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6d00de8e-20, col_values=(('external_ids', {'iface-id': '4d0b2163-acbb-4b6a-b6d8-84f8212e1e02'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:12:47Z|00138|binding|INFO|Releasing lport 4d0b2163-acbb-4b6a-b6d8-84f8212e1e02 from this chassis (sb_readonly=0)
Oct  2 08:12:47 np0005465988 nova_compute[236126]: 2025-10-02 12:12:47.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.797 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6d00de8e-203c-4e94-b60f-36ba9ccef805.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6d00de8e-203c-4e94-b60f-36ba9ccef805.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.798 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9051a9-e07b-4349-80be-2d8e94f7ddf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.800 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-6d00de8e-203c-4e94-b60f-36ba9ccef805
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/6d00de8e-203c-4e94-b60f-36ba9ccef805.pid.haproxy
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 6d00de8e-203c-4e94-b60f-36ba9ccef805
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:12:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:47.803 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'env', 'PROCESS_TAG=haproxy-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6d00de8e-203c-4e94-b60f-36ba9ccef805.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:12:48 np0005465988 podman[259632]: 2025-10-02 12:12:48.178606535 +0000 UTC m=+0.058720757 container create 5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:12:48 np0005465988 systemd[1]: Started libpod-conmon-5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4.scope.
Oct  2 08:12:48 np0005465988 podman[259632]: 2025-10-02 12:12:48.145475472 +0000 UTC m=+0.025589714 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:12:48 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:12:48 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f56e4859a8a7e204a7fd32f3d3f2f05fcc71b9dc9200482cd47819355fa5e04/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:12:48 np0005465988 podman[259632]: 2025-10-02 12:12:48.273138331 +0000 UTC m=+0.153252583 container init 5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:12:48 np0005465988 podman[259632]: 2025-10-02 12:12:48.278516358 +0000 UTC m=+0.158630580 container start 5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:12:48 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[259647]: [NOTICE]   (259651) : New worker (259653) forked
Oct  2 08:12:48 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[259647]: [NOTICE]   (259651) : Loading success.
Oct  2 08:12:48 np0005465988 nova_compute[236126]: 2025-10-02 12:12:48.371 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407168.3704703, d8adf6f4-e7d3-4a21-87f7-4b2396126258 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:48 np0005465988 nova_compute[236126]: 2025-10-02 12:12:48.371 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] VM Started (Lifecycle Event)#033[00m
Oct  2 08:12:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:48.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:48 np0005465988 nova_compute[236126]: 2025-10-02 12:12:48.409 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:48 np0005465988 nova_compute[236126]: 2025-10-02 12:12:48.412 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407168.3705838, d8adf6f4-e7d3-4a21-87f7-4b2396126258 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:48 np0005465988 nova_compute[236126]: 2025-10-02 12:12:48.413 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:12:48 np0005465988 nova_compute[236126]: 2025-10-02 12:12:48.443 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:48 np0005465988 nova_compute[236126]: 2025-10-02 12:12:48.448 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:12:48 np0005465988 nova_compute[236126]: 2025-10-02 12:12:48.479 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:12:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:48.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.376 2 DEBUG nova.compute.manager [req-119d1220-571a-493b-920a-7c2832b586f4 req-80b4650b-c799-40dc-8f23-f98ddfbbbfd5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Received event network-vif-plugged-519ef635-69c9-49bb-9158-8e3bee30c427 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.376 2 DEBUG oslo_concurrency.lockutils [req-119d1220-571a-493b-920a-7c2832b586f4 req-80b4650b-c799-40dc-8f23-f98ddfbbbfd5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.377 2 DEBUG oslo_concurrency.lockutils [req-119d1220-571a-493b-920a-7c2832b586f4 req-80b4650b-c799-40dc-8f23-f98ddfbbbfd5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.377 2 DEBUG oslo_concurrency.lockutils [req-119d1220-571a-493b-920a-7c2832b586f4 req-80b4650b-c799-40dc-8f23-f98ddfbbbfd5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.377 2 DEBUG nova.compute.manager [req-119d1220-571a-493b-920a-7c2832b586f4 req-80b4650b-c799-40dc-8f23-f98ddfbbbfd5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Processing event network-vif-plugged-519ef635-69c9-49bb-9158-8e3bee30c427 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.378 2 DEBUG nova.compute.manager [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.383 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407170.3831089, d8adf6f4-e7d3-4a21-87f7-4b2396126258 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.384 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.386 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:12:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:12:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:50.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.393 2 INFO nova.virt.libvirt.driver [-] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Instance spawned successfully.#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.394 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.405 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.411 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.425 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.426 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.426 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.427 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.427 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.428 2 DEBUG nova.virt.libvirt.driver [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.434 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.476 2 INFO nova.compute.manager [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Took 11.96 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.476 2 DEBUG nova.compute.manager [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.550 2 INFO nova.compute.manager [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Took 13.11 seconds to build instance.#033[00m
Oct  2 08:12:50 np0005465988 nova_compute[236126]: 2025-10-02 12:12:50.567 2 DEBUG oslo_concurrency.lockutils [None req-acaf9617-0555-407f-b5f3-95bddc0c63b7 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.005000145s ======
Oct  2 08:12:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:50.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000145s
Oct  2 08:12:51 np0005465988 nova_compute[236126]: 2025-10-02 12:12:51.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:51 np0005465988 podman[259665]: 2025-10-02 12:12:51.536944723 +0000 UTC m=+0.068905843 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:12:51 np0005465988 podman[259667]: 2025-10-02 12:12:51.554988097 +0000 UTC m=+0.082493818 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Oct  2 08:12:51 np0005465988 podman[259664]: 2025-10-02 12:12:51.555398779 +0000 UTC m=+0.093383654 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:12:52 np0005465988 nova_compute[236126]: 2025-10-02 12:12:52.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:52.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:52 np0005465988 nova_compute[236126]: 2025-10-02 12:12:52.519 2 DEBUG nova.compute.manager [req-945620a3-9ec4-404e-ab95-b6b8a1b173cf req-3b27a5f6-ce7e-4ab0-9b8c-de6f52d0931d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Received event network-vif-plugged-519ef635-69c9-49bb-9158-8e3bee30c427 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:52 np0005465988 nova_compute[236126]: 2025-10-02 12:12:52.520 2 DEBUG oslo_concurrency.lockutils [req-945620a3-9ec4-404e-ab95-b6b8a1b173cf req-3b27a5f6-ce7e-4ab0-9b8c-de6f52d0931d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:52 np0005465988 nova_compute[236126]: 2025-10-02 12:12:52.521 2 DEBUG oslo_concurrency.lockutils [req-945620a3-9ec4-404e-ab95-b6b8a1b173cf req-3b27a5f6-ce7e-4ab0-9b8c-de6f52d0931d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:52 np0005465988 nova_compute[236126]: 2025-10-02 12:12:52.521 2 DEBUG oslo_concurrency.lockutils [req-945620a3-9ec4-404e-ab95-b6b8a1b173cf req-3b27a5f6-ce7e-4ab0-9b8c-de6f52d0931d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:52 np0005465988 nova_compute[236126]: 2025-10-02 12:12:52.521 2 DEBUG nova.compute.manager [req-945620a3-9ec4-404e-ab95-b6b8a1b173cf req-3b27a5f6-ce7e-4ab0-9b8c-de6f52d0931d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] No waiting events found dispatching network-vif-plugged-519ef635-69c9-49bb-9158-8e3bee30c427 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:52 np0005465988 nova_compute[236126]: 2025-10-02 12:12:52.522 2 WARNING nova.compute.manager [req-945620a3-9ec4-404e-ab95-b6b8a1b173cf req-3b27a5f6-ce7e-4ab0-9b8c-de6f52d0931d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Received unexpected event network-vif-plugged-519ef635-69c9-49bb-9158-8e3bee30c427 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:12:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:52.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:54.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:54.457 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:54.458 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:12:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:12:54.459 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:54 np0005465988 nova_compute[236126]: 2025-10-02 12:12:54.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:54 np0005465988 nova_compute[236126]: 2025-10-02 12:12:54.533 2 DEBUG nova.compute.manager [None req-235acb12-e1eb-4b7f-ade4-efb0fb6d4ecf 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:54 np0005465988 nova_compute[236126]: 2025-10-02 12:12:54.602 2 INFO nova.compute.manager [None req-235acb12-e1eb-4b7f-ade4-efb0fb6d4ecf 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] instance snapshotting#033[00m
Oct  2 08:12:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:54.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:55 np0005465988 nova_compute[236126]: 2025-10-02 12:12:55.005 2 INFO nova.virt.libvirt.driver [None req-235acb12-e1eb-4b7f-ade4-efb0fb6d4ecf 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Beginning live snapshot process#033[00m
Oct  2 08:12:55 np0005465988 nova_compute[236126]: 2025-10-02 12:12:55.224 2 DEBUG nova.virt.libvirt.imagebackend [None req-235acb12-e1eb-4b7f-ade4-efb0fb6d4ecf 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:12:55 np0005465988 nova_compute[236126]: 2025-10-02 12:12:55.596 2 DEBUG nova.storage.rbd_utils [None req-235acb12-e1eb-4b7f-ade4-efb0fb6d4ecf 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] creating snapshot(8942e35cfad54c64aa2cd2cd62c9200b) on rbd image(d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:12:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e197 e197: 3 total, 3 up, 3 in
Oct  2 08:12:55 np0005465988 nova_compute[236126]: 2025-10-02 12:12:55.778 2 DEBUG nova.storage.rbd_utils [None req-235acb12-e1eb-4b7f-ade4-efb0fb6d4ecf 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] cloning vms/d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk@8942e35cfad54c64aa2cd2cd62c9200b to images/11a037e0-306c-4ca3-91ab-08e92bb1fae5 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:12:55 np0005465988 nova_compute[236126]: 2025-10-02 12:12:55.928 2 DEBUG nova.storage.rbd_utils [None req-235acb12-e1eb-4b7f-ade4-efb0fb6d4ecf 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] flattening images/11a037e0-306c-4ca3-91ab-08e92bb1fae5 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:12:56 np0005465988 nova_compute[236126]: 2025-10-02 12:12:56.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:56.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:56 np0005465988 nova_compute[236126]: 2025-10-02 12:12:56.563 2 DEBUG nova.storage.rbd_utils [None req-235acb12-e1eb-4b7f-ade4-efb0fb6d4ecf 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] removing snapshot(8942e35cfad54c64aa2cd2cd62c9200b) on rbd image(d8adf6f4-e7d3-4a21-87f7-4b2396126258_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:12:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:56.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:12:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e198 e198: 3 total, 3 up, 3 in
Oct  2 08:12:57 np0005465988 nova_compute[236126]: 2025-10-02 12:12:57.029 2 DEBUG nova.storage.rbd_utils [None req-235acb12-e1eb-4b7f-ade4-efb0fb6d4ecf 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] creating snapshot(snap) on rbd image(11a037e0-306c-4ca3-91ab-08e92bb1fae5) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:12:57 np0005465988 nova_compute[236126]: 2025-10-02 12:12:57.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:12:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e199 e199: 3 total, 3 up, 3 in
Oct  2 08:12:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:12:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:12:58.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:12:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:12:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:12:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:12:58.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:00.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:00.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:00 np0005465988 nova_compute[236126]: 2025-10-02 12:13:00.743 2 INFO nova.virt.libvirt.driver [None req-235acb12-e1eb-4b7f-ade4-efb0fb6d4ecf 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Snapshot image upload complete#033[00m
Oct  2 08:13:00 np0005465988 nova_compute[236126]: 2025-10-02 12:13:00.744 2 INFO nova.compute.manager [None req-235acb12-e1eb-4b7f-ade4-efb0fb6d4ecf 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Took 6.14 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:13:01 np0005465988 nova_compute[236126]: 2025-10-02 12:13:01.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:02 np0005465988 nova_compute[236126]: 2025-10-02 12:13:02.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:02.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:13:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:02.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:13:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e200 e200: 3 total, 3 up, 3 in
Oct  2 08:13:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:03Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2d:e0:82 10.100.0.13
Oct  2 08:13:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:03Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:e0:82 10.100.0.13
Oct  2 08:13:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:04.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:04.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:06 np0005465988 nova_compute[236126]: 2025-10-02 12:13:06.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:06.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:06.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:07 np0005465988 nova_compute[236126]: 2025-10-02 12:13:07.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:07 np0005465988 podman[259929]: 2025-10-02 12:13:07.556973379 +0000 UTC m=+0.087223225 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 08:13:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:08.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:08.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:10.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:10.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:11 np0005465988 nova_compute[236126]: 2025-10-02 12:13:11.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:12 np0005465988 nova_compute[236126]: 2025-10-02 12:13:12.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:12.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:12.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:14.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:14.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:16 np0005465988 nova_compute[236126]: 2025-10-02 12:13:16.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:16.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:16.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:17 np0005465988 nova_compute[236126]: 2025-10-02 12:13:17.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:18.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:18.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e201 e201: 3 total, 3 up, 3 in
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.127 2 DEBUG oslo_concurrency.lockutils [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.127 2 DEBUG oslo_concurrency.lockutils [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.128 2 DEBUG oslo_concurrency.lockutils [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.128 2 DEBUG oslo_concurrency.lockutils [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.128 2 DEBUG oslo_concurrency.lockutils [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.129 2 INFO nova.compute.manager [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Terminating instance#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.130 2 DEBUG nova.compute.manager [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:13:20 np0005465988 kernel: tap519ef635-69 (unregistering): left promiscuous mode
Oct  2 08:13:20 np0005465988 NetworkManager[45041]: <info>  [1759407200.2160] device (tap519ef635-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:13:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:20Z|00139|binding|INFO|Releasing lport 519ef635-69c9-49bb-9158-8e3bee30c427 from this chassis (sb_readonly=0)
Oct  2 08:13:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:20Z|00140|binding|INFO|Setting lport 519ef635-69c9-49bb-9158-8e3bee30c427 down in Southbound
Oct  2 08:13:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:20Z|00141|binding|INFO|Removing iface tap519ef635-69 ovn-installed in OVS
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.240 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:e0:82 10.100.0.13'], port_security=['fa:16:3e:2d:e0:82 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'd8adf6f4-e7d3-4a21-87f7-4b2396126258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55d20ae21b6d4f0abfff3bccc371ee7a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a015800-2f8b-4fd4-818b-829a4dcb7912', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9678136b-02f9-4c61-b96e-15935f11dca7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=519ef635-69c9-49bb-9158-8e3bee30c427) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.242 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 519ef635-69c9-49bb-9158-8e3bee30c427 in datapath 6d00de8e-203c-4e94-b60f-36ba9ccef805 unbound from our chassis#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.245 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d00de8e-203c-4e94-b60f-36ba9ccef805, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.248 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ce235c0b-1b80-40df-8cc4-b4655f87209a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.250 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805 namespace which is not needed anymore#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:20 np0005465988 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000037.scope: Deactivated successfully.
Oct  2 08:13:20 np0005465988 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000037.scope: Consumed 14.167s CPU time.
Oct  2 08:13:20 np0005465988 systemd-machined[192594]: Machine qemu-21-instance-00000037 terminated.
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.383 2 INFO nova.virt.libvirt.driver [-] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Instance destroyed successfully.#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.384 2 DEBUG nova.objects.instance [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lazy-loading 'resources' on Instance uuid d8adf6f4-e7d3-4a21-87f7-4b2396126258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.400 2 DEBUG nova.virt.libvirt.vif [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:12:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1007708604',display_name='tempest-ImagesTestJSON-server-1007708604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1007708604',id=55,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:12:50Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='55d20ae21b6d4f0abfff3bccc371ee7a',ramdisk_id='',reservation_id='r-cyztd4k0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-2116266493',owner_user_name='tempest-ImagesTestJSON-2116266493-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:13:00Z,user_data=None,user_id='0df47040f1ff4ce69a6fbdfd9eba4955',uuid=d8adf6f4-e7d3-4a21-87f7-4b2396126258,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "519ef635-69c9-49bb-9158-8e3bee30c427", "address": "fa:16:3e:2d:e0:82", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap519ef635-69", "ovs_interfaceid": "519ef635-69c9-49bb-9158-8e3bee30c427", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.401 2 DEBUG nova.network.os_vif_util [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converting VIF {"id": "519ef635-69c9-49bb-9158-8e3bee30c427", "address": "fa:16:3e:2d:e0:82", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap519ef635-69", "ovs_interfaceid": "519ef635-69c9-49bb-9158-8e3bee30c427", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.402 2 DEBUG nova.network.os_vif_util [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:e0:82,bridge_name='br-int',has_traffic_filtering=True,id=519ef635-69c9-49bb-9158-8e3bee30c427,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap519ef635-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.403 2 DEBUG os_vif [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:e0:82,bridge_name='br-int',has_traffic_filtering=True,id=519ef635-69c9-49bb-9158-8e3bee30c427,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap519ef635-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.408 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap519ef635-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.416 2 INFO os_vif [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:e0:82,bridge_name='br-int',has_traffic_filtering=True,id=519ef635-69c9-49bb-9158-8e3bee30c427,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap519ef635-69')#033[00m
Oct  2 08:13:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:20.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:20 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[259647]: [NOTICE]   (259651) : haproxy version is 2.8.14-c23fe91
Oct  2 08:13:20 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[259647]: [NOTICE]   (259651) : path to executable is /usr/sbin/haproxy
Oct  2 08:13:20 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[259647]: [WARNING]  (259651) : Exiting Master process...
Oct  2 08:13:20 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[259647]: [WARNING]  (259651) : Exiting Master process...
Oct  2 08:13:20 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[259647]: [ALERT]    (259651) : Current worker (259653) exited with code 143 (Terminated)
Oct  2 08:13:20 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[259647]: [WARNING]  (259651) : All workers exited. Exiting... (0)
Oct  2 08:13:20 np0005465988 systemd[1]: libpod-5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4.scope: Deactivated successfully.
Oct  2 08:13:20 np0005465988 podman[260040]: 2025-10-02 12:13:20.47114943 +0000 UTC m=+0.056080661 container died 5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 08:13:20 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4-userdata-shm.mount: Deactivated successfully.
Oct  2 08:13:20 np0005465988 systemd[1]: var-lib-containers-storage-overlay-0f56e4859a8a7e204a7fd32f3d3f2f05fcc71b9dc9200482cd47819355fa5e04-merged.mount: Deactivated successfully.
Oct  2 08:13:20 np0005465988 podman[260040]: 2025-10-02 12:13:20.51729965 +0000 UTC m=+0.102230891 container cleanup 5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:13:20 np0005465988 systemd[1]: libpod-conmon-5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4.scope: Deactivated successfully.
Oct  2 08:13:20 np0005465988 podman[260089]: 2025-10-02 12:13:20.586738028 +0000 UTC m=+0.045927306 container remove 5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.595 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[22727c7d-93e2-4a35-9faa-7f79ddad750c]: (4, ('Thu Oct  2 12:13:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805 (5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4)\n5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4\nThu Oct  2 12:13:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805 (5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4)\n5899e67536f4e634788e7aeaa1d04df79657e571287201af899b3d66441fd4a4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.596 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1f5ae14c-8f4a-43d7-93a3-5d6c21121c42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.597 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d00de8e-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:20 np0005465988 kernel: tap6d00de8e-20: left promiscuous mode
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:20 np0005465988 nova_compute[236126]: 2025-10-02 12:13:20.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.621 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[072698cf-25d0-462e-98bf-e4b27762e64f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.644 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ea9bd7b1-cddd-4d0e-9f16-8c7cd5c175e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.645 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1a450183-5ac8-4902-a187-2396aa70c90a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.660 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[00e3774b-0eb0-40a4-a5d8-1f8e701e45e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522784, 'reachable_time': 26178, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260104, 'error': None, 'target': 'ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:20 np0005465988 systemd[1]: run-netns-ovnmeta\x2d6d00de8e\x2d203c\x2d4e94\x2db60f\x2d36ba9ccef805.mount: Deactivated successfully.
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.664 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:13:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:20.664 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[fd4e0220-1b78-42bc-8f45-fdd88d5c4fd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:20.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.077 2 INFO nova.virt.libvirt.driver [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Deleting instance files /var/lib/nova/instances/d8adf6f4-e7d3-4a21-87f7-4b2396126258_del#033[00m
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.078 2 INFO nova.virt.libvirt.driver [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Deletion of /var/lib/nova/instances/d8adf6f4-e7d3-4a21-87f7-4b2396126258_del complete#033[00m
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.142 2 INFO nova.compute.manager [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Took 1.01 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.144 2 DEBUG oslo.service.loopingcall [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.145 2 DEBUG nova.compute.manager [-] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.145 2 DEBUG nova.network.neutron [-] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.810 2 DEBUG nova.network.neutron [-] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.828 2 INFO nova.compute.manager [-] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Took 0.68 seconds to deallocate network for instance.#033[00m
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.873 2 DEBUG oslo_concurrency.lockutils [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.874 2 DEBUG oslo_concurrency.lockutils [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.915 2 DEBUG nova.compute.manager [req-ad0acc93-6c5c-44de-aff9-dfdd4dd1fbb1 req-9e416940-887c-4c5b-9a74-75c26a8c81c9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Received event network-vif-deleted-519ef635-69c9-49bb-9158-8e3bee30c427 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:21 np0005465988 nova_compute[236126]: 2025-10-02 12:13:21.944 2 DEBUG oslo_concurrency.processutils [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:13:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/746776297' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:13:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:13:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:22.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.451 2 DEBUG oslo_concurrency.processutils [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.458 2 DEBUG nova.compute.provider_tree [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.472 2 DEBUG nova.scheduler.client.report [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.510 2 DEBUG oslo_concurrency.lockutils [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.543 2 INFO nova.scheduler.client.report [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Deleted allocations for instance d8adf6f4-e7d3-4a21-87f7-4b2396126258#033[00m
Oct  2 08:13:22 np0005465988 podman[260130]: 2025-10-02 12:13:22.545072061 +0000 UTC m=+0.072679323 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:13:22 np0005465988 podman[260131]: 2025-10-02 12:13:22.58567792 +0000 UTC m=+0.096440512 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:13:22 np0005465988 podman[260129]: 2025-10-02 12:13:22.595754723 +0000 UTC m=+0.118737510 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.618 2 DEBUG oslo_concurrency.lockutils [None req-b5ba2fad-9ab2-4d5c-b5d4-629d7a8d8d47 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "d8adf6f4-e7d3-4a21-87f7-4b2396126258" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.491s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.633 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.634 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.648 2 DEBUG nova.compute.manager [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:13:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:22.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.713 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.714 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.723 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.724 2 INFO nova.compute.claims [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:13:22 np0005465988 nova_compute[236126]: 2025-10-02 12:13:22.840 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:13:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/984207681' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.311 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.319 2 DEBUG nova.compute.provider_tree [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.338 2 DEBUG nova.scheduler.client.report [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.390 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.391 2 DEBUG nova.compute.manager [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.454 2 DEBUG nova.compute.manager [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.454 2 DEBUG nova.network.neutron [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.499 2 INFO nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.538 2 DEBUG nova.compute.manager [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.646 2 DEBUG nova.compute.manager [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.650 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.651 2 INFO nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Creating image(s)#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.697 2 DEBUG nova.storage.rbd_utils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.742 2 DEBUG nova.storage.rbd_utils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.786 2 DEBUG nova.storage.rbd_utils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.793 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.837 2 DEBUG nova.policy [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0df47040f1ff4ce69a6fbdfd9eba4955', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '55d20ae21b6d4f0abfff3bccc371ee7a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.886 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.887 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.888 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.888 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.931 2 DEBUG nova.storage.rbd_utils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:13:23 np0005465988 nova_compute[236126]: 2025-10-02 12:13:23.939 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:24.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:24.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:24 np0005465988 nova_compute[236126]: 2025-10-02 12:13:24.705 2 DEBUG nova.network.neutron [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Successfully created port: a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:13:25 np0005465988 nova_compute[236126]: 2025-10-02 12:13:25.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:26 np0005465988 nova_compute[236126]: 2025-10-02 12:13:26.032 2 DEBUG nova.network.neutron [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Successfully updated port: a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:13:26 np0005465988 nova_compute[236126]: 2025-10-02 12:13:26.048 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "refresh_cache-8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:26 np0005465988 nova_compute[236126]: 2025-10-02 12:13:26.049 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquired lock "refresh_cache-8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:26 np0005465988 nova_compute[236126]: 2025-10-02 12:13:26.049 2 DEBUG nova.network.neutron [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:13:26 np0005465988 nova_compute[236126]: 2025-10-02 12:13:26.174 2 DEBUG nova.compute.manager [req-9ffe7a22-68dd-4d20-b424-afc54c17fa9d req-b9ad6814-0d62-47b4-bb3c-c3bee0a1a6a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Received event network-changed-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:26 np0005465988 nova_compute[236126]: 2025-10-02 12:13:26.174 2 DEBUG nova.compute.manager [req-9ffe7a22-68dd-4d20-b424-afc54c17fa9d req-b9ad6814-0d62-47b4-bb3c-c3bee0a1a6a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Refreshing instance network info cache due to event network-changed-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:13:26 np0005465988 nova_compute[236126]: 2025-10-02 12:13:26.175 2 DEBUG oslo_concurrency.lockutils [req-9ffe7a22-68dd-4d20-b424-afc54c17fa9d req-b9ad6814-0d62-47b4-bb3c-c3bee0a1a6a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:26 np0005465988 nova_compute[236126]: 2025-10-02 12:13:26.235 2 DEBUG nova.network.neutron [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:13:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:26.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:26.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:27 np0005465988 nova_compute[236126]: 2025-10-02 12:13:27.177 2 DEBUG nova.network.neutron [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Updating instance_info_cache with network_info: [{"id": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "address": "fa:16:3e:54:d9:0e", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9dcd32c-a9", "ovs_interfaceid": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:27 np0005465988 nova_compute[236126]: 2025-10-02 12:13:27.207 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Releasing lock "refresh_cache-8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:27 np0005465988 nova_compute[236126]: 2025-10-02 12:13:27.207 2 DEBUG nova.compute.manager [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Instance network_info: |[{"id": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "address": "fa:16:3e:54:d9:0e", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9dcd32c-a9", "ovs_interfaceid": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:13:27 np0005465988 nova_compute[236126]: 2025-10-02 12:13:27.208 2 DEBUG oslo_concurrency.lockutils [req-9ffe7a22-68dd-4d20-b424-afc54c17fa9d req-b9ad6814-0d62-47b4-bb3c-c3bee0a1a6a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:27 np0005465988 nova_compute[236126]: 2025-10-02 12:13:27.209 2 DEBUG nova.network.neutron [req-9ffe7a22-68dd-4d20-b424-afc54c17fa9d req-b9ad6814-0d62-47b4-bb3c-c3bee0a1a6a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Refreshing network info cache for port a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:13:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:27.339 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:27.340 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:27.340 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:27 np0005465988 nova_compute[236126]: 2025-10-02 12:13:27.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:27 np0005465988 nova_compute[236126]: 2025-10-02 12:13:27.392 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:27 np0005465988 nova_compute[236126]: 2025-10-02 12:13:27.482 2 DEBUG nova.storage.rbd_utils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] resizing rbd image 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:13:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:13:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:13:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.420 2 DEBUG nova.network.neutron [req-9ffe7a22-68dd-4d20-b424-afc54c17fa9d req-b9ad6814-0d62-47b4-bb3c-c3bee0a1a6a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Updated VIF entry in instance network info cache for port a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.421 2 DEBUG nova.network.neutron [req-9ffe7a22-68dd-4d20-b424-afc54c17fa9d req-b9ad6814-0d62-47b4-bb3c-c3bee0a1a6a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Updating instance_info_cache with network_info: [{"id": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "address": "fa:16:3e:54:d9:0e", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9dcd32c-a9", "ovs_interfaceid": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.438 2 DEBUG oslo_concurrency.lockutils [req-9ffe7a22-68dd-4d20-b424-afc54c17fa9d req-b9ad6814-0d62-47b4-bb3c-c3bee0a1a6a8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:28.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.669 2 DEBUG nova.objects.instance [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lazy-loading 'migration_context' on Instance uuid 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.684 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.684 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Ensure instance console log exists: /var/lib/nova/instances/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:13:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.685 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:28.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.686 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.687 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.691 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Start _get_guest_xml network_info=[{"id": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "address": "fa:16:3e:54:d9:0e", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9dcd32c-a9", "ovs_interfaceid": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.699 2 WARNING nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.705 2 DEBUG nova.virt.libvirt.host [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.706 2 DEBUG nova.virt.libvirt.host [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.710 2 DEBUG nova.virt.libvirt.host [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.710 2 DEBUG nova.virt.libvirt.host [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.712 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.713 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.713 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.714 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.714 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.715 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.715 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.715 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.716 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.716 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.717 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.717 2 DEBUG nova.virt.hardware [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:13:28 np0005465988 nova_compute[236126]: 2025-10-02 12:13:28.722 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e202 e202: 3 total, 3 up, 3 in
Oct  2 08:13:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:13:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6847 writes, 34K keys, 6847 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.03 MB/s#012Cumulative WAL: 6847 writes, 6847 syncs, 1.00 writes per sync, written: 0.07 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1765 writes, 8585 keys, 1765 commit groups, 1.0 writes per commit group, ingest: 16.83 MB, 0.03 MB/s#012Interval WAL: 1765 writes, 1765 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     80.9      0.51              0.12        18    0.029       0      0       0.0       0.0#012  L6      1/0    9.67 MB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   3.5    131.6    108.1      1.36              0.53        17    0.080     85K   9988       0.0       0.0#012 Sum      1/0    9.67 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   4.5     95.5    100.7      1.87              0.65        35    0.053     85K   9988       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   5.0    172.6    176.2      0.28              0.17         8    0.036     24K   3113       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   0.0    131.6    108.1      1.36              0.53        17    0.080     85K   9988       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     81.2      0.51              0.12        17    0.030       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.041, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.18 GB write, 0.08 MB/s write, 0.17 GB read, 0.07 MB/s read, 1.9 seconds#012Interval compaction: 0.05 GB write, 0.08 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.3 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 304.00 MB usage: 19.66 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000226 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1142,18.98 MB,6.24326%) FilterBlock(35,244.80 KB,0.078638%) IndexBlock(35,448.58 KB,0.1441%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 08:13:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:13:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/361221018' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.227 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.274 2 DEBUG nova.storage.rbd_utils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.280 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:13:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/680505641' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.725 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.728 2 DEBUG nova.virt.libvirt.vif [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:13:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-628122184',display_name='tempest-ImagesTestJSON-server-628122184',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-628122184',id=59,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='55d20ae21b6d4f0abfff3bccc371ee7a',ramdisk_id='',reservation_id='r-euvb0b1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-2116266493',owner_user_name='tempest-ImagesTestJSON-2116266493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:23Z,user_data=None,user_id='0df47040f1ff4ce69a6fbdfd9eba4955',uuid=8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "address": "fa:16:3e:54:d9:0e", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9dcd32c-a9", "ovs_interfaceid": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.729 2 DEBUG nova.network.os_vif_util [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converting VIF {"id": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "address": "fa:16:3e:54:d9:0e", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9dcd32c-a9", "ovs_interfaceid": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.731 2 DEBUG nova.network.os_vif_util [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9dcd32c-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.733 2 DEBUG nova.objects.instance [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lazy-loading 'pci_devices' on Instance uuid 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.756 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  <uuid>8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09</uuid>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  <name>instance-0000003b</name>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <nova:name>tempest-ImagesTestJSON-server-628122184</nova:name>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:13:28</nova:creationTime>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <nova:user uuid="0df47040f1ff4ce69a6fbdfd9eba4955">tempest-ImagesTestJSON-2116266493-project-member</nova:user>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <nova:project uuid="55d20ae21b6d4f0abfff3bccc371ee7a">tempest-ImagesTestJSON-2116266493</nova:project>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <nova:port uuid="a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <entry name="serial">8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09</entry>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <entry name="uuid">8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09</entry>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk.config">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:54:d9:0e"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <target dev="tapa9dcd32c-a9"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09/console.log" append="off"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:13:29 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:13:29 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:13:29 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:13:29 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.757 2 DEBUG nova.compute.manager [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Preparing to wait for external event network-vif-plugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.757 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.757 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.758 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.758 2 DEBUG nova.virt.libvirt.vif [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:13:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-628122184',display_name='tempest-ImagesTestJSON-server-628122184',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-628122184',id=59,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='55d20ae21b6d4f0abfff3bccc371ee7a',ramdisk_id='',reservation_id='r-euvb0b1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-2116266493',owner_user_name='tempest-ImagesTestJSON-2116266493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:23Z,user_data=None,user_id='0df47040f1ff4ce69a6fbdfd9eba4955',uuid=8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "address": "fa:16:3e:54:d9:0e", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9dcd32c-a9", "ovs_interfaceid": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.758 2 DEBUG nova.network.os_vif_util [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converting VIF {"id": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "address": "fa:16:3e:54:d9:0e", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9dcd32c-a9", "ovs_interfaceid": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.759 2 DEBUG nova.network.os_vif_util [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9dcd32c-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.759 2 DEBUG os_vif [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9dcd32c-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.760 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.760 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.763 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa9dcd32c-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.763 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa9dcd32c-a9, col_values=(('external_ids', {'iface-id': 'a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:d9:0e', 'vm-uuid': '8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:29 np0005465988 NetworkManager[45041]: <info>  [1759407209.7665] manager: (tapa9dcd32c-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.774 2 INFO os_vif [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9dcd32c-a9')#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.852 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.852 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.853 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] No VIF found with MAC fa:16:3e:54:d9:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.853 2 INFO nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Using config drive#033[00m
Oct  2 08:13:29 np0005465988 nova_compute[236126]: 2025-10-02 12:13:29.895 2 DEBUG nova.storage.rbd_utils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:13:30 np0005465988 nova_compute[236126]: 2025-10-02 12:13:30.312 2 INFO nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Creating config drive at /var/lib/nova/instances/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09/disk.config#033[00m
Oct  2 08:13:30 np0005465988 nova_compute[236126]: 2025-10-02 12:13:30.319 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8r_liie7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:30 np0005465988 nova_compute[236126]: 2025-10-02 12:13:30.450 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8r_liie7" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:30.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:30 np0005465988 nova_compute[236126]: 2025-10-02 12:13:30.489 2 DEBUG nova.storage.rbd_utils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] rbd image 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:13:30 np0005465988 nova_compute[236126]: 2025-10-02 12:13:30.493 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09/disk.config 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:30.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:30 np0005465988 nova_compute[236126]: 2025-10-02 12:13:30.951 2 DEBUG oslo_concurrency.processutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09/disk.config 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:30 np0005465988 nova_compute[236126]: 2025-10-02 12:13:30.952 2 INFO nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Deleting local config drive /var/lib/nova/instances/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09/disk.config because it was imported into RBD.#033[00m
Oct  2 08:13:31 np0005465988 kernel: tapa9dcd32c-a9: entered promiscuous mode
Oct  2 08:13:31 np0005465988 NetworkManager[45041]: <info>  [1759407211.0291] manager: (tapa9dcd32c-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Oct  2 08:13:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:31Z|00142|binding|INFO|Claiming lport a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 for this chassis.
Oct  2 08:13:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:31Z|00143|binding|INFO|a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57: Claiming fa:16:3e:54:d9:0e 10.100.0.9
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.037 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:d9:0e 10.100.0.9'], port_security=['fa:16:3e:54:d9:0e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55d20ae21b6d4f0abfff3bccc371ee7a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a015800-2f8b-4fd4-818b-829a4dcb7912', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9678136b-02f9-4c61-b96e-15935f11dca7, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.039 142124 INFO neutron.agent.ovn.metadata.agent [-] Port a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 in datapath 6d00de8e-203c-4e94-b60f-36ba9ccef805 bound to our chassis#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.042 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6d00de8e-203c-4e94-b60f-36ba9ccef805#033[00m
Oct  2 08:13:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:31Z|00144|binding|INFO|Setting lport a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 ovn-installed in OVS
Oct  2 08:13:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:31Z|00145|binding|INFO|Setting lport a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 up in Southbound
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.059 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4fed1e82-e6a6-4aa5-b93c-e1a620d19a3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.061 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6d00de8e-21 in ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.065 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6d00de8e-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.065 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d4171d9e-05ee-4596-ad47-07776a4a82d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.066 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5848fea6-4cc0-4392-b37f-275a68e0743c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:31 np0005465988 systemd-udevd[260655]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.083 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b7adb5-362b-4fe8-bd1f-057b3f348db0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 systemd-machined[192594]: New machine qemu-22-instance-0000003b.
Oct  2 08:13:31 np0005465988 systemd[1]: Started Virtual Machine qemu-22-instance-0000003b.
Oct  2 08:13:31 np0005465988 NetworkManager[45041]: <info>  [1759407211.1001] device (tapa9dcd32c-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:13:31 np0005465988 NetworkManager[45041]: <info>  [1759407211.1010] device (tapa9dcd32c-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.112 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c69c2656-01a4-4b3a-8f1d-0f053f55efb9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.141 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[96275fd5-8d1d-4df4-9cd0-4d809de479c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 NetworkManager[45041]: <info>  [1759407211.1477] manager: (tap6d00de8e-20): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.146 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[991cfe4f-1a5f-4ec3-bd21-5f7c7569677c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.186 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7c8dc1da-d5be-4d14-97db-2bc6123c477c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.190 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed79483-34b4-4c17-96c3-438fd73b0285]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 NetworkManager[45041]: <info>  [1759407211.2156] device (tap6d00de8e-20): carrier: link connected
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.221 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd35e39-d1ee-473c-b5c9-50029a34d843]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.242 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dffb4278-491f-45b8-ba20-a9ee59f20f99]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d00de8e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:28:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527154, 'reachable_time': 28869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260686, 'error': None, 'target': 'ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.261 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[514bbcd9-e49a-44ad-9bbe-a93f5fd60232]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe87:28f2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527154, 'tstamp': 527154}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 260687, 'error': None, 'target': 'ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.280 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[05fd7d16-26b1-4c1a-b334-3a60b44b37f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d00de8e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:28:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527154, 'reachable_time': 28869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 260688, 'error': None, 'target': 'ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.312 2 DEBUG nova.compute.manager [req-0808fb9d-aae9-47b3-ad3e-f87a4194f800 req-703cccd0-46cc-4f54-8fdc-d27085134d52 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Received event network-vif-plugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.312 2 DEBUG oslo_concurrency.lockutils [req-0808fb9d-aae9-47b3-ad3e-f87a4194f800 req-703cccd0-46cc-4f54-8fdc-d27085134d52 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.313 2 DEBUG oslo_concurrency.lockutils [req-0808fb9d-aae9-47b3-ad3e-f87a4194f800 req-703cccd0-46cc-4f54-8fdc-d27085134d52 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.313 2 DEBUG oslo_concurrency.lockutils [req-0808fb9d-aae9-47b3-ad3e-f87a4194f800 req-703cccd0-46cc-4f54-8fdc-d27085134d52 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.313 2 DEBUG nova.compute.manager [req-0808fb9d-aae9-47b3-ad3e-f87a4194f800 req-703cccd0-46cc-4f54-8fdc-d27085134d52 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Processing event network-vif-plugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.324 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5584ef-2dd2-4941-9687-bc7265fb7430]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.406 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d31e9a61-859a-4728-8b71-211a05563e36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.408 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d00de8e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.408 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.409 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d00de8e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:31 np0005465988 NetworkManager[45041]: <info>  [1759407211.4124] manager: (tap6d00de8e-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Oct  2 08:13:31 np0005465988 kernel: tap6d00de8e-20: entered promiscuous mode
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.417 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6d00de8e-20, col_values=(('external_ids', {'iface-id': '4d0b2163-acbb-4b6a-b6d8-84f8212e1e02'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:31Z|00146|binding|INFO|Releasing lport 4d0b2163-acbb-4b6a-b6d8-84f8212e1e02 from this chassis (sb_readonly=0)
Oct  2 08:13:31 np0005465988 nova_compute[236126]: 2025-10-02 12:13:31.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.449 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6d00de8e-203c-4e94-b60f-36ba9ccef805.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6d00de8e-203c-4e94-b60f-36ba9ccef805.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.450 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[28ae1fe3-778d-4aca-b86a-364b7da085ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.451 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-6d00de8e-203c-4e94-b60f-36ba9ccef805
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/6d00de8e-203c-4e94-b60f-36ba9ccef805.pid.haproxy
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 6d00de8e-203c-4e94-b60f-36ba9ccef805
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:13:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:31.452 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'env', 'PROCESS_TAG=haproxy-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6d00de8e-203c-4e94-b60f-36ba9ccef805.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:13:31 np0005465988 podman[260762]: 2025-10-02 12:13:31.898235262 +0000 UTC m=+0.081171899 container create ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:13:31 np0005465988 systemd[1]: Started libpod-conmon-ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869.scope.
Oct  2 08:13:31 np0005465988 podman[260762]: 2025-10-02 12:13:31.859919719 +0000 UTC m=+0.042856386 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:13:31 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:13:31 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9531c9dc07ee1b00d98cfc94d56ed6e29b1bd0d5a9f3a6c2884cc833894367e9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:13:32 np0005465988 podman[260762]: 2025-10-02 12:13:32.010545515 +0000 UTC m=+0.193482102 container init ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 08:13:32 np0005465988 podman[260762]: 2025-10-02 12:13:32.016974602 +0000 UTC m=+0.199911189 container start ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:13:32 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[260777]: [NOTICE]   (260781) : New worker (260783) forked
Oct  2 08:13:32 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[260777]: [NOTICE]   (260781) : Loading success.
Oct  2 08:13:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.298 2 DEBUG nova.compute.manager [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.299 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407212.2989132, 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.299 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] VM Started (Lifecycle Event)#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.305 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.309 2 INFO nova.virt.libvirt.driver [-] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Instance spawned successfully.#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.309 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.372 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.378 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.381 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.382 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.382 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.382 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.384 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.384 2 DEBUG nova.virt.libvirt.driver [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.419 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.420 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407212.2991695, 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.420 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.450 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.455 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407212.3040469, 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.455 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:13:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:32.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.463 2 INFO nova.compute.manager [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Took 8.82 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.464 2 DEBUG nova.compute.manager [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.471 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.475 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.507 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.536 2 INFO nova.compute.manager [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Took 9.84 seconds to build instance.#033[00m
Oct  2 08:13:32 np0005465988 nova_compute[236126]: 2025-10-02 12:13:32.574 2 DEBUG oslo_concurrency.lockutils [None req-5996cbff-f72f-4df0-80e0-80fcf749839b 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:32.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:33 np0005465988 nova_compute[236126]: 2025-10-02 12:13:33.391 2 DEBUG nova.compute.manager [req-bcc6bc93-e6d5-433a-9671-8d3a7027a07f req-dcc134af-90a1-4ab1-89a8-b2bdd10a236c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Received event network-vif-plugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:33 np0005465988 nova_compute[236126]: 2025-10-02 12:13:33.392 2 DEBUG oslo_concurrency.lockutils [req-bcc6bc93-e6d5-433a-9671-8d3a7027a07f req-dcc134af-90a1-4ab1-89a8-b2bdd10a236c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:33 np0005465988 nova_compute[236126]: 2025-10-02 12:13:33.392 2 DEBUG oslo_concurrency.lockutils [req-bcc6bc93-e6d5-433a-9671-8d3a7027a07f req-dcc134af-90a1-4ab1-89a8-b2bdd10a236c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:33 np0005465988 nova_compute[236126]: 2025-10-02 12:13:33.393 2 DEBUG oslo_concurrency.lockutils [req-bcc6bc93-e6d5-433a-9671-8d3a7027a07f req-dcc134af-90a1-4ab1-89a8-b2bdd10a236c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:33 np0005465988 nova_compute[236126]: 2025-10-02 12:13:33.393 2 DEBUG nova.compute.manager [req-bcc6bc93-e6d5-433a-9671-8d3a7027a07f req-dcc134af-90a1-4ab1-89a8-b2bdd10a236c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] No waiting events found dispatching network-vif-plugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:33 np0005465988 nova_compute[236126]: 2025-10-02 12:13:33.394 2 WARNING nova.compute.manager [req-bcc6bc93-e6d5-433a-9671-8d3a7027a07f req-dcc134af-90a1-4ab1-89a8-b2bdd10a236c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Received unexpected event network-vif-plugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:13:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:34.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:34.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:34 np0005465988 nova_compute[236126]: 2025-10-02 12:13:34.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:35 np0005465988 nova_compute[236126]: 2025-10-02 12:13:35.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:35.133 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:35.134 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:13:35 np0005465988 nova_compute[236126]: 2025-10-02 12:13:35.379 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407200.375569, d8adf6f4-e7d3-4a21-87f7-4b2396126258 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:35 np0005465988 nova_compute[236126]: 2025-10-02 12:13:35.379 2 INFO nova.compute.manager [-] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:13:35 np0005465988 nova_compute[236126]: 2025-10-02 12:13:35.400 2 DEBUG nova.compute.manager [None req-5e93a69b-2c66-45fb-9e32-7fdb445b0388 - - - - - -] [instance: d8adf6f4-e7d3-4a21-87f7-4b2396126258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:35 np0005465988 nova_compute[236126]: 2025-10-02 12:13:35.954 2 DEBUG nova.compute.manager [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:36 np0005465988 nova_compute[236126]: 2025-10-02 12:13:36.012 2 INFO nova.compute.manager [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] instance snapshotting#033[00m
Oct  2 08:13:36 np0005465988 nova_compute[236126]: 2025-10-02 12:13:36.386 2 INFO nova.virt.libvirt.driver [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Beginning live snapshot process#033[00m
Oct  2 08:13:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:36.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:36 np0005465988 nova_compute[236126]: 2025-10-02 12:13:36.539 2 DEBUG nova.virt.libvirt.imagebackend [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:13:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:36.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:36 np0005465988 nova_compute[236126]: 2025-10-02 12:13:36.781 2 DEBUG nova.storage.rbd_utils [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] creating snapshot(ff89ad0a01654743a811ffbbda3f2696) on rbd image(8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:13:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:37 np0005465988 nova_compute[236126]: 2025-10-02 12:13:37.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:37 np0005465988 nova_compute[236126]: 2025-10-02 12:13:37.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:37 np0005465988 podman[260920]: 2025-10-02 12:13:37.933689015 +0000 UTC m=+0.064213447 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:13:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e203 e203: 3 total, 3 up, 3 in
Oct  2 08:13:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:38.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:38 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #47. Immutable memtables: 4.
Oct  2 08:13:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:38.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:13:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:13:39 np0005465988 nova_compute[236126]: 2025-10-02 12:13:39.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:39 np0005465988 nova_compute[236126]: 2025-10-02 12:13:39.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:39 np0005465988 nova_compute[236126]: 2025-10-02 12:13:39.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:39 np0005465988 nova_compute[236126]: 2025-10-02 12:13:39.504 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:39 np0005465988 nova_compute[236126]: 2025-10-02 12:13:39.505 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:39 np0005465988 nova_compute[236126]: 2025-10-02 12:13:39.505 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:39 np0005465988 nova_compute[236126]: 2025-10-02 12:13:39.505 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:13:39 np0005465988 nova_compute[236126]: 2025-10-02 12:13:39.506 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:39 np0005465988 nova_compute[236126]: 2025-10-02 12:13:39.577 2 DEBUG nova.storage.rbd_utils [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] cloning vms/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk@ff89ad0a01654743a811ffbbda3f2696 to images/37de6150-c476-48f0-b3f4-9bad2557dfc2 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:13:39 np0005465988 nova_compute[236126]: 2025-10-02 12:13:39.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:13:39 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1446482444' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:13:39 np0005465988 nova_compute[236126]: 2025-10-02 12:13:39.985 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:40 np0005465988 nova_compute[236126]: 2025-10-02 12:13:40.407 2 DEBUG nova.storage.rbd_utils [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] flattening images/37de6150-c476-48f0-b3f4-9bad2557dfc2 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:13:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:40.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:40.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:41 np0005465988 nova_compute[236126]: 2025-10-02 12:13:41.057 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000003b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:13:41 np0005465988 nova_compute[236126]: 2025-10-02 12:13:41.057 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000003b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:13:41 np0005465988 nova_compute[236126]: 2025-10-02 12:13:41.245 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:13:41 np0005465988 nova_compute[236126]: 2025-10-02 12:13:41.246 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4536MB free_disk=20.900920867919922GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:13:41 np0005465988 nova_compute[236126]: 2025-10-02 12:13:41.247 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:41 np0005465988 nova_compute[236126]: 2025-10-02 12:13:41.247 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:41 np0005465988 nova_compute[236126]: 2025-10-02 12:13:41.396 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:13:41 np0005465988 nova_compute[236126]: 2025-10-02 12:13:41.396 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:13:41 np0005465988 nova_compute[236126]: 2025-10-02 12:13:41.397 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:13:41 np0005465988 nova_compute[236126]: 2025-10-02 12:13:41.563 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:13:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1195908143' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:13:42 np0005465988 nova_compute[236126]: 2025-10-02 12:13:42.104 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.541s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:42 np0005465988 nova_compute[236126]: 2025-10-02 12:13:42.112 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:42 np0005465988 nova_compute[236126]: 2025-10-02 12:13:42.133 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:42.137 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:42 np0005465988 nova_compute[236126]: 2025-10-02 12:13:42.166 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:13:42 np0005465988 nova_compute[236126]: 2025-10-02 12:13:42.167 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.920s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:42 np0005465988 nova_compute[236126]: 2025-10-02 12:13:42.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:42 np0005465988 nova_compute[236126]: 2025-10-02 12:13:42.450 2 DEBUG nova.storage.rbd_utils [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] removing snapshot(ff89ad0a01654743a811ffbbda3f2696) on rbd image(8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:13:42 np0005465988 nova_compute[236126]: 2025-10-02 12:13:42.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:42 np0005465988 nova_compute[236126]: 2025-10-02 12:13:42.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:42 np0005465988 nova_compute[236126]: 2025-10-02 12:13:42.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:13:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:42.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:42 np0005465988 nova_compute[236126]: 2025-10-02 12:13:42.505 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:13:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:42.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:43 np0005465988 nova_compute[236126]: 2025-10-02 12:13:43.504 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:43 np0005465988 nova_compute[236126]: 2025-10-02 12:13:43.505 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:13:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e204 e204: 3 total, 3 up, 3 in
Oct  2 08:13:44 np0005465988 nova_compute[236126]: 2025-10-02 12:13:44.175 2 DEBUG nova.storage.rbd_utils [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] creating snapshot(snap) on rbd image(37de6150-c476-48f0-b3f4-9bad2557dfc2) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:13:44 np0005465988 nova_compute[236126]: 2025-10-02 12:13:44.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:44.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:44.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:44 np0005465988 nova_compute[236126]: 2025-10-02 12:13:44.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e205 e205: 3 total, 3 up, 3 in
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Failed to snapshot image: nova.exception.ImageNotFound: Image 37de6150-c476-48f0-b3f4-9bad2557dfc2 could not be found.
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     image = self._client.call(
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver glanceclient.exc.HTTPNotFound: HTTP 404 Not Found: No image found with ID 37de6150-c476-48f0-b3f4-9bad2557dfc2
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver 
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred:
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver 
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3082, in snapshot
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     self._image_api.update(context, image_id, metadata,
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1243, in update
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     return session.update(context, image_id, image_info, data=data,
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 693, in update
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     _reraise_translated_image_exception(image_id)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1031, in _reraise_translated_image_exception
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     raise new_exc.with_traceback(exc_trace)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     image = self._client.call(
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver nova.exception.ImageNotFound: Image 37de6150-c476-48f0-b3f4-9bad2557dfc2 could not be found.
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.211 2 ERROR nova.virt.libvirt.driver #033[00m
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.303 2 DEBUG nova.storage.rbd_utils [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] removing snapshot(snap) on rbd image(37de6150-c476-48f0-b3f4-9bad2557dfc2) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.503 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:46.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.504 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.504 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.540 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.540 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.541 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:13:46 np0005465988 nova_compute[236126]: 2025-10-02 12:13:46.542 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:46.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:46Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:54:d9:0e 10.100.0.9
Oct  2 08:13:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:46Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:54:d9:0e 10.100.0.9
Oct  2 08:13:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:47 np0005465988 nova_compute[236126]: 2025-10-02 12:13:47.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e206 e206: 3 total, 3 up, 3 in
Oct  2 08:13:48 np0005465988 nova_compute[236126]: 2025-10-02 12:13:48.320 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Updating instance_info_cache with network_info: [{"id": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "address": "fa:16:3e:54:d9:0e", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9dcd32c-a9", "ovs_interfaceid": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:48 np0005465988 nova_compute[236126]: 2025-10-02 12:13:48.339 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:48 np0005465988 nova_compute[236126]: 2025-10-02 12:13:48.339 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:13:48 np0005465988 nova_compute[236126]: 2025-10-02 12:13:48.340 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:48.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:48.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:49 np0005465988 nova_compute[236126]: 2025-10-02 12:13:49.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:50.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:50.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:52 np0005465988 nova_compute[236126]: 2025-10-02 12:13:52.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:52 np0005465988 nova_compute[236126]: 2025-10-02 12:13:52.396 2 WARNING nova.compute.manager [None req-5b69e2f5-e249-430c-bc5c-15c5b6d10939 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Image not found during snapshot: nova.exception.ImageNotFound: Image 37de6150-c476-48f0-b3f4-9bad2557dfc2 could not be found.#033[00m
Oct  2 08:13:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:52.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:52.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:53 np0005465988 nova_compute[236126]: 2025-10-02 12:13:53.316 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:53 np0005465988 nova_compute[236126]: 2025-10-02 12:13:53.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:53 np0005465988 nova_compute[236126]: 2025-10-02 12:13:53.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:13:53 np0005465988 podman[261148]: 2025-10-02 12:13:53.551856712 +0000 UTC m=+0.068766039 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:13:53 np0005465988 podman[261147]: 2025-10-02 12:13:53.579487275 +0000 UTC m=+0.096484874 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct  2 08:13:53 np0005465988 podman[261146]: 2025-10-02 12:13:53.619182058 +0000 UTC m=+0.135681573 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:13:53 np0005465988 nova_compute[236126]: 2025-10-02 12:13:53.652 2 DEBUG oslo_concurrency.lockutils [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:53 np0005465988 nova_compute[236126]: 2025-10-02 12:13:53.653 2 DEBUG oslo_concurrency.lockutils [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:53 np0005465988 nova_compute[236126]: 2025-10-02 12:13:53.653 2 DEBUG oslo_concurrency.lockutils [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:53 np0005465988 nova_compute[236126]: 2025-10-02 12:13:53.654 2 DEBUG oslo_concurrency.lockutils [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:53 np0005465988 nova_compute[236126]: 2025-10-02 12:13:53.654 2 DEBUG oslo_concurrency.lockutils [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:53 np0005465988 nova_compute[236126]: 2025-10-02 12:13:53.655 2 INFO nova.compute.manager [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Terminating instance#033[00m
Oct  2 08:13:53 np0005465988 nova_compute[236126]: 2025-10-02 12:13:53.656 2 DEBUG nova.compute.manager [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:13:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e207 e207: 3 total, 3 up, 3 in
Oct  2 08:13:54 np0005465988 kernel: tapa9dcd32c-a9 (unregistering): left promiscuous mode
Oct  2 08:13:54 np0005465988 NetworkManager[45041]: <info>  [1759407234.4573] device (tapa9dcd32c-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:13:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:54.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:54Z|00147|binding|INFO|Releasing lport a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 from this chassis (sb_readonly=0)
Oct  2 08:13:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:54Z|00148|binding|INFO|Setting lport a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 down in Southbound
Oct  2 08:13:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:13:54Z|00149|binding|INFO|Removing iface tapa9dcd32c-a9 ovn-installed in OVS
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.532 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:d9:0e 10.100.0.9'], port_security=['fa:16:3e:54:d9:0e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55d20ae21b6d4f0abfff3bccc371ee7a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a015800-2f8b-4fd4-818b-829a4dcb7912', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9678136b-02f9-4c61-b96e-15935f11dca7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.533 142124 INFO neutron.agent.ovn.metadata.agent [-] Port a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 in datapath 6d00de8e-203c-4e94-b60f-36ba9ccef805 unbound from our chassis#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.535 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d00de8e-203c-4e94-b60f-36ba9ccef805, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.536 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[df05bd38-9b51-40ba-9889-7826828e0b33]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.537 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805 namespace which is not needed anymore#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005465988 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Oct  2 08:13:54 np0005465988 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000003b.scope: Consumed 14.436s CPU time.
Oct  2 08:13:54 np0005465988 systemd-machined[192594]: Machine qemu-22-instance-0000003b terminated.
Oct  2 08:13:54 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[260777]: [NOTICE]   (260781) : haproxy version is 2.8.14-c23fe91
Oct  2 08:13:54 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[260777]: [NOTICE]   (260781) : path to executable is /usr/sbin/haproxy
Oct  2 08:13:54 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[260777]: [WARNING]  (260781) : Exiting Master process...
Oct  2 08:13:54 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[260777]: [WARNING]  (260781) : Exiting Master process...
Oct  2 08:13:54 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[260777]: [ALERT]    (260781) : Current worker (260783) exited with code 143 (Terminated)
Oct  2 08:13:54 np0005465988 neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805[260777]: [WARNING]  (260781) : All workers exited. Exiting... (0)
Oct  2 08:13:54 np0005465988 systemd[1]: libpod-ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869.scope: Deactivated successfully.
Oct  2 08:13:54 np0005465988 podman[261237]: 2025-10-02 12:13:54.690974945 +0000 UTC m=+0.053025971 container died ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.700 2 INFO nova.virt.libvirt.driver [-] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Instance destroyed successfully.#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.700 2 DEBUG nova.objects.instance [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lazy-loading 'resources' on Instance uuid 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:54.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:54 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869-userdata-shm.mount: Deactivated successfully.
Oct  2 08:13:54 np0005465988 systemd[1]: var-lib-containers-storage-overlay-9531c9dc07ee1b00d98cfc94d56ed6e29b1bd0d5a9f3a6c2884cc833894367e9-merged.mount: Deactivated successfully.
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.728 2 DEBUG nova.virt.libvirt.vif [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:13:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-628122184',display_name='tempest-ImagesTestJSON-server-628122184',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-628122184',id=59,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:13:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='55d20ae21b6d4f0abfff3bccc371ee7a',ramdisk_id='',reservation_id='r-euvb0b1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-2116266493',owner_user_name='tempest-ImagesTestJSON-2116266493-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:13:52Z,user_data=None,user_id='0df47040f1ff4ce69a6fbdfd9eba4955',uuid=8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "address": "fa:16:3e:54:d9:0e", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9dcd32c-a9", "ovs_interfaceid": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.728 2 DEBUG nova.network.os_vif_util [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converting VIF {"id": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "address": "fa:16:3e:54:d9:0e", "network": {"id": "6d00de8e-203c-4e94-b60f-36ba9ccef805", "bridge": "br-int", "label": "tempest-ImagesTestJSON-833660691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d20ae21b6d4f0abfff3bccc371ee7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9dcd32c-a9", "ovs_interfaceid": "a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.729 2 DEBUG nova.network.os_vif_util [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9dcd32c-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.730 2 DEBUG os_vif [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9dcd32c-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.732 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa9dcd32c-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:54 np0005465988 podman[261237]: 2025-10-02 12:13:54.737093985 +0000 UTC m=+0.099145001 container cleanup ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.740 2 INFO os_vif [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57,network=Network(6d00de8e-203c-4e94-b60f-36ba9ccef805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9dcd32c-a9')#033[00m
Oct  2 08:13:54 np0005465988 systemd[1]: libpod-conmon-ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869.scope: Deactivated successfully.
Oct  2 08:13:54 np0005465988 podman[261283]: 2025-10-02 12:13:54.805619756 +0000 UTC m=+0.041683982 container remove ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.811 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a862081e-2f4c-400d-bd05-28a6b2430bca]: (4, ('Thu Oct  2 12:13:54 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805 (ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869)\nede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869\nThu Oct  2 12:13:54 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805 (ede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869)\nede192eb82bfc0004a955d57765ab02298e687b613b5a4b2bdca9bef97898869\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.813 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c422d262-d26f-44f1-878d-e89b49800fc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.814 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d00de8e-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:54 np0005465988 kernel: tap6d00de8e-20: left promiscuous mode
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.834 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[88ac740a-c999-4b94-a009-f1713a6700e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.861 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fd041093-e683-4653-9c92-bf5d913d07c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.863 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b342465f-3964-49ba-8634-8c6f993ed79d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.882 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a51bd1-14e6-4502-b85c-edd68e1e5d3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527146, 'reachable_time': 17221, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261310, 'error': None, 'target': 'ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:54 np0005465988 systemd[1]: run-netns-ovnmeta\x2d6d00de8e\x2d203c\x2d4e94\x2db60f\x2d36ba9ccef805.mount: Deactivated successfully.
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.886 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6d00de8e-203c-4e94-b60f-36ba9ccef805 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:13:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:13:54.887 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fc2214-7bc9-4da7-92ac-913a9ea13cb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.896 2 DEBUG nova.compute.manager [req-0a186cdd-b4e9-42bd-bdf4-b3d508e64139 req-b58ed03b-7607-4359-98ee-f41d2babe593 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Received event network-vif-unplugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.897 2 DEBUG oslo_concurrency.lockutils [req-0a186cdd-b4e9-42bd-bdf4-b3d508e64139 req-b58ed03b-7607-4359-98ee-f41d2babe593 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.897 2 DEBUG oslo_concurrency.lockutils [req-0a186cdd-b4e9-42bd-bdf4-b3d508e64139 req-b58ed03b-7607-4359-98ee-f41d2babe593 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.898 2 DEBUG oslo_concurrency.lockutils [req-0a186cdd-b4e9-42bd-bdf4-b3d508e64139 req-b58ed03b-7607-4359-98ee-f41d2babe593 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.898 2 DEBUG nova.compute.manager [req-0a186cdd-b4e9-42bd-bdf4-b3d508e64139 req-b58ed03b-7607-4359-98ee-f41d2babe593 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] No waiting events found dispatching network-vif-unplugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:54 np0005465988 nova_compute[236126]: 2025-10-02 12:13:54.898 2 DEBUG nova.compute.manager [req-0a186cdd-b4e9-42bd-bdf4-b3d508e64139 req-b58ed03b-7607-4359-98ee-f41d2babe593 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Received event network-vif-unplugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:13:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:13:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2933500776' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:13:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:13:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2933500776' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:13:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:56.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:56.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:57 np0005465988 nova_compute[236126]: 2025-10-02 12:13:57.032 2 DEBUG nova.compute.manager [req-1bd0e0f9-5817-4c21-8a07-68bba3155c25 req-6a25e939-ac03-4355-8313-b4b2e9c839e7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Received event network-vif-plugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:57 np0005465988 nova_compute[236126]: 2025-10-02 12:13:57.032 2 DEBUG oslo_concurrency.lockutils [req-1bd0e0f9-5817-4c21-8a07-68bba3155c25 req-6a25e939-ac03-4355-8313-b4b2e9c839e7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:57 np0005465988 nova_compute[236126]: 2025-10-02 12:13:57.032 2 DEBUG oslo_concurrency.lockutils [req-1bd0e0f9-5817-4c21-8a07-68bba3155c25 req-6a25e939-ac03-4355-8313-b4b2e9c839e7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:57 np0005465988 nova_compute[236126]: 2025-10-02 12:13:57.033 2 DEBUG oslo_concurrency.lockutils [req-1bd0e0f9-5817-4c21-8a07-68bba3155c25 req-6a25e939-ac03-4355-8313-b4b2e9c839e7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:57 np0005465988 nova_compute[236126]: 2025-10-02 12:13:57.033 2 DEBUG nova.compute.manager [req-1bd0e0f9-5817-4c21-8a07-68bba3155c25 req-6a25e939-ac03-4355-8313-b4b2e9c839e7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] No waiting events found dispatching network-vif-plugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:57 np0005465988 nova_compute[236126]: 2025-10-02 12:13:57.033 2 WARNING nova.compute.manager [req-1bd0e0f9-5817-4c21-8a07-68bba3155c25 req-6a25e939-ac03-4355-8313-b4b2e9c839e7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Received unexpected event network-vif-plugged-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:13:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:13:57 np0005465988 nova_compute[236126]: 2025-10-02 12:13:57.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:58 np0005465988 nova_compute[236126]: 2025-10-02 12:13:58.297 2 INFO nova.virt.libvirt.driver [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Deleting instance files /var/lib/nova/instances/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_del#033[00m
Oct  2 08:13:58 np0005465988 nova_compute[236126]: 2025-10-02 12:13:58.298 2 INFO nova.virt.libvirt.driver [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Deletion of /var/lib/nova/instances/8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09_del complete#033[00m
Oct  2 08:13:58 np0005465988 nova_compute[236126]: 2025-10-02 12:13:58.357 2 INFO nova.compute.manager [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Took 4.70 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:13:58 np0005465988 nova_compute[236126]: 2025-10-02 12:13:58.358 2 DEBUG oslo.service.loopingcall [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:13:58 np0005465988 nova_compute[236126]: 2025-10-02 12:13:58.359 2 DEBUG nova.compute.manager [-] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:13:58 np0005465988 nova_compute[236126]: 2025-10-02 12:13:58.359 2 DEBUG nova.network.neutron [-] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:13:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:13:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:13:58.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:13:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e208 e208: 3 total, 3 up, 3 in
Oct  2 08:13:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:13:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:13:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:13:58.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.049 2 DEBUG nova.network.neutron [-] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.074 2 INFO nova.compute.manager [-] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Took 0.71 seconds to deallocate network for instance.#033[00m
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.119 2 DEBUG nova.compute.manager [req-0b618175-e815-4f68-9bb3-e4f3349c1ff2 req-a0125f2d-ca02-406d-87e6-4bdc41b5e1c6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Received event network-vif-deleted-a9dcd32c-a9d6-4e8e-9304-0e76a8f89d57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.140 2 DEBUG oslo_concurrency.lockutils [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.141 2 DEBUG oslo_concurrency.lockutils [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.305 2 DEBUG oslo_concurrency.processutils [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:13:59 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3227020014' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.776 2 DEBUG oslo_concurrency.processutils [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.783 2 DEBUG nova.compute.provider_tree [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.800 2 DEBUG nova.scheduler.client.report [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.823 2 DEBUG oslo_concurrency.lockutils [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.844 2 INFO nova.scheduler.client.report [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Deleted allocations for instance 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09#033[00m
Oct  2 08:13:59 np0005465988 nova_compute[236126]: 2025-10-02 12:13:59.932 2 DEBUG oslo_concurrency.lockutils [None req-6587a586-74ac-4fbe-b71c-ac1baacaa15e 0df47040f1ff4ce69a6fbdfd9eba4955 55d20ae21b6d4f0abfff3bccc371ee7a - - default default] Lock "8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:00.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:00.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:02 np0005465988 nova_compute[236126]: 2025-10-02 12:14:02.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:02.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:02.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:04.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:04.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:04 np0005465988 nova_compute[236126]: 2025-10-02 12:14:04.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:05 np0005465988 nova_compute[236126]: 2025-10-02 12:14:05.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:06.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:14:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2401.0 total, 600.0 interval#012Cumulative writes: 25K writes, 104K keys, 25K commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.04 MB/s#012Cumulative WAL: 25K writes, 8811 syncs, 2.91 writes per sync, written: 0.10 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 13K writes, 52K keys, 13K commit groups, 1.0 writes per commit group, ingest: 56.05 MB, 0.09 MB/s#012Interval WAL: 13K writes, 5353 syncs, 2.54 writes per sync, written: 0.05 GB, 0.09 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 08:14:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:06.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:07 np0005465988 nova_compute[236126]: 2025-10-02 12:14:07.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:14:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:08.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:14:08 np0005465988 podman[261390]: 2025-10-02 12:14:08.566249372 +0000 UTC m=+0.093690384 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:14:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:08.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:09 np0005465988 nova_compute[236126]: 2025-10-02 12:14:09.699 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407234.6979196, 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:09 np0005465988 nova_compute[236126]: 2025-10-02 12:14:09.700 2 INFO nova.compute.manager [-] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:14:09 np0005465988 nova_compute[236126]: 2025-10-02 12:14:09.727 2 DEBUG nova.compute.manager [None req-b2a174c0-73c6-4843-8983-e10c8d393069 - - - - - -] [instance: 8af0743c-0b6a-4e74-a2c7-f8b4ccd83d09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:09 np0005465988 nova_compute[236126]: 2025-10-02 12:14:09.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:10.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:10.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:11 np0005465988 nova_compute[236126]: 2025-10-02 12:14:11.587 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:11 np0005465988 nova_compute[236126]: 2025-10-02 12:14:11.588 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:11 np0005465988 nova_compute[236126]: 2025-10-02 12:14:11.607 2 DEBUG nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:14:11 np0005465988 nova_compute[236126]: 2025-10-02 12:14:11.694 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:11 np0005465988 nova_compute[236126]: 2025-10-02 12:14:11.695 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:11 np0005465988 nova_compute[236126]: 2025-10-02 12:14:11.704 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:14:11 np0005465988 nova_compute[236126]: 2025-10-02 12:14:11.705 2 INFO nova.compute.claims [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:14:11 np0005465988 nova_compute[236126]: 2025-10-02 12:14:11.814 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:14:12 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3971733623' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:14:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.300 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.307 2 DEBUG nova.compute.provider_tree [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.325 2 DEBUG nova.scheduler.client.report [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.368 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.369 2 DEBUG nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.428 2 DEBUG nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.429 2 DEBUG nova.network.neutron [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.448 2 INFO nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.467 2 DEBUG nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:14:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:12.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.575 2 DEBUG nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.577 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.578 2 INFO nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Creating image(s)#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.605 2 DEBUG nova.storage.rbd_utils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.635 2 DEBUG nova.storage.rbd_utils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.665 2 DEBUG nova.storage.rbd_utils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.669 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.692 2 DEBUG nova.policy [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd107dd863d2e4a56853a0b758cb2c110', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ba87091f122a4afabe0a62682078fece', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.730 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.731 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.731 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.732 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:12.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.758 2 DEBUG nova.storage.rbd_utils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:14:12 np0005465988 nova_compute[236126]: 2025-10-02 12:14:12.762 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:13 np0005465988 nova_compute[236126]: 2025-10-02 12:14:13.267 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:13 np0005465988 nova_compute[236126]: 2025-10-02 12:14:13.352 2 DEBUG nova.storage.rbd_utils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] resizing rbd image 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:14:13 np0005465988 nova_compute[236126]: 2025-10-02 12:14:13.479 2 DEBUG nova.objects.instance [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lazy-loading 'migration_context' on Instance uuid 612f24ca-960e-4fef-81d6-aefbd0d68bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:13 np0005465988 nova_compute[236126]: 2025-10-02 12:14:13.497 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:14:13 np0005465988 nova_compute[236126]: 2025-10-02 12:14:13.498 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Ensure instance console log exists: /var/lib/nova/instances/612f24ca-960e-4fef-81d6-aefbd0d68bf5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:14:13 np0005465988 nova_compute[236126]: 2025-10-02 12:14:13.499 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:13 np0005465988 nova_compute[236126]: 2025-10-02 12:14:13.499 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:13 np0005465988 nova_compute[236126]: 2025-10-02 12:14:13.499 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:13 np0005465988 nova_compute[236126]: 2025-10-02 12:14:13.720 2 DEBUG nova.network.neutron [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Successfully created port: 9bb43112-c321-4894-a7e6-6e7cdbb47eb5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:14:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:14.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:14:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:14.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:14:14 np0005465988 nova_compute[236126]: 2025-10-02 12:14:14.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:14 np0005465988 nova_compute[236126]: 2025-10-02 12:14:14.997 2 DEBUG nova.network.neutron [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Successfully created port: 8efa0909-383b-4c50-82de-99064aa6894d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:14:16 np0005465988 nova_compute[236126]: 2025-10-02 12:14:16.112 2 DEBUG nova.network.neutron [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Successfully created port: f57ded32-4a17-4f1b-b0eb-06069110bc4c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:14:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:16.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:16.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:17 np0005465988 nova_compute[236126]: 2025-10-02 12:14:17.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:18 np0005465988 nova_compute[236126]: 2025-10-02 12:14:18.271 2 DEBUG nova.network.neutron [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Successfully updated port: 9bb43112-c321-4894-a7e6-6e7cdbb47eb5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:14:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:18.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:18.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:19 np0005465988 nova_compute[236126]: 2025-10-02 12:14:19.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:19 np0005465988 nova_compute[236126]: 2025-10-02 12:14:19.940 2 DEBUG nova.network.neutron [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Successfully updated port: 8efa0909-383b-4c50-82de-99064aa6894d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:14:20 np0005465988 nova_compute[236126]: 2025-10-02 12:14:20.507 2 DEBUG nova.compute.manager [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-changed-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:20 np0005465988 nova_compute[236126]: 2025-10-02 12:14:20.508 2 DEBUG nova.compute.manager [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Refreshing instance network info cache due to event network-changed-9bb43112-c321-4894-a7e6-6e7cdbb47eb5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:14:20 np0005465988 nova_compute[236126]: 2025-10-02 12:14:20.508 2 DEBUG oslo_concurrency.lockutils [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:20 np0005465988 nova_compute[236126]: 2025-10-02 12:14:20.508 2 DEBUG oslo_concurrency.lockutils [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:20 np0005465988 nova_compute[236126]: 2025-10-02 12:14:20.508 2 DEBUG nova.network.neutron [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Refreshing network info cache for port 9bb43112-c321-4894-a7e6-6e7cdbb47eb5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:14:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:20.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:20.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:20 np0005465988 nova_compute[236126]: 2025-10-02 12:14:20.760 2 DEBUG nova.network.neutron [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:14:21 np0005465988 nova_compute[236126]: 2025-10-02 12:14:21.069 2 DEBUG nova.network.neutron [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Successfully updated port: f57ded32-4a17-4f1b-b0eb-06069110bc4c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:14:21 np0005465988 nova_compute[236126]: 2025-10-02 12:14:21.114 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:21 np0005465988 nova_compute[236126]: 2025-10-02 12:14:21.338 2 DEBUG nova.network.neutron [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:21 np0005465988 nova_compute[236126]: 2025-10-02 12:14:21.367 2 DEBUG oslo_concurrency.lockutils [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:21 np0005465988 nova_compute[236126]: 2025-10-02 12:14:21.368 2 DEBUG nova.compute.manager [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-changed-8efa0909-383b-4c50-82de-99064aa6894d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:21 np0005465988 nova_compute[236126]: 2025-10-02 12:14:21.369 2 DEBUG nova.compute.manager [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Refreshing instance network info cache due to event network-changed-8efa0909-383b-4c50-82de-99064aa6894d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:14:21 np0005465988 nova_compute[236126]: 2025-10-02 12:14:21.369 2 DEBUG oslo_concurrency.lockutils [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:21 np0005465988 nova_compute[236126]: 2025-10-02 12:14:21.370 2 DEBUG oslo_concurrency.lockutils [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:21 np0005465988 nova_compute[236126]: 2025-10-02 12:14:21.370 2 DEBUG nova.network.neutron [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Refreshing network info cache for port 8efa0909-383b-4c50-82de-99064aa6894d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:14:21 np0005465988 nova_compute[236126]: 2025-10-02 12:14:21.871 2 DEBUG nova.network.neutron [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:14:22 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:14:22 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:14:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:22 np0005465988 nova_compute[236126]: 2025-10-02 12:14:22.296 2 DEBUG nova.network.neutron [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:22 np0005465988 nova_compute[236126]: 2025-10-02 12:14:22.345 2 DEBUG oslo_concurrency.lockutils [req-07fdd191-5b7c-4c45-9a5e-0e2097ac5039 req-50eadd96-fa32-475d-a445-50c414b30b5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:22 np0005465988 nova_compute[236126]: 2025-10-02 12:14:22.346 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquired lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:22 np0005465988 nova_compute[236126]: 2025-10-02 12:14:22.346 2 DEBUG nova.network.neutron [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:14:22 np0005465988 nova_compute[236126]: 2025-10-02 12:14:22.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:22.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:22 np0005465988 nova_compute[236126]: 2025-10-02 12:14:22.651 2 DEBUG nova.compute.manager [req-e9964445-4921-421e-a398-d11dc77dd7ea req-f54c57b2-9c7c-4946-858e-1d89b3c6795a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-changed-f57ded32-4a17-4f1b-b0eb-06069110bc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:22 np0005465988 nova_compute[236126]: 2025-10-02 12:14:22.652 2 DEBUG nova.compute.manager [req-e9964445-4921-421e-a398-d11dc77dd7ea req-f54c57b2-9c7c-4946-858e-1d89b3c6795a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Refreshing instance network info cache due to event network-changed-f57ded32-4a17-4f1b-b0eb-06069110bc4c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:14:22 np0005465988 nova_compute[236126]: 2025-10-02 12:14:22.652 2 DEBUG oslo_concurrency.lockutils [req-e9964445-4921-421e-a398-d11dc77dd7ea req-f54c57b2-9c7c-4946-858e-1d89b3c6795a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:22.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:22 np0005465988 nova_compute[236126]: 2025-10-02 12:14:22.860 2 DEBUG nova.network.neutron [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:14:24 np0005465988 podman[261659]: 2025-10-02 12:14:24.557833954 +0000 UTC m=+0.083748244 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, io.buildah.version=1.41.3)
Oct  2 08:14:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:24.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:24 np0005465988 podman[261658]: 2025-10-02 12:14:24.575865737 +0000 UTC m=+0.114064244 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:14:24 np0005465988 podman[261660]: 2025-10-02 12:14:24.576175917 +0000 UTC m=+0.096057432 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:14:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:24.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:24 np0005465988 nova_compute[236126]: 2025-10-02 12:14:24.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.081 2 DEBUG nova.network.neutron [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Updating instance_info_cache with network_info: [{"id": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "address": "fa:16:3e:41:7a:fb", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb43112-c3", "ovs_interfaceid": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8efa0909-383b-4c50-82de-99064aa6894d", "address": "fa:16:3e:8a:59:4d", "network": {"id": "6e4fcc17-40ff-4975-9ebb-75ca0833226a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1713615272", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8efa0909-38", "ovs_interfaceid": "8efa0909-383b-4c50-82de-99064aa6894d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "address": "fa:16:3e:77:d3:ba", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf57ded32-4a", "ovs_interfaceid": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.138 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Releasing lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.138 2 DEBUG nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Instance network_info: |[{"id": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "address": "fa:16:3e:41:7a:fb", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb43112-c3", "ovs_interfaceid": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8efa0909-383b-4c50-82de-99064aa6894d", "address": "fa:16:3e:8a:59:4d", "network": {"id": "6e4fcc17-40ff-4975-9ebb-75ca0833226a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1713615272", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8efa0909-38", "ovs_interfaceid": "8efa0909-383b-4c50-82de-99064aa6894d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "address": "fa:16:3e:77:d3:ba", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf57ded32-4a", "ovs_interfaceid": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.139 2 DEBUG oslo_concurrency.lockutils [req-e9964445-4921-421e-a398-d11dc77dd7ea req-f54c57b2-9c7c-4946-858e-1d89b3c6795a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.139 2 DEBUG nova.network.neutron [req-e9964445-4921-421e-a398-d11dc77dd7ea req-f54c57b2-9c7c-4946-858e-1d89b3c6795a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Refreshing network info cache for port f57ded32-4a17-4f1b-b0eb-06069110bc4c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.145 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Start _get_guest_xml network_info=[{"id": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "address": "fa:16:3e:41:7a:fb", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb43112-c3", "ovs_interfaceid": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8efa0909-383b-4c50-82de-99064aa6894d", "address": "fa:16:3e:8a:59:4d", "network": {"id": "6e4fcc17-40ff-4975-9ebb-75ca0833226a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1713615272", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8efa0909-38", "ovs_interfaceid": "8efa0909-383b-4c50-82de-99064aa6894d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "address": "fa:16:3e:77:d3:ba", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf57ded32-4a", "ovs_interfaceid": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.152 2 WARNING nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.159 2 DEBUG nova.virt.libvirt.host [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.159 2 DEBUG nova.virt.libvirt.host [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.165 2 DEBUG nova.virt.libvirt.host [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.165 2 DEBUG nova.virt.libvirt.host [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.167 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.167 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.168 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.168 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.169 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.169 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.169 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.170 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.170 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.170 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.171 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.171 2 DEBUG nova.virt.hardware [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.174 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:26.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:14:26 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1707467713' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.634 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.679 2 DEBUG nova.storage.rbd_utils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:14:26 np0005465988 nova_compute[236126]: 2025-10-02 12:14:26.684 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:26.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:14:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4252505577' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.135 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.138 2 DEBUG nova.virt.libvirt.vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-601760978',display_name='tempest-ServersTestMultiNic-server-601760978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-601760978',id=62,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-li7f2xvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:12Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=612f24ca-960e-4fef-81d6-aefbd0d68bf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "address": "fa:16:3e:41:7a:fb", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb43112-c3", "ovs_interfaceid": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.138 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "address": "fa:16:3e:41:7a:fb", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb43112-c3", "ovs_interfaceid": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.140 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:7a:fb,bridge_name='br-int',has_traffic_filtering=True,id=9bb43112-c321-4894-a7e6-6e7cdbb47eb5,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb43112-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.141 2 DEBUG nova.virt.libvirt.vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-601760978',display_name='tempest-ServersTestMultiNic-server-601760978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-601760978',id=62,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-li7f2xvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:12Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=612f24ca-960e-4fef-81d6-aefbd0d68bf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8efa0909-383b-4c50-82de-99064aa6894d", "address": "fa:16:3e:8a:59:4d", "network": {"id": "6e4fcc17-40ff-4975-9ebb-75ca0833226a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1713615272", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8efa0909-38", "ovs_interfaceid": "8efa0909-383b-4c50-82de-99064aa6894d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.142 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "8efa0909-383b-4c50-82de-99064aa6894d", "address": "fa:16:3e:8a:59:4d", "network": {"id": "6e4fcc17-40ff-4975-9ebb-75ca0833226a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1713615272", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8efa0909-38", "ovs_interfaceid": "8efa0909-383b-4c50-82de-99064aa6894d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.143 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:59:4d,bridge_name='br-int',has_traffic_filtering=True,id=8efa0909-383b-4c50-82de-99064aa6894d,network=Network(6e4fcc17-40ff-4975-9ebb-75ca0833226a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8efa0909-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.144 2 DEBUG nova.virt.libvirt.vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-601760978',display_name='tempest-ServersTestMultiNic-server-601760978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-601760978',id=62,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-li7f2xvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:12Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=612f24ca-960e-4fef-81d6-aefbd0d68bf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "address": "fa:16:3e:77:d3:ba", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf57ded32-4a", "ovs_interfaceid": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.144 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "address": "fa:16:3e:77:d3:ba", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf57ded32-4a", "ovs_interfaceid": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.145 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:d3:ba,bridge_name='br-int',has_traffic_filtering=True,id=f57ded32-4a17-4f1b-b0eb-06069110bc4c,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf57ded32-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.147 2 DEBUG nova.objects.instance [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lazy-loading 'pci_devices' on Instance uuid 612f24ca-960e-4fef-81d6-aefbd0d68bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.162 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  <uuid>612f24ca-960e-4fef-81d6-aefbd0d68bf5</uuid>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  <name>instance-0000003e</name>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersTestMultiNic-server-601760978</nova:name>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:14:26</nova:creationTime>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <nova:user uuid="d107dd863d2e4a56853a0b758cb2c110">tempest-ServersTestMultiNic-1178748303-project-member</nova:user>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <nova:project uuid="ba87091f122a4afabe0a62682078fece">tempest-ServersTestMultiNic-1178748303</nova:project>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <nova:port uuid="9bb43112-c321-4894-a7e6-6e7cdbb47eb5">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.70" ipVersion="4"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <nova:port uuid="8efa0909-383b-4c50-82de-99064aa6894d">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.1.69" ipVersion="4"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <nova:port uuid="f57ded32-4a17-4f1b-b0eb-06069110bc4c">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.177" ipVersion="4"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <entry name="serial">612f24ca-960e-4fef-81d6-aefbd0d68bf5</entry>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <entry name="uuid">612f24ca-960e-4fef-81d6-aefbd0d68bf5</entry>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk.config">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:41:7a:fb"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <target dev="tap9bb43112-c3"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:8a:59:4d"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <target dev="tap8efa0909-38"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:77:d3:ba"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <target dev="tapf57ded32-4a"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/612f24ca-960e-4fef-81d6-aefbd0d68bf5/console.log" append="off"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:14:27 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:14:27 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:14:27 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:14:27 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.164 2 DEBUG nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Preparing to wait for external event network-vif-plugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.165 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.166 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.166 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.167 2 DEBUG nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Preparing to wait for external event network-vif-plugged-8efa0909-383b-4c50-82de-99064aa6894d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.167 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.168 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.168 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.169 2 DEBUG nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Preparing to wait for external event network-vif-plugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.169 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.170 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.170 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.172 2 DEBUG nova.virt.libvirt.vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-601760978',display_name='tempest-ServersTestMultiNic-server-601760978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-601760978',id=62,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-li7f2xvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:12Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=612f24ca-960e-4fef-81d6-aefbd0d68bf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "address": "fa:16:3e:41:7a:fb", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb43112-c3", "ovs_interfaceid": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.173 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "address": "fa:16:3e:41:7a:fb", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb43112-c3", "ovs_interfaceid": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.174 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:7a:fb,bridge_name='br-int',has_traffic_filtering=True,id=9bb43112-c321-4894-a7e6-6e7cdbb47eb5,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb43112-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.175 2 DEBUG os_vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:7a:fb,bridge_name='br-int',has_traffic_filtering=True,id=9bb43112-c321-4894-a7e6-6e7cdbb47eb5,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb43112-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.177 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.178 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.185 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bb43112-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.186 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9bb43112-c3, col_values=(('external_ids', {'iface-id': '9bb43112-c321-4894-a7e6-6e7cdbb47eb5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:7a:fb', 'vm-uuid': '612f24ca-960e-4fef-81d6-aefbd0d68bf5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:27 np0005465988 NetworkManager[45041]: <info>  [1759407267.1899] manager: (tap9bb43112-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.198 2 INFO os_vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:7a:fb,bridge_name='br-int',has_traffic_filtering=True,id=9bb43112-c321-4894-a7e6-6e7cdbb47eb5,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb43112-c3')#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.199 2 DEBUG nova.virt.libvirt.vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-601760978',display_name='tempest-ServersTestMultiNic-server-601760978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-601760978',id=62,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-li7f2xvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:12Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=612f24ca-960e-4fef-81d6-aefbd0d68bf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8efa0909-383b-4c50-82de-99064aa6894d", "address": "fa:16:3e:8a:59:4d", "network": {"id": "6e4fcc17-40ff-4975-9ebb-75ca0833226a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1713615272", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8efa0909-38", "ovs_interfaceid": "8efa0909-383b-4c50-82de-99064aa6894d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.200 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "8efa0909-383b-4c50-82de-99064aa6894d", "address": "fa:16:3e:8a:59:4d", "network": {"id": "6e4fcc17-40ff-4975-9ebb-75ca0833226a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1713615272", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8efa0909-38", "ovs_interfaceid": "8efa0909-383b-4c50-82de-99064aa6894d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.201 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:59:4d,bridge_name='br-int',has_traffic_filtering=True,id=8efa0909-383b-4c50-82de-99064aa6894d,network=Network(6e4fcc17-40ff-4975-9ebb-75ca0833226a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8efa0909-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.202 2 DEBUG os_vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:59:4d,bridge_name='br-int',has_traffic_filtering=True,id=8efa0909-383b-4c50-82de-99064aa6894d,network=Network(6e4fcc17-40ff-4975-9ebb-75ca0833226a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8efa0909-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.203 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.203 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.207 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8efa0909-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.208 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8efa0909-38, col_values=(('external_ids', {'iface-id': '8efa0909-383b-4c50-82de-99064aa6894d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:59:4d', 'vm-uuid': '612f24ca-960e-4fef-81d6-aefbd0d68bf5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:27 np0005465988 NetworkManager[45041]: <info>  [1759407267.2110] manager: (tap8efa0909-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.221 2 INFO os_vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:59:4d,bridge_name='br-int',has_traffic_filtering=True,id=8efa0909-383b-4c50-82de-99064aa6894d,network=Network(6e4fcc17-40ff-4975-9ebb-75ca0833226a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8efa0909-38')#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.222 2 DEBUG nova.virt.libvirt.vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-601760978',display_name='tempest-ServersTestMultiNic-server-601760978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-601760978',id=62,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-li7f2xvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:12Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=612f24ca-960e-4fef-81d6-aefbd0d68bf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "address": "fa:16:3e:77:d3:ba", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf57ded32-4a", "ovs_interfaceid": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.223 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "address": "fa:16:3e:77:d3:ba", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf57ded32-4a", "ovs_interfaceid": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.224 2 DEBUG nova.network.os_vif_util [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:d3:ba,bridge_name='br-int',has_traffic_filtering=True,id=f57ded32-4a17-4f1b-b0eb-06069110bc4c,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf57ded32-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.224 2 DEBUG os_vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:d3:ba,bridge_name='br-int',has_traffic_filtering=True,id=f57ded32-4a17-4f1b-b0eb-06069110bc4c,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf57ded32-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.225 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.226 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.229 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf57ded32-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.230 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf57ded32-4a, col_values=(('external_ids', {'iface-id': 'f57ded32-4a17-4f1b-b0eb-06069110bc4c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:d3:ba', 'vm-uuid': '612f24ca-960e-4fef-81d6-aefbd0d68bf5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:27 np0005465988 NetworkManager[45041]: <info>  [1759407267.2328] manager: (tapf57ded32-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.249 2 INFO os_vif [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:d3:ba,bridge_name='br-int',has_traffic_filtering=True,id=f57ded32-4a17-4f1b-b0eb-06069110bc4c,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf57ded32-4a')#033[00m
Oct  2 08:14:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.327 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.328 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.328 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] No VIF found with MAC fa:16:3e:41:7a:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.328 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] No VIF found with MAC fa:16:3e:8a:59:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.329 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] No VIF found with MAC fa:16:3e:77:d3:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.330 2 INFO nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Using config drive#033[00m
Oct  2 08:14:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:27.340 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:27.342 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:27.342 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.369 2 DEBUG nova.storage.rbd_utils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:27.519 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:27.520 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.773 2 INFO nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Creating config drive at /var/lib/nova/instances/612f24ca-960e-4fef-81d6-aefbd0d68bf5/disk.config#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.782 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/612f24ca-960e-4fef-81d6-aefbd0d68bf5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpviqk34ws execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.838 2 DEBUG nova.network.neutron [req-e9964445-4921-421e-a398-d11dc77dd7ea req-f54c57b2-9c7c-4946-858e-1d89b3c6795a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Updated VIF entry in instance network info cache for port f57ded32-4a17-4f1b-b0eb-06069110bc4c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.839 2 DEBUG nova.network.neutron [req-e9964445-4921-421e-a398-d11dc77dd7ea req-f54c57b2-9c7c-4946-858e-1d89b3c6795a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Updating instance_info_cache with network_info: [{"id": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "address": "fa:16:3e:41:7a:fb", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb43112-c3", "ovs_interfaceid": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8efa0909-383b-4c50-82de-99064aa6894d", "address": "fa:16:3e:8a:59:4d", "network": {"id": "6e4fcc17-40ff-4975-9ebb-75ca0833226a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1713615272", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8efa0909-38", "ovs_interfaceid": "8efa0909-383b-4c50-82de-99064aa6894d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "address": "fa:16:3e:77:d3:ba", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf57ded32-4a", "ovs_interfaceid": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.861 2 DEBUG oslo_concurrency.lockutils [req-e9964445-4921-421e-a398-d11dc77dd7ea req-f54c57b2-9c7c-4946-858e-1d89b3c6795a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-612f24ca-960e-4fef-81d6-aefbd0d68bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.930 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/612f24ca-960e-4fef-81d6-aefbd0d68bf5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpviqk34ws" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.982 2 DEBUG nova.storage.rbd_utils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:14:27 np0005465988 nova_compute[236126]: 2025-10-02 12:14:27.988 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/612f24ca-960e-4fef-81d6-aefbd0d68bf5/disk.config 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.289 2 DEBUG oslo_concurrency.processutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/612f24ca-960e-4fef-81d6-aefbd0d68bf5/disk.config 612f24ca-960e-4fef-81d6-aefbd0d68bf5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.301s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.292 2 INFO nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Deleting local config drive /var/lib/nova/instances/612f24ca-960e-4fef-81d6-aefbd0d68bf5/disk.config because it was imported into RBD.#033[00m
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.3757] manager: (tap9bb43112-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Oct  2 08:14:28 np0005465988 kernel: tap9bb43112-c3: entered promiscuous mode
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00150|binding|INFO|Claiming lport 9bb43112-c321-4894-a7e6-6e7cdbb47eb5 for this chassis.
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00151|binding|INFO|9bb43112-c321-4894-a7e6-6e7cdbb47eb5: Claiming fa:16:3e:41:7a:fb 10.100.0.70
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.4034] manager: (tap8efa0909-38): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.404 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:7a:fb 10.100.0.70'], port_security=['fa:16:3e:41:7a:fb 10.100.0.70'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.70/24', 'neutron:device_id': '612f24ca-960e-4fef-81d6-aefbd0d68bf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66508b86-a2fe-4ff7-9652-19f5686c951d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff777e6a-8eba-4e87-806f-e65647879b1e, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=9bb43112-c321-4894-a7e6-6e7cdbb47eb5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.406 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 9bb43112-c321-4894-a7e6-6e7cdbb47eb5 in datapath 66508b86-a2fe-4ff7-9652-19f5686c951d bound to our chassis#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.409 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66508b86-a2fe-4ff7-9652-19f5686c951d#033[00m
Oct  2 08:14:28 np0005465988 kernel: tap8efa0909-38: entered promiscuous mode
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.428 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2c44571a-63f0-4965-8d1b-d82517d4272d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.430 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap66508b86-a1 in ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.4311] manager: (tapf57ded32-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Oct  2 08:14:28 np0005465988 systemd-udevd[261872]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:14:28 np0005465988 systemd-udevd[261873]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.436 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap66508b86-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.436 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e3191066-3f54-4171-a3e8-7d96b69680a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.437 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a6d406-91d6-42f7-ad2d-c063e654e201]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 systemd-udevd[261874]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.453 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[1d958667-5a62-457d-81d7-d8400c95a435]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.4718] device (tap8efa0909-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.4743] device (tap8efa0909-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.4755] device (tap9bb43112-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00152|binding|INFO|Claiming lport 8efa0909-383b-4c50-82de-99064aa6894d for this chassis.
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00153|binding|INFO|8efa0909-383b-4c50-82de-99064aa6894d: Claiming fa:16:3e:8a:59:4d 10.100.1.69
Oct  2 08:14:28 np0005465988 kernel: tapf57ded32-4a: entered promiscuous mode
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00154|if_status|INFO|Not updating pb chassis for f57ded32-4a17-4f1b-b0eb-06069110bc4c now as sb is readonly
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.4884] device (tap9bb43112-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.4888] device (tapf57ded32-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.4896] device (tapf57ded32-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00155|binding|INFO|Claiming lport f57ded32-4a17-4f1b-b0eb-06069110bc4c for this chassis.
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00156|binding|INFO|f57ded32-4a17-4f1b-b0eb-06069110bc4c: Claiming fa:16:3e:77:d3:ba 10.100.0.177
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.491 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:59:4d 10.100.1.69'], port_security=['fa:16:3e:8a:59:4d 10.100.1.69'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.69/24', 'neutron:device_id': '612f24ca-960e-4fef-81d6-aefbd0d68bf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e4fcc17-40ff-4975-9ebb-75ca0833226a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57d03c00-c8a7-410c-947f-3ee823d01fd3, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=8efa0909-383b-4c50-82de-99064aa6894d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.491 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6478c7ad-c6f3-44d4-bf02-e450faef2b1b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00157|binding|INFO|Setting lport 9bb43112-c321-4894-a7e6-6e7cdbb47eb5 ovn-installed in OVS
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00158|binding|INFO|Setting lport 9bb43112-c321-4894-a7e6-6e7cdbb47eb5 up in Southbound
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.503 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:d3:ba 10.100.0.177'], port_security=['fa:16:3e:77:d3:ba 10.100.0.177'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.177/24', 'neutron:device_id': '612f24ca-960e-4fef-81d6-aefbd0d68bf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66508b86-a2fe-4ff7-9652-19f5686c951d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff777e6a-8eba-4e87-806f-e65647879b1e, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f57ded32-4a17-4f1b-b0eb-06069110bc4c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:28 np0005465988 systemd-machined[192594]: New machine qemu-23-instance-0000003e.
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.523 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:28 np0005465988 systemd[1]: Started Virtual Machine qemu-23-instance-0000003e.
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00159|binding|INFO|Setting lport 8efa0909-383b-4c50-82de-99064aa6894d ovn-installed in OVS
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00160|binding|INFO|Setting lport 8efa0909-383b-4c50-82de-99064aa6894d up in Southbound
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.536 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[80397e2d-5fc4-4c4f-be32-e85b9aea9cd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.545 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f309aa5b-45e5-44ee-bc6b-296dcf244792]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.5468] manager: (tap66508b86-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00161|binding|INFO|Setting lport f57ded32-4a17-4f1b-b0eb-06069110bc4c ovn-installed in OVS
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00162|binding|INFO|Setting lport f57ded32-4a17-4f1b-b0eb-06069110bc4c up in Southbound
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:28.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.595 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ef21d1bb-9b67-4be1-9456-cac85ff0d887]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.599 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d18c2368-735c-431d-961e-e1009d814e38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.6301] device (tap66508b86-a0): carrier: link connected
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.639 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[277b0eab-8c54-4219-9a70-2b0259009d83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.667 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[640286a8-be25-46e8-ae44-e80d73cef08d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66508b86-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:1e:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532896, 'reachable_time': 24249, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261910, 'error': None, 'target': 'ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.696 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[351e70e8-916f-4cf6-b392-e2a6fb89c202]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:1eb2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532896, 'tstamp': 532896}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261911, 'error': None, 'target': 'ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.729 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fe203436-c858-4819-9e16-055d7cd38e31]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66508b86-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:1e:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532896, 'reachable_time': 24249, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 261912, 'error': None, 'target': 'ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:28.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.775 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[96681c68-03d2-4a68-b56d-664d56ff2ba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.784 2 DEBUG nova.compute.manager [req-ea01bb17-b6e6-4139-b435-64d168c88dd1 req-405853fe-5065-428b-8cc4-3e08390a7b30 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-plugged-8efa0909-383b-4c50-82de-99064aa6894d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.784 2 DEBUG oslo_concurrency.lockutils [req-ea01bb17-b6e6-4139-b435-64d168c88dd1 req-405853fe-5065-428b-8cc4-3e08390a7b30 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.784 2 DEBUG oslo_concurrency.lockutils [req-ea01bb17-b6e6-4139-b435-64d168c88dd1 req-405853fe-5065-428b-8cc4-3e08390a7b30 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.785 2 DEBUG oslo_concurrency.lockutils [req-ea01bb17-b6e6-4139-b435-64d168c88dd1 req-405853fe-5065-428b-8cc4-3e08390a7b30 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.785 2 DEBUG nova.compute.manager [req-ea01bb17-b6e6-4139-b435-64d168c88dd1 req-405853fe-5065-428b-8cc4-3e08390a7b30 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Processing event network-vif-plugged-8efa0909-383b-4c50-82de-99064aa6894d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.894 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[181779f3-43e2-4ce5-ab59-54c79fc2492c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.901 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66508b86-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.902 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.903 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66508b86-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 kernel: tap66508b86-a0: entered promiscuous mode
Oct  2 08:14:28 np0005465988 NetworkManager[45041]: <info>  [1759407268.9074] manager: (tap66508b86-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.915 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66508b86-a0, col_values=(('external_ids', {'iface-id': 'd1bbefcc-6af1-4dad-8de6-8a7cacd6a890'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:28Z|00163|binding|INFO|Releasing lport d1bbefcc-6af1-4dad-8de6-8a7cacd6a890 from this chassis (sb_readonly=0)
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 nova_compute[236126]: 2025-10-02 12:14:28.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.948 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/66508b86-a2fe-4ff7-9652-19f5686c951d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/66508b86-a2fe-4ff7-9652-19f5686c951d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.949 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[23f485c2-53b3-4b12-9706-90ebea2f52f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.950 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-66508b86-a2fe-4ff7-9652-19f5686c951d
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/66508b86-a2fe-4ff7-9652-19f5686c951d.pid.haproxy
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 66508b86-a2fe-4ff7-9652-19f5686c951d
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:14:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:28.952 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d', 'env', 'PROCESS_TAG=haproxy-66508b86-a2fe-4ff7-9652-19f5686c951d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/66508b86-a2fe-4ff7-9652-19f5686c951d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:14:29 np0005465988 podman[261988]: 2025-10-02 12:14:29.358861494 +0000 UTC m=+0.057700437 container create 40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:14:29 np0005465988 systemd[1]: Started libpod-conmon-40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c.scope.
Oct  2 08:14:29 np0005465988 podman[261988]: 2025-10-02 12:14:29.324328361 +0000 UTC m=+0.023167344 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:14:29 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:14:29 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0eb8475acefb4453da69dd4c753e34e8bd937f7722c244add0909e2b56551a4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:14:29 np0005465988 podman[261988]: 2025-10-02 12:14:29.521090897 +0000 UTC m=+0.219929870 container init 40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct  2 08:14:29 np0005465988 podman[261988]: 2025-10-02 12:14:29.52944357 +0000 UTC m=+0.228282513 container start 40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:14:29 np0005465988 neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d[262004]: [NOTICE]   (262008) : New worker (262010) forked
Oct  2 08:14:29 np0005465988 neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d[262004]: [NOTICE]   (262008) : Loading success.
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.608 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 8efa0909-383b-4c50-82de-99064aa6894d in datapath 6e4fcc17-40ff-4975-9ebb-75ca0833226a unbound from our chassis#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.612 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6e4fcc17-40ff-4975-9ebb-75ca0833226a#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.629 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[642aee34-e03d-4fd7-ba65-35a8f0bbd7c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.630 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6e4fcc17-41 in ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.633 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6e4fcc17-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.633 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[18ee44c9-9bee-4d83-a73d-510d73f79096]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.635 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1322e8-047b-444e-b81a-29ace9f3ab73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.652 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[a015826f-6148-46b1-8030-62d881edc64f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.682 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7873f87b-353e-45fa-b013-c331dbef50ec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.716 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d3eef728-a413-4489-81a8-dd5069fdd5a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 NetworkManager[45041]: <info>  [1759407269.7251] manager: (tap6e4fcc17-40): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Oct  2 08:14:29 np0005465988 systemd-udevd[261898]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.725 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6fcf4595-689d-40ae-a946-7fe451d3c9cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.766 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d758ef07-90d2-482b-9572-a203a68189ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.770 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7e4b90-50e9-4d10-b097-51961d0745f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.771 2 DEBUG nova.compute.manager [req-e90ab2fc-a7c9-4aa7-a29b-002549a18503 req-9274ed5f-a2be-4686-992b-3365cc9bd16b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-plugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.771 2 DEBUG oslo_concurrency.lockutils [req-e90ab2fc-a7c9-4aa7-a29b-002549a18503 req-9274ed5f-a2be-4686-992b-3365cc9bd16b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.771 2 DEBUG oslo_concurrency.lockutils [req-e90ab2fc-a7c9-4aa7-a29b-002549a18503 req-9274ed5f-a2be-4686-992b-3365cc9bd16b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.772 2 DEBUG oslo_concurrency.lockutils [req-e90ab2fc-a7c9-4aa7-a29b-002549a18503 req-9274ed5f-a2be-4686-992b-3365cc9bd16b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.772 2 DEBUG nova.compute.manager [req-e90ab2fc-a7c9-4aa7-a29b-002549a18503 req-9274ed5f-a2be-4686-992b-3365cc9bd16b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Processing event network-vif-plugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.772 2 DEBUG nova.compute.manager [req-e90ab2fc-a7c9-4aa7-a29b-002549a18503 req-9274ed5f-a2be-4686-992b-3365cc9bd16b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-plugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.772 2 DEBUG oslo_concurrency.lockutils [req-e90ab2fc-a7c9-4aa7-a29b-002549a18503 req-9274ed5f-a2be-4686-992b-3365cc9bd16b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.773 2 DEBUG oslo_concurrency.lockutils [req-e90ab2fc-a7c9-4aa7-a29b-002549a18503 req-9274ed5f-a2be-4686-992b-3365cc9bd16b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.773 2 DEBUG oslo_concurrency.lockutils [req-e90ab2fc-a7c9-4aa7-a29b-002549a18503 req-9274ed5f-a2be-4686-992b-3365cc9bd16b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.773 2 DEBUG nova.compute.manager [req-e90ab2fc-a7c9-4aa7-a29b-002549a18503 req-9274ed5f-a2be-4686-992b-3365cc9bd16b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] No event matching network-vif-plugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 in dict_keys([('network-vif-plugged', 'f57ded32-4a17-4f1b-b0eb-06069110bc4c')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.773 2 WARNING nova.compute.manager [req-e90ab2fc-a7c9-4aa7-a29b-002549a18503 req-9274ed5f-a2be-4686-992b-3365cc9bd16b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received unexpected event network-vif-plugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:14:29 np0005465988 NetworkManager[45041]: <info>  [1759407269.8096] device (tap6e4fcc17-40): carrier: link connected
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.815 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9df597e8-540a-4f8f-9fa2-076e5ab8594c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.836 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407269.8358252, 612f24ca-960e-4fef-81d6-aefbd0d68bf5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.836 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] VM Started (Lifecycle Event)#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.839 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[535614a9-0886-4789-b9df-211307eaad38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6e4fcc17-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:73:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533014, 'reachable_time': 35530, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262029, 'error': None, 'target': 'ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.857 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ccdce265-3d38-4d4b-b2dd-230fdb8903ef]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:737b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533014, 'tstamp': 533014}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262030, 'error': None, 'target': 'ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.861 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.865 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407269.8359058, 612f24ca-960e-4fef-81d6-aefbd0d68bf5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.866 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.875 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a5bed95f-d344-4b28-88c0-cf561bbce5c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6e4fcc17-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:73:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533014, 'reachable_time': 35530, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 262031, 'error': None, 'target': 'ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.894 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.897 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.915 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6b76c985-d848-4452-bd44-d270f17fbe0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.943 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.983 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[26249161-f1d9-4ab8-9d9e-5e8658392cee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.986 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e4fcc17-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.986 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.987 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e4fcc17-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:29 np0005465988 NetworkManager[45041]: <info>  [1759407269.9906] manager: (tap6e4fcc17-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Oct  2 08:14:29 np0005465988 kernel: tap6e4fcc17-40: entered promiscuous mode
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.995 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6e4fcc17-40, col_values=(('external_ids', {'iface-id': '3fb69fb8-817e-41f4-8b58-f0acf0c26599'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:29 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:29Z|00164|binding|INFO|Releasing lport 3fb69fb8-817e-41f4-8b58-f0acf0c26599 from this chassis (sb_readonly=0)
Oct  2 08:14:29 np0005465988 nova_compute[236126]: 2025-10-02 12:14:29.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:29.998 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6e4fcc17-40ff-4975-9ebb-75ca0833226a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6e4fcc17-40ff-4975-9ebb-75ca0833226a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.000 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1b9d5205-9b27-4a2a-aebe-784df2255273]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.001 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-6e4fcc17-40ff-4975-9ebb-75ca0833226a
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/6e4fcc17-40ff-4975-9ebb-75ca0833226a.pid.haproxy
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 6e4fcc17-40ff-4975-9ebb-75ca0833226a
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.002 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a', 'env', 'PROCESS_TAG=haproxy-6e4fcc17-40ff-4975-9ebb-75ca0833226a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6e4fcc17-40ff-4975-9ebb-75ca0833226a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005465988 podman[262064]: 2025-10-02 12:14:30.45470145 +0000 UTC m=+0.067097019 container create 18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:14:30 np0005465988 systemd[1]: Started libpod-conmon-18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2.scope.
Oct  2 08:14:30 np0005465988 podman[262064]: 2025-10-02 12:14:30.419004383 +0000 UTC m=+0.031400012 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:14:30 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:14:30 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a5bfa1b7beb4fc6668cc9e3232baf6c73a2e5db4dac4829e822ff06da8a18e3e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:14:30 np0005465988 podman[262064]: 2025-10-02 12:14:30.548089373 +0000 UTC m=+0.160485012 container init 18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:14:30 np0005465988 podman[262064]: 2025-10-02 12:14:30.556424875 +0000 UTC m=+0.168820474 container start 18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:14:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:30 np0005465988 neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a[262080]: [NOTICE]   (262084) : New worker (262086) forked
Oct  2 08:14:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:30 np0005465988 neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a[262080]: [NOTICE]   (262084) : Loading success.
Oct  2 08:14:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:30.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.625 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f57ded32-4a17-4f1b-b0eb-06069110bc4c in datapath 66508b86-a2fe-4ff7-9652-19f5686c951d unbound from our chassis#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.627 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66508b86-a2fe-4ff7-9652-19f5686c951d#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.650 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ebb47952-d787-4092-a8b9-fb704150817b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.694 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[803467e5-b05b-41ca-8eb7-e6f5f5d99885]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.698 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[22941f28-a86f-4d5d-a378-b0f1d8a2d493]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.745 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4e110b-952d-43bf-bf54-c6bf52fb91ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:14:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:30.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.768 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4dec4a33-28d7-42ce-a567-f6a72bd3038a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66508b86-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:1e:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 7, 'tx_packets': 5, 'rx_bytes': 742, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 7, 'tx_packets': 5, 'rx_bytes': 742, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532896, 'reachable_time': 24249, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 7, 'inoctets': 644, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 7, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 644, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 7, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262100, 'error': None, 'target': 'ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.789 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1f1e6416-1489-4b06-9996-0f40fad52fd6]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap66508b86-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532916, 'tstamp': 532916}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262101, 'error': None, 'target': 'ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66508b86-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532921, 'tstamp': 532921}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262101, 'error': None, 'target': 'ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.790 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66508b86-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.794 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66508b86-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.794 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.794 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66508b86-a0, col_values=(('external_ids', {'iface-id': 'd1bbefcc-6af1-4dad-8de6-8a7cacd6a890'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:30.795 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.893 2 DEBUG nova.compute.manager [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-plugged-8efa0909-383b-4c50-82de-99064aa6894d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.893 2 DEBUG oslo_concurrency.lockutils [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.894 2 DEBUG oslo_concurrency.lockutils [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.894 2 DEBUG oslo_concurrency.lockutils [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.895 2 DEBUG nova.compute.manager [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] No event matching network-vif-plugged-8efa0909-383b-4c50-82de-99064aa6894d in dict_keys([('network-vif-plugged', 'f57ded32-4a17-4f1b-b0eb-06069110bc4c')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.895 2 WARNING nova.compute.manager [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received unexpected event network-vif-plugged-8efa0909-383b-4c50-82de-99064aa6894d for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.895 2 DEBUG nova.compute.manager [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-plugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.896 2 DEBUG oslo_concurrency.lockutils [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.896 2 DEBUG oslo_concurrency.lockutils [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.896 2 DEBUG oslo_concurrency.lockutils [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.897 2 DEBUG nova.compute.manager [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Processing event network-vif-plugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.897 2 DEBUG nova.compute.manager [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-plugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.898 2 DEBUG oslo_concurrency.lockutils [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.898 2 DEBUG oslo_concurrency.lockutils [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.899 2 DEBUG oslo_concurrency.lockutils [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.899 2 DEBUG nova.compute.manager [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] No waiting events found dispatching network-vif-plugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.899 2 WARNING nova.compute.manager [req-f2be10f2-e0ed-4535-ba81-f3d5277fc73c req-dee4ea75-3d03-4b0d-919c-3c9437a5e349 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received unexpected event network-vif-plugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.900 2 DEBUG nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.905 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407270.905422, 612f24ca-960e-4fef-81d6-aefbd0d68bf5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.906 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.910 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.917 2 INFO nova.virt.libvirt.driver [-] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Instance spawned successfully.#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.917 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.940 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.950 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.954 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.955 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.956 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.956 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.957 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:30 np0005465988 nova_compute[236126]: 2025-10-02 12:14:30.957 2 DEBUG nova.virt.libvirt.driver [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.001 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.048 2 INFO nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Took 18.47 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.049 2 DEBUG nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.113 2 INFO nova.compute.manager [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Took 19.45 seconds to build instance.#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.136 2 DEBUG oslo_concurrency.lockutils [None req-01dd5dd0-3ef5-4790-ad1e-f533045cfc7d d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.804 2 DEBUG oslo_concurrency.lockutils [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.804 2 DEBUG oslo_concurrency.lockutils [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.805 2 DEBUG oslo_concurrency.lockutils [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.805 2 DEBUG oslo_concurrency.lockutils [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.805 2 DEBUG oslo_concurrency.lockutils [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.806 2 INFO nova.compute.manager [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Terminating instance#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.807 2 DEBUG nova.compute.manager [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:14:31 np0005465988 kernel: tap9bb43112-c3 (unregistering): left promiscuous mode
Oct  2 08:14:31 np0005465988 NetworkManager[45041]: <info>  [1759407271.8453] device (tap9bb43112-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:14:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:31Z|00165|binding|INFO|Releasing lport 9bb43112-c321-4894-a7e6-6e7cdbb47eb5 from this chassis (sb_readonly=0)
Oct  2 08:14:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:31Z|00166|binding|INFO|Setting lport 9bb43112-c321-4894-a7e6-6e7cdbb47eb5 down in Southbound
Oct  2 08:14:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:31Z|00167|binding|INFO|Removing iface tap9bb43112-c3 ovn-installed in OVS
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:31.864 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:7a:fb 10.100.0.70'], port_security=['fa:16:3e:41:7a:fb 10.100.0.70'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.70/24', 'neutron:device_id': '612f24ca-960e-4fef-81d6-aefbd0d68bf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66508b86-a2fe-4ff7-9652-19f5686c951d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff777e6a-8eba-4e87-806f-e65647879b1e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=9bb43112-c321-4894-a7e6-6e7cdbb47eb5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:31.865 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 9bb43112-c321-4894-a7e6-6e7cdbb47eb5 in datapath 66508b86-a2fe-4ff7-9652-19f5686c951d unbound from our chassis#033[00m
Oct  2 08:14:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:31.867 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66508b86-a2fe-4ff7-9652-19f5686c951d#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005465988 kernel: tap8efa0909-38 (unregistering): left promiscuous mode
Oct  2 08:14:31 np0005465988 NetworkManager[45041]: <info>  [1759407271.8875] device (tap8efa0909-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:31.896 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1d791c05-2dbe-4a83-907d-d0658fe6b07b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:31Z|00168|binding|INFO|Releasing lport 8efa0909-383b-4c50-82de-99064aa6894d from this chassis (sb_readonly=0)
Oct  2 08:14:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:31Z|00169|binding|INFO|Setting lport 8efa0909-383b-4c50-82de-99064aa6894d down in Southbound
Oct  2 08:14:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:31Z|00170|binding|INFO|Removing iface tap8efa0909-38 ovn-installed in OVS
Oct  2 08:14:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:31.901 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:59:4d 10.100.1.69'], port_security=['fa:16:3e:8a:59:4d 10.100.1.69'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.69/24', 'neutron:device_id': '612f24ca-960e-4fef-81d6-aefbd0d68bf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e4fcc17-40ff-4975-9ebb-75ca0833226a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57d03c00-c8a7-410c-947f-3ee823d01fd3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=8efa0909-383b-4c50-82de-99064aa6894d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005465988 kernel: tapf57ded32-4a (unregistering): left promiscuous mode
Oct  2 08:14:31 np0005465988 NetworkManager[45041]: <info>  [1759407271.9286] device (tapf57ded32-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:31.933 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[70b6a7ba-22fc-4fda-b590-b6b184f5914d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:31.937 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5a81c811-a9c3-4fe4-a795-293efd8df0e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:31Z|00171|binding|INFO|Releasing lport f57ded32-4a17-4f1b-b0eb-06069110bc4c from this chassis (sb_readonly=0)
Oct  2 08:14:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:31Z|00172|binding|INFO|Setting lport f57ded32-4a17-4f1b-b0eb-06069110bc4c down in Southbound
Oct  2 08:14:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:14:31Z|00173|binding|INFO|Removing iface tapf57ded32-4a ovn-installed in OVS
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:31.953 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:d3:ba 10.100.0.177'], port_security=['fa:16:3e:77:d3:ba 10.100.0.177'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.177/24', 'neutron:device_id': '612f24ca-960e-4fef-81d6-aefbd0d68bf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66508b86-a2fe-4ff7-9652-19f5686c951d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff777e6a-8eba-4e87-806f-e65647879b1e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f57ded32-4a17-4f1b-b0eb-06069110bc4c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:31 np0005465988 nova_compute[236126]: 2025-10-02 12:14:31.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:31.969 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1240ccd0-1bc8-4133-bac2-d3f13702c0d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:31.991 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d17ffee3-8ebd-4af9-90ae-f3496f349607]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66508b86-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:1e:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532896, 'reachable_time': 24249, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262128, 'error': None, 'target': 'ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005465988 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Oct  2 08:14:31 np0005465988 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000003e.scope: Consumed 2.105s CPU time.
Oct  2 08:14:32 np0005465988 systemd-machined[192594]: Machine qemu-23-instance-0000003e terminated.
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.008 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[23e565f9-2563-4430-86d7-a26d6127c025]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap66508b86-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532916, 'tstamp': 532916}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262129, 'error': None, 'target': 'ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66508b86-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532921, 'tstamp': 532921}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262129, 'error': None, 'target': 'ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.010 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66508b86-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.022 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66508b86-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.022 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.022 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66508b86-a0, col_values=(('external_ids', {'iface-id': 'd1bbefcc-6af1-4dad-8de6-8a7cacd6a890'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.023 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:32 np0005465988 NetworkManager[45041]: <info>  [1759407272.0245] manager: (tap9bb43112-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.025 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 8efa0909-383b-4c50-82de-99064aa6894d in datapath 6e4fcc17-40ff-4975-9ebb-75ca0833226a unbound from our chassis#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.027 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6e4fcc17-40ff-4975-9ebb-75ca0833226a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.027 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7c6b6be1-992b-431e-bb7a-5e61a5779738]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.028 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a namespace which is not needed anymore#033[00m
Oct  2 08:14:32 np0005465988 NetworkManager[45041]: <info>  [1759407272.0431] manager: (tap8efa0909-38): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Oct  2 08:14:32 np0005465988 NetworkManager[45041]: <info>  [1759407272.0531] manager: (tapf57ded32-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.067 2 INFO nova.virt.libvirt.driver [-] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Instance destroyed successfully.#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.067 2 DEBUG nova.objects.instance [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lazy-loading 'resources' on Instance uuid 612f24ca-960e-4fef-81d6-aefbd0d68bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.081 2 DEBUG nova.virt.libvirt.vif [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-601760978',display_name='tempest-ServersTestMultiNic-server-601760978',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-601760978',id=62,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:14:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-li7f2xvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:14:31Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=612f24ca-960e-4fef-81d6-aefbd0d68bf5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "address": "fa:16:3e:41:7a:fb", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb43112-c3", "ovs_interfaceid": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.081 2 DEBUG nova.network.os_vif_util [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "address": "fa:16:3e:41:7a:fb", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb43112-c3", "ovs_interfaceid": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.082 2 DEBUG nova.network.os_vif_util [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:7a:fb,bridge_name='br-int',has_traffic_filtering=True,id=9bb43112-c321-4894-a7e6-6e7cdbb47eb5,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb43112-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.082 2 DEBUG os_vif [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:7a:fb,bridge_name='br-int',has_traffic_filtering=True,id=9bb43112-c321-4894-a7e6-6e7cdbb47eb5,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb43112-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.084 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bb43112-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.096 2 INFO os_vif [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:7a:fb,bridge_name='br-int',has_traffic_filtering=True,id=9bb43112-c321-4894-a7e6-6e7cdbb47eb5,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb43112-c3')#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.097 2 DEBUG nova.virt.libvirt.vif [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-601760978',display_name='tempest-ServersTestMultiNic-server-601760978',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-601760978',id=62,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:14:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-li7f2xvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:14:31Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=612f24ca-960e-4fef-81d6-aefbd0d68bf5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8efa0909-383b-4c50-82de-99064aa6894d", "address": "fa:16:3e:8a:59:4d", "network": {"id": "6e4fcc17-40ff-4975-9ebb-75ca0833226a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1713615272", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8efa0909-38", "ovs_interfaceid": "8efa0909-383b-4c50-82de-99064aa6894d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.098 2 DEBUG nova.network.os_vif_util [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "8efa0909-383b-4c50-82de-99064aa6894d", "address": "fa:16:3e:8a:59:4d", "network": {"id": "6e4fcc17-40ff-4975-9ebb-75ca0833226a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1713615272", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8efa0909-38", "ovs_interfaceid": "8efa0909-383b-4c50-82de-99064aa6894d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.098 2 DEBUG nova.network.os_vif_util [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:59:4d,bridge_name='br-int',has_traffic_filtering=True,id=8efa0909-383b-4c50-82de-99064aa6894d,network=Network(6e4fcc17-40ff-4975-9ebb-75ca0833226a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8efa0909-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.099 2 DEBUG os_vif [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:59:4d,bridge_name='br-int',has_traffic_filtering=True,id=8efa0909-383b-4c50-82de-99064aa6894d,network=Network(6e4fcc17-40ff-4975-9ebb-75ca0833226a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8efa0909-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.100 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8efa0909-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.110 2 INFO os_vif [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:59:4d,bridge_name='br-int',has_traffic_filtering=True,id=8efa0909-383b-4c50-82de-99064aa6894d,network=Network(6e4fcc17-40ff-4975-9ebb-75ca0833226a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8efa0909-38')#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.111 2 DEBUG nova.virt.libvirt.vif [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-601760978',display_name='tempest-ServersTestMultiNic-server-601760978',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-601760978',id=62,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:14:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-li7f2xvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:14:31Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=612f24ca-960e-4fef-81d6-aefbd0d68bf5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "address": "fa:16:3e:77:d3:ba", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf57ded32-4a", "ovs_interfaceid": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.112 2 DEBUG nova.network.os_vif_util [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "address": "fa:16:3e:77:d3:ba", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf57ded32-4a", "ovs_interfaceid": "f57ded32-4a17-4f1b-b0eb-06069110bc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.113 2 DEBUG nova.network.os_vif_util [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:d3:ba,bridge_name='br-int',has_traffic_filtering=True,id=f57ded32-4a17-4f1b-b0eb-06069110bc4c,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf57ded32-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.114 2 DEBUG os_vif [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:d3:ba,bridge_name='br-int',has_traffic_filtering=True,id=f57ded32-4a17-4f1b-b0eb-06069110bc4c,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf57ded32-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.115 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf57ded32-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.122 2 INFO os_vif [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:d3:ba,bridge_name='br-int',has_traffic_filtering=True,id=f57ded32-4a17-4f1b-b0eb-06069110bc4c,network=Network(66508b86-a2fe-4ff7-9652-19f5686c951d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf57ded32-4a')#033[00m
Oct  2 08:14:32 np0005465988 neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a[262080]: [NOTICE]   (262084) : haproxy version is 2.8.14-c23fe91
Oct  2 08:14:32 np0005465988 neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a[262080]: [NOTICE]   (262084) : path to executable is /usr/sbin/haproxy
Oct  2 08:14:32 np0005465988 neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a[262080]: [WARNING]  (262084) : Exiting Master process...
Oct  2 08:14:32 np0005465988 neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a[262080]: [ALERT]    (262084) : Current worker (262086) exited with code 143 (Terminated)
Oct  2 08:14:32 np0005465988 neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a[262080]: [WARNING]  (262084) : All workers exited. Exiting... (0)
Oct  2 08:14:32 np0005465988 systemd[1]: libpod-18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2.scope: Deactivated successfully.
Oct  2 08:14:32 np0005465988 conmon[262080]: conmon 18c1214276fea8373d3f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2.scope/container/memory.events
Oct  2 08:14:32 np0005465988 podman[262192]: 2025-10-02 12:14:32.192717254 +0000 UTC m=+0.043799564 container died 18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:14:32 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2-userdata-shm.mount: Deactivated successfully.
Oct  2 08:14:32 np0005465988 systemd[1]: var-lib-containers-storage-overlay-a5bfa1b7beb4fc6668cc9e3232baf6c73a2e5db4dac4829e822ff06da8a18e3e-merged.mount: Deactivated successfully.
Oct  2 08:14:32 np0005465988 podman[262192]: 2025-10-02 12:14:32.227927637 +0000 UTC m=+0.079009957 container cleanup 18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:14:32 np0005465988 systemd[1]: libpod-conmon-18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2.scope: Deactivated successfully.
Oct  2 08:14:32 np0005465988 podman[262232]: 2025-10-02 12:14:32.285812548 +0000 UTC m=+0.037831240 container remove 18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:14:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.293 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[48e525a7-9e82-420d-ad84-b86edfb8f68d]: (4, ('Thu Oct  2 12:14:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a (18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2)\n18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2\nThu Oct  2 12:14:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a (18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2)\n18c1214276fea8373d3f8bc681b848f03052a1bf6234641507e3f7da412e37c2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.294 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e40c5bff-be6a-497c-9397-13bf0716b6fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.295 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e4fcc17-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 kernel: tap6e4fcc17-40: left promiscuous mode
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.380 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[51ad17d4-3cb4-4c96-b934-82d7eb65c113]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.410 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b64396ba-b5b4-4e57-8363-6167c968cd9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.412 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d2bbdedb-2647-45b0-b77d-17dcd289798d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.429 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cc4c05f3-5335-49fa-b14e-fdc371d445e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533003, 'reachable_time': 43224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262247, 'error': None, 'target': 'ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.432 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6e4fcc17-40ff-4975-9ebb-75ca0833226a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:14:32 np0005465988 systemd[1]: run-netns-ovnmeta\x2d6e4fcc17\x2d40ff\x2d4975\x2d9ebb\x2d75ca0833226a.mount: Deactivated successfully.
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.433 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[28d0f5b4-32ce-45e1-856e-cbc6d6c01db4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.434 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f57ded32-4a17-4f1b-b0eb-06069110bc4c in datapath 66508b86-a2fe-4ff7-9652-19f5686c951d unbound from our chassis#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.436 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66508b86-a2fe-4ff7-9652-19f5686c951d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.437 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7b35dee2-1c4a-4b36-b053-60fc1a98d0cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.438 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d namespace which is not needed anymore#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d[262004]: [NOTICE]   (262008) : haproxy version is 2.8.14-c23fe91
Oct  2 08:14:32 np0005465988 neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d[262004]: [NOTICE]   (262008) : path to executable is /usr/sbin/haproxy
Oct  2 08:14:32 np0005465988 neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d[262004]: [WARNING]  (262008) : Exiting Master process...
Oct  2 08:14:32 np0005465988 neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d[262004]: [ALERT]    (262008) : Current worker (262010) exited with code 143 (Terminated)
Oct  2 08:14:32 np0005465988 neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d[262004]: [WARNING]  (262008) : All workers exited. Exiting... (0)
Oct  2 08:14:32 np0005465988 systemd[1]: libpod-40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c.scope: Deactivated successfully.
Oct  2 08:14:32 np0005465988 podman[262264]: 2025-10-02 12:14:32.581098377 +0000 UTC m=+0.048390527 container died 40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:14:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:32.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:32 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c-userdata-shm.mount: Deactivated successfully.
Oct  2 08:14:32 np0005465988 systemd[1]: var-lib-containers-storage-overlay-0eb8475acefb4453da69dd4c753e34e8bd937f7722c244add0909e2b56551a4a-merged.mount: Deactivated successfully.
Oct  2 08:14:32 np0005465988 podman[262264]: 2025-10-02 12:14:32.629243696 +0000 UTC m=+0.096535846 container cleanup 40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:14:32 np0005465988 systemd[1]: libpod-conmon-40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c.scope: Deactivated successfully.
Oct  2 08:14:32 np0005465988 podman[262292]: 2025-10-02 12:14:32.722951778 +0000 UTC m=+0.059961943 container remove 40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.731 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[03fe477e-2c56-4c72-b6cf-ca2f6a61850d]: (4, ('Thu Oct  2 12:14:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d (40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c)\n40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c\nThu Oct  2 12:14:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d (40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c)\n40aa2b585fb71247d72f5c7430ef42294e868a9ee0ae03a9c09265ab5835b50c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.734 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2d3e7fe8-01f6-46f6-8f98-983c1b7fedb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.735 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66508b86-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 kernel: tap66508b86-a0: left promiscuous mode
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.746 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a237d4f5-81ab-4b05-946b-62ca8d20b092]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 nova_compute[236126]: 2025-10-02 12:14:32.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:32.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.776 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[47e13118-313d-49ac-89b0-0f60bd9a7c03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.778 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b7f29dec-919b-4252-9dc4-6b054669223b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.796 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5dcf65b1-da01-4c7c-a19f-ffaacb4fd0cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532885, 'reachable_time': 19644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262309, 'error': None, 'target': 'ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.799 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-66508b86-a2fe-4ff7-9652-19f5686c951d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:14:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:14:32.799 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[687f0c36-081c-4618-8b35-5acc0672af0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.126 2 DEBUG nova.compute.manager [req-f51c07c1-1183-4b02-b192-f525c49c38b9 req-91841c14-3e5f-4497-9b03-f4e61ea1e912 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-unplugged-8efa0909-383b-4c50-82de-99064aa6894d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.126 2 DEBUG oslo_concurrency.lockutils [req-f51c07c1-1183-4b02-b192-f525c49c38b9 req-91841c14-3e5f-4497-9b03-f4e61ea1e912 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.127 2 DEBUG oslo_concurrency.lockutils [req-f51c07c1-1183-4b02-b192-f525c49c38b9 req-91841c14-3e5f-4497-9b03-f4e61ea1e912 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.127 2 DEBUG oslo_concurrency.lockutils [req-f51c07c1-1183-4b02-b192-f525c49c38b9 req-91841c14-3e5f-4497-9b03-f4e61ea1e912 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.127 2 DEBUG nova.compute.manager [req-f51c07c1-1183-4b02-b192-f525c49c38b9 req-91841c14-3e5f-4497-9b03-f4e61ea1e912 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] No waiting events found dispatching network-vif-unplugged-8efa0909-383b-4c50-82de-99064aa6894d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.128 2 DEBUG nova.compute.manager [req-f51c07c1-1183-4b02-b192-f525c49c38b9 req-91841c14-3e5f-4497-9b03-f4e61ea1e912 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-unplugged-8efa0909-383b-4c50-82de-99064aa6894d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:14:33 np0005465988 systemd[1]: run-netns-ovnmeta\x2d66508b86\x2da2fe\x2d4ff7\x2d9652\x2d19f5686c951d.mount: Deactivated successfully.
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.388 2 INFO nova.virt.libvirt.driver [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Deleting instance files /var/lib/nova/instances/612f24ca-960e-4fef-81d6-aefbd0d68bf5_del#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.388 2 INFO nova.virt.libvirt.driver [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Deletion of /var/lib/nova/instances/612f24ca-960e-4fef-81d6-aefbd0d68bf5_del complete#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.393 2 DEBUG nova.compute.manager [req-dd176561-8bc0-444e-ab3d-1a7c31265fda req-957f84f1-7d89-4758-83cb-7bbf6cbf510b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-unplugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.394 2 DEBUG oslo_concurrency.lockutils [req-dd176561-8bc0-444e-ab3d-1a7c31265fda req-957f84f1-7d89-4758-83cb-7bbf6cbf510b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.394 2 DEBUG oslo_concurrency.lockutils [req-dd176561-8bc0-444e-ab3d-1a7c31265fda req-957f84f1-7d89-4758-83cb-7bbf6cbf510b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.394 2 DEBUG oslo_concurrency.lockutils [req-dd176561-8bc0-444e-ab3d-1a7c31265fda req-957f84f1-7d89-4758-83cb-7bbf6cbf510b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.394 2 DEBUG nova.compute.manager [req-dd176561-8bc0-444e-ab3d-1a7c31265fda req-957f84f1-7d89-4758-83cb-7bbf6cbf510b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] No waiting events found dispatching network-vif-unplugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.395 2 DEBUG nova.compute.manager [req-dd176561-8bc0-444e-ab3d-1a7c31265fda req-957f84f1-7d89-4758-83cb-7bbf6cbf510b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-unplugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.445 2 INFO nova.compute.manager [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Took 1.64 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.445 2 DEBUG oslo.service.loopingcall [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.446 2 DEBUG nova.compute.manager [-] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:14:33 np0005465988 nova_compute[236126]: 2025-10-02 12:14:33.446 2 DEBUG nova.network.neutron [-] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:14:34 np0005465988 nova_compute[236126]: 2025-10-02 12:14:34.467 2 DEBUG nova.compute.manager [req-ae931e8a-9529-4fed-b1f3-45752aa61c28 req-edb36495-0ca8-49e2-8fa9-b6694c168971 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-deleted-f57ded32-4a17-4f1b-b0eb-06069110bc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:34 np0005465988 nova_compute[236126]: 2025-10-02 12:14:34.468 2 INFO nova.compute.manager [req-ae931e8a-9529-4fed-b1f3-45752aa61c28 req-edb36495-0ca8-49e2-8fa9-b6694c168971 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Neutron deleted interface f57ded32-4a17-4f1b-b0eb-06069110bc4c; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:14:34 np0005465988 nova_compute[236126]: 2025-10-02 12:14:34.469 2 DEBUG nova.network.neutron [req-ae931e8a-9529-4fed-b1f3-45752aa61c28 req-edb36495-0ca8-49e2-8fa9-b6694c168971 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Updating instance_info_cache with network_info: [{"id": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "address": "fa:16:3e:41:7a:fb", "network": {"id": "66508b86-a2fe-4ff7-9652-19f5686c951d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-909997459", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb43112-c3", "ovs_interfaceid": "9bb43112-c321-4894-a7e6-6e7cdbb47eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8efa0909-383b-4c50-82de-99064aa6894d", "address": "fa:16:3e:8a:59:4d", "network": {"id": "6e4fcc17-40ff-4975-9ebb-75ca0833226a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1713615272", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8efa0909-38", "ovs_interfaceid": "8efa0909-383b-4c50-82de-99064aa6894d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:34 np0005465988 nova_compute[236126]: 2025-10-02 12:14:34.566 2 DEBUG nova.compute.manager [req-ae931e8a-9529-4fed-b1f3-45752aa61c28 req-edb36495-0ca8-49e2-8fa9-b6694c168971 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Detach interface failed, port_id=f57ded32-4a17-4f1b-b0eb-06069110bc4c, reason: Instance 612f24ca-960e-4fef-81d6-aefbd0d68bf5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:14:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:34.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:34.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.281 2 DEBUG nova.compute.manager [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-plugged-8efa0909-383b-4c50-82de-99064aa6894d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.281 2 DEBUG oslo_concurrency.lockutils [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.282 2 DEBUG oslo_concurrency.lockutils [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.283 2 DEBUG oslo_concurrency.lockutils [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.283 2 DEBUG nova.compute.manager [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] No waiting events found dispatching network-vif-plugged-8efa0909-383b-4c50-82de-99064aa6894d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.284 2 WARNING nova.compute.manager [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received unexpected event network-vif-plugged-8efa0909-383b-4c50-82de-99064aa6894d for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.284 2 DEBUG nova.compute.manager [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-unplugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.285 2 DEBUG oslo_concurrency.lockutils [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.285 2 DEBUG oslo_concurrency.lockutils [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.286 2 DEBUG oslo_concurrency.lockutils [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.286 2 DEBUG nova.compute.manager [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] No waiting events found dispatching network-vif-unplugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.287 2 DEBUG nova.compute.manager [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-unplugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.287 2 DEBUG nova.compute.manager [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-plugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.288 2 DEBUG oslo_concurrency.lockutils [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.289 2 DEBUG oslo_concurrency.lockutils [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.289 2 DEBUG oslo_concurrency.lockutils [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.289 2 DEBUG nova.compute.manager [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] No waiting events found dispatching network-vif-plugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.290 2 WARNING nova.compute.manager [req-4b925fae-d485-4faf-9b0c-d48e581f423f req-97dcde9e-0212-4a42-880b-c7da91d4b83e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received unexpected event network-vif-plugged-f57ded32-4a17-4f1b-b0eb-06069110bc4c for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.431 2 DEBUG nova.network.neutron [-] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.602 2 DEBUG nova.compute.manager [req-e22d7669-0ecb-43d5-b3cc-ff768941454e req-7077834a-b184-4b8a-9da7-4252aea0a762 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-plugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.603 2 DEBUG oslo_concurrency.lockutils [req-e22d7669-0ecb-43d5-b3cc-ff768941454e req-7077834a-b184-4b8a-9da7-4252aea0a762 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.603 2 DEBUG oslo_concurrency.lockutils [req-e22d7669-0ecb-43d5-b3cc-ff768941454e req-7077834a-b184-4b8a-9da7-4252aea0a762 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.604 2 DEBUG oslo_concurrency.lockutils [req-e22d7669-0ecb-43d5-b3cc-ff768941454e req-7077834a-b184-4b8a-9da7-4252aea0a762 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.605 2 DEBUG nova.compute.manager [req-e22d7669-0ecb-43d5-b3cc-ff768941454e req-7077834a-b184-4b8a-9da7-4252aea0a762 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] No waiting events found dispatching network-vif-plugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.605 2 WARNING nova.compute.manager [req-e22d7669-0ecb-43d5-b3cc-ff768941454e req-7077834a-b184-4b8a-9da7-4252aea0a762 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received unexpected event network-vif-plugged-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.624 2 INFO nova.compute.manager [-] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Took 2.18 seconds to deallocate network for instance.#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.742 2 DEBUG oslo_concurrency.lockutils [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.743 2 DEBUG oslo_concurrency.lockutils [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:35 np0005465988 nova_compute[236126]: 2025-10-02 12:14:35.801 2 DEBUG oslo_concurrency.processutils [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:14:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1618711799' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:14:36 np0005465988 nova_compute[236126]: 2025-10-02 12:14:36.249 2 DEBUG oslo_concurrency.processutils [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:36 np0005465988 nova_compute[236126]: 2025-10-02 12:14:36.256 2 DEBUG nova.compute.provider_tree [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:14:36 np0005465988 nova_compute[236126]: 2025-10-02 12:14:36.429 2 DEBUG nova.scheduler.client.report [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:14:36 np0005465988 nova_compute[236126]: 2025-10-02 12:14:36.562 2 DEBUG oslo_concurrency.lockutils [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.820s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:36.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:36 np0005465988 nova_compute[236126]: 2025-10-02 12:14:36.687 2 INFO nova.scheduler.client.report [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Deleted allocations for instance 612f24ca-960e-4fef-81d6-aefbd0d68bf5#033[00m
Oct  2 08:14:36 np0005465988 nova_compute[236126]: 2025-10-02 12:14:36.693 2 DEBUG nova.compute.manager [req-9218c1e0-c2cb-4c93-b6c8-fcf28e982cca req-581bf3b8-7423-43fd-bd50-c08fddea44a4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-deleted-8efa0909-383b-4c50-82de-99064aa6894d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:36 np0005465988 nova_compute[236126]: 2025-10-02 12:14:36.694 2 DEBUG nova.compute.manager [req-9218c1e0-c2cb-4c93-b6c8-fcf28e982cca req-581bf3b8-7423-43fd-bd50-c08fddea44a4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Received event network-vif-deleted-9bb43112-c321-4894-a7e6-6e7cdbb47eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:36.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:37 np0005465988 nova_compute[236126]: 2025-10-02 12:14:37.007 2 DEBUG oslo_concurrency.lockutils [None req-a6a5de92-08ef-460a-8c59-77f1928b679b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "612f24ca-960e-4fef-81d6-aefbd0d68bf5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:37 np0005465988 nova_compute[236126]: 2025-10-02 12:14:37.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:37 np0005465988 nova_compute[236126]: 2025-10-02 12:14:37.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:38.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:38.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:38 np0005465988 nova_compute[236126]: 2025-10-02 12:14:38.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:38.962055) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407278962138, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2223, "num_deletes": 258, "total_data_size": 4926773, "memory_usage": 5019024, "flush_reason": "Manual Compaction"}
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407278976351, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 2055007, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33673, "largest_seqno": 35891, "table_properties": {"data_size": 2047946, "index_size": 3815, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 18673, "raw_average_key_size": 21, "raw_value_size": 2032262, "raw_average_value_size": 2352, "num_data_blocks": 166, "num_entries": 864, "num_filter_entries": 864, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759407114, "oldest_key_time": 1759407114, "file_creation_time": 1759407278, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 14405 microseconds, and 9432 cpu microseconds.
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:38.976468) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 2055007 bytes OK
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:38.976493) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:38.977934) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:38.977955) EVENT_LOG_v1 {"time_micros": 1759407278977948, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:38.977977) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 4916858, prev total WAL file size 4939387, number of live WAL files 2.
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:38.982430) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303035' seq:72057594037927935, type:22 .. '6D6772737461740031323537' seq:0, type:0; will stop at (end)
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(2006KB)], [63(9901KB)]
Oct  2 08:14:38 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407278982518, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 12194370, "oldest_snapshot_seqno": -1}
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6024 keys, 9496029 bytes, temperature: kUnknown
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407279042077, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 9496029, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9456150, "index_size": 23705, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15109, "raw_key_size": 153355, "raw_average_key_size": 25, "raw_value_size": 9348401, "raw_average_value_size": 1551, "num_data_blocks": 957, "num_entries": 6024, "num_filter_entries": 6024, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759407278, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:39.042442) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 9496029 bytes
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:39.044057) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 204.3 rd, 159.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 9.7 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(10.6) write-amplify(4.6) OK, records in: 6480, records dropped: 456 output_compression: NoCompression
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:39.044087) EVENT_LOG_v1 {"time_micros": 1759407279044073, "job": 38, "event": "compaction_finished", "compaction_time_micros": 59676, "compaction_time_cpu_micros": 41991, "output_level": 6, "num_output_files": 1, "total_output_size": 9496029, "num_input_records": 6480, "num_output_records": 6024, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407279045067, "job": 38, "event": "table_file_deletion", "file_number": 65}
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407279048567, "job": 38, "event": "table_file_deletion", "file_number": 63}
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:38.982126) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:39.048760) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:39.048771) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:39.048775) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:39.048781) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:39.048789) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:14:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:14:39 np0005465988 nova_compute[236126]: 2025-10-02 12:14:39.491 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:39 np0005465988 nova_compute[236126]: 2025-10-02 12:14:39.492 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:39 np0005465988 podman[262517]: 2025-10-02 12:14:39.539157774 +0000 UTC m=+0.062003412 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:14:39 np0005465988 nova_compute[236126]: 2025-10-02 12:14:39.856 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:40 np0005465988 nova_compute[236126]: 2025-10-02 12:14:40.498 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:14:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:14:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:14:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:14:40 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:14:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:40.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:40.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e209 e209: 3 total, 3 up, 3 in
Oct  2 08:14:41 np0005465988 nova_compute[236126]: 2025-10-02 12:14:41.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:41 np0005465988 nova_compute[236126]: 2025-10-02 12:14:41.545 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:41 np0005465988 nova_compute[236126]: 2025-10-02 12:14:41.546 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:41 np0005465988 nova_compute[236126]: 2025-10-02 12:14:41.546 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:41 np0005465988 nova_compute[236126]: 2025-10-02 12:14:41.546 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:14:41 np0005465988 nova_compute[236126]: 2025-10-02 12:14:41.546 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:14:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2550400584' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.028 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.223 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.224 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4669MB free_disk=20.90121078491211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.224 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.224 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e209 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.376 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.377 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.404 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.424 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.424 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.440 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.476 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.500 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:42 np0005465988 nova_compute[236126]: 2025-10-02 12:14:42.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:42.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:42.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:14:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2913259531' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:14:43 np0005465988 nova_compute[236126]: 2025-10-02 12:14:43.013 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:43 np0005465988 nova_compute[236126]: 2025-10-02 12:14:43.019 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:14:43 np0005465988 nova_compute[236126]: 2025-10-02 12:14:43.051 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:14:43 np0005465988 nova_compute[236126]: 2025-10-02 12:14:43.072 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:14:43 np0005465988 nova_compute[236126]: 2025-10-02 12:14:43.072 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.848s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:44.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:44.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:45 np0005465988 nova_compute[236126]: 2025-10-02 12:14:45.072 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:45 np0005465988 nova_compute[236126]: 2025-10-02 12:14:45.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:45 np0005465988 nova_compute[236126]: 2025-10-02 12:14:45.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:45 np0005465988 nova_compute[236126]: 2025-10-02 12:14:45.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:14:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:46.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:46.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:47 np0005465988 nova_compute[236126]: 2025-10-02 12:14:47.066 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407272.0655482, 612f24ca-960e-4fef-81d6-aefbd0d68bf5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:47 np0005465988 nova_compute[236126]: 2025-10-02 12:14:47.067 2 INFO nova.compute.manager [-] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:14:47 np0005465988 nova_compute[236126]: 2025-10-02 12:14:47.090 2 DEBUG nova.compute.manager [None req-e0e9d70c-b9b9-4bad-a9ba-2d54a718e270 - - - - - -] [instance: 612f24ca-960e-4fef-81d6-aefbd0d68bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:47 np0005465988 nova_compute[236126]: 2025-10-02 12:14:47.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e209 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:47 np0005465988 nova_compute[236126]: 2025-10-02 12:14:47.470 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:47 np0005465988 nova_compute[236126]: 2025-10-02 12:14:47.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:14:47 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1652653518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:14:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e210 e210: 3 total, 3 up, 3 in
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.536 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.575 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.576 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:48.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.622 2 DEBUG nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.755 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.756 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.764 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.765 2 INFO nova.compute.claims [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:14:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:14:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:48.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:14:48 np0005465988 nova_compute[236126]: 2025-10-02 12:14:48.938 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:14:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:14:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:14:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2811471746' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.421 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.431 2 DEBUG nova.compute.provider_tree [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.452 2 DEBUG nova.scheduler.client.report [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.529 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.531 2 DEBUG nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.601 2 DEBUG nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.601 2 DEBUG nova.network.neutron [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.632 2 INFO nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.660 2 DEBUG nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.827 2 DEBUG nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.829 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.830 2 INFO nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Creating image(s)#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.864 2 DEBUG nova.storage.rbd_utils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.890 2 DEBUG nova.storage.rbd_utils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.914 2 DEBUG nova.storage.rbd_utils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.921 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:49 np0005465988 nova_compute[236126]: 2025-10-02 12:14:49.961 2 DEBUG nova.policy [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd107dd863d2e4a56853a0b758cb2c110', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ba87091f122a4afabe0a62682078fece', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:14:50 np0005465988 nova_compute[236126]: 2025-10-02 12:14:50.013 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:50 np0005465988 nova_compute[236126]: 2025-10-02 12:14:50.014 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:50 np0005465988 nova_compute[236126]: 2025-10-02 12:14:50.015 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:50 np0005465988 nova_compute[236126]: 2025-10-02 12:14:50.015 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:50 np0005465988 nova_compute[236126]: 2025-10-02 12:14:50.043 2 DEBUG nova.storage.rbd_utils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:14:50 np0005465988 nova_compute[236126]: 2025-10-02 12:14:50.048 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e211 e211: 3 total, 3 up, 3 in
Oct  2 08:14:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:50.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:50.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:50 np0005465988 nova_compute[236126]: 2025-10-02 12:14:50.869 2 DEBUG nova.network.neutron [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Successfully created port: 2b1d96dd-80af-466e-bcec-84d96c0ace5a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:14:51 np0005465988 nova_compute[236126]: 2025-10-02 12:14:51.682 2 DEBUG nova.network.neutron [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Successfully created port: 47f72a44-d893-407b-9a47-a23914996fd8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:14:52 np0005465988 nova_compute[236126]: 2025-10-02 12:14:52.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:52 np0005465988 nova_compute[236126]: 2025-10-02 12:14:52.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:52.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:52.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:53 np0005465988 nova_compute[236126]: 2025-10-02 12:14:53.291 2 DEBUG nova.network.neutron [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Successfully updated port: 2b1d96dd-80af-466e-bcec-84d96c0ace5a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:14:53 np0005465988 nova_compute[236126]: 2025-10-02 12:14:53.452 2 DEBUG nova.compute.manager [req-8eb58a30-25e3-41f4-8b0f-1a87e115d000 req-e17b58ef-1db5-4ba2-8792-503ff690b654 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-changed-2b1d96dd-80af-466e-bcec-84d96c0ace5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:53 np0005465988 nova_compute[236126]: 2025-10-02 12:14:53.452 2 DEBUG nova.compute.manager [req-8eb58a30-25e3-41f4-8b0f-1a87e115d000 req-e17b58ef-1db5-4ba2-8792-503ff690b654 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Refreshing instance network info cache due to event network-changed-2b1d96dd-80af-466e-bcec-84d96c0ace5a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:14:53 np0005465988 nova_compute[236126]: 2025-10-02 12:14:53.453 2 DEBUG oslo_concurrency.lockutils [req-8eb58a30-25e3-41f4-8b0f-1a87e115d000 req-e17b58ef-1db5-4ba2-8792-503ff690b654 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-7a82919b-9ebf-48bc-9416-81be5ea3fc18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:53 np0005465988 nova_compute[236126]: 2025-10-02 12:14:53.453 2 DEBUG oslo_concurrency.lockutils [req-8eb58a30-25e3-41f4-8b0f-1a87e115d000 req-e17b58ef-1db5-4ba2-8792-503ff690b654 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-7a82919b-9ebf-48bc-9416-81be5ea3fc18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:53 np0005465988 nova_compute[236126]: 2025-10-02 12:14:53.453 2 DEBUG nova.network.neutron [req-8eb58a30-25e3-41f4-8b0f-1a87e115d000 req-e17b58ef-1db5-4ba2-8792-503ff690b654 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Refreshing network info cache for port 2b1d96dd-80af-466e-bcec-84d96c0ace5a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:14:54 np0005465988 nova_compute[236126]: 2025-10-02 12:14:54.172 2 DEBUG nova.network.neutron [req-8eb58a30-25e3-41f4-8b0f-1a87e115d000 req-e17b58ef-1db5-4ba2-8792-503ff690b654 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:14:54 np0005465988 nova_compute[236126]: 2025-10-02 12:14:54.335 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.287s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:54 np0005465988 nova_compute[236126]: 2025-10-02 12:14:54.429 2 DEBUG nova.storage.rbd_utils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] resizing rbd image 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:14:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:54.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:54.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:55 np0005465988 nova_compute[236126]: 2025-10-02 12:14:55.021 2 DEBUG nova.network.neutron [req-8eb58a30-25e3-41f4-8b0f-1a87e115d000 req-e17b58ef-1db5-4ba2-8792-503ff690b654 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:55 np0005465988 nova_compute[236126]: 2025-10-02 12:14:55.067 2 DEBUG oslo_concurrency.lockutils [req-8eb58a30-25e3-41f4-8b0f-1a87e115d000 req-e17b58ef-1db5-4ba2-8792-503ff690b654 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-7a82919b-9ebf-48bc-9416-81be5ea3fc18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:14:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/299602845' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:14:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:14:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/299602845' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:14:55 np0005465988 nova_compute[236126]: 2025-10-02 12:14:55.309 2 DEBUG nova.network.neutron [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Successfully updated port: 47f72a44-d893-407b-9a47-a23914996fd8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:14:55 np0005465988 nova_compute[236126]: 2025-10-02 12:14:55.411 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "refresh_cache-7a82919b-9ebf-48bc-9416-81be5ea3fc18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:55 np0005465988 nova_compute[236126]: 2025-10-02 12:14:55.412 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquired lock "refresh_cache-7a82919b-9ebf-48bc-9416-81be5ea3fc18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:55 np0005465988 nova_compute[236126]: 2025-10-02 12:14:55.412 2 DEBUG nova.network.neutron [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:14:55 np0005465988 podman[262812]: 2025-10-02 12:14:55.543571906 +0000 UTC m=+0.073321545 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:14:55 np0005465988 podman[262813]: 2025-10-02 12:14:55.545604665 +0000 UTC m=+0.070684508 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:14:55 np0005465988 nova_compute[236126]: 2025-10-02 12:14:55.550 2 DEBUG nova.compute.manager [req-c7b4b08d-8819-4ff8-8a28-c8641a6a2972 req-22552b5c-f00a-42eb-8d01-98f3cfcd7f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-changed-47f72a44-d893-407b-9a47-a23914996fd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:55 np0005465988 nova_compute[236126]: 2025-10-02 12:14:55.550 2 DEBUG nova.compute.manager [req-c7b4b08d-8819-4ff8-8a28-c8641a6a2972 req-22552b5c-f00a-42eb-8d01-98f3cfcd7f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Refreshing instance network info cache due to event network-changed-47f72a44-d893-407b-9a47-a23914996fd8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:14:55 np0005465988 nova_compute[236126]: 2025-10-02 12:14:55.550 2 DEBUG oslo_concurrency.lockutils [req-c7b4b08d-8819-4ff8-8a28-c8641a6a2972 req-22552b5c-f00a-42eb-8d01-98f3cfcd7f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-7a82919b-9ebf-48bc-9416-81be5ea3fc18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:55 np0005465988 podman[262811]: 2025-10-02 12:14:55.582601072 +0000 UTC m=+0.114663619 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:14:55 np0005465988 nova_compute[236126]: 2025-10-02 12:14:55.644 2 DEBUG nova.network.neutron [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:14:56 np0005465988 nova_compute[236126]: 2025-10-02 12:14:56.287 2 DEBUG nova.objects.instance [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lazy-loading 'migration_context' on Instance uuid 7a82919b-9ebf-48bc-9416-81be5ea3fc18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e212 e212: 3 total, 3 up, 3 in
Oct  2 08:14:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:56.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:14:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:56.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:14:57 np0005465988 nova_compute[236126]: 2025-10-02 12:14:57.094 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:14:57 np0005465988 nova_compute[236126]: 2025-10-02 12:14:57.094 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Ensure instance console log exists: /var/lib/nova/instances/7a82919b-9ebf-48bc-9416-81be5ea3fc18/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:14:57 np0005465988 nova_compute[236126]: 2025-10-02 12:14:57.094 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:57 np0005465988 nova_compute[236126]: 2025-10-02 12:14:57.095 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:57 np0005465988 nova_compute[236126]: 2025-10-02 12:14:57.095 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:57 np0005465988 nova_compute[236126]: 2025-10-02 12:14:57.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:14:57 np0005465988 nova_compute[236126]: 2025-10-02 12:14:57.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:14:58.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:14:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:14:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:14:58.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:14:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e213 e213: 3 total, 3 up, 3 in
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.300317) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407299300393, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 559, "num_deletes": 252, "total_data_size": 804714, "memory_usage": 816216, "flush_reason": "Manual Compaction"}
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407299306816, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 520019, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35896, "largest_seqno": 36450, "table_properties": {"data_size": 517003, "index_size": 988, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7610, "raw_average_key_size": 19, "raw_value_size": 510696, "raw_average_value_size": 1336, "num_data_blocks": 43, "num_entries": 382, "num_filter_entries": 382, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759407278, "oldest_key_time": 1759407278, "file_creation_time": 1759407299, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 6550 microseconds, and 3165 cpu microseconds.
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.306865) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 520019 bytes OK
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.306887) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.308542) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.308569) EVENT_LOG_v1 {"time_micros": 1759407299308561, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.308594) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 801401, prev total WAL file size 801401, number of live WAL files 2.
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.309335) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(507KB)], [66(9273KB)]
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407299309416, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 10016048, "oldest_snapshot_seqno": -1}
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 5885 keys, 8184739 bytes, temperature: kUnknown
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407299361256, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 8184739, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8146924, "index_size": 22025, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14725, "raw_key_size": 151301, "raw_average_key_size": 25, "raw_value_size": 8042647, "raw_average_value_size": 1366, "num_data_blocks": 880, "num_entries": 5885, "num_filter_entries": 5885, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759407299, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.361946) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 8184739 bytes
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.363609) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 191.4 rd, 156.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 9.1 +0.0 blob) out(7.8 +0.0 blob), read-write-amplify(35.0) write-amplify(15.7) OK, records in: 6406, records dropped: 521 output_compression: NoCompression
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.363661) EVENT_LOG_v1 {"time_micros": 1759407299363641, "job": 40, "event": "compaction_finished", "compaction_time_micros": 52319, "compaction_time_cpu_micros": 37189, "output_level": 6, "num_output_files": 1, "total_output_size": 8184739, "num_input_records": 6406, "num_output_records": 5885, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407299364686, "job": 40, "event": "table_file_deletion", "file_number": 68}
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407299368313, "job": 40, "event": "table_file_deletion", "file_number": 66}
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.309247) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.368352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.368356) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.368392) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.368394) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:14:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:14:59.368396) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:15:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:00.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:00.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:00 np0005465988 nova_compute[236126]: 2025-10-02 12:15:00.958 2 DEBUG nova.network.neutron [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Updating instance_info_cache with network_info: [{"id": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "address": "fa:16:3e:e7:2b:4d", "network": {"id": "2f21da81-29bc-4940-a16a-f2ffdd3e65c5", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1339475481", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.93", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b1d96dd-80", "ovs_interfaceid": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47f72a44-d893-407b-9a47-a23914996fd8", "address": "fa:16:3e:41:46:9b", "network": {"id": "0d17f40d-92af-4ebd-87c4-5dfcec0eed15", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-876470846", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47f72a44-d8", "ovs_interfaceid": "47f72a44-d893-407b-9a47-a23914996fd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:00 np0005465988 nova_compute[236126]: 2025-10-02 12:15:00.995 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Releasing lock "refresh_cache-7a82919b-9ebf-48bc-9416-81be5ea3fc18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:00 np0005465988 nova_compute[236126]: 2025-10-02 12:15:00.996 2 DEBUG nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Instance network_info: |[{"id": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "address": "fa:16:3e:e7:2b:4d", "network": {"id": "2f21da81-29bc-4940-a16a-f2ffdd3e65c5", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1339475481", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.93", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b1d96dd-80", "ovs_interfaceid": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47f72a44-d893-407b-9a47-a23914996fd8", "address": "fa:16:3e:41:46:9b", "network": {"id": "0d17f40d-92af-4ebd-87c4-5dfcec0eed15", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-876470846", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47f72a44-d8", "ovs_interfaceid": "47f72a44-d893-407b-9a47-a23914996fd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:15:00 np0005465988 nova_compute[236126]: 2025-10-02 12:15:00.996 2 DEBUG oslo_concurrency.lockutils [req-c7b4b08d-8819-4ff8-8a28-c8641a6a2972 req-22552b5c-f00a-42eb-8d01-98f3cfcd7f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-7a82919b-9ebf-48bc-9416-81be5ea3fc18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:00 np0005465988 nova_compute[236126]: 2025-10-02 12:15:00.996 2 DEBUG nova.network.neutron [req-c7b4b08d-8819-4ff8-8a28-c8641a6a2972 req-22552b5c-f00a-42eb-8d01-98f3cfcd7f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Refreshing network info cache for port 47f72a44-d893-407b-9a47-a23914996fd8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.001 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Start _get_guest_xml network_info=[{"id": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "address": "fa:16:3e:e7:2b:4d", "network": {"id": "2f21da81-29bc-4940-a16a-f2ffdd3e65c5", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1339475481", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.93", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b1d96dd-80", "ovs_interfaceid": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47f72a44-d893-407b-9a47-a23914996fd8", "address": "fa:16:3e:41:46:9b", "network": {"id": "0d17f40d-92af-4ebd-87c4-5dfcec0eed15", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-876470846", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47f72a44-d8", "ovs_interfaceid": "47f72a44-d893-407b-9a47-a23914996fd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.006 2 WARNING nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.018 2 DEBUG nova.virt.libvirt.host [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.019 2 DEBUG nova.virt.libvirt.host [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.029 2 DEBUG nova.virt.libvirt.host [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.030 2 DEBUG nova.virt.libvirt.host [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.032 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.032 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.033 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.033 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.034 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.034 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.034 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.035 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.035 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.035 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.036 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.036 2 DEBUG nova.virt.hardware [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.040 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:15:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3679141303' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.491 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.517 2 DEBUG nova.storage.rbd_utils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.522 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:15:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3454801387' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.974 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.976 2 DEBUG nova.virt.libvirt.vif [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2093437597',display_name='tempest-ServersTestMultiNic-server-2093437597',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2093437597',id=65,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-g9uyprv7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:49Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=7a82919b-9ebf-48bc-9416-81be5ea3fc18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "address": "fa:16:3e:e7:2b:4d", "network": {"id": "2f21da81-29bc-4940-a16a-f2ffdd3e65c5", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1339475481", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.93", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b1d96dd-80", "ovs_interfaceid": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.977 2 DEBUG nova.network.os_vif_util [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "address": "fa:16:3e:e7:2b:4d", "network": {"id": "2f21da81-29bc-4940-a16a-f2ffdd3e65c5", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1339475481", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.93", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b1d96dd-80", "ovs_interfaceid": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.978 2 DEBUG nova.network.os_vif_util [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:2b:4d,bridge_name='br-int',has_traffic_filtering=True,id=2b1d96dd-80af-466e-bcec-84d96c0ace5a,network=Network(2f21da81-29bc-4940-a16a-f2ffdd3e65c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b1d96dd-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.979 2 DEBUG nova.virt.libvirt.vif [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2093437597',display_name='tempest-ServersTestMultiNic-server-2093437597',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2093437597',id=65,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-g9uyprv7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:49Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=7a82919b-9ebf-48bc-9416-81be5ea3fc18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47f72a44-d893-407b-9a47-a23914996fd8", "address": "fa:16:3e:41:46:9b", "network": {"id": "0d17f40d-92af-4ebd-87c4-5dfcec0eed15", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-876470846", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47f72a44-d8", "ovs_interfaceid": "47f72a44-d893-407b-9a47-a23914996fd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.979 2 DEBUG nova.network.os_vif_util [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "47f72a44-d893-407b-9a47-a23914996fd8", "address": "fa:16:3e:41:46:9b", "network": {"id": "0d17f40d-92af-4ebd-87c4-5dfcec0eed15", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-876470846", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47f72a44-d8", "ovs_interfaceid": "47f72a44-d893-407b-9a47-a23914996fd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.980 2 DEBUG nova.network.os_vif_util [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:46:9b,bridge_name='br-int',has_traffic_filtering=True,id=47f72a44-d893-407b-9a47-a23914996fd8,network=Network(0d17f40d-92af-4ebd-87c4-5dfcec0eed15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47f72a44-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:01 np0005465988 nova_compute[236126]: 2025-10-02 12:15:01.981 2 DEBUG nova.objects.instance [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lazy-loading 'pci_devices' on Instance uuid 7a82919b-9ebf-48bc-9416-81be5ea3fc18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.001 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  <uuid>7a82919b-9ebf-48bc-9416-81be5ea3fc18</uuid>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  <name>instance-00000041</name>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersTestMultiNic-server-2093437597</nova:name>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:15:01</nova:creationTime>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <nova:user uuid="d107dd863d2e4a56853a0b758cb2c110">tempest-ServersTestMultiNic-1178748303-project-member</nova:user>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <nova:project uuid="ba87091f122a4afabe0a62682078fece">tempest-ServersTestMultiNic-1178748303</nova:project>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <nova:port uuid="2b1d96dd-80af-466e-bcec-84d96c0ace5a">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.93" ipVersion="4"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <nova:port uuid="47f72a44-d893-407b-9a47-a23914996fd8">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.1.97" ipVersion="4"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <entry name="serial">7a82919b-9ebf-48bc-9416-81be5ea3fc18</entry>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <entry name="uuid">7a82919b-9ebf-48bc-9416-81be5ea3fc18</entry>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk.config">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:e7:2b:4d"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <target dev="tap2b1d96dd-80"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:41:46:9b"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <target dev="tap47f72a44-d8"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/7a82919b-9ebf-48bc-9416-81be5ea3fc18/console.log" append="off"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:15:02 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:15:02 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:15:02 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:15:02 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.002 2 DEBUG nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Preparing to wait for external event network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.003 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.003 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.003 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.004 2 DEBUG nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Preparing to wait for external event network-vif-plugged-47f72a44-d893-407b-9a47-a23914996fd8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.004 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.004 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.004 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.005 2 DEBUG nova.virt.libvirt.vif [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2093437597',display_name='tempest-ServersTestMultiNic-server-2093437597',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2093437597',id=65,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-g9uyprv7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:49Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=7a82919b-9ebf-48bc-9416-81be5ea3fc18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "address": "fa:16:3e:e7:2b:4d", "network": {"id": "2f21da81-29bc-4940-a16a-f2ffdd3e65c5", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1339475481", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.93", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b1d96dd-80", "ovs_interfaceid": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.005 2 DEBUG nova.network.os_vif_util [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "address": "fa:16:3e:e7:2b:4d", "network": {"id": "2f21da81-29bc-4940-a16a-f2ffdd3e65c5", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1339475481", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.93", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b1d96dd-80", "ovs_interfaceid": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.006 2 DEBUG nova.network.os_vif_util [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:2b:4d,bridge_name='br-int',has_traffic_filtering=True,id=2b1d96dd-80af-466e-bcec-84d96c0ace5a,network=Network(2f21da81-29bc-4940-a16a-f2ffdd3e65c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b1d96dd-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.006 2 DEBUG os_vif [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:2b:4d,bridge_name='br-int',has_traffic_filtering=True,id=2b1d96dd-80af-466e-bcec-84d96c0ace5a,network=Network(2f21da81-29bc-4940-a16a-f2ffdd3e65c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b1d96dd-80') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.007 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.007 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.011 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2b1d96dd-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.011 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2b1d96dd-80, col_values=(('external_ids', {'iface-id': '2b1d96dd-80af-466e-bcec-84d96c0ace5a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:2b:4d', 'vm-uuid': '7a82919b-9ebf-48bc-9416-81be5ea3fc18'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:02 np0005465988 NetworkManager[45041]: <info>  [1759407302.0140] manager: (tap2b1d96dd-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.021 2 INFO os_vif [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:2b:4d,bridge_name='br-int',has_traffic_filtering=True,id=2b1d96dd-80af-466e-bcec-84d96c0ace5a,network=Network(2f21da81-29bc-4940-a16a-f2ffdd3e65c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b1d96dd-80')#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.022 2 DEBUG nova.virt.libvirt.vif [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2093437597',display_name='tempest-ServersTestMultiNic-server-2093437597',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2093437597',id=65,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-g9uyprv7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:49Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=7a82919b-9ebf-48bc-9416-81be5ea3fc18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47f72a44-d893-407b-9a47-a23914996fd8", "address": "fa:16:3e:41:46:9b", "network": {"id": "0d17f40d-92af-4ebd-87c4-5dfcec0eed15", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-876470846", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47f72a44-d8", "ovs_interfaceid": "47f72a44-d893-407b-9a47-a23914996fd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.022 2 DEBUG nova.network.os_vif_util [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "47f72a44-d893-407b-9a47-a23914996fd8", "address": "fa:16:3e:41:46:9b", "network": {"id": "0d17f40d-92af-4ebd-87c4-5dfcec0eed15", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-876470846", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47f72a44-d8", "ovs_interfaceid": "47f72a44-d893-407b-9a47-a23914996fd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.023 2 DEBUG nova.network.os_vif_util [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:46:9b,bridge_name='br-int',has_traffic_filtering=True,id=47f72a44-d893-407b-9a47-a23914996fd8,network=Network(0d17f40d-92af-4ebd-87c4-5dfcec0eed15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47f72a44-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.023 2 DEBUG os_vif [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:46:9b,bridge_name='br-int',has_traffic_filtering=True,id=47f72a44-d893-407b-9a47-a23914996fd8,network=Network(0d17f40d-92af-4ebd-87c4-5dfcec0eed15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47f72a44-d8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.024 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.025 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.027 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47f72a44-d8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.027 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap47f72a44-d8, col_values=(('external_ids', {'iface-id': '47f72a44-d893-407b-9a47-a23914996fd8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:46:9b', 'vm-uuid': '7a82919b-9ebf-48bc-9416-81be5ea3fc18'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:02 np0005465988 NetworkManager[45041]: <info>  [1759407302.0305] manager: (tap47f72a44-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.036 2 INFO os_vif [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:46:9b,bridge_name='br-int',has_traffic_filtering=True,id=47f72a44-d893-407b-9a47-a23914996fd8,network=Network(0d17f40d-92af-4ebd-87c4-5dfcec0eed15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47f72a44-d8')#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.095 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.096 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.096 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] No VIF found with MAC fa:16:3e:e7:2b:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.096 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] No VIF found with MAC fa:16:3e:41:46:9b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.097 2 INFO nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Using config drive#033[00m
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.126 2 DEBUG nova.storage.rbd_utils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:02 np0005465988 nova_compute[236126]: 2025-10-02 12:15:02.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:02.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:02.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:03 np0005465988 nova_compute[236126]: 2025-10-02 12:15:03.081 2 INFO nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Creating config drive at /var/lib/nova/instances/7a82919b-9ebf-48bc-9416-81be5ea3fc18/disk.config#033[00m
Oct  2 08:15:03 np0005465988 nova_compute[236126]: 2025-10-02 12:15:03.086 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7a82919b-9ebf-48bc-9416-81be5ea3fc18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp03laeu6t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:03 np0005465988 nova_compute[236126]: 2025-10-02 12:15:03.224 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7a82919b-9ebf-48bc-9416-81be5ea3fc18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp03laeu6t" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:03 np0005465988 nova_compute[236126]: 2025-10-02 12:15:03.266 2 DEBUG nova.storage.rbd_utils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] rbd image 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:03 np0005465988 nova_compute[236126]: 2025-10-02 12:15:03.270 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7a82919b-9ebf-48bc-9416-81be5ea3fc18/disk.config 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e214 e214: 3 total, 3 up, 3 in
Oct  2 08:15:03 np0005465988 nova_compute[236126]: 2025-10-02 12:15:03.737 2 DEBUG nova.network.neutron [req-c7b4b08d-8819-4ff8-8a28-c8641a6a2972 req-22552b5c-f00a-42eb-8d01-98f3cfcd7f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Updated VIF entry in instance network info cache for port 47f72a44-d893-407b-9a47-a23914996fd8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:15:03 np0005465988 nova_compute[236126]: 2025-10-02 12:15:03.738 2 DEBUG nova.network.neutron [req-c7b4b08d-8819-4ff8-8a28-c8641a6a2972 req-22552b5c-f00a-42eb-8d01-98f3cfcd7f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Updating instance_info_cache with network_info: [{"id": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "address": "fa:16:3e:e7:2b:4d", "network": {"id": "2f21da81-29bc-4940-a16a-f2ffdd3e65c5", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1339475481", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.93", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b1d96dd-80", "ovs_interfaceid": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47f72a44-d893-407b-9a47-a23914996fd8", "address": "fa:16:3e:41:46:9b", "network": {"id": "0d17f40d-92af-4ebd-87c4-5dfcec0eed15", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-876470846", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47f72a44-d8", "ovs_interfaceid": "47f72a44-d893-407b-9a47-a23914996fd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:03 np0005465988 nova_compute[236126]: 2025-10-02 12:15:03.768 2 DEBUG oslo_concurrency.lockutils [req-c7b4b08d-8819-4ff8-8a28-c8641a6a2972 req-22552b5c-f00a-42eb-8d01-98f3cfcd7f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-7a82919b-9ebf-48bc-9416-81be5ea3fc18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:03 np0005465988 nova_compute[236126]: 2025-10-02 12:15:03.874 2 DEBUG oslo_concurrency.processutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7a82919b-9ebf-48bc-9416-81be5ea3fc18/disk.config 7a82919b-9ebf-48bc-9416-81be5ea3fc18_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.604s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:03 np0005465988 nova_compute[236126]: 2025-10-02 12:15:03.875 2 INFO nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Deleting local config drive /var/lib/nova/instances/7a82919b-9ebf-48bc-9416-81be5ea3fc18/disk.config because it was imported into RBD.#033[00m
Oct  2 08:15:03 np0005465988 kernel: tap2b1d96dd-80: entered promiscuous mode
Oct  2 08:15:03 np0005465988 NetworkManager[45041]: <info>  [1759407303.9492] manager: (tap2b1d96dd-80): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Oct  2 08:15:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:03Z|00174|binding|INFO|Claiming lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a for this chassis.
Oct  2 08:15:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:03Z|00175|binding|INFO|2b1d96dd-80af-466e-bcec-84d96c0ace5a: Claiming fa:16:3e:e7:2b:4d 10.100.0.93
Oct  2 08:15:03 np0005465988 nova_compute[236126]: 2025-10-02 12:15:03.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:03.981 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:2b:4d 10.100.0.93'], port_security=['fa:16:3e:e7:2b:4d 10.100.0.93'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.93/24', 'neutron:device_id': '7a82919b-9ebf-48bc-9416-81be5ea3fc18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f21da81-29bc-4940-a16a-f2ffdd3e65c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0505e98-51cb-4c2b-b458-c879d6cb0928, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=2b1d96dd-80af-466e-bcec-84d96c0ace5a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:03 np0005465988 NetworkManager[45041]: <info>  [1759407303.9839] manager: (tap47f72a44-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Oct  2 08:15:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:03.983 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 2b1d96dd-80af-466e-bcec-84d96c0ace5a in datapath 2f21da81-29bc-4940-a16a-f2ffdd3e65c5 bound to our chassis#033[00m
Oct  2 08:15:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:03.987 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2f21da81-29bc-4940-a16a-f2ffdd3e65c5#033[00m
Oct  2 08:15:04 np0005465988 systemd-udevd[263086]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:15:04 np0005465988 systemd-udevd[263087]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.005 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8a9a03ee-c73c-48ad-9d22-3951cf3587d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.006 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2f21da81-21 in ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.008 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2f21da81-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.008 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[798aeec7-4c39-4c8f-b3ec-548a6b584886]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.010 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c868728b-c2df-4d85-b8fd-2b3595424415]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 kernel: tap47f72a44-d8: entered promiscuous mode
Oct  2 08:15:04 np0005465988 systemd-machined[192594]: New machine qemu-24-instance-00000041.
Oct  2 08:15:04 np0005465988 NetworkManager[45041]: <info>  [1759407304.0377] device (tap47f72a44-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.039 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[c581bf50-5f96-4c78-bbb8-749273f809c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:04Z|00176|binding|INFO|Claiming lport 47f72a44-d893-407b-9a47-a23914996fd8 for this chassis.
Oct  2 08:15:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:04Z|00177|binding|INFO|47f72a44-d893-407b-9a47-a23914996fd8: Claiming fa:16:3e:41:46:9b 10.100.1.97
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:04 np0005465988 NetworkManager[45041]: <info>  [1759407304.0536] device (tap2b1d96dd-80): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:15:04 np0005465988 systemd[1]: Started Virtual Machine qemu-24-instance-00000041.
Oct  2 08:15:04 np0005465988 NetworkManager[45041]: <info>  [1759407304.0551] device (tap47f72a44-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:15:04 np0005465988 NetworkManager[45041]: <info>  [1759407304.0560] device (tap2b1d96dd-80): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:15:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:04Z|00178|binding|INFO|Setting lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a ovn-installed in OVS
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.064 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:46:9b 10.100.1.97'], port_security=['fa:16:3e:41:46:9b 10.100.1.97'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.97/24', 'neutron:device_id': '7a82919b-9ebf-48bc-9416-81be5ea3fc18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d17f40d-92af-4ebd-87c4-5dfcec0eed15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11dd8f30-12dc-4708-8370-c41b02466921, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=47f72a44-d893-407b-9a47-a23914996fd8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:04Z|00179|binding|INFO|Setting lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a up in Southbound
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.076 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dc0e5a9c-bdbe-47af-b92c-341e72fc701a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:04Z|00180|binding|INFO|Setting lport 47f72a44-d893-407b-9a47-a23914996fd8 ovn-installed in OVS
Oct  2 08:15:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:04Z|00181|binding|INFO|Setting lport 47f72a44-d893-407b-9a47-a23914996fd8 up in Southbound
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.122 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[3de4a41b-b6bb-4c33-9b38-cf26e82e2eeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.126 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[57f94c16-b108-420c-81b9-46f7036ea40b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 NetworkManager[45041]: <info>  [1759407304.1402] manager: (tap2f21da81-20): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.160 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7df1fb5e-93e7-40e7-8fd0-b46538aed2ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.163 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2e4d39ba-bf8b-4d61-ae2a-faea5722c951]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 NetworkManager[45041]: <info>  [1759407304.1898] device (tap2f21da81-20): carrier: link connected
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.196 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c6449b49-6fa4-4676-9afc-2d5fb36b3467]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.222 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a229e610-f13c-4529-a9ee-27b538b2b7b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2f21da81-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:20:b3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536452, 'reachable_time': 24483, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263122, 'error': None, 'target': 'ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.237 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0b6b7c0f-d1f6-4ecc-8084-00e830b57bbc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:20b3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536452, 'tstamp': 536452}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 263123, 'error': None, 'target': 'ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.260 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ad7f3575-6ae9-4200-8210-e63dd3f2a832]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2f21da81-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:20:b3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536452, 'reachable_time': 24483, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 263124, 'error': None, 'target': 'ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.295 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e2058cb1-6edc-4f19-b8d8-3cceedeb6359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.368 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[18f5fdf9-8755-4e27-a4d1-9c96e051abdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.369 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f21da81-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.369 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.370 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2f21da81-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:04 np0005465988 kernel: tap2f21da81-20: entered promiscuous mode
Oct  2 08:15:04 np0005465988 NetworkManager[45041]: <info>  [1759407304.3724] manager: (tap2f21da81-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.376 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2f21da81-20, col_values=(('external_ids', {'iface-id': '55b93d99-a115-4541-aa1d-1e079bab887f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:04Z|00182|binding|INFO|Releasing lport 55b93d99-a115-4541-aa1d-1e079bab887f from this chassis (sb_readonly=0)
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.399 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2f21da81-29bc-4940-a16a-f2ffdd3e65c5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2f21da81-29bc-4940-a16a-f2ffdd3e65c5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.399 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[65f47619-0f76-454a-9103-5f963f30c560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.400 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-2f21da81-29bc-4940-a16a-f2ffdd3e65c5
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/2f21da81-29bc-4940-a16a-f2ffdd3e65c5.pid.haproxy
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 2f21da81-29bc-4940-a16a-f2ffdd3e65c5
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.401 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5', 'env', 'PROCESS_TAG=haproxy-2f21da81-29bc-4940-a16a-f2ffdd3e65c5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2f21da81-29bc-4940-a16a-f2ffdd3e65c5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.476 2 DEBUG nova.compute.manager [req-0fbffb3d-9a5e-4c1c-9435-59f92a3638b8 req-98f0f463-4259-4d8f-8b2e-f98e07776d88 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.477 2 DEBUG oslo_concurrency.lockutils [req-0fbffb3d-9a5e-4c1c-9435-59f92a3638b8 req-98f0f463-4259-4d8f-8b2e-f98e07776d88 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.483 2 DEBUG oslo_concurrency.lockutils [req-0fbffb3d-9a5e-4c1c-9435-59f92a3638b8 req-98f0f463-4259-4d8f-8b2e-f98e07776d88 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.483 2 DEBUG oslo_concurrency.lockutils [req-0fbffb3d-9a5e-4c1c-9435-59f92a3638b8 req-98f0f463-4259-4d8f-8b2e-f98e07776d88 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:04 np0005465988 nova_compute[236126]: 2025-10-02 12:15:04.484 2 DEBUG nova.compute.manager [req-0fbffb3d-9a5e-4c1c-9435-59f92a3638b8 req-98f0f463-4259-4d8f-8b2e-f98e07776d88 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Processing event network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:15:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:04.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:04 np0005465988 podman[263198]: 2025-10-02 12:15:04.794451907 +0000 UTC m=+0.051434628 container create a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:15:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:04.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:04 np0005465988 systemd[1]: Started libpod-conmon-a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c.scope.
Oct  2 08:15:04 np0005465988 podman[263198]: 2025-10-02 12:15:04.767915714 +0000 UTC m=+0.024898425 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:15:04 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:15:04 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12f50a11c04cee8c44506d1dfb16b550b64e656096b07aafc9af62b6e5d0be01/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:15:04 np0005465988 podman[263198]: 2025-10-02 12:15:04.897313051 +0000 UTC m=+0.154295832 container init a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:15:04 np0005465988 podman[263198]: 2025-10-02 12:15:04.904396367 +0000 UTC m=+0.161379118 container start a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:15:04 np0005465988 neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5[263214]: [NOTICE]   (263218) : New worker (263220) forked
Oct  2 08:15:04 np0005465988 neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5[263214]: [NOTICE]   (263218) : Loading success.
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.974 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 47f72a44-d893-407b-9a47-a23914996fd8 in datapath 0d17f40d-92af-4ebd-87c4-5dfcec0eed15 unbound from our chassis#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.977 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0d17f40d-92af-4ebd-87c4-5dfcec0eed15#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.989 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ead4379f-44ae-48f5-be9e-e82321d0a4c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.990 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0d17f40d-91 in ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.992 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0d17f40d-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.992 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f82a7143-9759-4e3a-b038-55756993e326]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:04.994 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0e1b9d28-2550-4d75-a487-4388d27864a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.007 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[2085330d-8d6f-407f-beb7-ad9a9b9f3af3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.020 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bd19855d-501f-49bb-8909-ccfcf5f0a921]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.056 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1d932fed-bdc1-4b5f-8a7b-e68608b5edbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 systemd-udevd[263107]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:15:05 np0005465988 NetworkManager[45041]: <info>  [1759407305.0722] manager: (tap0d17f40d-90): new Veth device (/org/freedesktop/NetworkManager/Devices/104)
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.066 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5348e895-3367-404d-8187-94ce2bcff318]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.114 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d725fbf0-42e9-414b-b1a7-af95c37ff430]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 nova_compute[236126]: 2025-10-02 12:15:05.116 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407305.1157205, 7a82919b-9ebf-48bc-9416-81be5ea3fc18 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:05 np0005465988 nova_compute[236126]: 2025-10-02 12:15:05.117 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] VM Started (Lifecycle Event)#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.118 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8052145b-75a4-4254-89db-94b50eb36a12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 nova_compute[236126]: 2025-10-02 12:15:05.139 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:05 np0005465988 nova_compute[236126]: 2025-10-02 12:15:05.143 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407305.1159275, 7a82919b-9ebf-48bc-9416-81be5ea3fc18 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:05 np0005465988 nova_compute[236126]: 2025-10-02 12:15:05.143 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:15:05 np0005465988 NetworkManager[45041]: <info>  [1759407305.1479] device (tap0d17f40d-90): carrier: link connected
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.154 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e8d986-cf52-4f40-b8ac-4e63ba434b72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 nova_compute[236126]: 2025-10-02 12:15:05.173 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.176 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e9356fe1-68dc-4789-a0d4-d2177dc91650]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0d17f40d-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:7b:75'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536547, 'reachable_time': 27140, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263239, 'error': None, 'target': 'ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 nova_compute[236126]: 2025-10-02 12:15:05.178 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.194 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7d1aacbe-1457-4bbf-9133-6de6a21b42cb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:7b75'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536547, 'tstamp': 536547}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 263240, 'error': None, 'target': 'ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 nova_compute[236126]: 2025-10-02 12:15:05.208 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.212 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[905c33ea-7989-4611-b463-c8da6fe7eb7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0d17f40d-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:7b:75'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536547, 'reachable_time': 27140, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 263241, 'error': None, 'target': 'ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.254 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5971b5c0-422b-457a-ac7e-a8b0fd2e69c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.322 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3cdf28-eade-4c5f-beeb-eca86d0b42e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.323 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d17f40d-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.323 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.324 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d17f40d-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:05 np0005465988 NetworkManager[45041]: <info>  [1759407305.3263] manager: (tap0d17f40d-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Oct  2 08:15:05 np0005465988 kernel: tap0d17f40d-90: entered promiscuous mode
Oct  2 08:15:05 np0005465988 nova_compute[236126]: 2025-10-02 12:15:05.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.329 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0d17f40d-90, col_values=(('external_ids', {'iface-id': '5560d662-0dd6-4c0d-aa23-f575bfcd6933'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:05 np0005465988 nova_compute[236126]: 2025-10-02 12:15:05.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:05Z|00183|binding|INFO|Releasing lport 5560d662-0dd6-4c0d-aa23-f575bfcd6933 from this chassis (sb_readonly=0)
Oct  2 08:15:05 np0005465988 nova_compute[236126]: 2025-10-02 12:15:05.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.355 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0d17f40d-92af-4ebd-87c4-5dfcec0eed15.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0d17f40d-92af-4ebd-87c4-5dfcec0eed15.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.357 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dad1b152-f8bd-4f7d-aad7-7343bf79848c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.357 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-0d17f40d-92af-4ebd-87c4-5dfcec0eed15
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/0d17f40d-92af-4ebd-87c4-5dfcec0eed15.pid.haproxy
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 0d17f40d-92af-4ebd-87c4-5dfcec0eed15
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:15:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:05.358 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15', 'env', 'PROCESS_TAG=haproxy-0d17f40d-92af-4ebd-87c4-5dfcec0eed15', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0d17f40d-92af-4ebd-87c4-5dfcec0eed15.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:15:05 np0005465988 podman[263273]: 2025-10-02 12:15:05.798900743 +0000 UTC m=+0.073368197 container create f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:15:05 np0005465988 systemd[1]: Started libpod-conmon-f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027.scope.
Oct  2 08:15:05 np0005465988 podman[263273]: 2025-10-02 12:15:05.766416497 +0000 UTC m=+0.040883951 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:15:05 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:15:05 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b3a6f90b1cd37677f0c2ae5898a7d9f8bcd3bba3b31e1fa9847f5aac49c9f16/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:15:05 np0005465988 podman[263273]: 2025-10-02 12:15:05.911671605 +0000 UTC m=+0.186139099 container init f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:15:05 np0005465988 podman[263273]: 2025-10-02 12:15:05.921514662 +0000 UTC m=+0.195982116 container start f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:15:05 np0005465988 neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15[263288]: [NOTICE]   (263292) : New worker (263294) forked
Oct  2 08:15:05 np0005465988 neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15[263288]: [NOTICE]   (263292) : Loading success.
Oct  2 08:15:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:06.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.674 2 DEBUG nova.compute.manager [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.674 2 DEBUG oslo_concurrency.lockutils [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.675 2 DEBUG oslo_concurrency.lockutils [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.675 2 DEBUG oslo_concurrency.lockutils [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.676 2 DEBUG nova.compute.manager [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] No event matching network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a in dict_keys([('network-vif-plugged', '47f72a44-d893-407b-9a47-a23914996fd8')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.676 2 WARNING nova.compute.manager [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received unexpected event network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.676 2 DEBUG nova.compute.manager [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-plugged-47f72a44-d893-407b-9a47-a23914996fd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.676 2 DEBUG oslo_concurrency.lockutils [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.677 2 DEBUG oslo_concurrency.lockutils [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.677 2 DEBUG oslo_concurrency.lockutils [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.677 2 DEBUG nova.compute.manager [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Processing event network-vif-plugged-47f72a44-d893-407b-9a47-a23914996fd8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.677 2 DEBUG nova.compute.manager [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-plugged-47f72a44-d893-407b-9a47-a23914996fd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.678 2 DEBUG oslo_concurrency.lockutils [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.678 2 DEBUG oslo_concurrency.lockutils [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.678 2 DEBUG oslo_concurrency.lockutils [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.679 2 DEBUG nova.compute.manager [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] No waiting events found dispatching network-vif-plugged-47f72a44-d893-407b-9a47-a23914996fd8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.679 2 WARNING nova.compute.manager [req-2a14b56d-39c0-4c0e-9f6e-2ef2b0c39252 req-d1370ef9-92cf-40cc-8028-316517b36fe2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received unexpected event network-vif-plugged-47f72a44-d893-407b-9a47-a23914996fd8 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.680 2 DEBUG nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.685 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.685 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407306.685413, 7a82919b-9ebf-48bc-9416-81be5ea3fc18 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.685 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.690 2 INFO nova.virt.libvirt.driver [-] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Instance spawned successfully.#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.690 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.715 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.721 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.724 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.725 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.726 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.726 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.727 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.727 2 DEBUG nova.virt.libvirt.driver [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.777 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:15:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:06.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.876 2 INFO nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Took 17.05 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:15:06 np0005465988 nova_compute[236126]: 2025-10-02 12:15:06.877 2 DEBUG nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:07 np0005465988 nova_compute[236126]: 2025-10-02 12:15:07.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:07 np0005465988 nova_compute[236126]: 2025-10-02 12:15:07.068 2 INFO nova.compute.manager [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Took 18.37 seconds to build instance.#033[00m
Oct  2 08:15:07 np0005465988 nova_compute[236126]: 2025-10-02 12:15:07.140 2 DEBUG oslo_concurrency.lockutils [None req-598c06fe-53cf-4e1c-9612-ee1b313d9a6b d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:07 np0005465988 nova_compute[236126]: 2025-10-02 12:15:07.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:08.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:08.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.769 2 DEBUG oslo_concurrency.lockutils [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.770 2 DEBUG oslo_concurrency.lockutils [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.770 2 DEBUG oslo_concurrency.lockutils [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.771 2 DEBUG oslo_concurrency.lockutils [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.772 2 DEBUG oslo_concurrency.lockutils [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.774 2 INFO nova.compute.manager [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Terminating instance#033[00m
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.776 2 DEBUG nova.compute.manager [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:15:09 np0005465988 kernel: tap2b1d96dd-80 (unregistering): left promiscuous mode
Oct  2 08:15:09 np0005465988 NetworkManager[45041]: <info>  [1759407309.8304] device (tap2b1d96dd-80): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:15:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:09Z|00184|binding|INFO|Releasing lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a from this chassis (sb_readonly=0)
Oct  2 08:15:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:09Z|00185|binding|INFO|Setting lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a down in Southbound
Oct  2 08:15:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:09Z|00186|binding|INFO|Removing iface tap2b1d96dd-80 ovn-installed in OVS
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:09.849 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:2b:4d 10.100.0.93'], port_security=['fa:16:3e:e7:2b:4d 10.100.0.93'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.93/24', 'neutron:device_id': '7a82919b-9ebf-48bc-9416-81be5ea3fc18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f21da81-29bc-4940-a16a-f2ffdd3e65c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0505e98-51cb-4c2b-b458-c879d6cb0928, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=2b1d96dd-80af-466e-bcec-84d96c0ace5a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:09.851 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 2b1d96dd-80af-466e-bcec-84d96c0ace5a in datapath 2f21da81-29bc-4940-a16a-f2ffdd3e65c5 unbound from our chassis#033[00m
Oct  2 08:15:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:09.854 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f21da81-29bc-4940-a16a-f2ffdd3e65c5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:15:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:09.855 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a71d89-8850-432e-bbad-10498e303955]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:09.855 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5 namespace which is not needed anymore#033[00m
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:09 np0005465988 kernel: tap47f72a44-d8 (unregistering): left promiscuous mode
Oct  2 08:15:09 np0005465988 NetworkManager[45041]: <info>  [1759407309.8694] device (tap47f72a44-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:09Z|00187|binding|INFO|Releasing lport 47f72a44-d893-407b-9a47-a23914996fd8 from this chassis (sb_readonly=0)
Oct  2 08:15:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:09Z|00188|binding|INFO|Setting lport 47f72a44-d893-407b-9a47-a23914996fd8 down in Southbound
Oct  2 08:15:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:09Z|00189|binding|INFO|Removing iface tap47f72a44-d8 ovn-installed in OVS
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:09.936 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:46:9b 10.100.1.97'], port_security=['fa:16:3e:41:46:9b 10.100.1.97'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.97/24', 'neutron:device_id': '7a82919b-9ebf-48bc-9416-81be5ea3fc18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d17f40d-92af-4ebd-87c4-5dfcec0eed15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11dd8f30-12dc-4708-8370-c41b02466921, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=47f72a44-d893-407b-9a47-a23914996fd8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:09 np0005465988 nova_compute[236126]: 2025-10-02 12:15:09.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:09 np0005465988 podman[263305]: 2025-10-02 12:15:09.949230203 +0000 UTC m=+0.094593994 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:15:09 np0005465988 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000041.scope: Deactivated successfully.
Oct  2 08:15:09 np0005465988 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000041.scope: Consumed 4.071s CPU time.
Oct  2 08:15:09 np0005465988 systemd-machined[192594]: Machine qemu-24-instance-00000041 terminated.
Oct  2 08:15:09 np0005465988 kernel: tap2b1d96dd-80: entered promiscuous mode
Oct  2 08:15:09 np0005465988 NetworkManager[45041]: <info>  [1759407309.9951] manager: (tap2b1d96dd-80): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Oct  2 08:15:09 np0005465988 kernel: tap2b1d96dd-80 (unregistering): left promiscuous mode
Oct  2 08:15:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:10Z|00190|binding|INFO|Claiming lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a for this chassis.
Oct  2 08:15:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:10Z|00191|binding|INFO|2b1d96dd-80af-466e-bcec-84d96c0ace5a: Claiming fa:16:3e:e7:2b:4d 10.100.0.93
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.011 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:2b:4d 10.100.0.93'], port_security=['fa:16:3e:e7:2b:4d 10.100.0.93'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.93/24', 'neutron:device_id': '7a82919b-9ebf-48bc-9416-81be5ea3fc18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f21da81-29bc-4940-a16a-f2ffdd3e65c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0505e98-51cb-4c2b-b458-c879d6cb0928, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=2b1d96dd-80af-466e-bcec-84d96c0ace5a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:10 np0005465988 NetworkManager[45041]: <info>  [1759407310.0208] manager: (tap47f72a44-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/107)
Oct  2 08:15:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:10Z|00192|binding|INFO|Setting lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a ovn-installed in OVS
Oct  2 08:15:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:10Z|00193|binding|INFO|Setting lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a up in Southbound
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:10Z|00194|binding|INFO|Releasing lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a from this chassis (sb_readonly=1)
Oct  2 08:15:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:10Z|00195|if_status|INFO|Not setting lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a down as sb is readonly
Oct  2 08:15:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:10Z|00196|binding|INFO|Removing iface tap2b1d96dd-80 ovn-installed in OVS
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:10Z|00197|binding|INFO|Releasing lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a from this chassis (sb_readonly=0)
Oct  2 08:15:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:10Z|00198|binding|INFO|Setting lport 2b1d96dd-80af-466e-bcec-84d96c0ace5a down in Southbound
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.041 2 INFO nova.virt.libvirt.driver [-] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Instance destroyed successfully.#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.041 2 DEBUG nova.objects.instance [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lazy-loading 'resources' on Instance uuid 7a82919b-9ebf-48bc-9416-81be5ea3fc18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.048 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:2b:4d 10.100.0.93'], port_security=['fa:16:3e:e7:2b:4d 10.100.0.93'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.93/24', 'neutron:device_id': '7a82919b-9ebf-48bc-9416-81be5ea3fc18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f21da81-29bc-4940-a16a-f2ffdd3e65c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba87091f122a4afabe0a62682078fece', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2e045e2-9997-47d6-8453-aba0f450cdb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0505e98-51cb-4c2b-b458-c879d6cb0928, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=2b1d96dd-80af-466e-bcec-84d96c0ace5a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:10 np0005465988 neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5[263214]: [NOTICE]   (263218) : haproxy version is 2.8.14-c23fe91
Oct  2 08:15:10 np0005465988 neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5[263214]: [NOTICE]   (263218) : path to executable is /usr/sbin/haproxy
Oct  2 08:15:10 np0005465988 neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5[263214]: [WARNING]  (263218) : Exiting Master process...
Oct  2 08:15:10 np0005465988 neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5[263214]: [ALERT]    (263218) : Current worker (263220) exited with code 143 (Terminated)
Oct  2 08:15:10 np0005465988 neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5[263214]: [WARNING]  (263218) : All workers exited. Exiting... (0)
Oct  2 08:15:10 np0005465988 systemd[1]: libpod-a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c.scope: Deactivated successfully.
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.057 2 DEBUG nova.virt.libvirt.vif [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2093437597',display_name='tempest-ServersTestMultiNic-server-2093437597',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2093437597',id=65,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:15:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-g9uyprv7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:15:06Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=7a82919b-9ebf-48bc-9416-81be5ea3fc18,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "address": "fa:16:3e:e7:2b:4d", "network": {"id": "2f21da81-29bc-4940-a16a-f2ffdd3e65c5", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1339475481", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.93", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b1d96dd-80", "ovs_interfaceid": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.058 2 DEBUG nova.network.os_vif_util [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "address": "fa:16:3e:e7:2b:4d", "network": {"id": "2f21da81-29bc-4940-a16a-f2ffdd3e65c5", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1339475481", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.93", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b1d96dd-80", "ovs_interfaceid": "2b1d96dd-80af-466e-bcec-84d96c0ace5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.059 2 DEBUG nova.network.os_vif_util [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:2b:4d,bridge_name='br-int',has_traffic_filtering=True,id=2b1d96dd-80af-466e-bcec-84d96c0ace5a,network=Network(2f21da81-29bc-4940-a16a-f2ffdd3e65c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b1d96dd-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.059 2 DEBUG os_vif [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:2b:4d,bridge_name='br-int',has_traffic_filtering=True,id=2b1d96dd-80af-466e-bcec-84d96c0ace5a,network=Network(2f21da81-29bc-4940-a16a-f2ffdd3e65c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b1d96dd-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:15:10 np0005465988 podman[263357]: 2025-10-02 12:15:10.060427789 +0000 UTC m=+0.054986501 container died a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.062 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b1d96dd-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.069 2 INFO os_vif [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:2b:4d,bridge_name='br-int',has_traffic_filtering=True,id=2b1d96dd-80af-466e-bcec-84d96c0ace5a,network=Network(2f21da81-29bc-4940-a16a-f2ffdd3e65c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b1d96dd-80')#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.070 2 DEBUG nova.virt.libvirt.vif [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2093437597',display_name='tempest-ServersTestMultiNic-server-2093437597',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2093437597',id=65,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:15:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ba87091f122a4afabe0a62682078fece',ramdisk_id='',reservation_id='r-g9uyprv7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1178748303',owner_user_name='tempest-ServersTestMultiNic-1178748303-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:15:06Z,user_data=None,user_id='d107dd863d2e4a56853a0b758cb2c110',uuid=7a82919b-9ebf-48bc-9416-81be5ea3fc18,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47f72a44-d893-407b-9a47-a23914996fd8", "address": "fa:16:3e:41:46:9b", "network": {"id": "0d17f40d-92af-4ebd-87c4-5dfcec0eed15", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-876470846", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47f72a44-d8", "ovs_interfaceid": "47f72a44-d893-407b-9a47-a23914996fd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.071 2 DEBUG nova.network.os_vif_util [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converting VIF {"id": "47f72a44-d893-407b-9a47-a23914996fd8", "address": "fa:16:3e:41:46:9b", "network": {"id": "0d17f40d-92af-4ebd-87c4-5dfcec0eed15", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-876470846", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ba87091f122a4afabe0a62682078fece", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47f72a44-d8", "ovs_interfaceid": "47f72a44-d893-407b-9a47-a23914996fd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.071 2 DEBUG nova.network.os_vif_util [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:46:9b,bridge_name='br-int',has_traffic_filtering=True,id=47f72a44-d893-407b-9a47-a23914996fd8,network=Network(0d17f40d-92af-4ebd-87c4-5dfcec0eed15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47f72a44-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.072 2 DEBUG os_vif [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:46:9b,bridge_name='br-int',has_traffic_filtering=True,id=47f72a44-d893-407b-9a47-a23914996fd8,network=Network(0d17f40d-92af-4ebd-87c4-5dfcec0eed15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47f72a44-d8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.074 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47f72a44-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.080 2 INFO os_vif [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:46:9b,bridge_name='br-int',has_traffic_filtering=True,id=47f72a44-d893-407b-9a47-a23914996fd8,network=Network(0d17f40d-92af-4ebd-87c4-5dfcec0eed15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47f72a44-d8')#033[00m
Oct  2 08:15:10 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c-userdata-shm.mount: Deactivated successfully.
Oct  2 08:15:10 np0005465988 systemd[1]: var-lib-containers-storage-overlay-12f50a11c04cee8c44506d1dfb16b550b64e656096b07aafc9af62b6e5d0be01-merged.mount: Deactivated successfully.
Oct  2 08:15:10 np0005465988 podman[263357]: 2025-10-02 12:15:10.114466502 +0000 UTC m=+0.109025184 container cleanup a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:15:10 np0005465988 systemd[1]: libpod-conmon-a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c.scope: Deactivated successfully.
Oct  2 08:15:10 np0005465988 podman[263418]: 2025-10-02 12:15:10.188291461 +0000 UTC m=+0.051623933 container remove a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.195 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0312f9b9-acb3-4e4b-82ae-b9fd72e159ee]: (4, ('Thu Oct  2 12:15:09 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5 (a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c)\na5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c\nThu Oct  2 12:15:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5 (a5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c)\na5f06b14cd2bbc42308c09ce37ef15e55955dabf093566fc06ed34a44195f87c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.198 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[808ffe24-a0dc-43ec-90fa-2265caacda36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.200 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f21da81-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:10 np0005465988 kernel: tap2f21da81-20: left promiscuous mode
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.222 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ad141b-81e6-4647-88ec-4bf5e6434022]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.253 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3293460e-0220-4e1f-8177-d300ce9a5953]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.254 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fc941a45-5f43-4758-9ee9-bd039c126951]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.273 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[790641c9-1486-4b28-b423-d538e45f591c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536444, 'reachable_time': 36701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263436, 'error': None, 'target': 'ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 systemd[1]: run-netns-ovnmeta\x2d2f21da81\x2d29bc\x2d4940\x2da16a\x2df2ffdd3e65c5.mount: Deactivated successfully.
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.281 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2f21da81-29bc-4940-a16a-f2ffdd3e65c5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.281 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[2d169c30-efb8-48eb-81e0-811e8fd1727b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.283 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 47f72a44-d893-407b-9a47-a23914996fd8 in datapath 0d17f40d-92af-4ebd-87c4-5dfcec0eed15 unbound from our chassis#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.285 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0d17f40d-92af-4ebd-87c4-5dfcec0eed15, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.286 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[01d66e93-de8d-4213-b581-e341086ca82b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.287 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15 namespace which is not needed anymore#033[00m
Oct  2 08:15:10 np0005465988 neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15[263288]: [NOTICE]   (263292) : haproxy version is 2.8.14-c23fe91
Oct  2 08:15:10 np0005465988 neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15[263288]: [NOTICE]   (263292) : path to executable is /usr/sbin/haproxy
Oct  2 08:15:10 np0005465988 neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15[263288]: [WARNING]  (263292) : Exiting Master process...
Oct  2 08:15:10 np0005465988 neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15[263288]: [ALERT]    (263292) : Current worker (263294) exited with code 143 (Terminated)
Oct  2 08:15:10 np0005465988 neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15[263288]: [WARNING]  (263292) : All workers exited. Exiting... (0)
Oct  2 08:15:10 np0005465988 systemd[1]: libpod-f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027.scope: Deactivated successfully.
Oct  2 08:15:10 np0005465988 podman[263455]: 2025-10-02 12:15:10.449161544 +0000 UTC m=+0.058767831 container died f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:15:10 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027-userdata-shm.mount: Deactivated successfully.
Oct  2 08:15:10 np0005465988 systemd[1]: var-lib-containers-storage-overlay-9b3a6f90b1cd37677f0c2ae5898a7d9f8bcd3bba3b31e1fa9847f5aac49c9f16-merged.mount: Deactivated successfully.
Oct  2 08:15:10 np0005465988 podman[263455]: 2025-10-02 12:15:10.496989456 +0000 UTC m=+0.106595733 container cleanup f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:15:10 np0005465988 systemd[1]: libpod-conmon-f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027.scope: Deactivated successfully.
Oct  2 08:15:10 np0005465988 podman[263486]: 2025-10-02 12:15:10.56651492 +0000 UTC m=+0.045956059 container remove f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.570 2 INFO nova.virt.libvirt.driver [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Deleting instance files /var/lib/nova/instances/7a82919b-9ebf-48bc-9416-81be5ea3fc18_del#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.571 2 INFO nova.virt.libvirt.driver [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Deletion of /var/lib/nova/instances/7a82919b-9ebf-48bc-9416-81be5ea3fc18_del complete#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.576 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8e4e249a-0018-40e4-9306-a067f71a734d]: (4, ('Thu Oct  2 12:15:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15 (f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027)\nf482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027\nThu Oct  2 12:15:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15 (f482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027)\nf482e4a697c3658d54ba5a5fef0f8577975c8bf799f1e044c43518dde0d07027\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.578 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd25fda-7991-4036-9480-d383449bb8ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.579 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d17f40d-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 kernel: tap0d17f40d-90: left promiscuous mode
Oct  2 08:15:10 np0005465988 nova_compute[236126]: 2025-10-02 12:15:10.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.610 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ec969b00-c67b-485f-b6f0-2156f0f678eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:10.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.646 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[510b2180-3cd7-4fcb-b862-a4bb20b2c7ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.648 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a29e0ecc-3d92-4828-bfa0-322587f36b90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.669 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e7bcf6d9-c35e-4d75-9493-12d334d59b7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536537, 'reachable_time': 34070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263501, 'error': None, 'target': 'ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.672 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0d17f40d-92af-4ebd-87c4-5dfcec0eed15 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.672 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[4312d3f4-f30e-4355-a25b-d8d308c011f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.673 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 2b1d96dd-80af-466e-bcec-84d96c0ace5a in datapath 2f21da81-29bc-4940-a16a-f2ffdd3e65c5 unbound from our chassis#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.674 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f21da81-29bc-4940-a16a-f2ffdd3e65c5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.675 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c6f58a55-5099-4250-9c37-27e3e6973dae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.675 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 2b1d96dd-80af-466e-bcec-84d96c0ace5a in datapath 2f21da81-29bc-4940-a16a-f2ffdd3e65c5 unbound from our chassis#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.676 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f21da81-29bc-4940-a16a-f2ffdd3e65c5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:15:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:10.677 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2e59eedd-d83d-44de-beb1-e4b37b5d6491]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:15:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:10.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:15:11 np0005465988 systemd[1]: run-netns-ovnmeta\x2d0d17f40d\x2d92af\x2d4ebd\x2d87c4\x2d5dfcec0eed15.mount: Deactivated successfully.
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.406 2 INFO nova.compute.manager [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Took 1.63 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.406 2 DEBUG oslo.service.loopingcall [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.407 2 DEBUG nova.compute.manager [-] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.407 2 DEBUG nova.network.neutron [-] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.554 2 DEBUG nova.compute.manager [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-unplugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.555 2 DEBUG oslo_concurrency.lockutils [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.555 2 DEBUG oslo_concurrency.lockutils [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.556 2 DEBUG oslo_concurrency.lockutils [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.556 2 DEBUG nova.compute.manager [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] No waiting events found dispatching network-vif-unplugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.556 2 DEBUG nova.compute.manager [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-unplugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.557 2 DEBUG nova.compute.manager [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.557 2 DEBUG oslo_concurrency.lockutils [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.557 2 DEBUG oslo_concurrency.lockutils [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.558 2 DEBUG oslo_concurrency.lockutils [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.558 2 DEBUG nova.compute.manager [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] No waiting events found dispatching network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.558 2 WARNING nova.compute.manager [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received unexpected event network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.558 2 DEBUG nova.compute.manager [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-unplugged-47f72a44-d893-407b-9a47-a23914996fd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.558 2 DEBUG oslo_concurrency.lockutils [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.559 2 DEBUG oslo_concurrency.lockutils [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.559 2 DEBUG oslo_concurrency.lockutils [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.559 2 DEBUG nova.compute.manager [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] No waiting events found dispatching network-vif-unplugged-47f72a44-d893-407b-9a47-a23914996fd8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:11 np0005465988 nova_compute[236126]: 2025-10-02 12:15:11.559 2 DEBUG nova.compute.manager [req-d2ca23b5-8f55-4fb6-92b1-e1a412c7fd2b req-0a02689a-7ed3-4bc1-be54-82455f9d16f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-unplugged-47f72a44-d893-407b-9a47-a23914996fd8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:15:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:12 np0005465988 nova_compute[236126]: 2025-10-02 12:15:12.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:12.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:12.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.585 2 DEBUG nova.network.neutron [-] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.613 2 INFO nova.compute.manager [-] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Took 2.21 seconds to deallocate network for instance.#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.656 2 DEBUG oslo_concurrency.lockutils [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.657 2 DEBUG oslo_concurrency.lockutils [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.690 2 DEBUG nova.compute.manager [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-plugged-47f72a44-d893-407b-9a47-a23914996fd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.691 2 DEBUG oslo_concurrency.lockutils [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.692 2 DEBUG oslo_concurrency.lockutils [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.692 2 DEBUG oslo_concurrency.lockutils [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.693 2 DEBUG nova.compute.manager [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] No waiting events found dispatching network-vif-plugged-47f72a44-d893-407b-9a47-a23914996fd8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.693 2 WARNING nova.compute.manager [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received unexpected event network-vif-plugged-47f72a44-d893-407b-9a47-a23914996fd8 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.693 2 DEBUG nova.compute.manager [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.694 2 DEBUG oslo_concurrency.lockutils [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.694 2 DEBUG oslo_concurrency.lockutils [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.695 2 DEBUG oslo_concurrency.lockutils [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.695 2 DEBUG nova.compute.manager [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] No waiting events found dispatching network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.696 2 WARNING nova.compute.manager [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received unexpected event network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.696 2 DEBUG nova.compute.manager [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.697 2 DEBUG oslo_concurrency.lockutils [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.697 2 DEBUG oslo_concurrency.lockutils [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.697 2 DEBUG oslo_concurrency.lockutils [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.698 2 DEBUG nova.compute.manager [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] No waiting events found dispatching network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.698 2 WARNING nova.compute.manager [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received unexpected event network-vif-plugged-2b1d96dd-80af-466e-bcec-84d96c0ace5a for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.699 2 DEBUG nova.compute.manager [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-deleted-2b1d96dd-80af-466e-bcec-84d96c0ace5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.699 2 DEBUG nova.compute.manager [req-36ffd0df-61ab-4e88-bfdf-824361819740 req-a1ecb180-0404-420c-b16a-de0e3b12a66b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Received event network-vif-deleted-47f72a44-d893-407b-9a47-a23914996fd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:13 np0005465988 nova_compute[236126]: 2025-10-02 12:15:13.730 2 DEBUG oslo_concurrency.processutils [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:15:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2995499838' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:15:14 np0005465988 nova_compute[236126]: 2025-10-02 12:15:14.189 2 DEBUG oslo_concurrency.processutils [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:14 np0005465988 nova_compute[236126]: 2025-10-02 12:15:14.195 2 DEBUG nova.compute.provider_tree [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:15:14 np0005465988 nova_compute[236126]: 2025-10-02 12:15:14.215 2 DEBUG nova.scheduler.client.report [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:15:14 np0005465988 nova_compute[236126]: 2025-10-02 12:15:14.234 2 DEBUG oslo_concurrency.lockutils [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:14 np0005465988 nova_compute[236126]: 2025-10-02 12:15:14.264 2 INFO nova.scheduler.client.report [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Deleted allocations for instance 7a82919b-9ebf-48bc-9416-81be5ea3fc18#033[00m
Oct  2 08:15:14 np0005465988 nova_compute[236126]: 2025-10-02 12:15:14.342 2 DEBUG oslo_concurrency.lockutils [None req-364cfa08-605c-40e7-b7c9-be7b510c78a2 d107dd863d2e4a56853a0b758cb2c110 ba87091f122a4afabe0a62682078fece - - default default] Lock "7a82919b-9ebf-48bc-9416-81be5ea3fc18" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:14.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:14.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:15 np0005465988 nova_compute[236126]: 2025-10-02 12:15:15.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:16.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:16.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:17 np0005465988 nova_compute[236126]: 2025-10-02 12:15:17.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:17 np0005465988 nova_compute[236126]: 2025-10-02 12:15:17.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.106 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "c8781322-d594-4313-9032-0f1c3f66aad1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.107 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.132 2 DEBUG nova.compute.manager [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.202 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.202 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.212 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.212 2 INFO nova.compute.claims [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.325 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:18.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:15:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3477163817' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.764 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.774 2 DEBUG nova.compute.provider_tree [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.802 2 DEBUG nova.scheduler.client.report [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:15:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:18.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.827 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.828 2 DEBUG nova.compute.manager [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.877 2 DEBUG nova.compute.manager [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.878 2 DEBUG nova.network.neutron [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.906 2 INFO nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:15:18 np0005465988 nova_compute[236126]: 2025-10-02 12:15:18.937 2 DEBUG nova.compute.manager [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.055 2 DEBUG nova.compute.manager [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.058 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.059 2 INFO nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Creating image(s)#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.087 2 DEBUG nova.storage.rbd_utils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image c8781322-d594-4313-9032-0f1c3f66aad1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.117 2 DEBUG nova.storage.rbd_utils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image c8781322-d594-4313-9032-0f1c3f66aad1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.149 2 DEBUG nova.storage.rbd_utils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image c8781322-d594-4313-9032-0f1c3f66aad1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.153 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.228 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.229 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.230 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.231 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.268 2 DEBUG nova.storage.rbd_utils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image c8781322-d594-4313-9032-0f1c3f66aad1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.272 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c8781322-d594-4313-9032-0f1c3f66aad1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.309 2 DEBUG nova.policy [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '72c74994085d4fc697ddd4acddfa7a11', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3204d74f349d47fda3152d9d7fbea43e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.709 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c8781322-d594-4313-9032-0f1c3f66aad1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.804 2 DEBUG nova.network.neutron [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Successfully created port: d5948eb9-6ab3-472b-8ac8-d00a9906df99 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:15:19 np0005465988 nova_compute[236126]: 2025-10-02 12:15:19.811 2 DEBUG nova.storage.rbd_utils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] resizing rbd image c8781322-d594-4313-9032-0f1c3f66aad1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:15:20 np0005465988 nova_compute[236126]: 2025-10-02 12:15:20.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:20 np0005465988 nova_compute[236126]: 2025-10-02 12:15:20.110 2 DEBUG nova.objects.instance [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lazy-loading 'migration_context' on Instance uuid c8781322-d594-4313-9032-0f1c3f66aad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:20 np0005465988 nova_compute[236126]: 2025-10-02 12:15:20.125 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:15:20 np0005465988 nova_compute[236126]: 2025-10-02 12:15:20.126 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Ensure instance console log exists: /var/lib/nova/instances/c8781322-d594-4313-9032-0f1c3f66aad1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:15:20 np0005465988 nova_compute[236126]: 2025-10-02 12:15:20.127 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:20 np0005465988 nova_compute[236126]: 2025-10-02 12:15:20.127 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:20 np0005465988 nova_compute[236126]: 2025-10-02 12:15:20.128 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:20.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:20.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:20 np0005465988 nova_compute[236126]: 2025-10-02 12:15:20.984 2 DEBUG nova.network.neutron [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Successfully updated port: d5948eb9-6ab3-472b-8ac8-d00a9906df99 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:15:21 np0005465988 nova_compute[236126]: 2025-10-02 12:15:21.011 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "refresh_cache-c8781322-d594-4313-9032-0f1c3f66aad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:21 np0005465988 nova_compute[236126]: 2025-10-02 12:15:21.011 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquired lock "refresh_cache-c8781322-d594-4313-9032-0f1c3f66aad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:21 np0005465988 nova_compute[236126]: 2025-10-02 12:15:21.011 2 DEBUG nova.network.neutron [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:15:21 np0005465988 nova_compute[236126]: 2025-10-02 12:15:21.147 2 DEBUG nova.compute.manager [req-a36dfbef-3371-451d-8c9e-5bda38145eb8 req-c1b6128a-b834-4e5d-a427-24bc36ddfdee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Received event network-changed-d5948eb9-6ab3-472b-8ac8-d00a9906df99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:21 np0005465988 nova_compute[236126]: 2025-10-02 12:15:21.148 2 DEBUG nova.compute.manager [req-a36dfbef-3371-451d-8c9e-5bda38145eb8 req-c1b6128a-b834-4e5d-a427-24bc36ddfdee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Refreshing instance network info cache due to event network-changed-d5948eb9-6ab3-472b-8ac8-d00a9906df99. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:15:21 np0005465988 nova_compute[236126]: 2025-10-02 12:15:21.148 2 DEBUG oslo_concurrency.lockutils [req-a36dfbef-3371-451d-8c9e-5bda38145eb8 req-c1b6128a-b834-4e5d-a427-24bc36ddfdee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c8781322-d594-4313-9032-0f1c3f66aad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:21 np0005465988 nova_compute[236126]: 2025-10-02 12:15:21.210 2 DEBUG nova.network.neutron [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:15:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:22 np0005465988 nova_compute[236126]: 2025-10-02 12:15:22.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:22.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:22.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.729 2 DEBUG nova.network.neutron [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Updating instance_info_cache with network_info: [{"id": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "address": "fa:16:3e:f9:cd:99", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5948eb9-6a", "ovs_interfaceid": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.754 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Releasing lock "refresh_cache-c8781322-d594-4313-9032-0f1c3f66aad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.755 2 DEBUG nova.compute.manager [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Instance network_info: |[{"id": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "address": "fa:16:3e:f9:cd:99", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5948eb9-6a", "ovs_interfaceid": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.756 2 DEBUG oslo_concurrency.lockutils [req-a36dfbef-3371-451d-8c9e-5bda38145eb8 req-c1b6128a-b834-4e5d-a427-24bc36ddfdee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c8781322-d594-4313-9032-0f1c3f66aad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.756 2 DEBUG nova.network.neutron [req-a36dfbef-3371-451d-8c9e-5bda38145eb8 req-c1b6128a-b834-4e5d-a427-24bc36ddfdee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Refreshing network info cache for port d5948eb9-6ab3-472b-8ac8-d00a9906df99 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.761 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Start _get_guest_xml network_info=[{"id": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "address": "fa:16:3e:f9:cd:99", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5948eb9-6a", "ovs_interfaceid": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.770 2 WARNING nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.776 2 DEBUG nova.virt.libvirt.host [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.777 2 DEBUG nova.virt.libvirt.host [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.788 2 DEBUG nova.virt.libvirt.host [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.789 2 DEBUG nova.virt.libvirt.host [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.791 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.791 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.792 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.793 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.793 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.794 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.794 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.795 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.795 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.796 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.796 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.797 2 DEBUG nova.virt.hardware [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:15:23 np0005465988 nova_compute[236126]: 2025-10-02 12:15:23.802 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:15:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/550724469' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.388 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.587s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.416 2 DEBUG nova.storage.rbd_utils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image c8781322-d594-4313-9032-0f1c3f66aad1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.420 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:15:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:24.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:15:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:24.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:15:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1953675138' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.872 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.874 2 DEBUG nova.virt.libvirt.vif [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:15:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-223869166',display_name='tempest-ImagesOneServerNegativeTestJSON-server-223869166',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-223869166',id=66,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3204d74f349d47fda3152d9d7fbea43e',ramdisk_id='',reservation_id='r-mw0831qq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1213912851',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1213912851-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:15:18Z,user_data=None,user_id='72c74994085d4fc697ddd4acddfa7a11',uuid=c8781322-d594-4313-9032-0f1c3f66aad1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "address": "fa:16:3e:f9:cd:99", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5948eb9-6a", "ovs_interfaceid": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.874 2 DEBUG nova.network.os_vif_util [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converting VIF {"id": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "address": "fa:16:3e:f9:cd:99", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5948eb9-6a", "ovs_interfaceid": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.875 2 DEBUG nova.network.os_vif_util [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:cd:99,bridge_name='br-int',has_traffic_filtering=True,id=d5948eb9-6ab3-472b-8ac8-d00a9906df99,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5948eb9-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.876 2 DEBUG nova.objects.instance [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lazy-loading 'pci_devices' on Instance uuid c8781322-d594-4313-9032-0f1c3f66aad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.898 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  <uuid>c8781322-d594-4313-9032-0f1c3f66aad1</uuid>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  <name>instance-00000042</name>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-223869166</nova:name>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:15:23</nova:creationTime>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <nova:user uuid="72c74994085d4fc697ddd4acddfa7a11">tempest-ImagesOneServerNegativeTestJSON-1213912851-project-member</nova:user>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <nova:project uuid="3204d74f349d47fda3152d9d7fbea43e">tempest-ImagesOneServerNegativeTestJSON-1213912851</nova:project>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <nova:port uuid="d5948eb9-6ab3-472b-8ac8-d00a9906df99">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <entry name="serial">c8781322-d594-4313-9032-0f1c3f66aad1</entry>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <entry name="uuid">c8781322-d594-4313-9032-0f1c3f66aad1</entry>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c8781322-d594-4313-9032-0f1c3f66aad1_disk">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c8781322-d594-4313-9032-0f1c3f66aad1_disk.config">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:f9:cd:99"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <target dev="tapd5948eb9-6a"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/c8781322-d594-4313-9032-0f1c3f66aad1/console.log" append="off"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:15:24 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:15:24 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:15:24 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:15:24 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.900 2 DEBUG nova.compute.manager [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Preparing to wait for external event network-vif-plugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.901 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.901 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.902 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.903 2 DEBUG nova.virt.libvirt.vif [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:15:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-223869166',display_name='tempest-ImagesOneServerNegativeTestJSON-server-223869166',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-223869166',id=66,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3204d74f349d47fda3152d9d7fbea43e',ramdisk_id='',reservation_id='r-mw0831qq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1213912851',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1213912851-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:15:18Z,user_data=None,user_id='72c74994085d4fc697ddd4acddfa7a11',uuid=c8781322-d594-4313-9032-0f1c3f66aad1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "address": "fa:16:3e:f9:cd:99", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5948eb9-6a", "ovs_interfaceid": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.904 2 DEBUG nova.network.os_vif_util [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converting VIF {"id": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "address": "fa:16:3e:f9:cd:99", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5948eb9-6a", "ovs_interfaceid": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.905 2 DEBUG nova.network.os_vif_util [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:cd:99,bridge_name='br-int',has_traffic_filtering=True,id=d5948eb9-6ab3-472b-8ac8-d00a9906df99,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5948eb9-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.906 2 DEBUG os_vif [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:cd:99,bridge_name='br-int',has_traffic_filtering=True,id=d5948eb9-6ab3-472b-8ac8-d00a9906df99,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5948eb9-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.908 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.909 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.915 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5948eb9-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.916 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd5948eb9-6a, col_values=(('external_ids', {'iface-id': 'd5948eb9-6ab3-472b-8ac8-d00a9906df99', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:cd:99', 'vm-uuid': 'c8781322-d594-4313-9032-0f1c3f66aad1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:24 np0005465988 NetworkManager[45041]: <info>  [1759407324.9196] manager: (tapd5948eb9-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:24 np0005465988 nova_compute[236126]: 2025-10-02 12:15:24.928 2 INFO os_vif [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:cd:99,bridge_name='br-int',has_traffic_filtering=True,id=d5948eb9-6ab3-472b-8ac8-d00a9906df99,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5948eb9-6a')#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.022 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.023 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.023 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] No VIF found with MAC fa:16:3e:f9:cd:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.024 2 INFO nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Using config drive#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.053 2 DEBUG nova.storage.rbd_utils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image c8781322-d594-4313-9032-0f1c3f66aad1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.061 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407310.0338016, 7a82919b-9ebf-48bc-9416-81be5ea3fc18 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.061 2 INFO nova.compute.manager [-] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.097 2 DEBUG nova.compute.manager [None req-45d9a9e3-087d-4443-bdad-d825266bdea3 - - - - - -] [instance: 7a82919b-9ebf-48bc-9416-81be5ea3fc18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.537 2 INFO nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Creating config drive at /var/lib/nova/instances/c8781322-d594-4313-9032-0f1c3f66aad1/disk.config#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.544 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c8781322-d594-4313-9032-0f1c3f66aad1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi1563yu5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.655 2 DEBUG nova.network.neutron [req-a36dfbef-3371-451d-8c9e-5bda38145eb8 req-c1b6128a-b834-4e5d-a427-24bc36ddfdee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Updated VIF entry in instance network info cache for port d5948eb9-6ab3-472b-8ac8-d00a9906df99. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.657 2 DEBUG nova.network.neutron [req-a36dfbef-3371-451d-8c9e-5bda38145eb8 req-c1b6128a-b834-4e5d-a427-24bc36ddfdee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Updating instance_info_cache with network_info: [{"id": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "address": "fa:16:3e:f9:cd:99", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5948eb9-6a", "ovs_interfaceid": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.691 2 DEBUG oslo_concurrency.lockutils [req-a36dfbef-3371-451d-8c9e-5bda38145eb8 req-c1b6128a-b834-4e5d-a427-24bc36ddfdee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c8781322-d594-4313-9032-0f1c3f66aad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.703 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c8781322-d594-4313-9032-0f1c3f66aad1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi1563yu5" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.743 2 DEBUG nova.storage.rbd_utils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image c8781322-d594-4313-9032-0f1c3f66aad1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.748 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c8781322-d594-4313-9032-0f1c3f66aad1/disk.config c8781322-d594-4313-9032-0f1c3f66aad1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.969 2 DEBUG oslo_concurrency.processutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c8781322-d594-4313-9032-0f1c3f66aad1/disk.config c8781322-d594-4313-9032-0f1c3f66aad1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.221s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:25 np0005465988 nova_compute[236126]: 2025-10-02 12:15:25.971 2 INFO nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Deleting local config drive /var/lib/nova/instances/c8781322-d594-4313-9032-0f1c3f66aad1/disk.config because it was imported into RBD.#033[00m
Oct  2 08:15:26 np0005465988 kernel: tapd5948eb9-6a: entered promiscuous mode
Oct  2 08:15:26 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:26Z|00199|binding|INFO|Claiming lport d5948eb9-6ab3-472b-8ac8-d00a9906df99 for this chassis.
Oct  2 08:15:26 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:26Z|00200|binding|INFO|d5948eb9-6ab3-472b-8ac8-d00a9906df99: Claiming fa:16:3e:f9:cd:99 10.100.0.9
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:26 np0005465988 NetworkManager[45041]: <info>  [1759407326.0539] manager: (tapd5948eb9-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.073 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:cd:99 10.100.0.9'], port_security=['fa:16:3e:f9:cd:99 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c8781322-d594-4313-9032-0f1c3f66aad1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34395819-7251-4d97-acea-2b98c07c277f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3204d74f349d47fda3152d9d7fbea43e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e7f67ea6-7556-4acf-963b-57a7d344e510', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c7b3003-1bab-46e9-ac89-20b4b6aed5f5, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=d5948eb9-6ab3-472b-8ac8-d00a9906df99) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.075 142124 INFO neutron.agent.ovn.metadata.agent [-] Port d5948eb9-6ab3-472b-8ac8-d00a9906df99 in datapath 34395819-7251-4d97-acea-2b98c07c277f bound to our chassis#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.077 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 34395819-7251-4d97-acea-2b98c07c277f#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.095 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[47e58c73-e86f-4e8e-93d6-4fbe7f93cc4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.096 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap34395819-71 in ovnmeta-34395819-7251-4d97-acea-2b98c07c277f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:15:26 np0005465988 systemd-udevd[263948]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.098 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap34395819-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.098 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[02999059-da04-4dac-8214-ddb3de611abc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.098 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3156522e-6c86-42f8-a4f6-c5c87e571217]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 systemd-machined[192594]: New machine qemu-25-instance-00000042.
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.115 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[c8789dc1-fadf-42ee-8594-db1e168dc1ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 podman[263899]: 2025-10-02 12:15:26.117570182 +0000 UTC m=+0.092469473 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, tcib_managed=true)
Oct  2 08:15:26 np0005465988 NetworkManager[45041]: <info>  [1759407326.1225] device (tapd5948eb9-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:15:26 np0005465988 NetworkManager[45041]: <info>  [1759407326.1235] device (tapd5948eb9-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:15:26 np0005465988 systemd[1]: Started Virtual Machine qemu-25-instance-00000042.
Oct  2 08:15:26 np0005465988 podman[263897]: 2025-10-02 12:15:26.146908305 +0000 UTC m=+0.126126442 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.149 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e287bf-f048-403d-a429-8a2612de5d3d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:26Z|00201|binding|INFO|Setting lport d5948eb9-6ab3-472b-8ac8-d00a9906df99 ovn-installed in OVS
Oct  2 08:15:26 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:26Z|00202|binding|INFO|Setting lport d5948eb9-6ab3-472b-8ac8-d00a9906df99 up in Southbound
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:26 np0005465988 podman[263893]: 2025-10-02 12:15:26.161952803 +0000 UTC m=+0.140793359 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.188 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e0aaf91a-5d7a-4dae-ab29-7adcac073008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 systemd-udevd[263966]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.195 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b6412681-f3ff-4904-8cf8-6363af8998b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 NetworkManager[45041]: <info>  [1759407326.1977] manager: (tap34395819-70): new Veth device (/org/freedesktop/NetworkManager/Devices/110)
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.235 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2318aa54-4235-4597-ba83-18f673076f84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.240 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[6eabfee4-cf6f-4135-aa85-f516862ce8d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 NetworkManager[45041]: <info>  [1759407326.2691] device (tap34395819-70): carrier: link connected
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.276 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[888e2f00-5f39-421c-8e2a-8d6792af84fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.295 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[085ca2f7-5b80-43c5-8b08-7e2f8272c938]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34395819-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:6c:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538660, 'reachable_time': 24251, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264003, 'error': None, 'target': 'ovnmeta-34395819-7251-4d97-acea-2b98c07c277f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.311 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[902b514d-b032-48af-a970-efaa5be7f4e5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:6c0c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 538660, 'tstamp': 538660}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264004, 'error': None, 'target': 'ovnmeta-34395819-7251-4d97-acea-2b98c07c277f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.331 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fb971cb9-f081-4190-9ba6-674eec77d4a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34395819-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:6c:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538660, 'reachable_time': 24251, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 264005, 'error': None, 'target': 'ovnmeta-34395819-7251-4d97-acea-2b98c07c277f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.366 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c3148e04-d606-48ac-bfe4-a9cb9380b155]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.449 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dc2a093e-919e-46fb-9927-9ed7ace4f57d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.451 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34395819-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.451 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.452 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap34395819-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:26 np0005465988 NetworkManager[45041]: <info>  [1759407326.4829] manager: (tap34395819-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Oct  2 08:15:26 np0005465988 kernel: tap34395819-70: entered promiscuous mode
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.490 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap34395819-70, col_values=(('external_ids', {'iface-id': '595d0160-be54-4c2f-8674-117a5a5028e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:26 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:26Z|00203|binding|INFO|Releasing lport 595d0160-be54-4c2f-8674-117a5a5028e1 from this chassis (sb_readonly=0)
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.518 2 DEBUG nova.compute.manager [req-de957d5c-f0e6-4e83-b068-ead3d4193687 req-0a834534-85f1-4097-8a1a-95bfa20ad6d6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Received event network-vif-plugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.518 2 DEBUG oslo_concurrency.lockutils [req-de957d5c-f0e6-4e83-b068-ead3d4193687 req-0a834534-85f1-4097-8a1a-95bfa20ad6d6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.520 2 DEBUG oslo_concurrency.lockutils [req-de957d5c-f0e6-4e83-b068-ead3d4193687 req-0a834534-85f1-4097-8a1a-95bfa20ad6d6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.520 2 DEBUG oslo_concurrency.lockutils [req-de957d5c-f0e6-4e83-b068-ead3d4193687 req-0a834534-85f1-4097-8a1a-95bfa20ad6d6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.520 2 DEBUG nova.compute.manager [req-de957d5c-f0e6-4e83-b068-ead3d4193687 req-0a834534-85f1-4097-8a1a-95bfa20ad6d6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Processing event network-vif-plugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:15:26 np0005465988 nova_compute[236126]: 2025-10-02 12:15:26.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.520 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/34395819-7251-4d97-acea-2b98c07c277f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/34395819-7251-4d97-acea-2b98c07c277f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.522 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[99f2f65b-2cc4-4253-9a77-746a2b74d026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.523 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-34395819-7251-4d97-acea-2b98c07c277f
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/34395819-7251-4d97-acea-2b98c07c277f.pid.haproxy
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 34395819-7251-4d97-acea-2b98c07c277f
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:15:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:26.524 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-34395819-7251-4d97-acea-2b98c07c277f', 'env', 'PROCESS_TAG=haproxy-34395819-7251-4d97-acea-2b98c07c277f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/34395819-7251-4d97-acea-2b98c07c277f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:15:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:26.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:26.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:26 np0005465988 podman[264080]: 2025-10-02 12:15:26.898254535 +0000 UTC m=+0.049094350 container create 229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:15:26 np0005465988 systemd[1]: Started libpod-conmon-229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7.scope.
Oct  2 08:15:26 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:15:26 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fde8fb67a33fefa76bb5dfb09a8fc9038c29161b56e594fdd88616a4a23e15a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:15:26 np0005465988 podman[264080]: 2025-10-02 12:15:26.871612399 +0000 UTC m=+0.022452264 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:15:26 np0005465988 podman[264080]: 2025-10-02 12:15:26.97401694 +0000 UTC m=+0.124856775 container init 229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:15:26 np0005465988 podman[264080]: 2025-10-02 12:15:26.981675263 +0000 UTC m=+0.132515078 container start 229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:15:27 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[264094]: [NOTICE]   (264098) : New worker (264100) forked
Oct  2 08:15:27 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[264094]: [NOTICE]   (264098) : Loading success.
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.065 2 DEBUG nova.compute.manager [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.067 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407327.0652907, c8781322-d594-4313-9032-0f1c3f66aad1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.067 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.071 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.079 2 INFO nova.virt.libvirt.driver [-] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Instance spawned successfully.#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.080 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.120 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.124 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.183 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.183 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407327.0665984, c8781322-d594-4313-9032-0f1c3f66aad1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.184 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.191 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.191 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.191 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.192 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.192 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.193 2 DEBUG nova.virt.libvirt.driver [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.220 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.224 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407327.0706744, c8781322-d594-4313-9032-0f1c3f66aad1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.224 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.268 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.273 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.279 2 INFO nova.compute.manager [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Took 8.22 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.279 2 DEBUG nova.compute.manager [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.297 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:15:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:27.341 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:27.343 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:27.343 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.384 2 INFO nova.compute.manager [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Took 9.20 seconds to build instance.#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:27 np0005465988 nova_compute[236126]: 2025-10-02 12:15:27.855 2 DEBUG oslo_concurrency.lockutils [None req-18eb4ffc-6e5d-4b02-8ed5-3b2e575ffcda 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:28.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:28.754 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:28.757 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:15:28 np0005465988 nova_compute[236126]: 2025-10-02 12:15:28.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:28.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:29 np0005465988 nova_compute[236126]: 2025-10-02 12:15:29.294 2 DEBUG nova.compute.manager [req-01ca60a2-07b1-49cb-a72b-e93125824b34 req-061d88b7-7bd1-4123-84ad-97944c20560e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Received event network-vif-plugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:29 np0005465988 nova_compute[236126]: 2025-10-02 12:15:29.295 2 DEBUG oslo_concurrency.lockutils [req-01ca60a2-07b1-49cb-a72b-e93125824b34 req-061d88b7-7bd1-4123-84ad-97944c20560e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:29 np0005465988 nova_compute[236126]: 2025-10-02 12:15:29.296 2 DEBUG oslo_concurrency.lockutils [req-01ca60a2-07b1-49cb-a72b-e93125824b34 req-061d88b7-7bd1-4123-84ad-97944c20560e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:29 np0005465988 nova_compute[236126]: 2025-10-02 12:15:29.296 2 DEBUG oslo_concurrency.lockutils [req-01ca60a2-07b1-49cb-a72b-e93125824b34 req-061d88b7-7bd1-4123-84ad-97944c20560e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:29 np0005465988 nova_compute[236126]: 2025-10-02 12:15:29.297 2 DEBUG nova.compute.manager [req-01ca60a2-07b1-49cb-a72b-e93125824b34 req-061d88b7-7bd1-4123-84ad-97944c20560e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] No waiting events found dispatching network-vif-plugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:29 np0005465988 nova_compute[236126]: 2025-10-02 12:15:29.297 2 WARNING nova.compute.manager [req-01ca60a2-07b1-49cb-a72b-e93125824b34 req-061d88b7-7bd1-4123-84ad-97944c20560e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Received unexpected event network-vif-plugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:15:29 np0005465988 nova_compute[236126]: 2025-10-02 12:15:29.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:30.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:30.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:31 np0005465988 nova_compute[236126]: 2025-10-02 12:15:31.971 2 DEBUG nova.compute.manager [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:32 np0005465988 nova_compute[236126]: 2025-10-02 12:15:32.029 2 INFO nova.compute.manager [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] instance snapshotting#033[00m
Oct  2 08:15:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:32 np0005465988 nova_compute[236126]: 2025-10-02 12:15:32.404 2 INFO nova.virt.libvirt.driver [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Beginning live snapshot process#033[00m
Oct  2 08:15:32 np0005465988 nova_compute[236126]: 2025-10-02 12:15:32.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:32.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:32.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:33.760 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:33 np0005465988 nova_compute[236126]: 2025-10-02 12:15:33.913 2 DEBUG nova.virt.libvirt.imagebackend [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:15:34 np0005465988 nova_compute[236126]: 2025-10-02 12:15:34.267 2 DEBUG nova.storage.rbd_utils [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] creating snapshot(5856cd134ab34ae1870d7b7260ea79d6) on rbd image(c8781322-d594-4313-9032-0f1c3f66aad1_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:15:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:34.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:34.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e215 e215: 3 total, 3 up, 3 in
Oct  2 08:15:34 np0005465988 nova_compute[236126]: 2025-10-02 12:15:34.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:35 np0005465988 nova_compute[236126]: 2025-10-02 12:15:35.301 2 DEBUG nova.storage.rbd_utils [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] cloning vms/c8781322-d594-4313-9032-0f1c3f66aad1_disk@5856cd134ab34ae1870d7b7260ea79d6 to images/0b7cb45b-4756-4516-aac2-0c994d6cef99 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:15:35 np0005465988 nova_compute[236126]: 2025-10-02 12:15:35.538 2 DEBUG nova.storage.rbd_utils [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] flattening images/0b7cb45b-4756-4516-aac2-0c994d6cef99 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:15:36 np0005465988 nova_compute[236126]: 2025-10-02 12:15:36.434 2 DEBUG nova.storage.rbd_utils [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] removing snapshot(5856cd134ab34ae1870d7b7260ea79d6) on rbd image(c8781322-d594-4313-9032-0f1c3f66aad1_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:15:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:36.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:36.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e216 e216: 3 total, 3 up, 3 in
Oct  2 08:15:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:37 np0005465988 nova_compute[236126]: 2025-10-02 12:15:37.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:38 np0005465988 nova_compute[236126]: 2025-10-02 12:15:38.312 2 DEBUG nova.storage.rbd_utils [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] creating snapshot(snap) on rbd image(0b7cb45b-4756-4516-aac2-0c994d6cef99) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:15:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:38.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:38.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e217 e217: 3 total, 3 up, 3 in
Oct  2 08:15:39 np0005465988 nova_compute[236126]: 2025-10-02 12:15:39.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:40 np0005465988 podman[264306]: 2025-10-02 12:15:40.57687141 +0000 UTC m=+0.096861890 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:15:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:40.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Failed to snapshot image: nova.exception.ImageNotFound: Image 0b7cb45b-4756-4516-aac2-0c994d6cef99 could not be found.
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     image = self._client.call(
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver glanceclient.exc.HTTPNotFound: HTTP 404 Not Found: No image found with ID 0b7cb45b-4756-4516-aac2-0c994d6cef99
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver 
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred:
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver 
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3082, in snapshot
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     self._image_api.update(context, image_id, metadata,
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1243, in update
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     return session.update(context, image_id, image_info, data=data,
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 693, in update
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     _reraise_translated_image_exception(image_id)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1031, in _reraise_translated_image_exception
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     raise new_exc.with_traceback(exc_trace)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     image = self._client.call(
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver nova.exception.ImageNotFound: Image 0b7cb45b-4756-4516-aac2-0c994d6cef99 could not be found.
Oct  2 08:15:40 np0005465988 nova_compute[236126]: 2025-10-02 12:15:40.694 2 ERROR nova.virt.libvirt.driver #033[00m
Oct  2 08:15:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:40.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:41 np0005465988 nova_compute[236126]: 2025-10-02 12:15:41.087 2 DEBUG nova.storage.rbd_utils [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] removing snapshot(snap) on rbd image(0b7cb45b-4756-4516-aac2-0c994d6cef99) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:15:41 np0005465988 nova_compute[236126]: 2025-10-02 12:15:41.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:41 np0005465988 nova_compute[236126]: 2025-10-02 12:15:41.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:41 np0005465988 nova_compute[236126]: 2025-10-02 12:15:41.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:41 np0005465988 nova_compute[236126]: 2025-10-02 12:15:41.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:41 np0005465988 nova_compute[236126]: 2025-10-02 12:15:41.508 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:41 np0005465988 nova_compute[236126]: 2025-10-02 12:15:41.509 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:41 np0005465988 nova_compute[236126]: 2025-10-02 12:15:41.509 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:41 np0005465988 nova_compute[236126]: 2025-10-02 12:15:41.510 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:15:41 np0005465988 nova_compute[236126]: 2025-10-02 12:15:41.510 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:15:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3157430924' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.055 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.160 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000042 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.160 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000042 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:15:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e218 e218: 3 total, 3 up, 3 in
Oct  2 08:15:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.417 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.418 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4461MB free_disk=20.897754669189453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.418 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.418 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.512 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance c8781322-d594-4313-9032-0f1c3f66aad1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.513 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.513 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.601 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:42.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:42 np0005465988 nova_compute[236126]: 2025-10-02 12:15:42.773 2 WARNING nova.compute.manager [None req-1389904a-565a-428d-983b-003488f9259b 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Image not found during snapshot: nova.exception.ImageNotFound: Image 0b7cb45b-4756-4516-aac2-0c994d6cef99 could not be found.#033[00m
Oct  2 08:15:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:42.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:42 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:42Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f9:cd:99 10.100.0.9
Oct  2 08:15:42 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:42Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f9:cd:99 10.100.0.9
Oct  2 08:15:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:15:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/397792814' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:15:43 np0005465988 nova_compute[236126]: 2025-10-02 12:15:43.067 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:43 np0005465988 nova_compute[236126]: 2025-10-02 12:15:43.075 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:15:43 np0005465988 nova_compute[236126]: 2025-10-02 12:15:43.095 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:15:43 np0005465988 nova_compute[236126]: 2025-10-02 12:15:43.148 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:15:43 np0005465988 nova_compute[236126]: 2025-10-02 12:15:43.149 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e219 e219: 3 total, 3 up, 3 in
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.274 2 DEBUG oslo_concurrency.lockutils [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "c8781322-d594-4313-9032-0f1c3f66aad1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.274 2 DEBUG oslo_concurrency.lockutils [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.274 2 DEBUG oslo_concurrency.lockutils [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.274 2 DEBUG oslo_concurrency.lockutils [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.275 2 DEBUG oslo_concurrency.lockutils [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.276 2 INFO nova.compute.manager [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Terminating instance#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.277 2 DEBUG nova.compute.manager [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:15:44 np0005465988 kernel: tapd5948eb9-6a (unregistering): left promiscuous mode
Oct  2 08:15:44 np0005465988 NetworkManager[45041]: <info>  [1759407344.3413] device (tapd5948eb9-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:15:44 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:44Z|00204|binding|INFO|Releasing lport d5948eb9-6ab3-472b-8ac8-d00a9906df99 from this chassis (sb_readonly=0)
Oct  2 08:15:44 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:44Z|00205|binding|INFO|Setting lport d5948eb9-6ab3-472b-8ac8-d00a9906df99 down in Southbound
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:44 np0005465988 ovn_controller[132601]: 2025-10-02T12:15:44Z|00206|binding|INFO|Removing iface tapd5948eb9-6a ovn-installed in OVS
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.367 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:cd:99 10.100.0.9'], port_security=['fa:16:3e:f9:cd:99 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c8781322-d594-4313-9032-0f1c3f66aad1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34395819-7251-4d97-acea-2b98c07c277f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3204d74f349d47fda3152d9d7fbea43e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e7f67ea6-7556-4acf-963b-57a7d344e510', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c7b3003-1bab-46e9-ac89-20b4b6aed5f5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=d5948eb9-6ab3-472b-8ac8-d00a9906df99) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.370 142124 INFO neutron.agent.ovn.metadata.agent [-] Port d5948eb9-6ab3-472b-8ac8-d00a9906df99 in datapath 34395819-7251-4d97-acea-2b98c07c277f unbound from our chassis#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.373 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 34395819-7251-4d97-acea-2b98c07c277f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.374 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a232580b-0f01-4f74-85ec-5b03ff70bee4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.375 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-34395819-7251-4d97-acea-2b98c07c277f namespace which is not needed anymore#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:44 np0005465988 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000042.scope: Deactivated successfully.
Oct  2 08:15:44 np0005465988 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000042.scope: Consumed 13.447s CPU time.
Oct  2 08:15:44 np0005465988 systemd-machined[192594]: Machine qemu-25-instance-00000042 terminated.
Oct  2 08:15:44 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[264094]: [NOTICE]   (264098) : haproxy version is 2.8.14-c23fe91
Oct  2 08:15:44 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[264094]: [NOTICE]   (264098) : path to executable is /usr/sbin/haproxy
Oct  2 08:15:44 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[264094]: [WARNING]  (264098) : Exiting Master process...
Oct  2 08:15:44 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[264094]: [ALERT]    (264098) : Current worker (264100) exited with code 143 (Terminated)
Oct  2 08:15:44 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[264094]: [WARNING]  (264098) : All workers exited. Exiting... (0)
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:44 np0005465988 systemd[1]: libpod-229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7.scope: Deactivated successfully.
Oct  2 08:15:44 np0005465988 conmon[264094]: conmon 229b3132d79d65f0ed7a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7.scope/container/memory.events
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:44 np0005465988 podman[264434]: 2025-10-02 12:15:44.505282251 +0000 UTC m=+0.041272902 container died 229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.525 2 INFO nova.virt.libvirt.driver [-] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Instance destroyed successfully.#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.525 2 DEBUG nova.objects.instance [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lazy-loading 'resources' on Instance uuid c8781322-d594-4313-9032-0f1c3f66aad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:44 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7-userdata-shm.mount: Deactivated successfully.
Oct  2 08:15:44 np0005465988 systemd[1]: var-lib-containers-storage-overlay-fde8fb67a33fefa76bb5dfb09a8fc9038c29161b56e594fdd88616a4a23e15a5-merged.mount: Deactivated successfully.
Oct  2 08:15:44 np0005465988 podman[264434]: 2025-10-02 12:15:44.540331471 +0000 UTC m=+0.076322152 container cleanup 229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.560 2 DEBUG nova.virt.libvirt.vif [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:15:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-223869166',display_name='tempest-ImagesOneServerNegativeTestJSON-server-223869166',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-223869166',id=66,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:15:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3204d74f349d47fda3152d9d7fbea43e',ramdisk_id='',reservation_id='r-mw0831qq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1213912851',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1213912851-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:15:42Z,user_data=None,user_id='72c74994085d4fc697ddd4acddfa7a11',uuid=c8781322-d594-4313-9032-0f1c3f66aad1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "address": "fa:16:3e:f9:cd:99", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5948eb9-6a", "ovs_interfaceid": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.561 2 DEBUG nova.network.os_vif_util [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converting VIF {"id": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "address": "fa:16:3e:f9:cd:99", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5948eb9-6a", "ovs_interfaceid": "d5948eb9-6ab3-472b-8ac8-d00a9906df99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.562 2 DEBUG nova.network.os_vif_util [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:cd:99,bridge_name='br-int',has_traffic_filtering=True,id=d5948eb9-6ab3-472b-8ac8-d00a9906df99,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5948eb9-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:44 np0005465988 systemd[1]: libpod-conmon-229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7.scope: Deactivated successfully.
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.562 2 DEBUG os_vif [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:cd:99,bridge_name='br-int',has_traffic_filtering=True,id=d5948eb9-6ab3-472b-8ac8-d00a9906df99,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5948eb9-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.565 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5948eb9-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.571 2 INFO os_vif [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:cd:99,bridge_name='br-int',has_traffic_filtering=True,id=d5948eb9-6ab3-472b-8ac8-d00a9906df99,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5948eb9-6a')#033[00m
Oct  2 08:15:44 np0005465988 podman[264474]: 2025-10-02 12:15:44.622910735 +0000 UTC m=+0.049051599 container remove 229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.630 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ca0048-9fd4-4398-9757-dd51ddf8afd3]: (4, ('Thu Oct  2 12:15:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f (229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7)\n229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7\nThu Oct  2 12:15:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f (229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7)\n229b3132d79d65f0ed7a24f71348adce33c38da885b65544adfa99f948776df7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.632 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3f491900-f9e6-474d-a00d-803a3f2d6c25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.634 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34395819-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:44 np0005465988 kernel: tap34395819-70: left promiscuous mode
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.658 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5f1428-7b04-4176-86a2-938da2574138]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.684 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ce01e2e1-dff6-46d1-97d3-7299cf84c1c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.686 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cb1437ec-5cf9-49b5-93bb-ccb4d83ae6e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:44.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.704 2 DEBUG nova.compute.manager [req-cdda8a4b-7fab-4a4b-aea9-2a6b2fe9c304 req-f1b12f93-aaa5-46ee-9d31-0f3861c9722b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Received event network-vif-unplugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.704 2 DEBUG oslo_concurrency.lockutils [req-cdda8a4b-7fab-4a4b-aea9-2a6b2fe9c304 req-f1b12f93-aaa5-46ee-9d31-0f3861c9722b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.705 2 DEBUG oslo_concurrency.lockutils [req-cdda8a4b-7fab-4a4b-aea9-2a6b2fe9c304 req-f1b12f93-aaa5-46ee-9d31-0f3861c9722b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.705 2 DEBUG oslo_concurrency.lockutils [req-cdda8a4b-7fab-4a4b-aea9-2a6b2fe9c304 req-f1b12f93-aaa5-46ee-9d31-0f3861c9722b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.705 2 DEBUG nova.compute.manager [req-cdda8a4b-7fab-4a4b-aea9-2a6b2fe9c304 req-f1b12f93-aaa5-46ee-9d31-0f3861c9722b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] No waiting events found dispatching network-vif-unplugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:44 np0005465988 nova_compute[236126]: 2025-10-02 12:15:44.705 2 DEBUG nova.compute.manager [req-cdda8a4b-7fab-4a4b-aea9-2a6b2fe9c304 req-f1b12f93-aaa5-46ee-9d31-0f3861c9722b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Received event network-vif-unplugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.710 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[999c7e84-95b6-4215-9554-912425139067]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538651, 'reachable_time': 21842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264508, 'error': None, 'target': 'ovnmeta-34395819-7251-4d97-acea-2b98c07c277f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.712 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-34395819-7251-4d97-acea-2b98c07c277f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:15:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:15:44.712 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[93313592-0287-455c-ac64-3fc1f0c4bac4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:44 np0005465988 systemd[1]: run-netns-ovnmeta\x2d34395819\x2d7251\x2d4d97\x2dacea\x2d2b98c07c277f.mount: Deactivated successfully.
Oct  2 08:15:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:44.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:45 np0005465988 nova_compute[236126]: 2025-10-02 12:15:45.227 2 INFO nova.virt.libvirt.driver [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Deleting instance files /var/lib/nova/instances/c8781322-d594-4313-9032-0f1c3f66aad1_del#033[00m
Oct  2 08:15:45 np0005465988 nova_compute[236126]: 2025-10-02 12:15:45.228 2 INFO nova.virt.libvirt.driver [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Deletion of /var/lib/nova/instances/c8781322-d594-4313-9032-0f1c3f66aad1_del complete#033[00m
Oct  2 08:15:45 np0005465988 nova_compute[236126]: 2025-10-02 12:15:45.320 2 INFO nova.compute.manager [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Took 1.04 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:15:45 np0005465988 nova_compute[236126]: 2025-10-02 12:15:45.321 2 DEBUG oslo.service.loopingcall [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:15:45 np0005465988 nova_compute[236126]: 2025-10-02 12:15:45.322 2 DEBUG nova.compute.manager [-] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:15:45 np0005465988 nova_compute[236126]: 2025-10-02 12:15:45.322 2 DEBUG nova.network.neutron [-] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:15:46 np0005465988 nova_compute[236126]: 2025-10-02 12:15:46.149 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:46 np0005465988 nova_compute[236126]: 2025-10-02 12:15:46.150 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:46 np0005465988 nova_compute[236126]: 2025-10-02 12:15:46.150 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:15:46 np0005465988 nova_compute[236126]: 2025-10-02 12:15:46.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:46.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:46.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:47 np0005465988 nova_compute[236126]: 2025-10-02 12:15:47.095 2 DEBUG nova.compute.manager [req-9e47260f-5704-4291-8b69-dab5f1a0a8a7 req-4514efba-3a53-4ae5-a094-173ac4183c6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Received event network-vif-plugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:47 np0005465988 nova_compute[236126]: 2025-10-02 12:15:47.095 2 DEBUG oslo_concurrency.lockutils [req-9e47260f-5704-4291-8b69-dab5f1a0a8a7 req-4514efba-3a53-4ae5-a094-173ac4183c6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:47 np0005465988 nova_compute[236126]: 2025-10-02 12:15:47.095 2 DEBUG oslo_concurrency.lockutils [req-9e47260f-5704-4291-8b69-dab5f1a0a8a7 req-4514efba-3a53-4ae5-a094-173ac4183c6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:47 np0005465988 nova_compute[236126]: 2025-10-02 12:15:47.096 2 DEBUG oslo_concurrency.lockutils [req-9e47260f-5704-4291-8b69-dab5f1a0a8a7 req-4514efba-3a53-4ae5-a094-173ac4183c6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:47 np0005465988 nova_compute[236126]: 2025-10-02 12:15:47.096 2 DEBUG nova.compute.manager [req-9e47260f-5704-4291-8b69-dab5f1a0a8a7 req-4514efba-3a53-4ae5-a094-173ac4183c6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] No waiting events found dispatching network-vif-plugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:47 np0005465988 nova_compute[236126]: 2025-10-02 12:15:47.096 2 WARNING nova.compute.manager [req-9e47260f-5704-4291-8b69-dab5f1a0a8a7 req-4514efba-3a53-4ae5-a094-173ac4183c6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Received unexpected event network-vif-plugged-d5948eb9-6ab3-472b-8ac8-d00a9906df99 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:15:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:47 np0005465988 nova_compute[236126]: 2025-10-02 12:15:47.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:48 np0005465988 nova_compute[236126]: 2025-10-02 12:15:48.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:48 np0005465988 nova_compute[236126]: 2025-10-02 12:15:48.642 2 DEBUG nova.network.neutron [-] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:48 np0005465988 nova_compute[236126]: 2025-10-02 12:15:48.664 2 INFO nova.compute.manager [-] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Took 3.34 seconds to deallocate network for instance.#033[00m
Oct  2 08:15:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e220 e220: 3 total, 3 up, 3 in
Oct  2 08:15:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:48.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:48 np0005465988 nova_compute[236126]: 2025-10-02 12:15:48.844 2 DEBUG oslo_concurrency.lockutils [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:48 np0005465988 nova_compute[236126]: 2025-10-02 12:15:48.845 2 DEBUG oslo_concurrency.lockutils [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:48.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:48 np0005465988 nova_compute[236126]: 2025-10-02 12:15:48.910 2 DEBUG oslo_concurrency.processutils [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:49 np0005465988 nova_compute[236126]: 2025-10-02 12:15:49.319 2 DEBUG nova.compute.manager [req-b1e64bce-34ac-417c-b774-86fece7e38a3 req-7ff6d82e-37fe-4f03-890a-c13ce528fbfc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Received event network-vif-deleted-d5948eb9-6ab3-472b-8ac8-d00a9906df99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:15:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3264171307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:15:49 np0005465988 nova_compute[236126]: 2025-10-02 12:15:49.374 2 DEBUG oslo_concurrency.processutils [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:49 np0005465988 nova_compute[236126]: 2025-10-02 12:15:49.380 2 DEBUG nova.compute.provider_tree [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:15:49 np0005465988 nova_compute[236126]: 2025-10-02 12:15:49.402 2 DEBUG nova.scheduler.client.report [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:15:49 np0005465988 nova_compute[236126]: 2025-10-02 12:15:49.428 2 DEBUG oslo_concurrency.lockutils [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:49 np0005465988 nova_compute[236126]: 2025-10-02 12:15:49.464 2 INFO nova.scheduler.client.report [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Deleted allocations for instance c8781322-d594-4313-9032-0f1c3f66aad1#033[00m
Oct  2 08:15:49 np0005465988 nova_compute[236126]: 2025-10-02 12:15:49.544 2 DEBUG oslo_concurrency.lockutils [None req-15653020-f7d0-4447-a8cc-81ac2013291c 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "c8781322-d594-4313-9032-0f1c3f66aad1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:49 np0005465988 nova_compute[236126]: 2025-10-02 12:15:49.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:50 np0005465988 nova_compute[236126]: 2025-10-02 12:15:50.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:50 np0005465988 nova_compute[236126]: 2025-10-02 12:15:50.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:15:50 np0005465988 nova_compute[236126]: 2025-10-02 12:15:50.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:15:50 np0005465988 nova_compute[236126]: 2025-10-02 12:15:50.519 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:15:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:50.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:50.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:15:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:15:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct  2 08:15:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 08:15:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 08:15:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:15:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:15:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:15:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:52 np0005465988 nova_compute[236126]: 2025-10-02 12:15:52.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:52.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:52.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:53 np0005465988 nova_compute[236126]: 2025-10-02 12:15:53.514 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:53 np0005465988 nova_compute[236126]: 2025-10-02 12:15:53.683 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:53 np0005465988 nova_compute[236126]: 2025-10-02 12:15:53.684 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:53 np0005465988 nova_compute[236126]: 2025-10-02 12:15:53.703 2 DEBUG nova.compute.manager [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:15:53 np0005465988 nova_compute[236126]: 2025-10-02 12:15:53.805 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:53 np0005465988 nova_compute[236126]: 2025-10-02 12:15:53.805 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:53 np0005465988 nova_compute[236126]: 2025-10-02 12:15:53.815 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:15:53 np0005465988 nova_compute[236126]: 2025-10-02 12:15:53.815 2 INFO nova.compute.claims [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:15:53 np0005465988 nova_compute[236126]: 2025-10-02 12:15:53.952 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:15:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3542693089' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.417 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.427 2 DEBUG nova.compute.provider_tree [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.444 2 DEBUG nova.scheduler.client.report [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.494 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.494 2 DEBUG nova.compute.manager [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.549 2 DEBUG nova.compute.manager [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.549 2 DEBUG nova.network.neutron [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.570 2 INFO nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.590 2 DEBUG nova.compute.manager [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.703 2 DEBUG nova.compute.manager [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.705 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.706 2 INFO nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Creating image(s)#033[00m
Oct  2 08:15:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:54.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.741 2 DEBUG nova.storage.rbd_utils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.776 2 DEBUG nova.storage.rbd_utils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.804 2 DEBUG nova.storage.rbd_utils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.810 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:54.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.906 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.908 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.909 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.910 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.949 2 DEBUG nova.storage.rbd_utils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:54 np0005465988 nova_compute[236126]: 2025-10-02 12:15:54.954 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:55 np0005465988 nova_compute[236126]: 2025-10-02 12:15:55.021 2 DEBUG nova.policy [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '72c74994085d4fc697ddd4acddfa7a11', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3204d74f349d47fda3152d9d7fbea43e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:15:55 np0005465988 nova_compute[236126]: 2025-10-02 12:15:55.345 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.390s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:55 np0005465988 nova_compute[236126]: 2025-10-02 12:15:55.417 2 DEBUG nova.storage.rbd_utils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] resizing rbd image e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:15:55 np0005465988 nova_compute[236126]: 2025-10-02 12:15:55.542 2 DEBUG nova.objects.instance [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lazy-loading 'migration_context' on Instance uuid e6452d21-4672-41d0-90d6-f3ce7c7bf0e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:55 np0005465988 nova_compute[236126]: 2025-10-02 12:15:55.599 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:15:55 np0005465988 nova_compute[236126]: 2025-10-02 12:15:55.600 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Ensure instance console log exists: /var/lib/nova/instances/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:15:55 np0005465988 nova_compute[236126]: 2025-10-02 12:15:55.601 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:55 np0005465988 nova_compute[236126]: 2025-10-02 12:15:55.602 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:55 np0005465988 nova_compute[236126]: 2025-10-02 12:15:55.602 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:55 np0005465988 nova_compute[236126]: 2025-10-02 12:15:55.821 2 DEBUG nova.network.neutron [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Successfully created port: 8ba41eed-9444-4fdf-b14f-d774a574becd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:15:56 np0005465988 podman[264980]: 2025-10-02 12:15:56.533956801 +0000 UTC m=+0.064810497 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:15:56 np0005465988 podman[264979]: 2025-10-02 12:15:56.536760822 +0000 UTC m=+0.067656290 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:15:56 np0005465988 podman[264978]: 2025-10-02 12:15:56.566492268 +0000 UTC m=+0.097384226 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:15:56 np0005465988 nova_compute[236126]: 2025-10-02 12:15:56.572 2 DEBUG nova.network.neutron [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Successfully updated port: 8ba41eed-9444-4fdf-b14f-d774a574becd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:15:56 np0005465988 nova_compute[236126]: 2025-10-02 12:15:56.590 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "refresh_cache-e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:56 np0005465988 nova_compute[236126]: 2025-10-02 12:15:56.590 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquired lock "refresh_cache-e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:56 np0005465988 nova_compute[236126]: 2025-10-02 12:15:56.590 2 DEBUG nova.network.neutron [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:15:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:15:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:56.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:15:56 np0005465988 nova_compute[236126]: 2025-10-02 12:15:56.718 2 DEBUG nova.compute.manager [req-941530b7-2d16-4ae8-b4ff-85a3687dd034 req-3d02f2d6-defd-4512-877e-b22f0c52f637 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Received event network-changed-8ba41eed-9444-4fdf-b14f-d774a574becd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:56 np0005465988 nova_compute[236126]: 2025-10-02 12:15:56.720 2 DEBUG nova.compute.manager [req-941530b7-2d16-4ae8-b4ff-85a3687dd034 req-3d02f2d6-defd-4512-877e-b22f0c52f637 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Refreshing instance network info cache due to event network-changed-8ba41eed-9444-4fdf-b14f-d774a574becd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:15:56 np0005465988 nova_compute[236126]: 2025-10-02 12:15:56.720 2 DEBUG oslo_concurrency.lockutils [req-941530b7-2d16-4ae8-b4ff-85a3687dd034 req-3d02f2d6-defd-4512-877e-b22f0c52f637 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:56 np0005465988 nova_compute[236126]: 2025-10-02 12:15:56.802 2 DEBUG nova.network.neutron [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:15:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:56.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:15:57 np0005465988 nova_compute[236126]: 2025-10-02 12:15:57.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:57 np0005465988 nova_compute[236126]: 2025-10-02 12:15:57.946 2 DEBUG nova.network.neutron [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Updating instance_info_cache with network_info: [{"id": "8ba41eed-9444-4fdf-b14f-d774a574becd", "address": "fa:16:3e:3a:ff:c5", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ba41eed-94", "ovs_interfaceid": "8ba41eed-9444-4fdf-b14f-d774a574becd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:57 np0005465988 nova_compute[236126]: 2025-10-02 12:15:57.979 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Releasing lock "refresh_cache-e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:57 np0005465988 nova_compute[236126]: 2025-10-02 12:15:57.980 2 DEBUG nova.compute.manager [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Instance network_info: |[{"id": "8ba41eed-9444-4fdf-b14f-d774a574becd", "address": "fa:16:3e:3a:ff:c5", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ba41eed-94", "ovs_interfaceid": "8ba41eed-9444-4fdf-b14f-d774a574becd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:15:57 np0005465988 nova_compute[236126]: 2025-10-02 12:15:57.980 2 DEBUG oslo_concurrency.lockutils [req-941530b7-2d16-4ae8-b4ff-85a3687dd034 req-3d02f2d6-defd-4512-877e-b22f0c52f637 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:57 np0005465988 nova_compute[236126]: 2025-10-02 12:15:57.981 2 DEBUG nova.network.neutron [req-941530b7-2d16-4ae8-b4ff-85a3687dd034 req-3d02f2d6-defd-4512-877e-b22f0c52f637 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Refreshing network info cache for port 8ba41eed-9444-4fdf-b14f-d774a574becd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:15:57 np0005465988 nova_compute[236126]: 2025-10-02 12:15:57.986 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Start _get_guest_xml network_info=[{"id": "8ba41eed-9444-4fdf-b14f-d774a574becd", "address": "fa:16:3e:3a:ff:c5", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ba41eed-94", "ovs_interfaceid": "8ba41eed-9444-4fdf-b14f-d774a574becd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:15:57 np0005465988 nova_compute[236126]: 2025-10-02 12:15:57.992 2 WARNING nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:57.999 2 DEBUG nova.virt.libvirt.host [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.000 2 DEBUG nova.virt.libvirt.host [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.005 2 DEBUG nova.virt.libvirt.host [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.006 2 DEBUG nova.virt.libvirt.host [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.008 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.008 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.009 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.009 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.009 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.010 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.010 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.010 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.011 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.011 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.011 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.012 2 DEBUG nova.virt.hardware [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.016 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:15:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3453082838' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:15:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:15:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.496 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.533 2 DEBUG nova.storage.rbd_utils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:58 np0005465988 nova_compute[236126]: 2025-10-02 12:15:58.538 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:15:58.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:15:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:15:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:15:58.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:15:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:15:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1066816460' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.007 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.011 2 DEBUG nova.virt.libvirt.vif [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-2048086633',display_name='tempest-ImagesOneServerNegativeTestJSON-server-2048086633',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-2048086633',id=68,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3204d74f349d47fda3152d9d7fbea43e',ramdisk_id='',reservation_id='r-mdwiujaz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1213912851',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1213912851-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:15:54Z,user_data=None,user_id='72c74994085d4fc697ddd4acddfa7a11',uuid=e6452d21-4672-41d0-90d6-f3ce7c7bf0e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8ba41eed-9444-4fdf-b14f-d774a574becd", "address": "fa:16:3e:3a:ff:c5", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ba41eed-94", "ovs_interfaceid": "8ba41eed-9444-4fdf-b14f-d774a574becd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.013 2 DEBUG nova.network.os_vif_util [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converting VIF {"id": "8ba41eed-9444-4fdf-b14f-d774a574becd", "address": "fa:16:3e:3a:ff:c5", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ba41eed-94", "ovs_interfaceid": "8ba41eed-9444-4fdf-b14f-d774a574becd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.015 2 DEBUG nova.network.os_vif_util [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:ff:c5,bridge_name='br-int',has_traffic_filtering=True,id=8ba41eed-9444-4fdf-b14f-d774a574becd,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ba41eed-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.018 2 DEBUG nova.objects.instance [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lazy-loading 'pci_devices' on Instance uuid e6452d21-4672-41d0-90d6-f3ce7c7bf0e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.049 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  <uuid>e6452d21-4672-41d0-90d6-f3ce7c7bf0e1</uuid>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  <name>instance-00000044</name>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-2048086633</nova:name>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:15:57</nova:creationTime>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <nova:user uuid="72c74994085d4fc697ddd4acddfa7a11">tempest-ImagesOneServerNegativeTestJSON-1213912851-project-member</nova:user>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <nova:project uuid="3204d74f349d47fda3152d9d7fbea43e">tempest-ImagesOneServerNegativeTestJSON-1213912851</nova:project>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <nova:port uuid="8ba41eed-9444-4fdf-b14f-d774a574becd">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <entry name="serial">e6452d21-4672-41d0-90d6-f3ce7c7bf0e1</entry>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <entry name="uuid">e6452d21-4672-41d0-90d6-f3ce7c7bf0e1</entry>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk.config">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:3a:ff:c5"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <target dev="tap8ba41eed-94"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1/console.log" append="off"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:15:59 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:15:59 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:15:59 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:15:59 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.051 2 DEBUG nova.compute.manager [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Preparing to wait for external event network-vif-plugged-8ba41eed-9444-4fdf-b14f-d774a574becd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.051 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.052 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.052 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.053 2 DEBUG nova.virt.libvirt.vif [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-2048086633',display_name='tempest-ImagesOneServerNegativeTestJSON-server-2048086633',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-2048086633',id=68,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3204d74f349d47fda3152d9d7fbea43e',ramdisk_id='',reservation_id='r-mdwiujaz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1213912851',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1213912851-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:15:54Z,user_data=None,user_id='72c74994085d4fc697ddd4acddfa7a11',uuid=e6452d21-4672-41d0-90d6-f3ce7c7bf0e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8ba41eed-9444-4fdf-b14f-d774a574becd", "address": "fa:16:3e:3a:ff:c5", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ba41eed-94", "ovs_interfaceid": "8ba41eed-9444-4fdf-b14f-d774a574becd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.053 2 DEBUG nova.network.os_vif_util [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converting VIF {"id": "8ba41eed-9444-4fdf-b14f-d774a574becd", "address": "fa:16:3e:3a:ff:c5", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ba41eed-94", "ovs_interfaceid": "8ba41eed-9444-4fdf-b14f-d774a574becd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.054 2 DEBUG nova.network.os_vif_util [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:ff:c5,bridge_name='br-int',has_traffic_filtering=True,id=8ba41eed-9444-4fdf-b14f-d774a574becd,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ba41eed-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.054 2 DEBUG os_vif [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:ff:c5,bridge_name='br-int',has_traffic_filtering=True,id=8ba41eed-9444-4fdf-b14f-d774a574becd,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ba41eed-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.056 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.056 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.061 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8ba41eed-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.062 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8ba41eed-94, col_values=(('external_ids', {'iface-id': '8ba41eed-9444-4fdf-b14f-d774a574becd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3a:ff:c5', 'vm-uuid': 'e6452d21-4672-41d0-90d6-f3ce7c7bf0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:59 np0005465988 NetworkManager[45041]: <info>  [1759407359.0692] manager: (tap8ba41eed-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.082 2 INFO os_vif [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:ff:c5,bridge_name='br-int',has_traffic_filtering=True,id=8ba41eed-9444-4fdf-b14f-d774a574becd,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ba41eed-94')#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.142 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.143 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.143 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] No VIF found with MAC fa:16:3e:3a:ff:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.144 2 INFO nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Using config drive#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.170 2 DEBUG nova.storage.rbd_utils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.514 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407344.5133848, c8781322-d594-4313-9032-0f1c3f66aad1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.515 2 INFO nova.compute.manager [-] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.566 2 DEBUG nova.compute.manager [None req-2608e9e6-589c-4309-adb3-c547fb59c30d - - - - - -] [instance: c8781322-d594-4313-9032-0f1c3f66aad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.693 2 INFO nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Creating config drive at /var/lib/nova/instances/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1/disk.config#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.702 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprns4umu1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.851 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprns4umu1" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.901 2 DEBUG nova.storage.rbd_utils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] rbd image e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:15:59 np0005465988 nova_compute[236126]: 2025-10-02 12:15:59.906 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1/disk.config e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.114 2 DEBUG oslo_concurrency.processutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1/disk.config e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.207s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.116 2 INFO nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Deleting local config drive /var/lib/nova/instances/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1/disk.config because it was imported into RBD.#033[00m
Oct  2 08:16:00 np0005465988 kernel: tap8ba41eed-94: entered promiscuous mode
Oct  2 08:16:00 np0005465988 NetworkManager[45041]: <info>  [1759407360.2016] manager: (tap8ba41eed-94): new Tun device (/org/freedesktop/NetworkManager/Devices/113)
Oct  2 08:16:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:00Z|00207|binding|INFO|Claiming lport 8ba41eed-9444-4fdf-b14f-d774a574becd for this chassis.
Oct  2 08:16:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:00Z|00208|binding|INFO|8ba41eed-9444-4fdf-b14f-d774a574becd: Claiming fa:16:3e:3a:ff:c5 10.100.0.6
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.217 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:ff:c5 10.100.0.6'], port_security=['fa:16:3e:3a:ff:c5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e6452d21-4672-41d0-90d6-f3ce7c7bf0e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34395819-7251-4d97-acea-2b98c07c277f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3204d74f349d47fda3152d9d7fbea43e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e7f67ea6-7556-4acf-963b-57a7d344e510', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c7b3003-1bab-46e9-ac89-20b4b6aed5f5, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=8ba41eed-9444-4fdf-b14f-d774a574becd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.219 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 8ba41eed-9444-4fdf-b14f-d774a574becd in datapath 34395819-7251-4d97-acea-2b98c07c277f bound to our chassis#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.222 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 34395819-7251-4d97-acea-2b98c07c277f#033[00m
Oct  2 08:16:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:00Z|00209|binding|INFO|Setting lport 8ba41eed-9444-4fdf-b14f-d774a574becd ovn-installed in OVS
Oct  2 08:16:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:00Z|00210|binding|INFO|Setting lport 8ba41eed-9444-4fdf-b14f-d774a574becd up in Southbound
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.242 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[22749c00-892f-4153-bea0-449da132ec65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.243 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap34395819-71 in ovnmeta-34395819-7251-4d97-acea-2b98c07c277f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.249 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap34395819-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.250 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[813359db-5088-4448-b419-8b27b6cec3c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.249 2 DEBUG nova.network.neutron [req-941530b7-2d16-4ae8-b4ff-85a3687dd034 req-3d02f2d6-defd-4512-877e-b22f0c52f637 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Updated VIF entry in instance network info cache for port 8ba41eed-9444-4fdf-b14f-d774a574becd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.250 2 DEBUG nova.network.neutron [req-941530b7-2d16-4ae8-b4ff-85a3687dd034 req-3d02f2d6-defd-4512-877e-b22f0c52f637 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Updating instance_info_cache with network_info: [{"id": "8ba41eed-9444-4fdf-b14f-d774a574becd", "address": "fa:16:3e:3a:ff:c5", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ba41eed-94", "ovs_interfaceid": "8ba41eed-9444-4fdf-b14f-d774a574becd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.251 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a0fa01a0-3937-482c-8cb7-85f1e1958472]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 systemd-udevd[265285]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:00 np0005465988 systemd-machined[192594]: New machine qemu-26-instance-00000044.
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.270 2 DEBUG oslo_concurrency.lockutils [req-941530b7-2d16-4ae8-b4ff-85a3687dd034 req-3d02f2d6-defd-4512-877e-b22f0c52f637 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.271 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[dcc41b44-eadc-40fe-aae9-d8b99404647e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 systemd[1]: Started Virtual Machine qemu-26-instance-00000044.
Oct  2 08:16:00 np0005465988 NetworkManager[45041]: <info>  [1759407360.2852] device (tap8ba41eed-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:16:00 np0005465988 NetworkManager[45041]: <info>  [1759407360.2866] device (tap8ba41eed-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.312 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[70de8168-3448-43d1-8923-f8f7029a4538]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.354 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[14a96450-5d27-43ea-949d-2bf1d38f433d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.364 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ba464f02-9200-4d54-9d32-f724bbbd40e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 systemd-udevd[265288]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:00 np0005465988 NetworkManager[45041]: <info>  [1759407360.3656] manager: (tap34395819-70): new Veth device (/org/freedesktop/NetworkManager/Devices/114)
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.402 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0260ad08-3270-4364-81bb-cc4ea4122495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.407 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[df29b286-809f-4ecc-bbd2-9a72e770dfd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 NetworkManager[45041]: <info>  [1759407360.4331] device (tap34395819-70): carrier: link connected
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.442 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ae39c1b6-b7f3-4aad-9f67-b6b8974fc901]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.465 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9cdf1874-12bc-4cd3-b173-ac26d646e23b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34395819-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:6c:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542076, 'reachable_time': 19253, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265316, 'error': None, 'target': 'ovnmeta-34395819-7251-4d97-acea-2b98c07c277f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.486 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ac5a8f7c-1273-444e-9cab-62cdc45ec608]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:6c0c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542076, 'tstamp': 542076}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 265317, 'error': None, 'target': 'ovnmeta-34395819-7251-4d97-acea-2b98c07c277f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.510 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f60c03b1-5183-4739-893e-2e8f9000f240]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34395819-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:6c:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542076, 'reachable_time': 19253, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 265333, 'error': None, 'target': 'ovnmeta-34395819-7251-4d97-acea-2b98c07c277f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.541 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d13737-78f1-43d3-9d93-79974565dda4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.616 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[67fcea27-6b8b-4a5c-a05c-a30bfad8b934]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.618 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34395819-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.619 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.619 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap34395819-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:00 np0005465988 NetworkManager[45041]: <info>  [1759407360.6222] manager: (tap34395819-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Oct  2 08:16:00 np0005465988 kernel: tap34395819-70: entered promiscuous mode
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.626 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap34395819-70, col_values=(('external_ids', {'iface-id': '595d0160-be54-4c2f-8674-117a5a5028e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:00Z|00211|binding|INFO|Releasing lport 595d0160-be54-4c2f-8674-117a5a5028e1 from this chassis (sb_readonly=0)
Oct  2 08:16:00 np0005465988 nova_compute[236126]: 2025-10-02 12:16:00.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.645 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/34395819-7251-4d97-acea-2b98c07c277f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/34395819-7251-4d97-acea-2b98c07c277f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.646 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d36afde0-889e-42ea-9da2-cf8b7b8b79eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.652 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-34395819-7251-4d97-acea-2b98c07c277f
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/34395819-7251-4d97-acea-2b98c07c277f.pid.haproxy
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 34395819-7251-4d97-acea-2b98c07c277f
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:16:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:00.653 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-34395819-7251-4d97-acea-2b98c07c277f', 'env', 'PROCESS_TAG=haproxy-34395819-7251-4d97-acea-2b98c07c277f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/34395819-7251-4d97-acea-2b98c07c277f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:16:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:00.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:00.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:01 np0005465988 nova_compute[236126]: 2025-10-02 12:16:01.079 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407361.0788658, e6452d21-4672-41d0-90d6-f3ce7c7bf0e1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:01 np0005465988 nova_compute[236126]: 2025-10-02 12:16:01.080 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:16:01 np0005465988 nova_compute[236126]: 2025-10-02 12:16:01.104 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:01 np0005465988 nova_compute[236126]: 2025-10-02 12:16:01.111 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407361.0803976, e6452d21-4672-41d0-90d6-f3ce7c7bf0e1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:01 np0005465988 nova_compute[236126]: 2025-10-02 12:16:01.112 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:16:01 np0005465988 podman[265393]: 2025-10-02 12:16:01.131864809 +0000 UTC m=+0.091297588 container create adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:16:01 np0005465988 nova_compute[236126]: 2025-10-02 12:16:01.140 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:01 np0005465988 nova_compute[236126]: 2025-10-02 12:16:01.149 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:01 np0005465988 podman[265393]: 2025-10-02 12:16:01.088872098 +0000 UTC m=+0.048304937 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:16:01 np0005465988 systemd[1]: Started libpod-conmon-adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03.scope.
Oct  2 08:16:01 np0005465988 nova_compute[236126]: 2025-10-02 12:16:01.197 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:01 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:16:01 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d3c5cb09e08c21a23afe3844065953cc2267d54bacc47d33cb07e00a68b4fde/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:16:01 np0005465988 podman[265393]: 2025-10-02 12:16:01.264970033 +0000 UTC m=+0.224402792 container init adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:16:01 np0005465988 podman[265393]: 2025-10-02 12:16:01.27140914 +0000 UTC m=+0.230841879 container start adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:16:01 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[265409]: [NOTICE]   (265413) : New worker (265415) forked
Oct  2 08:16:01 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[265409]: [NOTICE]   (265413) : Loading success.
Oct  2 08:16:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.628 2 DEBUG nova.compute.manager [req-eeeea906-89ef-4a4d-b581-9f277c8cef63 req-9c0d398e-9aa7-4a33-bbaf-67e819a89c4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Received event network-vif-plugged-8ba41eed-9444-4fdf-b14f-d774a574becd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.629 2 DEBUG oslo_concurrency.lockutils [req-eeeea906-89ef-4a4d-b581-9f277c8cef63 req-9c0d398e-9aa7-4a33-bbaf-67e819a89c4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.630 2 DEBUG oslo_concurrency.lockutils [req-eeeea906-89ef-4a4d-b581-9f277c8cef63 req-9c0d398e-9aa7-4a33-bbaf-67e819a89c4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.630 2 DEBUG oslo_concurrency.lockutils [req-eeeea906-89ef-4a4d-b581-9f277c8cef63 req-9c0d398e-9aa7-4a33-bbaf-67e819a89c4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.631 2 DEBUG nova.compute.manager [req-eeeea906-89ef-4a4d-b581-9f277c8cef63 req-9c0d398e-9aa7-4a33-bbaf-67e819a89c4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Processing event network-vif-plugged-8ba41eed-9444-4fdf-b14f-d774a574becd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.631 2 DEBUG nova.compute.manager [req-eeeea906-89ef-4a4d-b581-9f277c8cef63 req-9c0d398e-9aa7-4a33-bbaf-67e819a89c4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Received event network-vif-plugged-8ba41eed-9444-4fdf-b14f-d774a574becd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.631 2 DEBUG oslo_concurrency.lockutils [req-eeeea906-89ef-4a4d-b581-9f277c8cef63 req-9c0d398e-9aa7-4a33-bbaf-67e819a89c4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.632 2 DEBUG oslo_concurrency.lockutils [req-eeeea906-89ef-4a4d-b581-9f277c8cef63 req-9c0d398e-9aa7-4a33-bbaf-67e819a89c4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.632 2 DEBUG oslo_concurrency.lockutils [req-eeeea906-89ef-4a4d-b581-9f277c8cef63 req-9c0d398e-9aa7-4a33-bbaf-67e819a89c4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.632 2 DEBUG nova.compute.manager [req-eeeea906-89ef-4a4d-b581-9f277c8cef63 req-9c0d398e-9aa7-4a33-bbaf-67e819a89c4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] No waiting events found dispatching network-vif-plugged-8ba41eed-9444-4fdf-b14f-d774a574becd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.633 2 WARNING nova.compute.manager [req-eeeea906-89ef-4a4d-b581-9f277c8cef63 req-9c0d398e-9aa7-4a33-bbaf-67e819a89c4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Received unexpected event network-vif-plugged-8ba41eed-9444-4fdf-b14f-d774a574becd for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.634 2 DEBUG nova.compute.manager [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.638 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407362.6386206, e6452d21-4672-41d0-90d6-f3ce7c7bf0e1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.639 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.642 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.648 2 INFO nova.virt.libvirt.driver [-] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Instance spawned successfully.#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.648 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.664 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.673 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.679 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.680 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.681 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.682 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.682 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.683 2 DEBUG nova.virt.libvirt.driver [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.694 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:02.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.771 2 INFO nova.compute.manager [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Took 8.07 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.771 2 DEBUG nova.compute.manager [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:02.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.865 2 INFO nova.compute.manager [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Took 9.10 seconds to build instance.#033[00m
Oct  2 08:16:02 np0005465988 nova_compute[236126]: 2025-10-02 12:16:02.930 2 DEBUG oslo_concurrency.lockutils [None req-7c4afe42-b42a-432e-9683-d9ebdfbf6256 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.581 2 DEBUG oslo_concurrency.lockutils [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.582 2 DEBUG oslo_concurrency.lockutils [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.582 2 DEBUG oslo_concurrency.lockutils [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.583 2 DEBUG oslo_concurrency.lockutils [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.583 2 DEBUG oslo_concurrency.lockutils [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.584 2 INFO nova.compute.manager [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Terminating instance#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.585 2 DEBUG nova.compute.manager [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:16:04 np0005465988 kernel: tap8ba41eed-94 (unregistering): left promiscuous mode
Oct  2 08:16:04 np0005465988 NetworkManager[45041]: <info>  [1759407364.6337] device (tap8ba41eed-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:16:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:04Z|00212|binding|INFO|Releasing lport 8ba41eed-9444-4fdf-b14f-d774a574becd from this chassis (sb_readonly=0)
Oct  2 08:16:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:04Z|00213|binding|INFO|Setting lport 8ba41eed-9444-4fdf-b14f-d774a574becd down in Southbound
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:04Z|00214|binding|INFO|Removing iface tap8ba41eed-94 ovn-installed in OVS
Oct  2 08:16:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:04.655 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:ff:c5 10.100.0.6'], port_security=['fa:16:3e:3a:ff:c5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e6452d21-4672-41d0-90d6-f3ce7c7bf0e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34395819-7251-4d97-acea-2b98c07c277f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3204d74f349d47fda3152d9d7fbea43e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e7f67ea6-7556-4acf-963b-57a7d344e510', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c7b3003-1bab-46e9-ac89-20b4b6aed5f5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=8ba41eed-9444-4fdf-b14f-d774a574becd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:04.657 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 8ba41eed-9444-4fdf-b14f-d774a574becd in datapath 34395819-7251-4d97-acea-2b98c07c277f unbound from our chassis#033[00m
Oct  2 08:16:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:04.661 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 34395819-7251-4d97-acea-2b98c07c277f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:16:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:04.662 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0280654c-d521-48b3-a411-e509f176d78c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:04.663 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-34395819-7251-4d97-acea-2b98c07c277f namespace which is not needed anymore#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:04.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:04 np0005465988 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000044.scope: Deactivated successfully.
Oct  2 08:16:04 np0005465988 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000044.scope: Consumed 2.852s CPU time.
Oct  2 08:16:04 np0005465988 systemd-machined[192594]: Machine qemu-26-instance-00000044 terminated.
Oct  2 08:16:04 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[265409]: [NOTICE]   (265413) : haproxy version is 2.8.14-c23fe91
Oct  2 08:16:04 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[265409]: [NOTICE]   (265413) : path to executable is /usr/sbin/haproxy
Oct  2 08:16:04 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[265409]: [WARNING]  (265413) : Exiting Master process...
Oct  2 08:16:04 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[265409]: [ALERT]    (265413) : Current worker (265415) exited with code 143 (Terminated)
Oct  2 08:16:04 np0005465988 neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f[265409]: [WARNING]  (265413) : All workers exited. Exiting... (0)
Oct  2 08:16:04 np0005465988 systemd[1]: libpod-adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03.scope: Deactivated successfully.
Oct  2 08:16:04 np0005465988 podman[265448]: 2025-10-02 12:16:04.825839397 +0000 UTC m=+0.056927788 container died adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.824 2 INFO nova.virt.libvirt.driver [-] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Instance destroyed successfully.#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.825 2 DEBUG nova.objects.instance [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lazy-loading 'resources' on Instance uuid e6452d21-4672-41d0-90d6-f3ce7c7bf0e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:04 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03-userdata-shm.mount: Deactivated successfully.
Oct  2 08:16:04 np0005465988 systemd[1]: var-lib-containers-storage-overlay-8d3c5cb09e08c21a23afe3844065953cc2267d54bacc47d33cb07e00a68b4fde-merged.mount: Deactivated successfully.
Oct  2 08:16:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:04.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:04 np0005465988 podman[265448]: 2025-10-02 12:16:04.866282404 +0000 UTC m=+0.097370795 container cleanup adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:16:04 np0005465988 systemd[1]: libpod-conmon-adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03.scope: Deactivated successfully.
Oct  2 08:16:04 np0005465988 podman[265487]: 2025-10-02 12:16:04.924734866 +0000 UTC m=+0.036631857 container remove adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:16:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:04.931 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2c7630b5-72c0-4c28-b48f-4a72e9475ee1]: (4, ('Thu Oct  2 12:16:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f (adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03)\nadad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03\nThu Oct  2 12:16:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-34395819-7251-4d97-acea-2b98c07c277f (adad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03)\nadad24fe9187e6e6d87ec4284263beaca079b5eefc3bcdd2374f8eeb2d903d03\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:04.932 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[36dd968f-7d2c-4d65-8684-89154a6090f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:04.933 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34395819-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:04 np0005465988 kernel: tap34395819-70: left promiscuous mode
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:04.956 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c660e9a1-1498-4960-ae86-fbe55d3902a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.971 2 DEBUG nova.compute.manager [req-b647ede1-52fb-45ae-9d0e-192fc9b0e48e req-a763c316-ff76-41f8-b0e0-03a9ef1157d2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Received event network-vif-unplugged-8ba41eed-9444-4fdf-b14f-d774a574becd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.971 2 DEBUG oslo_concurrency.lockutils [req-b647ede1-52fb-45ae-9d0e-192fc9b0e48e req-a763c316-ff76-41f8-b0e0-03a9ef1157d2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.971 2 DEBUG oslo_concurrency.lockutils [req-b647ede1-52fb-45ae-9d0e-192fc9b0e48e req-a763c316-ff76-41f8-b0e0-03a9ef1157d2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.972 2 DEBUG oslo_concurrency.lockutils [req-b647ede1-52fb-45ae-9d0e-192fc9b0e48e req-a763c316-ff76-41f8-b0e0-03a9ef1157d2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.972 2 DEBUG nova.compute.manager [req-b647ede1-52fb-45ae-9d0e-192fc9b0e48e req-a763c316-ff76-41f8-b0e0-03a9ef1157d2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] No waiting events found dispatching network-vif-unplugged-8ba41eed-9444-4fdf-b14f-d774a574becd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:04 np0005465988 nova_compute[236126]: 2025-10-02 12:16:04.972 2 DEBUG nova.compute.manager [req-b647ede1-52fb-45ae-9d0e-192fc9b0e48e req-a763c316-ff76-41f8-b0e0-03a9ef1157d2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Received event network-vif-unplugged-8ba41eed-9444-4fdf-b14f-d774a574becd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:16:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:04.994 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ccf174ca-5582-4aa3-a9be-3b023fd8463f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:04.995 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[db635d69-e6d9-44a9-9994-3e5f21448b50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:05.008 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c58f33ea-eca1-4f68-bb1f-07856b43d8c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542067, 'reachable_time': 37534, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265505, 'error': None, 'target': 'ovnmeta-34395819-7251-4d97-acea-2b98c07c277f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:05.011 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-34395819-7251-4d97-acea-2b98c07c277f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:16:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:05.011 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[01cdf2c5-a779-495b-bee1-ba3b55eea063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:05 np0005465988 systemd[1]: run-netns-ovnmeta\x2d34395819\x2d7251\x2d4d97\x2dacea\x2d2b98c07c277f.mount: Deactivated successfully.
Oct  2 08:16:05 np0005465988 nova_compute[236126]: 2025-10-02 12:16:05.027 2 DEBUG nova.virt.libvirt.vif [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-2048086633',display_name='tempest-ImagesOneServerNegativeTestJSON-server-2048086633',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-2048086633',id=68,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3204d74f349d47fda3152d9d7fbea43e',ramdisk_id='',reservation_id='r-mdwiujaz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1213912851',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1213912851-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:02Z,user_data=None,user_id='72c74994085d4fc697ddd4acddfa7a11',uuid=e6452d21-4672-41d0-90d6-f3ce7c7bf0e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8ba41eed-9444-4fdf-b14f-d774a574becd", "address": "fa:16:3e:3a:ff:c5", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ba41eed-94", "ovs_interfaceid": "8ba41eed-9444-4fdf-b14f-d774a574becd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:16:05 np0005465988 nova_compute[236126]: 2025-10-02 12:16:05.027 2 DEBUG nova.network.os_vif_util [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converting VIF {"id": "8ba41eed-9444-4fdf-b14f-d774a574becd", "address": "fa:16:3e:3a:ff:c5", "network": {"id": "34395819-7251-4d97-acea-2b98c07c277f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1049268166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3204d74f349d47fda3152d9d7fbea43e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ba41eed-94", "ovs_interfaceid": "8ba41eed-9444-4fdf-b14f-d774a574becd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:05 np0005465988 nova_compute[236126]: 2025-10-02 12:16:05.029 2 DEBUG nova.network.os_vif_util [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:ff:c5,bridge_name='br-int',has_traffic_filtering=True,id=8ba41eed-9444-4fdf-b14f-d774a574becd,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ba41eed-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:05 np0005465988 nova_compute[236126]: 2025-10-02 12:16:05.030 2 DEBUG os_vif [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:ff:c5,bridge_name='br-int',has_traffic_filtering=True,id=8ba41eed-9444-4fdf-b14f-d774a574becd,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ba41eed-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:16:05 np0005465988 nova_compute[236126]: 2025-10-02 12:16:05.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:05 np0005465988 nova_compute[236126]: 2025-10-02 12:16:05.034 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ba41eed-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:05 np0005465988 nova_compute[236126]: 2025-10-02 12:16:05.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:05 np0005465988 nova_compute[236126]: 2025-10-02 12:16:05.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:05 np0005465988 nova_compute[236126]: 2025-10-02 12:16:05.096 2 INFO os_vif [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:ff:c5,bridge_name='br-int',has_traffic_filtering=True,id=8ba41eed-9444-4fdf-b14f-d774a574becd,network=Network(34395819-7251-4d97-acea-2b98c07c277f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ba41eed-94')#033[00m
Oct  2 08:16:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:06.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:06.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.175 2 DEBUG nova.compute.manager [req-9a0ebef1-0ad2-4dde-a944-34875a6c9a38 req-4e411d32-5c9c-4510-8f34-759cc042616f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Received event network-vif-plugged-8ba41eed-9444-4fdf-b14f-d774a574becd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.176 2 DEBUG oslo_concurrency.lockutils [req-9a0ebef1-0ad2-4dde-a944-34875a6c9a38 req-4e411d32-5c9c-4510-8f34-759cc042616f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.176 2 DEBUG oslo_concurrency.lockutils [req-9a0ebef1-0ad2-4dde-a944-34875a6c9a38 req-4e411d32-5c9c-4510-8f34-759cc042616f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.176 2 DEBUG oslo_concurrency.lockutils [req-9a0ebef1-0ad2-4dde-a944-34875a6c9a38 req-4e411d32-5c9c-4510-8f34-759cc042616f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.176 2 DEBUG nova.compute.manager [req-9a0ebef1-0ad2-4dde-a944-34875a6c9a38 req-4e411d32-5c9c-4510-8f34-759cc042616f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] No waiting events found dispatching network-vif-plugged-8ba41eed-9444-4fdf-b14f-d774a574becd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.177 2 WARNING nova.compute.manager [req-9a0ebef1-0ad2-4dde-a944-34875a6c9a38 req-4e411d32-5c9c-4510-8f34-759cc042616f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Received unexpected event network-vif-plugged-8ba41eed-9444-4fdf-b14f-d774a574becd for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:16:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.609 2 INFO nova.virt.libvirt.driver [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Deleting instance files /var/lib/nova/instances/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_del#033[00m
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.610 2 INFO nova.virt.libvirt.driver [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Deletion of /var/lib/nova/instances/e6452d21-4672-41d0-90d6-f3ce7c7bf0e1_del complete#033[00m
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.685 2 INFO nova.compute.manager [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Took 3.10 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.686 2 DEBUG oslo.service.loopingcall [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.686 2 DEBUG nova.compute.manager [-] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:16:07 np0005465988 nova_compute[236126]: 2025-10-02 12:16:07.687 2 DEBUG nova.network.neutron [-] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:16:08 np0005465988 nova_compute[236126]: 2025-10-02 12:16:08.435 2 DEBUG nova.network.neutron [-] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:08 np0005465988 nova_compute[236126]: 2025-10-02 12:16:08.451 2 INFO nova.compute.manager [-] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Took 0.76 seconds to deallocate network for instance.#033[00m
Oct  2 08:16:08 np0005465988 nova_compute[236126]: 2025-10-02 12:16:08.518 2 DEBUG oslo_concurrency.lockutils [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:08 np0005465988 nova_compute[236126]: 2025-10-02 12:16:08.519 2 DEBUG oslo_concurrency.lockutils [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:08 np0005465988 nova_compute[236126]: 2025-10-02 12:16:08.640 2 DEBUG oslo_concurrency.processutils [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:08.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:08.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:16:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3560707684' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:16:09 np0005465988 nova_compute[236126]: 2025-10-02 12:16:09.114 2 DEBUG oslo_concurrency.processutils [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:09 np0005465988 nova_compute[236126]: 2025-10-02 12:16:09.122 2 DEBUG nova.compute.provider_tree [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:09 np0005465988 nova_compute[236126]: 2025-10-02 12:16:09.141 2 DEBUG nova.scheduler.client.report [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:09 np0005465988 nova_compute[236126]: 2025-10-02 12:16:09.168 2 DEBUG oslo_concurrency.lockutils [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:09 np0005465988 nova_compute[236126]: 2025-10-02 12:16:09.195 2 INFO nova.scheduler.client.report [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Deleted allocations for instance e6452d21-4672-41d0-90d6-f3ce7c7bf0e1#033[00m
Oct  2 08:16:09 np0005465988 nova_compute[236126]: 2025-10-02 12:16:09.274 2 DEBUG nova.compute.manager [req-78a853e3-60fa-492a-a39f-2fede8259675 req-b55a7157-81e9-45e2-b0a4-1cdaa8337771 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Received event network-vif-deleted-8ba41eed-9444-4fdf-b14f-d774a574becd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:09 np0005465988 nova_compute[236126]: 2025-10-02 12:16:09.289 2 DEBUG oslo_concurrency.lockutils [None req-250d7f86-8d45-4ac9-906d-c91a02578953 72c74994085d4fc697ddd4acddfa7a11 3204d74f349d47fda3152d9d7fbea43e - - default default] Lock "e6452d21-4672-41d0-90d6-f3ce7c7bf0e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:10 np0005465988 nova_compute[236126]: 2025-10-02 12:16:10.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:10.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:10.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:11 np0005465988 podman[265551]: 2025-10-02 12:16:11.525200132 +0000 UTC m=+0.054970681 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:16:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:12 np0005465988 nova_compute[236126]: 2025-10-02 12:16:12.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:12.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:12.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:14.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:14.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:15 np0005465988 nova_compute[236126]: 2025-10-02 12:16:15.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:16:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:16.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:16:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:16.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:16 np0005465988 nova_compute[236126]: 2025-10-02 12:16:16.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:17 np0005465988 nova_compute[236126]: 2025-10-02 12:16:17.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:17.861 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:17 np0005465988 nova_compute[236126]: 2025-10-02 12:16:17.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:17.863 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:16:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:18.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:18.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:19 np0005465988 nova_compute[236126]: 2025-10-02 12:16:19.822 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407364.8207893, e6452d21-4672-41d0-90d6-f3ce7c7bf0e1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:19 np0005465988 nova_compute[236126]: 2025-10-02 12:16:19.823 2 INFO nova.compute.manager [-] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:16:19 np0005465988 nova_compute[236126]: 2025-10-02 12:16:19.841 2 DEBUG nova.compute.manager [None req-8b9ddc67-a418-47bf-9de4-7a183f068d43 - - - - - -] [instance: e6452d21-4672-41d0-90d6-f3ce7c7bf0e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:20 np0005465988 nova_compute[236126]: 2025-10-02 12:16:20.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:20.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:20.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:21.866 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:22 np0005465988 nova_compute[236126]: 2025-10-02 12:16:22.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:22.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:22.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e221 e221: 3 total, 3 up, 3 in
Oct  2 08:16:23 np0005465988 nova_compute[236126]: 2025-10-02 12:16:23.315 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:23 np0005465988 nova_compute[236126]: 2025-10-02 12:16:23.315 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:23 np0005465988 nova_compute[236126]: 2025-10-02 12:16:23.342 2 DEBUG nova.compute.manager [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:16:23 np0005465988 nova_compute[236126]: 2025-10-02 12:16:23.431 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:23 np0005465988 nova_compute[236126]: 2025-10-02 12:16:23.432 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:23 np0005465988 nova_compute[236126]: 2025-10-02 12:16:23.443 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:16:23 np0005465988 nova_compute[236126]: 2025-10-02 12:16:23.444 2 INFO nova.compute.claims [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:16:23 np0005465988 nova_compute[236126]: 2025-10-02 12:16:23.603 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:16:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/758705718' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.066 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.073 2 DEBUG nova.compute.provider_tree [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.102 2 DEBUG nova.scheduler.client.report [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.138 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.139 2 DEBUG nova.compute.manager [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.206 2 DEBUG nova.compute.manager [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.207 2 DEBUG nova.network.neutron [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.233 2 INFO nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.254 2 DEBUG nova.compute.manager [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.345 2 DEBUG nova.compute.manager [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.346 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.347 2 INFO nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Creating image(s)#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.383 2 DEBUG nova.storage.rbd_utils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] rbd image d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.424 2 DEBUG nova.storage.rbd_utils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] rbd image d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.474 2 DEBUG nova.storage.rbd_utils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] rbd image d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.480 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.516 2 DEBUG nova.policy [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8e627e098f2b465d97099fee8f489b71', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bddf59854a7d4eb1a85ece547923cea0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.551 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.553 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.554 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.554 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.594 2 DEBUG nova.storage.rbd_utils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] rbd image d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:24 np0005465988 nova_compute[236126]: 2025-10-02 12:16:24.599 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:24.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:24.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:25 np0005465988 nova_compute[236126]: 2025-10-02 12:16:25.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:25 np0005465988 nova_compute[236126]: 2025-10-02 12:16:25.308 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.710s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:25 np0005465988 nova_compute[236126]: 2025-10-02 12:16:25.420 2 DEBUG nova.storage.rbd_utils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] resizing rbd image d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:16:25 np0005465988 nova_compute[236126]: 2025-10-02 12:16:25.570 2 DEBUG nova.objects.instance [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lazy-loading 'migration_context' on Instance uuid d0fb1236-bd41-4efe-8e6a-bb900eb86960 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:25 np0005465988 nova_compute[236126]: 2025-10-02 12:16:25.593 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:16:25 np0005465988 nova_compute[236126]: 2025-10-02 12:16:25.594 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Ensure instance console log exists: /var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:16:25 np0005465988 nova_compute[236126]: 2025-10-02 12:16:25.594 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:25 np0005465988 nova_compute[236126]: 2025-10-02 12:16:25.595 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:25 np0005465988 nova_compute[236126]: 2025-10-02 12:16:25.596 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:25 np0005465988 nova_compute[236126]: 2025-10-02 12:16:25.683 2 DEBUG nova.network.neutron [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Successfully created port: ac8da617-201c-4081-8414-4b18e26dfb4a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:16:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e222 e222: 3 total, 3 up, 3 in
Oct  2 08:16:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:26.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:26.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:26 np0005465988 nova_compute[236126]: 2025-10-02 12:16:26.911 2 DEBUG nova.network.neutron [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Successfully updated port: ac8da617-201c-4081-8414-4b18e26dfb4a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:16:26 np0005465988 nova_compute[236126]: 2025-10-02 12:16:26.944 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:26 np0005465988 nova_compute[236126]: 2025-10-02 12:16:26.945 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquired lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:26 np0005465988 nova_compute[236126]: 2025-10-02 12:16:26.945 2 DEBUG nova.network.neutron [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:16:27 np0005465988 nova_compute[236126]: 2025-10-02 12:16:27.125 2 DEBUG nova.compute.manager [req-c2b7df1b-f75d-43c5-b269-ced8e0581dd4 req-a2833fc7-5366-4c5f-89de-6fb14f99fc6a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-changed-ac8da617-201c-4081-8414-4b18e26dfb4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:27 np0005465988 nova_compute[236126]: 2025-10-02 12:16:27.125 2 DEBUG nova.compute.manager [req-c2b7df1b-f75d-43c5-b269-ced8e0581dd4 req-a2833fc7-5366-4c5f-89de-6fb14f99fc6a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing instance network info cache due to event network-changed-ac8da617-201c-4081-8414-4b18e26dfb4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:27 np0005465988 nova_compute[236126]: 2025-10-02 12:16:27.125 2 DEBUG oslo_concurrency.lockutils [req-c2b7df1b-f75d-43c5-b269-ced8e0581dd4 req-a2833fc7-5366-4c5f-89de-6fb14f99fc6a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:27 np0005465988 nova_compute[236126]: 2025-10-02 12:16:27.258 2 DEBUG nova.network.neutron [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:16:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:27.342 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:27.343 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:27.343 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:27 np0005465988 podman[265817]: 2025-10-02 12:16:27.567096232 +0000 UTC m=+0.094188042 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  2 08:16:27 np0005465988 podman[265818]: 2025-10-02 12:16:27.569285076 +0000 UTC m=+0.095503491 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:16:27 np0005465988 podman[265816]: 2025-10-02 12:16:27.574759476 +0000 UTC m=+0.115361069 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:16:27 np0005465988 nova_compute[236126]: 2025-10-02 12:16:27.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:28.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.842 2 DEBUG nova.network.neutron [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updating instance_info_cache with network_info: [{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.868 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Releasing lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.869 2 DEBUG nova.compute.manager [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Instance network_info: |[{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.870 2 DEBUG oslo_concurrency.lockutils [req-c2b7df1b-f75d-43c5-b269-ced8e0581dd4 req-a2833fc7-5366-4c5f-89de-6fb14f99fc6a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.871 2 DEBUG nova.network.neutron [req-c2b7df1b-f75d-43c5-b269-ced8e0581dd4 req-a2833fc7-5366-4c5f-89de-6fb14f99fc6a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing network info cache for port ac8da617-201c-4081-8414-4b18e26dfb4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.874 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Start _get_guest_xml network_info=[{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.881 2 WARNING nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:28.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.886 2 DEBUG nova.virt.libvirt.host [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.887 2 DEBUG nova.virt.libvirt.host [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.897 2 DEBUG nova.virt.libvirt.host [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.898 2 DEBUG nova.virt.libvirt.host [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.900 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.900 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.901 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.902 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.902 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.903 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.903 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.904 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.904 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.904 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.905 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.905 2 DEBUG nova.virt.hardware [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:16:28 np0005465988 nova_compute[236126]: 2025-10-02 12:16:28.909 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e223 e223: 3 total, 3 up, 3 in
Oct  2 08:16:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:16:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4277439159' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.370 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.401 2 DEBUG nova.storage.rbd_utils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] rbd image d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.405 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:16:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/435369187' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.859 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.864 2 DEBUG nova.virt.libvirt.vif [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1313126078',display_name='tempest-tempest.common.compute-instance-1313126078',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1313126078',id=70,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATuG0Igs/aClVmsQdGPZDgNmvV7Bgyq0Y4nFm5M4crDAlAvWuUKcpXWJIzqspIHqd22HRQCaWbqLnJi4PrcYo9nS6OUtnl8L8k0zpd1d2KcZKsipjVMcOXUmRyImfEBhg==',key_name='tempest-keypair-716714059',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bddf59854a7d4eb1a85ece547923cea0',ramdisk_id='',reservation_id='r-vu95vf5t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-751325164',owner_user_name='tempest-AttachInterfacesTestJSON-751325164-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8e627e098f2b465d97099fee8f489b71',uuid=d0fb1236-bd41-4efe-8e6a-bb900eb86960,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.865 2 DEBUG nova.network.os_vif_util [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converting VIF {"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.868 2 DEBUG nova.network.os_vif_util [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:f8:40,bridge_name='br-int',has_traffic_filtering=True,id=ac8da617-201c-4081-8414-4b18e26dfb4a,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8da617-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.871 2 DEBUG nova.objects.instance [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lazy-loading 'pci_devices' on Instance uuid d0fb1236-bd41-4efe-8e6a-bb900eb86960 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.894 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  <uuid>d0fb1236-bd41-4efe-8e6a-bb900eb86960</uuid>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  <name>instance-00000046</name>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <nova:name>tempest-tempest.common.compute-instance-1313126078</nova:name>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:16:28</nova:creationTime>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <nova:user uuid="8e627e098f2b465d97099fee8f489b71">tempest-AttachInterfacesTestJSON-751325164-project-member</nova:user>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <nova:project uuid="bddf59854a7d4eb1a85ece547923cea0">tempest-AttachInterfacesTestJSON-751325164</nova:project>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <nova:port uuid="ac8da617-201c-4081-8414-4b18e26dfb4a">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <entry name="serial">d0fb1236-bd41-4efe-8e6a-bb900eb86960</entry>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <entry name="uuid">d0fb1236-bd41-4efe-8e6a-bb900eb86960</entry>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk.config">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:15:f8:40"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <target dev="tapac8da617-20"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/console.log" append="off"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:16:29 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:16:29 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:16:29 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:16:29 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.897 2 DEBUG nova.compute.manager [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Preparing to wait for external event network-vif-plugged-ac8da617-201c-4081-8414-4b18e26dfb4a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.897 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.898 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.898 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.899 2 DEBUG nova.virt.libvirt.vif [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1313126078',display_name='tempest-tempest.common.compute-instance-1313126078',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1313126078',id=70,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATuG0Igs/aClVmsQdGPZDgNmvV7Bgyq0Y4nFm5M4crDAlAvWuUKcpXWJIzqspIHqd22HRQCaWbqLnJi4PrcYo9nS6OUtnl8L8k0zpd1d2KcZKsipjVMcOXUmRyImfEBhg==',key_name='tempest-keypair-716714059',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bddf59854a7d4eb1a85ece547923cea0',ramdisk_id='',reservation_id='r-vu95vf5t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-751325164',owner_user_name='tempest-AttachInterfacesTestJSON-751325164-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8e627e098f2b465d97099fee8f489b71',uuid=d0fb1236-bd41-4efe-8e6a-bb900eb86960,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.899 2 DEBUG nova.network.os_vif_util [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converting VIF {"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.900 2 DEBUG nova.network.os_vif_util [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:f8:40,bridge_name='br-int',has_traffic_filtering=True,id=ac8da617-201c-4081-8414-4b18e26dfb4a,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8da617-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.900 2 DEBUG os_vif [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:f8:40,bridge_name='br-int',has_traffic_filtering=True,id=ac8da617-201c-4081-8414-4b18e26dfb4a,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8da617-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.902 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.902 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.905 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac8da617-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.905 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapac8da617-20, col_values=(('external_ids', {'iface-id': 'ac8da617-201c-4081-8414-4b18e26dfb4a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:f8:40', 'vm-uuid': 'd0fb1236-bd41-4efe-8e6a-bb900eb86960'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:29 np0005465988 NetworkManager[45041]: <info>  [1759407389.9455] manager: (tapac8da617-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:29 np0005465988 nova_compute[236126]: 2025-10-02 12:16:29.953 2 INFO os_vif [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:f8:40,bridge_name='br-int',has_traffic_filtering=True,id=ac8da617-201c-4081-8414-4b18e26dfb4a,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8da617-20')#033[00m
Oct  2 08:16:30 np0005465988 nova_compute[236126]: 2025-10-02 12:16:30.019 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:30 np0005465988 nova_compute[236126]: 2025-10-02 12:16:30.020 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:30 np0005465988 nova_compute[236126]: 2025-10-02 12:16:30.020 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] No VIF found with MAC fa:16:3e:15:f8:40, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:16:30 np0005465988 nova_compute[236126]: 2025-10-02 12:16:30.021 2 INFO nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Using config drive#033[00m
Oct  2 08:16:30 np0005465988 nova_compute[236126]: 2025-10-02 12:16:30.056 2 DEBUG nova.storage.rbd_utils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] rbd image d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:30.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:30.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:30 np0005465988 nova_compute[236126]: 2025-10-02 12:16:30.937 2 INFO nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Creating config drive at /var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/disk.config#033[00m
Oct  2 08:16:30 np0005465988 nova_compute[236126]: 2025-10-02 12:16:30.943 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcnbm8dka execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.031 2 DEBUG nova.network.neutron [req-c2b7df1b-f75d-43c5-b269-ced8e0581dd4 req-a2833fc7-5366-4c5f-89de-6fb14f99fc6a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updated VIF entry in instance network info cache for port ac8da617-201c-4081-8414-4b18e26dfb4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.032 2 DEBUG nova.network.neutron [req-c2b7df1b-f75d-43c5-b269-ced8e0581dd4 req-a2833fc7-5366-4c5f-89de-6fb14f99fc6a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updating instance_info_cache with network_info: [{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.049 2 DEBUG oslo_concurrency.lockutils [req-c2b7df1b-f75d-43c5-b269-ced8e0581dd4 req-a2833fc7-5366-4c5f-89de-6fb14f99fc6a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.095 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcnbm8dka" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.122 2 DEBUG nova.storage.rbd_utils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] rbd image d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.125 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/disk.config d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.273 2 DEBUG oslo_concurrency.processutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/disk.config d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.274 2 INFO nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Deleting local config drive /var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/disk.config because it was imported into RBD.#033[00m
Oct  2 08:16:31 np0005465988 NetworkManager[45041]: <info>  [1759407391.3365] manager: (tapac8da617-20): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Oct  2 08:16:31 np0005465988 kernel: tapac8da617-20: entered promiscuous mode
Oct  2 08:16:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:31Z|00215|binding|INFO|Claiming lport ac8da617-201c-4081-8414-4b18e26dfb4a for this chassis.
Oct  2 08:16:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:31Z|00216|binding|INFO|ac8da617-201c-4081-8414-4b18e26dfb4a: Claiming fa:16:3e:15:f8:40 10.100.0.3
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.358 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:f8:40 10.100.0.3'], port_security=['fa:16:3e:15:f8:40 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd0fb1236-bd41-4efe-8e6a-bb900eb86960', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee114210-598c-482f-83c7-26c3363a45c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bddf59854a7d4eb1a85ece547923cea0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fd3b62a4-b346-4888-8f6a-ca787221af6a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6f6a61f-52bf-4639-80ed-e59f98940f90, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=ac8da617-201c-4081-8414-4b18e26dfb4a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.359 142124 INFO neutron.agent.ovn.metadata.agent [-] Port ac8da617-201c-4081-8414-4b18e26dfb4a in datapath ee114210-598c-482f-83c7-26c3363a45c4 bound to our chassis#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.361 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee114210-598c-482f-83c7-26c3363a45c4#033[00m
Oct  2 08:16:31 np0005465988 systemd-udevd[266017]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:31 np0005465988 systemd-machined[192594]: New machine qemu-27-instance-00000046.
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.374 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7158888e-6cef-4d08-b630-02c50d2db460]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.375 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee114210-51 in ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.379 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee114210-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.380 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3a90ce59-5370-40e2-8148-302f61b2b5bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.381 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[894c0a64-54ef-4c1e-aafc-4d586984d186]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 systemd[1]: Started Virtual Machine qemu-27-instance-00000046.
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.393 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[2719b869-ac5d-434c-9bd2-cd3878182534]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 NetworkManager[45041]: <info>  [1759407391.3963] device (tapac8da617-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:16:31 np0005465988 NetworkManager[45041]: <info>  [1759407391.3971] device (tapac8da617-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:16:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:31Z|00217|binding|INFO|Setting lport ac8da617-201c-4081-8414-4b18e26dfb4a ovn-installed in OVS
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.431 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e7da283f-faf5-462a-9305-e56d525a4e31]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:31Z|00218|binding|INFO|Setting lport ac8da617-201c-4081-8414-4b18e26dfb4a up in Southbound
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.480 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[16769def-f9c8-404f-a008-175ba323eee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.487 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[383dc5d1-503d-48ba-b499-7e06e346e184]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 NetworkManager[45041]: <info>  [1759407391.4889] manager: (tapee114210-50): new Veth device (/org/freedesktop/NetworkManager/Devices/118)
Oct  2 08:16:31 np0005465988 systemd-udevd[266020]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.529 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ac6052-99d2-45dc-8a99-7311a1368ea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.534 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7a36386d-dd68-4978-965c-5c9f0baef4f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 NetworkManager[45041]: <info>  [1759407391.5690] device (tapee114210-50): carrier: link connected
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.577 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0bfc811e-b654-49ce-bfb6-62f67a98f699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.608 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b7e7bb39-9890-43c1-ba52-ae14af7d10e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee114210-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:86:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545190, 'reachable_time': 27061, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266051, 'error': None, 'target': 'ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.630 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[78830cb8-26ae-471e-adf2-d65a58445aec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fead:8699'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545190, 'tstamp': 545190}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 266052, 'error': None, 'target': 'ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.658 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1b1f6739-b00e-4976-a211-69d25fb5a112]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee114210-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:86:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545190, 'reachable_time': 27061, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 266053, 'error': None, 'target': 'ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.708 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cf1f186b-2bb6-48d7-8be5-d35bc825cb6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.739 2 DEBUG nova.compute.manager [req-c31d541a-7bac-407b-8fbb-9a9956df611d req-d7280f67-3e43-4754-a55a-81d6b27caf45 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-vif-plugged-ac8da617-201c-4081-8414-4b18e26dfb4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.740 2 DEBUG oslo_concurrency.lockutils [req-c31d541a-7bac-407b-8fbb-9a9956df611d req-d7280f67-3e43-4754-a55a-81d6b27caf45 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.740 2 DEBUG oslo_concurrency.lockutils [req-c31d541a-7bac-407b-8fbb-9a9956df611d req-d7280f67-3e43-4754-a55a-81d6b27caf45 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.741 2 DEBUG oslo_concurrency.lockutils [req-c31d541a-7bac-407b-8fbb-9a9956df611d req-d7280f67-3e43-4754-a55a-81d6b27caf45 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.741 2 DEBUG nova.compute.manager [req-c31d541a-7bac-407b-8fbb-9a9956df611d req-d7280f67-3e43-4754-a55a-81d6b27caf45 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Processing event network-vif-plugged-ac8da617-201c-4081-8414-4b18e26dfb4a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.807 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[565bad4f-60f6-42b4-b836-7d2a61e386d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.809 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee114210-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.810 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.811 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee114210-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005465988 kernel: tapee114210-50: entered promiscuous mode
Oct  2 08:16:31 np0005465988 NetworkManager[45041]: <info>  [1759407391.8143] manager: (tapee114210-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.817 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee114210-50, col_values=(('external_ids', {'iface-id': '544b20f9-e292-46bc-a770-180c318d534e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:31Z|00219|binding|INFO|Releasing lport 544b20f9-e292-46bc-a770-180c318d534e from this chassis (sb_readonly=0)
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.822 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee114210-598c-482f-83c7-26c3363a45c4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee114210-598c-482f-83c7-26c3363a45c4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.823 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[302a1ed4-e387-4a9b-9c26-83bdaca0f4fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.824 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-ee114210-598c-482f-83c7-26c3363a45c4
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/ee114210-598c-482f-83c7-26c3363a45c4.pid.haproxy
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID ee114210-598c-482f-83c7-26c3363a45c4
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:16:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:31.828 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4', 'env', 'PROCESS_TAG=haproxy-ee114210-598c-482f-83c7-26c3363a45c4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee114210-598c-482f-83c7-26c3363a45c4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:16:31 np0005465988 nova_compute[236126]: 2025-10-02 12:16:31.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:32 np0005465988 podman[266127]: 2025-10-02 12:16:32.323149723 +0000 UTC m=+0.068076682 container create e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:16:32 np0005465988 systemd[1]: Started libpod-conmon-e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51.scope.
Oct  2 08:16:32 np0005465988 podman[266127]: 2025-10-02 12:16:32.29520221 +0000 UTC m=+0.040129209 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:16:32 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:16:32 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9261db82d16f56b57727a2ea2d60bfcdc4371cfe6284e2958325b8e937bb082/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:16:32 np0005465988 podman[266127]: 2025-10-02 12:16:32.432751463 +0000 UTC m=+0.177678482 container init e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:16:32 np0005465988 podman[266127]: 2025-10-02 12:16:32.437873692 +0000 UTC m=+0.182800671 container start e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:16:32 np0005465988 neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4[266142]: [NOTICE]   (266146) : New worker (266148) forked
Oct  2 08:16:32 np0005465988 neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4[266142]: [NOTICE]   (266146) : Loading success.
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.712 2 DEBUG nova.compute.manager [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.713 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407392.7133293, d0fb1236-bd41-4efe-8e6a-bb900eb86960 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.713 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] VM Started (Lifecycle Event)#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.716 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.720 2 INFO nova.virt.libvirt.driver [-] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Instance spawned successfully.#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.720 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.732 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.737 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.773 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.774 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407392.715677, d0fb1236-bd41-4efe-8e6a-bb900eb86960 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.774 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:16:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:32.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.780 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.781 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.784 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.785 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.785 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.786 2 DEBUG nova.virt.libvirt.driver [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.795 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.800 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407392.7158365, d0fb1236-bd41-4efe-8e6a-bb900eb86960 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.800 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.844 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.855 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.868 2 INFO nova.compute.manager [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Took 8.52 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.868 2 DEBUG nova.compute.manager [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:32.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.897 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.963 2 INFO nova.compute.manager [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Took 9.57 seconds to build instance.#033[00m
Oct  2 08:16:32 np0005465988 nova_compute[236126]: 2025-10-02 12:16:32.993 2 DEBUG oslo_concurrency.lockutils [None req-659350fe-7302-49f4-a1db-8b137a4d4dae 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:33 np0005465988 nova_compute[236126]: 2025-10-02 12:16:33.854 2 DEBUG nova.compute.manager [req-4bdb7cdd-0fec-431f-b9bd-24c1d99e5c09 req-9047091a-d622-4269-a0fb-00fe2fc9a5e1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-vif-plugged-ac8da617-201c-4081-8414-4b18e26dfb4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:33 np0005465988 nova_compute[236126]: 2025-10-02 12:16:33.855 2 DEBUG oslo_concurrency.lockutils [req-4bdb7cdd-0fec-431f-b9bd-24c1d99e5c09 req-9047091a-d622-4269-a0fb-00fe2fc9a5e1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:33 np0005465988 nova_compute[236126]: 2025-10-02 12:16:33.855 2 DEBUG oslo_concurrency.lockutils [req-4bdb7cdd-0fec-431f-b9bd-24c1d99e5c09 req-9047091a-d622-4269-a0fb-00fe2fc9a5e1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:33 np0005465988 nova_compute[236126]: 2025-10-02 12:16:33.855 2 DEBUG oslo_concurrency.lockutils [req-4bdb7cdd-0fec-431f-b9bd-24c1d99e5c09 req-9047091a-d622-4269-a0fb-00fe2fc9a5e1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:33 np0005465988 nova_compute[236126]: 2025-10-02 12:16:33.856 2 DEBUG nova.compute.manager [req-4bdb7cdd-0fec-431f-b9bd-24c1d99e5c09 req-9047091a-d622-4269-a0fb-00fe2fc9a5e1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] No waiting events found dispatching network-vif-plugged-ac8da617-201c-4081-8414-4b18e26dfb4a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:33 np0005465988 nova_compute[236126]: 2025-10-02 12:16:33.856 2 WARNING nova.compute.manager [req-4bdb7cdd-0fec-431f-b9bd-24c1d99e5c09 req-9047091a-d622-4269-a0fb-00fe2fc9a5e1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received unexpected event network-vif-plugged-ac8da617-201c-4081-8414-4b18e26dfb4a for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:34.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:34.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:34 np0005465988 nova_compute[236126]: 2025-10-02 12:16:34.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:35 np0005465988 NetworkManager[45041]: <info>  [1759407395.2794] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Oct  2 08:16:35 np0005465988 NetworkManager[45041]: <info>  [1759407395.2811] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Oct  2 08:16:35 np0005465988 nova_compute[236126]: 2025-10-02 12:16:35.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:35 np0005465988 nova_compute[236126]: 2025-10-02 12:16:35.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:35Z|00220|binding|INFO|Releasing lport 544b20f9-e292-46bc-a770-180c318d534e from this chassis (sb_readonly=0)
Oct  2 08:16:35 np0005465988 nova_compute[236126]: 2025-10-02 12:16:35.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:36 np0005465988 nova_compute[236126]: 2025-10-02 12:16:36.066 2 DEBUG nova.compute.manager [req-d455f2eb-8fb6-456f-a725-588c80d757cc req-145fbeb6-107a-4444-a812-7d5c5c686a5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-changed-ac8da617-201c-4081-8414-4b18e26dfb4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:36 np0005465988 nova_compute[236126]: 2025-10-02 12:16:36.066 2 DEBUG nova.compute.manager [req-d455f2eb-8fb6-456f-a725-588c80d757cc req-145fbeb6-107a-4444-a812-7d5c5c686a5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing instance network info cache due to event network-changed-ac8da617-201c-4081-8414-4b18e26dfb4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:36 np0005465988 nova_compute[236126]: 2025-10-02 12:16:36.067 2 DEBUG oslo_concurrency.lockutils [req-d455f2eb-8fb6-456f-a725-588c80d757cc req-145fbeb6-107a-4444-a812-7d5c5c686a5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:36 np0005465988 nova_compute[236126]: 2025-10-02 12:16:36.067 2 DEBUG oslo_concurrency.lockutils [req-d455f2eb-8fb6-456f-a725-588c80d757cc req-145fbeb6-107a-4444-a812-7d5c5c686a5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:36 np0005465988 nova_compute[236126]: 2025-10-02 12:16:36.067 2 DEBUG nova.network.neutron [req-d455f2eb-8fb6-456f-a725-588c80d757cc req-145fbeb6-107a-4444-a812-7d5c5c686a5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing network info cache for port ac8da617-201c-4081-8414-4b18e26dfb4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:36.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:36.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:37 np0005465988 nova_compute[236126]: 2025-10-02 12:16:37.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:37 np0005465988 nova_compute[236126]: 2025-10-02 12:16:37.827 2 DEBUG nova.network.neutron [req-d455f2eb-8fb6-456f-a725-588c80d757cc req-145fbeb6-107a-4444-a812-7d5c5c686a5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updated VIF entry in instance network info cache for port ac8da617-201c-4081-8414-4b18e26dfb4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:37 np0005465988 nova_compute[236126]: 2025-10-02 12:16:37.827 2 DEBUG nova.network.neutron [req-d455f2eb-8fb6-456f-a725-588c80d757cc req-145fbeb6-107a-4444-a812-7d5c5c686a5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updating instance_info_cache with network_info: [{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:37 np0005465988 nova_compute[236126]: 2025-10-02 12:16:37.863 2 DEBUG oslo_concurrency.lockutils [req-d455f2eb-8fb6-456f-a725-588c80d757cc req-145fbeb6-107a-4444-a812-7d5c5c686a5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:38.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:38.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:39 np0005465988 nova_compute[236126]: 2025-10-02 12:16:39.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:40 np0005465988 nova_compute[236126]: 2025-10-02 12:16:40.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:40.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:40.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:41 np0005465988 nova_compute[236126]: 2025-10-02 12:16:41.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:41 np0005465988 nova_compute[236126]: 2025-10-02 12:16:41.500 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:41 np0005465988 nova_compute[236126]: 2025-10-02 12:16:41.501 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:41 np0005465988 nova_compute[236126]: 2025-10-02 12:16:41.501 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:41 np0005465988 nova_compute[236126]: 2025-10-02 12:16:41.502 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:16:41 np0005465988 nova_compute[236126]: 2025-10-02 12:16:41.503 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:16:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2394753011' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.020 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e224 e224: 3 total, 3 up, 3 in
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.091 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000046 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.092 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000046 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:16:42 np0005465988 podman[266236]: 2025-10-02 12:16:42.129671385 +0000 UTC m=+0.063876510 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.266 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.269 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4468MB free_disk=20.91382598876953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.269 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.270 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.376 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance d0fb1236-bd41-4efe-8e6a-bb900eb86960 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.377 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.377 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.428 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:42.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:16:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1734109519' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.837 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.842 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.867 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:42.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.912 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:16:42 np0005465988 nova_compute[236126]: 2025-10-02 12:16:42.913 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:16:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2421507239' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:16:43 np0005465988 nova_compute[236126]: 2025-10-02 12:16:43.915 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:43 np0005465988 nova_compute[236126]: 2025-10-02 12:16:43.916 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:43 np0005465988 nova_compute[236126]: 2025-10-02 12:16:43.916 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:44 np0005465988 nova_compute[236126]: 2025-10-02 12:16:44.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:44.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:44.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:44 np0005465988 nova_compute[236126]: 2025-10-02 12:16:44.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e225 e225: 3 total, 3 up, 3 in
Oct  2 08:16:45 np0005465988 nova_compute[236126]: 2025-10-02 12:16:45.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:45 np0005465988 nova_compute[236126]: 2025-10-02 12:16:45.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:45 np0005465988 nova_compute[236126]: 2025-10-02 12:16:45.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:16:46 np0005465988 nova_compute[236126]: 2025-10-02 12:16:46.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e226 e226: 3 total, 3 up, 3 in
Oct  2 08:16:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:46Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:f8:40 10.100.0.3
Oct  2 08:16:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:46Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:f8:40 10.100.0.3
Oct  2 08:16:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:46.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:46.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.008 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.009 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.037 2 DEBUG nova.compute.manager [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.150 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.151 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.158 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.158 2 INFO nova.compute.claims [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:16:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.345 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:16:47 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4044166331' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.807 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.815 2 DEBUG nova.compute.provider_tree [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.836 2 DEBUG nova.scheduler.client.report [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.879 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.880 2 DEBUG nova.compute.manager [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.943 2 DEBUG nova.compute.manager [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.944 2 DEBUG nova.network.neutron [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.971 2 INFO nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:16:47 np0005465988 nova_compute[236126]: 2025-10-02 12:16:47.998 2 DEBUG nova.compute.manager [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.084 2 DEBUG nova.compute.manager [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.085 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.085 2 INFO nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Creating image(s)#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.116 2 DEBUG nova.storage.rbd_utils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.149 2 DEBUG nova.storage.rbd_utils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.181 2 DEBUG nova.storage.rbd_utils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.185 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.224 2 DEBUG nova.policy [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '69d8e29c6d3747e98a5985a584f4c814', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8efba404696b40fbbaa6431b934b87f1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.275 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.276 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.276 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.277 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.311 2 DEBUG nova.storage.rbd_utils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.315 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:48.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.831 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:48.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:48 np0005465988 nova_compute[236126]: 2025-10-02 12:16:48.918 2 DEBUG nova.storage.rbd_utils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] resizing rbd image bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:16:49 np0005465988 nova_compute[236126]: 2025-10-02 12:16:49.093 2 DEBUG nova.objects.instance [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lazy-loading 'migration_context' on Instance uuid bc4239f5-3cf2-4325-803c-73121f7e0ee0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:49 np0005465988 nova_compute[236126]: 2025-10-02 12:16:49.121 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:16:49 np0005465988 nova_compute[236126]: 2025-10-02 12:16:49.122 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Ensure instance console log exists: /var/lib/nova/instances/bc4239f5-3cf2-4325-803c-73121f7e0ee0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:16:49 np0005465988 nova_compute[236126]: 2025-10-02 12:16:49.122 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:49 np0005465988 nova_compute[236126]: 2025-10-02 12:16:49.123 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:49 np0005465988 nova_compute[236126]: 2025-10-02 12:16:49.123 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:49 np0005465988 nova_compute[236126]: 2025-10-02 12:16:49.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:50 np0005465988 nova_compute[236126]: 2025-10-02 12:16:50.471 2 DEBUG nova.network.neutron [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Successfully created port: 576bdab0-26cd-4663-8dd5-149075e0d45d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:16:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:50.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:50.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e227 e227: 3 total, 3 up, 3 in
Oct  2 08:16:51 np0005465988 nova_compute[236126]: 2025-10-02 12:16:51.770 2 DEBUG nova.compute.manager [req-eb6af3f1-6023-475d-aa08-4cb304252989 req-acef8f51-e052-4207-bbf2-a4ae1b100fb5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-changed-ac8da617-201c-4081-8414-4b18e26dfb4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:51 np0005465988 nova_compute[236126]: 2025-10-02 12:16:51.771 2 DEBUG nova.compute.manager [req-eb6af3f1-6023-475d-aa08-4cb304252989 req-acef8f51-e052-4207-bbf2-a4ae1b100fb5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing instance network info cache due to event network-changed-ac8da617-201c-4081-8414-4b18e26dfb4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:51 np0005465988 nova_compute[236126]: 2025-10-02 12:16:51.771 2 DEBUG oslo_concurrency.lockutils [req-eb6af3f1-6023-475d-aa08-4cb304252989 req-acef8f51-e052-4207-bbf2-a4ae1b100fb5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:51 np0005465988 nova_compute[236126]: 2025-10-02 12:16:51.772 2 DEBUG oslo_concurrency.lockutils [req-eb6af3f1-6023-475d-aa08-4cb304252989 req-acef8f51-e052-4207-bbf2-a4ae1b100fb5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:51 np0005465988 nova_compute[236126]: 2025-10-02 12:16:51.772 2 DEBUG nova.network.neutron [req-eb6af3f1-6023-475d-aa08-4cb304252989 req-acef8f51-e052-4207-bbf2-a4ae1b100fb5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing network info cache for port ac8da617-201c-4081-8414-4b18e26dfb4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:51 np0005465988 nova_compute[236126]: 2025-10-02 12:16:51.780 2 DEBUG nova.network.neutron [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Successfully updated port: 576bdab0-26cd-4663-8dd5-149075e0d45d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:16:51 np0005465988 nova_compute[236126]: 2025-10-02 12:16:51.802 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:51 np0005465988 nova_compute[236126]: 2025-10-02 12:16:51.803 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquired lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:51 np0005465988 nova_compute[236126]: 2025-10-02 12:16:51.803 2 DEBUG nova.network.neutron [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:16:52 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #48. Immutable memtables: 5.
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.009 2 DEBUG nova.network.neutron [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:16:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.492 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e228 e228: 3 total, 3 up, 3 in
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.691 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:52.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:52.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.936 2 DEBUG nova.network.neutron [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Updating instance_info_cache with network_info: [{"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.954 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Releasing lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.955 2 DEBUG nova.compute.manager [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Instance network_info: |[{"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.957 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Start _get_guest_xml network_info=[{"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.962 2 WARNING nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.968 2 DEBUG nova.virt.libvirt.host [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.968 2 DEBUG nova.virt.libvirt.host [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.973 2 DEBUG nova.virt.libvirt.host [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.974 2 DEBUG nova.virt.libvirt.host [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.975 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.975 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.976 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.976 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.976 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.976 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.976 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.977 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.977 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.977 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.977 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.977 2 DEBUG nova.virt.hardware [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:16:52 np0005465988 nova_compute[236126]: 2025-10-02 12:16:52.982 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:16:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1216961707' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.415 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.457 2 DEBUG nova.storage.rbd_utils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.464 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e229 e229: 3 total, 3 up, 3 in
Oct  2 08:16:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e230 e230: 3 total, 3 up, 3 in
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.772 2 DEBUG nova.network.neutron [req-eb6af3f1-6023-475d-aa08-4cb304252989 req-acef8f51-e052-4207-bbf2-a4ae1b100fb5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updated VIF entry in instance network info cache for port ac8da617-201c-4081-8414-4b18e26dfb4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.773 2 DEBUG nova.network.neutron [req-eb6af3f1-6023-475d-aa08-4cb304252989 req-acef8f51-e052-4207-bbf2-a4ae1b100fb5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updating instance_info_cache with network_info: [{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.809 2 DEBUG oslo_concurrency.lockutils [req-eb6af3f1-6023-475d-aa08-4cb304252989 req-acef8f51-e052-4207-bbf2-a4ae1b100fb5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.809 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.810 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.810 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d0fb1236-bd41-4efe-8e6a-bb900eb86960 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.882 2 DEBUG nova.compute.manager [req-de9e3a2c-1392-4278-a41d-70ffe0eec28f req-0ea3bfb8-ead5-4910-8c18-672c89da15f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Received event network-changed-576bdab0-26cd-4663-8dd5-149075e0d45d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.882 2 DEBUG nova.compute.manager [req-de9e3a2c-1392-4278-a41d-70ffe0eec28f req-0ea3bfb8-ead5-4910-8c18-672c89da15f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Refreshing instance network info cache due to event network-changed-576bdab0-26cd-4663-8dd5-149075e0d45d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.882 2 DEBUG oslo_concurrency.lockutils [req-de9e3a2c-1392-4278-a41d-70ffe0eec28f req-0ea3bfb8-ead5-4910-8c18-672c89da15f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.883 2 DEBUG oslo_concurrency.lockutils [req-de9e3a2c-1392-4278-a41d-70ffe0eec28f req-0ea3bfb8-ead5-4910-8c18-672c89da15f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.883 2 DEBUG nova.network.neutron [req-de9e3a2c-1392-4278-a41d-70ffe0eec28f req-0ea3bfb8-ead5-4910-8c18-672c89da15f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Refreshing network info cache for port 576bdab0-26cd-4663-8dd5-149075e0d45d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:16:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/325269327' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.929 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.931 2 DEBUG nova.virt.libvirt.vif [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2050580051',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-2050580051',id=75,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8efba404696b40fbbaa6431b934b87f1',ramdisk_id='',reservation_id='r-zp3eerp0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-153154373',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-153154373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:48Z,user_data=None,user_id='69d8e29c6d3747e98a5985a584f4c814',uuid=bc4239f5-3cf2-4325-803c-73121f7e0ee0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.931 2 DEBUG nova.network.os_vif_util [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converting VIF {"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.932 2 DEBUG nova.network.os_vif_util [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:e9:df,bridge_name='br-int',has_traffic_filtering=True,id=576bdab0-26cd-4663-8dd5-149075e0d45d,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap576bdab0-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.934 2 DEBUG nova.objects.instance [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid bc4239f5-3cf2-4325-803c-73121f7e0ee0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.952 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  <uuid>bc4239f5-3cf2-4325-803c-73121f7e0ee0</uuid>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  <name>instance-0000004b</name>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerBootFromVolumeStableRescueTest-server-2050580051</nova:name>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:16:52</nova:creationTime>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <nova:user uuid="69d8e29c6d3747e98a5985a584f4c814">tempest-ServerBootFromVolumeStableRescueTest-153154373-project-member</nova:user>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <nova:project uuid="8efba404696b40fbbaa6431b934b87f1">tempest-ServerBootFromVolumeStableRescueTest-153154373</nova:project>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <nova:port uuid="576bdab0-26cd-4663-8dd5-149075e0d45d">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <entry name="serial">bc4239f5-3cf2-4325-803c-73121f7e0ee0</entry>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <entry name="uuid">bc4239f5-3cf2-4325-803c-73121f7e0ee0</entry>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk.config">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:5d:e9:df"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <target dev="tap576bdab0-26"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/bc4239f5-3cf2-4325-803c-73121f7e0ee0/console.log" append="off"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:16:53 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:16:53 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:16:53 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:16:53 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.953 2 DEBUG nova.compute.manager [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Preparing to wait for external event network-vif-plugged-576bdab0-26cd-4663-8dd5-149075e0d45d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.954 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.954 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.955 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.957 2 DEBUG nova.virt.libvirt.vif [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2050580051',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-2050580051',id=75,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8efba404696b40fbbaa6431b934b87f1',ramdisk_id='',reservation_id='r-zp3eerp0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-153154373',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-153154373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:48Z,user_data=None,user_id='69d8e29c6d3747e98a5985a584f4c814',uuid=bc4239f5-3cf2-4325-803c-73121f7e0ee0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.957 2 DEBUG nova.network.os_vif_util [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converting VIF {"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.958 2 DEBUG nova.network.os_vif_util [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:e9:df,bridge_name='br-int',has_traffic_filtering=True,id=576bdab0-26cd-4663-8dd5-149075e0d45d,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap576bdab0-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.959 2 DEBUG os_vif [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:e9:df,bridge_name='br-int',has_traffic_filtering=True,id=576bdab0-26cd-4663-8dd5-149075e0d45d,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap576bdab0-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.961 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.961 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.966 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap576bdab0-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.967 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap576bdab0-26, col_values=(('external_ids', {'iface-id': '576bdab0-26cd-4663-8dd5-149075e0d45d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:e9:df', 'vm-uuid': 'bc4239f5-3cf2-4325-803c-73121f7e0ee0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:53 np0005465988 NetworkManager[45041]: <info>  [1759407413.9707] manager: (tap576bdab0-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:53 np0005465988 nova_compute[236126]: 2025-10-02 12:16:53.981 2 INFO os_vif [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:e9:df,bridge_name='br-int',has_traffic_filtering=True,id=576bdab0-26cd-4663-8dd5-149075e0d45d,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap576bdab0-26')#033[00m
Oct  2 08:16:54 np0005465988 nova_compute[236126]: 2025-10-02 12:16:54.036 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:54 np0005465988 nova_compute[236126]: 2025-10-02 12:16:54.037 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:54 np0005465988 nova_compute[236126]: 2025-10-02 12:16:54.037 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] No VIF found with MAC fa:16:3e:5d:e9:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:16:54 np0005465988 nova_compute[236126]: 2025-10-02 12:16:54.038 2 INFO nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Using config drive#033[00m
Oct  2 08:16:54 np0005465988 nova_compute[236126]: 2025-10-02 12:16:54.078 2 DEBUG nova.storage.rbd_utils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:54 np0005465988 nova_compute[236126]: 2025-10-02 12:16:54.795 2 INFO nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Creating config drive at /var/lib/nova/instances/bc4239f5-3cf2-4325-803c-73121f7e0ee0/disk.config#033[00m
Oct  2 08:16:54 np0005465988 nova_compute[236126]: 2025-10-02 12:16:54.801 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bc4239f5-3cf2-4325-803c-73121f7e0ee0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpki4r1qc8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:54.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:54.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:54 np0005465988 nova_compute[236126]: 2025-10-02 12:16:54.933 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bc4239f5-3cf2-4325-803c-73121f7e0ee0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpki4r1qc8" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:54 np0005465988 nova_compute[236126]: 2025-10-02 12:16:54.967 2 DEBUG nova.storage.rbd_utils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:16:54 np0005465988 nova_compute[236126]: 2025-10-02 12:16:54.973 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bc4239f5-3cf2-4325-803c-73121f7e0ee0/disk.config bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:16:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4088404689' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:16:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:16:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4088404689' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:16:55 np0005465988 nova_compute[236126]: 2025-10-02 12:16:55.180 2 DEBUG oslo_concurrency.processutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bc4239f5-3cf2-4325-803c-73121f7e0ee0/disk.config bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:55 np0005465988 nova_compute[236126]: 2025-10-02 12:16:55.181 2 INFO nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Deleting local config drive /var/lib/nova/instances/bc4239f5-3cf2-4325-803c-73121f7e0ee0/disk.config because it was imported into RBD.#033[00m
Oct  2 08:16:55 np0005465988 NetworkManager[45041]: <info>  [1759407415.2481] manager: (tap576bdab0-26): new Tun device (/org/freedesktop/NetworkManager/Devices/123)
Oct  2 08:16:55 np0005465988 kernel: tap576bdab0-26: entered promiscuous mode
Oct  2 08:16:55 np0005465988 nova_compute[236126]: 2025-10-02 12:16:55.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:55 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:55Z|00221|binding|INFO|Claiming lport 576bdab0-26cd-4663-8dd5-149075e0d45d for this chassis.
Oct  2 08:16:55 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:55Z|00222|binding|INFO|576bdab0-26cd-4663-8dd5-149075e0d45d: Claiming fa:16:3e:5d:e9:df 10.100.0.7
Oct  2 08:16:55 np0005465988 systemd-udevd[266608]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:55 np0005465988 nova_compute[236126]: 2025-10-02 12:16:55.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.317 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:e9:df 10.100.0.7'], port_security=['fa:16:3e:5d:e9:df 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bc4239f5-3cf2-4325-803c-73121f7e0ee0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8efba404696b40fbbaa6431b934b87f1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3d4d8d91-6fd2-4ab6-a30c-6640fa44e7f5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a5d3722e-d182-43fd-9a86-fa7ed68becec, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=576bdab0-26cd-4663-8dd5-149075e0d45d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.319 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 576bdab0-26cd-4663-8dd5-149075e0d45d in datapath f1725bd8-7d9d-45cc-b992-0cd3db0e30f0 bound to our chassis#033[00m
Oct  2 08:16:55 np0005465988 systemd-machined[192594]: New machine qemu-28-instance-0000004b.
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.322 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f1725bd8-7d9d-45cc-b992-0cd3db0e30f0#033[00m
Oct  2 08:16:55 np0005465988 NetworkManager[45041]: <info>  [1759407415.3258] device (tap576bdab0-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:16:55 np0005465988 NetworkManager[45041]: <info>  [1759407415.3272] device (tap576bdab0-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:16:55 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:55Z|00223|binding|INFO|Setting lport 576bdab0-26cd-4663-8dd5-149075e0d45d ovn-installed in OVS
Oct  2 08:16:55 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:55Z|00224|binding|INFO|Setting lport 576bdab0-26cd-4663-8dd5-149075e0d45d up in Southbound
Oct  2 08:16:55 np0005465988 nova_compute[236126]: 2025-10-02 12:16:55.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:55 np0005465988 systemd[1]: Started Virtual Machine qemu-28-instance-0000004b.
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.337 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fc218a0f-ba6e-40ab-9934-8146582dc7c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.338 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf1725bd8-71 in ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.340 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf1725bd8-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.340 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4180f32d-9ca3-4799-bb36-b0c24b4938d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.342 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dfb7a01e-a7a8-43a1-ac6c-70da9df1ce2c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.354 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd93d1d-a419-43e7-8933-7d8d90258cd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.378 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9a64d9-93e9-44d5-9468-875424cfc4c5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.403 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4bb1efdf-793d-4b45-8101-78e7ff0208f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 systemd-udevd[266612]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.408 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e86ed3bf-8206-4919-8626-1e1bdd1e00a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 NetworkManager[45041]: <info>  [1759407415.4097] manager: (tapf1725bd8-70): new Veth device (/org/freedesktop/NetworkManager/Devices/124)
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.443 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2f8f6969-202a-42f5-8f0f-970d94cf723e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.446 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4c358d2b-6f5c-4191-989b-456d74d32541]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 NetworkManager[45041]: <info>  [1759407415.4707] device (tapf1725bd8-70): carrier: link connected
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.476 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[02a046a7-69d9-4da4-9f14-0250c0802ad0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.496 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[741d370a-7ac8-4471-9080-aeba4e0b6ac2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1725bd8-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:76:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547580, 'reachable_time': 31646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266642, 'error': None, 'target': 'ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.517 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[aad9fa56-f12d-455c-a6c4-af5970379c23]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:76f0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547580, 'tstamp': 547580}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 266643, 'error': None, 'target': 'ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.535 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f5e049-fc3e-441f-a680-32b178d53a78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1725bd8-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:76:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547580, 'reachable_time': 31646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 266651, 'error': None, 'target': 'ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.570 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[893bba70-de7b-44cb-9c02-4f23c8c3e11f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.638 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4418683d-eec5-450e-8948-d66af5cebd86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.640 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1725bd8-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.640 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.640 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf1725bd8-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:55 np0005465988 NetworkManager[45041]: <info>  [1759407415.6436] manager: (tapf1725bd8-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/125)
Oct  2 08:16:55 np0005465988 nova_compute[236126]: 2025-10-02 12:16:55.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:55 np0005465988 kernel: tapf1725bd8-70: entered promiscuous mode
Oct  2 08:16:55 np0005465988 nova_compute[236126]: 2025-10-02 12:16:55.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.649 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf1725bd8-70, col_values=(('external_ids', {'iface-id': '421cd6e3-75aa-44e1-b552-d119c4fcd629'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:55 np0005465988 ovn_controller[132601]: 2025-10-02T12:16:55Z|00225|binding|INFO|Releasing lport 421cd6e3-75aa-44e1-b552-d119c4fcd629 from this chassis (sb_readonly=0)
Oct  2 08:16:55 np0005465988 nova_compute[236126]: 2025-10-02 12:16:55.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:55 np0005465988 nova_compute[236126]: 2025-10-02 12:16:55.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.674 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f1725bd8-7d9d-45cc-b992-0cd3db0e30f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f1725bd8-7d9d-45cc-b992-0cd3db0e30f0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.675 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0b1087af-7b16-48e6-8125-d0083e69d48e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.676 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/f1725bd8-7d9d-45cc-b992-0cd3db0e30f0.pid.haproxy
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID f1725bd8-7d9d-45cc-b992-0cd3db0e30f0
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:16:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:16:55.677 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'env', 'PROCESS_TAG=haproxy-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f1725bd8-7d9d-45cc-b992-0cd3db0e30f0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:16:56 np0005465988 podman[266718]: 2025-10-02 12:16:56.081598995 +0000 UTC m=+0.060125631 container create 4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:16:56 np0005465988 nova_compute[236126]: 2025-10-02 12:16:56.124 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407416.1241524, bc4239f5-3cf2-4325-803c-73121f7e0ee0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:56 np0005465988 nova_compute[236126]: 2025-10-02 12:16:56.125 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] VM Started (Lifecycle Event)#033[00m
Oct  2 08:16:56 np0005465988 systemd[1]: Started libpod-conmon-4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071.scope.
Oct  2 08:16:56 np0005465988 nova_compute[236126]: 2025-10-02 12:16:56.143 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:56 np0005465988 podman[266718]: 2025-10-02 12:16:56.051091487 +0000 UTC m=+0.029618113 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:16:56 np0005465988 nova_compute[236126]: 2025-10-02 12:16:56.148 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407416.1249816, bc4239f5-3cf2-4325-803c-73121f7e0ee0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:56 np0005465988 nova_compute[236126]: 2025-10-02 12:16:56.149 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:16:56 np0005465988 nova_compute[236126]: 2025-10-02 12:16:56.164 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:56 np0005465988 nova_compute[236126]: 2025-10-02 12:16:56.167 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:56 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:16:56 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5ca592ba44181e5982bad4e05ac1164bed6c5106ff51f20567b3d1bb1c87d6cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:16:56 np0005465988 nova_compute[236126]: 2025-10-02 12:16:56.194 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:56 np0005465988 podman[266718]: 2025-10-02 12:16:56.209052184 +0000 UTC m=+0.187578830 container init 4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:16:56 np0005465988 podman[266718]: 2025-10-02 12:16:56.21611717 +0000 UTC m=+0.194643776 container start 4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:16:56 np0005465988 neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0[266733]: [NOTICE]   (266737) : New worker (266739) forked
Oct  2 08:16:56 np0005465988 neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0[266733]: [NOTICE]   (266737) : Loading success.
Oct  2 08:16:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:56.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:56.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.148 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updating instance_info_cache with network_info: [{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.177 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.178 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.205 2 DEBUG oslo_concurrency.lockutils [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "interface-d0fb1236-bd41-4efe-8e6a-bb900eb86960-2a8b7c9c-6681-4c28-8c65-774622c5e92c" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.206 2 DEBUG oslo_concurrency.lockutils [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "interface-d0fb1236-bd41-4efe-8e6a-bb900eb86960-2a8b7c9c-6681-4c28-8c65-774622c5e92c" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.207 2 DEBUG nova.objects.instance [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lazy-loading 'flavor' on Instance uuid d0fb1236-bd41-4efe-8e6a-bb900eb86960 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.573 2 DEBUG nova.network.neutron [req-de9e3a2c-1392-4278-a41d-70ffe0eec28f req-0ea3bfb8-ead5-4910-8c18-672c89da15f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Updated VIF entry in instance network info cache for port 576bdab0-26cd-4663-8dd5-149075e0d45d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.575 2 DEBUG nova.network.neutron [req-de9e3a2c-1392-4278-a41d-70ffe0eec28f req-0ea3bfb8-ead5-4910-8c18-672c89da15f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Updating instance_info_cache with network_info: [{"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.589 2 DEBUG oslo_concurrency.lockutils [req-de9e3a2c-1392-4278-a41d-70ffe0eec28f req-0ea3bfb8-ead5-4910-8c18-672c89da15f2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.736 2 DEBUG nova.compute.manager [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-changed-ac8da617-201c-4081-8414-4b18e26dfb4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.736 2 DEBUG nova.compute.manager [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing instance network info cache due to event network-changed-ac8da617-201c-4081-8414-4b18e26dfb4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.737 2 DEBUG oslo_concurrency.lockutils [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.737 2 DEBUG oslo_concurrency.lockutils [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.737 2 DEBUG nova.network.neutron [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing network info cache for port ac8da617-201c-4081-8414-4b18e26dfb4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.910 2 DEBUG nova.objects.instance [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lazy-loading 'pci_requests' on Instance uuid d0fb1236-bd41-4efe-8e6a-bb900eb86960 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:57 np0005465988 nova_compute[236126]: 2025-10-02 12:16:57.924 2 DEBUG nova.network.neutron [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:16:58 np0005465988 podman[266824]: 2025-10-02 12:16:58.165100127 +0000 UTC m=+0.070930625 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:16:58 np0005465988 podman[266825]: 2025-10-02 12:16:58.192403892 +0000 UTC m=+0.077244939 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:16:58 np0005465988 podman[266823]: 2025-10-02 12:16:58.203969659 +0000 UTC m=+0.102851505 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 08:16:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e231 e231: 3 total, 3 up, 3 in
Oct  2 08:16:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:16:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:16:58.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:16:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:16:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:16:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:16:58.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:16:58 np0005465988 nova_compute[236126]: 2025-10-02 12:16:58.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.045 2 DEBUG nova.policy [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8e627e098f2b465d97099fee8f489b71', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bddf59854a7d4eb1a85ece547923cea0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.823 2 DEBUG nova.compute.manager [req-65ce47a2-9d9c-46d6-bb34-390cb7331a53 req-6b06f671-82c8-4cc1-aeb0-39c0085af721 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Received event network-vif-plugged-576bdab0-26cd-4663-8dd5-149075e0d45d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.823 2 DEBUG oslo_concurrency.lockutils [req-65ce47a2-9d9c-46d6-bb34-390cb7331a53 req-6b06f671-82c8-4cc1-aeb0-39c0085af721 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.824 2 DEBUG oslo_concurrency.lockutils [req-65ce47a2-9d9c-46d6-bb34-390cb7331a53 req-6b06f671-82c8-4cc1-aeb0-39c0085af721 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.824 2 DEBUG oslo_concurrency.lockutils [req-65ce47a2-9d9c-46d6-bb34-390cb7331a53 req-6b06f671-82c8-4cc1-aeb0-39c0085af721 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.825 2 DEBUG nova.compute.manager [req-65ce47a2-9d9c-46d6-bb34-390cb7331a53 req-6b06f671-82c8-4cc1-aeb0-39c0085af721 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Processing event network-vif-plugged-576bdab0-26cd-4663-8dd5-149075e0d45d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.826 2 DEBUG nova.compute.manager [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.831 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407419.8309255, bc4239f5-3cf2-4325-803c-73121f7e0ee0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.831 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.835 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:16:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:16:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:16:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.841 2 INFO nova.virt.libvirt.driver [-] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Instance spawned successfully.#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.841 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.850 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.855 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.868 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.869 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.871 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.871 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.873 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.874 2 DEBUG nova.virt.libvirt.driver [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.883 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e232 e232: 3 total, 3 up, 3 in
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.967 2 INFO nova.compute.manager [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Took 11.88 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:16:59 np0005465988 nova_compute[236126]: 2025-10-02 12:16:59.967 2 DEBUG nova.compute.manager [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.047 2 INFO nova.compute.manager [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Took 12.92 seconds to build instance.#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.073 2 DEBUG oslo_concurrency.lockutils [None req-7500af30-5602-4006-8512-bdaf81eee90f 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.691 2 DEBUG nova.network.neutron [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updated VIF entry in instance network info cache for port ac8da617-201c-4081-8414-4b18e26dfb4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.692 2 DEBUG nova.network.neutron [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updating instance_info_cache with network_info: [{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.710 2 DEBUG oslo_concurrency.lockutils [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.710 2 DEBUG nova.compute.manager [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Received event network-vif-plugged-576bdab0-26cd-4663-8dd5-149075e0d45d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.710 2 DEBUG oslo_concurrency.lockutils [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.710 2 DEBUG oslo_concurrency.lockutils [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.711 2 DEBUG oslo_concurrency.lockutils [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.711 2 DEBUG nova.compute.manager [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] No waiting events found dispatching network-vif-plugged-576bdab0-26cd-4663-8dd5-149075e0d45d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.711 2 WARNING nova.compute.manager [req-949caa0b-5377-4357-90e7-628ff4457bcf req-4e886f67-de8e-45b4-aa4e-56b20b4d3448 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Received unexpected event network-vif-plugged-576bdab0-26cd-4663-8dd5-149075e0d45d for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.739 2 DEBUG nova.network.neutron [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Successfully updated port: 2a8b7c9c-6681-4c28-8c65-774622c5e92c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.756 2 DEBUG oslo_concurrency.lockutils [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.757 2 DEBUG oslo_concurrency.lockutils [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquired lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.757 2 DEBUG nova.network.neutron [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:17:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:00.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.823 2 DEBUG nova.compute.manager [req-508213fd-517b-4a7e-827f-2f13376c34dd req-f96dc009-4959-4522-98eb-a509565e6c95 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-changed-2a8b7c9c-6681-4c28-8c65-774622c5e92c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.824 2 DEBUG nova.compute.manager [req-508213fd-517b-4a7e-827f-2f13376c34dd req-f96dc009-4959-4522-98eb-a509565e6c95 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing instance network info cache due to event network-changed-2a8b7c9c-6681-4c28-8c65-774622c5e92c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.824 2 DEBUG oslo_concurrency.lockutils [req-508213fd-517b-4a7e-827f-2f13376c34dd req-f96dc009-4959-4522-98eb-a509565e6c95 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:00.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:00 np0005465988 nova_compute[236126]: 2025-10-02 12:17:00.960 2 WARNING nova.network.neutron [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] ee114210-598c-482f-83c7-26c3363a45c4 already exists in list: networks containing: ['ee114210-598c-482f-83c7-26c3363a45c4']. ignoring it#033[00m
Oct  2 08:17:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e233 e233: 3 total, 3 up, 3 in
Oct  2 08:17:01 np0005465988 nova_compute[236126]: 2025-10-02 12:17:01.316 2 DEBUG nova.compute.manager [None req-c9b4af91-8935-4f40-a2e9-de7bf6a36efb 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:01 np0005465988 nova_compute[236126]: 2025-10-02 12:17:01.364 2 INFO nova.compute.manager [None req-c9b4af91-8935-4f40-a2e9-de7bf6a36efb 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] instance snapshotting#033[00m
Oct  2 08:17:01 np0005465988 nova_compute[236126]: 2025-10-02 12:17:01.589 2 INFO nova.virt.libvirt.driver [None req-c9b4af91-8935-4f40-a2e9-de7bf6a36efb 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Beginning live snapshot process#033[00m
Oct  2 08:17:01 np0005465988 nova_compute[236126]: 2025-10-02 12:17:01.836 2 DEBUG nova.virt.libvirt.imagebackend [None req-c9b4af91-8935-4f40-a2e9-de7bf6a36efb 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:17:02 np0005465988 nova_compute[236126]: 2025-10-02 12:17:02.101 2 DEBUG nova.storage.rbd_utils [None req-c9b4af91-8935-4f40-a2e9-de7bf6a36efb 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] creating snapshot(a999d017e3924e959ba2f4590c04c0dd) on rbd image(bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:17:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e233 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:02 np0005465988 nova_compute[236126]: 2025-10-02 12:17:02.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:02.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:02.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e234 e234: 3 total, 3 up, 3 in
Oct  2 08:17:03 np0005465988 nova_compute[236126]: 2025-10-02 12:17:03.487 2 DEBUG nova.storage.rbd_utils [None req-c9b4af91-8935-4f40-a2e9-de7bf6a36efb 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] cloning vms/bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk@a999d017e3924e959ba2f4590c04c0dd to images/e7ad18e6-654f-48ae-a957-a50e1a2c7a2d clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:17:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e235 e235: 3 total, 3 up, 3 in
Oct  2 08:17:03 np0005465988 nova_compute[236126]: 2025-10-02 12:17:03.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:03 np0005465988 nova_compute[236126]: 2025-10-02 12:17:03.992 2 DEBUG nova.storage.rbd_utils [None req-c9b4af91-8935-4f40-a2e9-de7bf6a36efb 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] flattening images/e7ad18e6-654f-48ae-a957-a50e1a2c7a2d flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.239 2 DEBUG nova.network.neutron [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updating instance_info_cache with network_info: [{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "address": "fa:16:3e:2e:88:f2", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8b7c9c-66", "ovs_interfaceid": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.260 2 DEBUG oslo_concurrency.lockutils [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Releasing lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.262 2 DEBUG oslo_concurrency.lockutils [req-508213fd-517b-4a7e-827f-2f13376c34dd req-f96dc009-4959-4522-98eb-a509565e6c95 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.263 2 DEBUG nova.network.neutron [req-508213fd-517b-4a7e-827f-2f13376c34dd req-f96dc009-4959-4522-98eb-a509565e6c95 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing network info cache for port 2a8b7c9c-6681-4c28-8c65-774622c5e92c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.268 2 DEBUG nova.virt.libvirt.vif [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1313126078',display_name='tempest-tempest.common.compute-instance-1313126078',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1313126078',id=70,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATuG0Igs/aClVmsQdGPZDgNmvV7Bgyq0Y4nFm5M4crDAlAvWuUKcpXWJIzqspIHqd22HRQCaWbqLnJi4PrcYo9nS6OUtnl8L8k0zpd1d2KcZKsipjVMcOXUmRyImfEBhg==',key_name='tempest-keypair-716714059',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='bddf59854a7d4eb1a85ece547923cea0',ramdisk_id='',reservation_id='r-vu95vf5t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-751325164',owner_user_name='tempest-AttachInterfacesTestJSON-751325164-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8e627e098f2b465d97099fee8f489b71',uuid=d0fb1236-bd41-4efe-8e6a-bb900eb86960,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "address": "fa:16:3e:2e:88:f2", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8b7c9c-66", "ovs_interfaceid": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.269 2 DEBUG nova.network.os_vif_util [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converting VIF {"id": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "address": "fa:16:3e:2e:88:f2", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8b7c9c-66", "ovs_interfaceid": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.270 2 DEBUG nova.network.os_vif_util [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:88:f2,bridge_name='br-int',has_traffic_filtering=True,id=2a8b7c9c-6681-4c28-8c65-774622c5e92c,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2a8b7c9c-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.270 2 DEBUG os_vif [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:88:f2,bridge_name='br-int',has_traffic_filtering=True,id=2a8b7c9c-6681-4c28-8c65-774622c5e92c,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2a8b7c9c-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.272 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.272 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.276 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a8b7c9c-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.277 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2a8b7c9c-66, col_values=(('external_ids', {'iface-id': '2a8b7c9c-6681-4c28-8c65-774622c5e92c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:88:f2', 'vm-uuid': 'd0fb1236-bd41-4efe-8e6a-bb900eb86960'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005465988 NetworkManager[45041]: <info>  [1759407424.3055] manager: (tap2a8b7c9c-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.315 2 INFO os_vif [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:88:f2,bridge_name='br-int',has_traffic_filtering=True,id=2a8b7c9c-6681-4c28-8c65-774622c5e92c,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2a8b7c9c-66')#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.316 2 DEBUG nova.virt.libvirt.vif [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1313126078',display_name='tempest-tempest.common.compute-instance-1313126078',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1313126078',id=70,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATuG0Igs/aClVmsQdGPZDgNmvV7Bgyq0Y4nFm5M4crDAlAvWuUKcpXWJIzqspIHqd22HRQCaWbqLnJi4PrcYo9nS6OUtnl8L8k0zpd1d2KcZKsipjVMcOXUmRyImfEBhg==',key_name='tempest-keypair-716714059',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='bddf59854a7d4eb1a85ece547923cea0',ramdisk_id='',reservation_id='r-vu95vf5t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-751325164',owner_user_name='tempest-AttachInterfacesTestJSON-751325164-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8e627e098f2b465d97099fee8f489b71',uuid=d0fb1236-bd41-4efe-8e6a-bb900eb86960,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "address": "fa:16:3e:2e:88:f2", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8b7c9c-66", "ovs_interfaceid": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.317 2 DEBUG nova.network.os_vif_util [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converting VIF {"id": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "address": "fa:16:3e:2e:88:f2", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8b7c9c-66", "ovs_interfaceid": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.318 2 DEBUG nova.network.os_vif_util [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:88:f2,bridge_name='br-int',has_traffic_filtering=True,id=2a8b7c9c-6681-4c28-8c65-774622c5e92c,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2a8b7c9c-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.322 2 DEBUG nova.virt.libvirt.guest [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] attach device xml: <interface type="ethernet">
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <mac address="fa:16:3e:2e:88:f2"/>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <model type="virtio"/>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <mtu size="1442"/>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <target dev="tap2a8b7c9c-66"/>
Oct  2 08:17:04 np0005465988 nova_compute[236126]: </interface>
Oct  2 08:17:04 np0005465988 nova_compute[236126]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:17:04 np0005465988 kernel: tap2a8b7c9c-66: entered promiscuous mode
Oct  2 08:17:04 np0005465988 NetworkManager[45041]: <info>  [1759407424.3481] manager: (tap2a8b7c9c-66): new Tun device (/org/freedesktop/NetworkManager/Devices/127)
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:04Z|00226|binding|INFO|Claiming lport 2a8b7c9c-6681-4c28-8c65-774622c5e92c for this chassis.
Oct  2 08:17:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:04Z|00227|binding|INFO|2a8b7c9c-6681-4c28-8c65-774622c5e92c: Claiming fa:16:3e:2e:88:f2 10.100.0.8
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.363 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:88:f2 10.100.0.8'], port_security=['fa:16:3e:2e:88:f2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1498800616', 'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd0fb1236-bd41-4efe-8e6a-bb900eb86960', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee114210-598c-482f-83c7-26c3363a45c4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1498800616', 'neutron:project_id': 'bddf59854a7d4eb1a85ece547923cea0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '20d10bb5-2bfc-4d22-a5af-27378413539d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6f6a61f-52bf-4639-80ed-e59f98940f90, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=2a8b7c9c-6681-4c28-8c65-774622c5e92c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.367 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 2a8b7c9c-6681-4c28-8c65-774622c5e92c in datapath ee114210-598c-482f-83c7-26c3363a45c4 bound to our chassis#033[00m
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.375 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee114210-598c-482f-83c7-26c3363a45c4#033[00m
Oct  2 08:17:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:04Z|00228|binding|INFO|Setting lport 2a8b7c9c-6681-4c28-8c65-774622c5e92c ovn-installed in OVS
Oct  2 08:17:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:04Z|00229|binding|INFO|Setting lport 2a8b7c9c-6681-4c28-8c65-774622c5e92c up in Southbound
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005465988 systemd-udevd[267107]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.399 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[de2925ee-3616-47f9-a6b7-4fbb66c656d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:04 np0005465988 NetworkManager[45041]: <info>  [1759407424.4156] device (tap2a8b7c9c-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:17:04 np0005465988 NetworkManager[45041]: <info>  [1759407424.4165] device (tap2a8b7c9c-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.432 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb436e6-8d06-42b2-b202-765d5ab73d56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.441 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0ace1f45-0b9d-4c4d-babc-a7c8d040b21e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.456 2 DEBUG nova.virt.libvirt.driver [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.456 2 DEBUG nova.virt.libvirt.driver [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.457 2 DEBUG nova.virt.libvirt.driver [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] No VIF found with MAC fa:16:3e:15:f8:40, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.457 2 DEBUG nova.virt.libvirt.driver [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] No VIF found with MAC fa:16:3e:2e:88:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.477 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[de0f308e-9f9d-44c1-8ee6-ef391219dff5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.490 2 DEBUG nova.virt.libvirt.guest [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <nova:name>tempest-tempest.common.compute-instance-1313126078</nova:name>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <nova:creationTime>2025-10-02 12:17:04</nova:creationTime>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <nova:flavor name="m1.nano">
Oct  2 08:17:04 np0005465988 nova_compute[236126]:    <nova:memory>128</nova:memory>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:    <nova:disk>1</nova:disk>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:    <nova:swap>0</nova:swap>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  </nova:flavor>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <nova:owner>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:    <nova:user uuid="8e627e098f2b465d97099fee8f489b71">tempest-AttachInterfacesTestJSON-751325164-project-member</nova:user>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:    <nova:project uuid="bddf59854a7d4eb1a85ece547923cea0">tempest-AttachInterfacesTestJSON-751325164</nova:project>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  </nova:owner>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  <nova:ports>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:    <nova:port uuid="ac8da617-201c-4081-8414-4b18e26dfb4a">
Oct  2 08:17:04 np0005465988 nova_compute[236126]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:    </nova:port>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:    <nova:port uuid="2a8b7c9c-6681-4c28-8c65-774622c5e92c">
Oct  2 08:17:04 np0005465988 nova_compute[236126]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:    </nova:port>
Oct  2 08:17:04 np0005465988 nova_compute[236126]:  </nova:ports>
Oct  2 08:17:04 np0005465988 nova_compute[236126]: </nova:instance>
Oct  2 08:17:04 np0005465988 nova_compute[236126]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.500 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[23e6c33e-2f9b-41eb-9ca3-bfea939ddb4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee114210-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:86:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545190, 'reachable_time': 27061, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267114, 'error': None, 'target': 'ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.523 2 DEBUG oslo_concurrency.lockutils [None req-bb4e381f-5405-4b2d-978a-85f6b9c85e94 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "interface-d0fb1236-bd41-4efe-8e6a-bb900eb86960-2a8b7c9c-6681-4c28-8c65-774622c5e92c" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.525 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d72d8ab4-537f-4ca0-a083-b24a568a28ae]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee114210-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545208, 'tstamp': 545208}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267115, 'error': None, 'target': 'ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee114210-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545212, 'tstamp': 545212}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267115, 'error': None, 'target': 'ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.527 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee114210-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.529 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee114210-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.530 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.530 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee114210-50, col_values=(('external_ids', {'iface-id': '544b20f9-e292-46bc-a770-180c318d534e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:04.531 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:17:04 np0005465988 nova_compute[236126]: 2025-10-02 12:17:04.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:04.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:04.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.121 2 DEBUG nova.compute.manager [req-2cfdd219-34fd-4a20-a642-ab45c23d2262 req-d8643338-52e9-4a60-8db4-a1d03ecabd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-vif-plugged-2a8b7c9c-6681-4c28-8c65-774622c5e92c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.122 2 DEBUG oslo_concurrency.lockutils [req-2cfdd219-34fd-4a20-a642-ab45c23d2262 req-d8643338-52e9-4a60-8db4-a1d03ecabd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.122 2 DEBUG oslo_concurrency.lockutils [req-2cfdd219-34fd-4a20-a642-ab45c23d2262 req-d8643338-52e9-4a60-8db4-a1d03ecabd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.122 2 DEBUG oslo_concurrency.lockutils [req-2cfdd219-34fd-4a20-a642-ab45c23d2262 req-d8643338-52e9-4a60-8db4-a1d03ecabd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.122 2 DEBUG nova.compute.manager [req-2cfdd219-34fd-4a20-a642-ab45c23d2262 req-d8643338-52e9-4a60-8db4-a1d03ecabd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] No waiting events found dispatching network-vif-plugged-2a8b7c9c-6681-4c28-8c65-774622c5e92c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.123 2 WARNING nova.compute.manager [req-2cfdd219-34fd-4a20-a642-ab45c23d2262 req-d8643338-52e9-4a60-8db4-a1d03ecabd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received unexpected event network-vif-plugged-2a8b7c9c-6681-4c28-8c65-774622c5e92c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.261 2 DEBUG nova.storage.rbd_utils [None req-c9b4af91-8935-4f40-a2e9-de7bf6a36efb 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] removing snapshot(a999d017e3924e959ba2f4590c04c0dd) on rbd image(bc4239f5-3cf2-4325-803c-73121f7e0ee0_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.652 2 DEBUG oslo_concurrency.lockutils [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "interface-d0fb1236-bd41-4efe-8e6a-bb900eb86960-2a8b7c9c-6681-4c28-8c65-774622c5e92c" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.652 2 DEBUG oslo_concurrency.lockutils [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "interface-d0fb1236-bd41-4efe-8e6a-bb900eb86960-2a8b7c9c-6681-4c28-8c65-774622c5e92c" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.672 2 DEBUG nova.objects.instance [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lazy-loading 'flavor' on Instance uuid d0fb1236-bd41-4efe-8e6a-bb900eb86960 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.701 2 DEBUG nova.virt.libvirt.vif [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1313126078',display_name='tempest-tempest.common.compute-instance-1313126078',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1313126078',id=70,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATuG0Igs/aClVmsQdGPZDgNmvV7Bgyq0Y4nFm5M4crDAlAvWuUKcpXWJIzqspIHqd22HRQCaWbqLnJi4PrcYo9nS6OUtnl8L8k0zpd1d2KcZKsipjVMcOXUmRyImfEBhg==',key_name='tempest-keypair-716714059',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bddf59854a7d4eb1a85ece547923cea0',ramdisk_id='',reservation_id='r-vu95vf5t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-751325164',owner_user_name='tempest-AttachInterfacesTestJSON-751325164-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8e627e098f2b465d97099fee8f489b71',uuid=d0fb1236-bd41-4efe-8e6a-bb900eb86960,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "address": "fa:16:3e:2e:88:f2", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8b7c9c-66", "ovs_interfaceid": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.701 2 DEBUG nova.network.os_vif_util [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converting VIF {"id": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "address": "fa:16:3e:2e:88:f2", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8b7c9c-66", "ovs_interfaceid": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.702 2 DEBUG nova.network.os_vif_util [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:88:f2,bridge_name='br-int',has_traffic_filtering=True,id=2a8b7c9c-6681-4c28-8c65-774622c5e92c,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2a8b7c9c-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.708 2 DEBUG nova.virt.libvirt.guest [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2e:88:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a8b7c9c-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.711 2 DEBUG nova.virt.libvirt.guest [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2e:88:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a8b7c9c-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.714 2 DEBUG nova.virt.libvirt.driver [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Attempting to detach device tap2a8b7c9c-66 from instance d0fb1236-bd41-4efe-8e6a-bb900eb86960 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.714 2 DEBUG nova.virt.libvirt.guest [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <mac address="fa:16:3e:2e:88:f2"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <model type="virtio"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <mtu size="1442"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <target dev="tap2a8b7c9c-66"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: </interface>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.722 2 DEBUG nova.virt.libvirt.guest [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2e:88:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a8b7c9c-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.726 2 DEBUG nova.virt.libvirt.guest [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2e:88:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a8b7c9c-66"/></interface>not found in domain: <domain type='kvm' id='27'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <name>instance-00000046</name>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <uuid>d0fb1236-bd41-4efe-8e6a-bb900eb86960</uuid>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:name>tempest-tempest.common.compute-instance-1313126078</nova:name>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:creationTime>2025-10-02 12:17:04</nova:creationTime>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:flavor name="m1.nano">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:memory>128</nova:memory>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:disk>1</nova:disk>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:swap>0</nova:swap>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </nova:flavor>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:owner>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:user uuid="8e627e098f2b465d97099fee8f489b71">tempest-AttachInterfacesTestJSON-751325164-project-member</nova:user>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:project uuid="bddf59854a7d4eb1a85ece547923cea0">tempest-AttachInterfacesTestJSON-751325164</nova:project>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </nova:owner>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:ports>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:port uuid="ac8da617-201c-4081-8414-4b18e26dfb4a">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </nova:port>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:port uuid="2a8b7c9c-6681-4c28-8c65-774622c5e92c">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </nova:port>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </nova:ports>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: </nova:instance>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <memory unit='KiB'>131072</memory>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <resource>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <partition>/machine</partition>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </resource>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <sysinfo type='smbios'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='serial'>d0fb1236-bd41-4efe-8e6a-bb900eb86960</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='uuid'>d0fb1236-bd41-4efe-8e6a-bb900eb86960</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <boot dev='hd'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <smbios mode='sysinfo'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <vmcoreinfo state='on'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <feature policy='require' name='x2apic'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <feature policy='require' name='vme'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <clock offset='utc'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <timer name='hpet' present='no'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <on_reboot>restart</on_reboot>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <on_crash>destroy</on_crash>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <disk type='network' device='disk'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <auth username='openstack'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <secret type='ceph' uuid='fd4c5763-22d1-50ea-ad0b-96a3dc3040b2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <source protocol='rbd' name='vms/d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk' index='2'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.100' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.102' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.101' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target dev='vda' bus='virtio'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='virtio-disk0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <disk type='network' device='cdrom'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <auth username='openstack'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <secret type='ceph' uuid='fd4c5763-22d1-50ea-ad0b-96a3dc3040b2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <source protocol='rbd' name='vms/d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk.config' index='1'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.100' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.102' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.101' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target dev='sda' bus='sata'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <readonly/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='sata0-0-0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pcie.0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='1' port='0x10'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='2' port='0x11'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='3' port='0x12'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.3'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='4' port='0x13'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.4'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='5' port='0x14'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.5'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='6' port='0x15'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.6'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='7' port='0x16'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.7'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='8' port='0x17'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.8'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='9' port='0x18'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.9'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='10' port='0x19'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.10'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='11' port='0x1a'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.11'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='12' port='0x1b'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.12'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='13' port='0x1c'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.13'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='14' port='0x1d'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.14'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='15' port='0x1e'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.15'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='16' port='0x1f'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.16'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='17' port='0x20'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.17'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='18' port='0x21'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.18'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='19' port='0x22'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.19'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='20' port='0x23'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.20'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='21' port='0x24'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.21'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='22' port='0x25'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.22'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='23' port='0x26'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.23'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='24' port='0x27'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.24'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='25' port='0x28'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.25'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-pci-bridge'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.26'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='usb'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='sata' index='0'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='ide'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <interface type='ethernet'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <mac address='fa:16:3e:15:f8:40'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target dev='tapac8da617-20'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model type='virtio'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <mtu size='1442'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='net0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <interface type='ethernet'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <mac address='fa:16:3e:2e:88:f2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target dev='tap2a8b7c9c-66'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model type='virtio'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <mtu size='1442'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='net1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <serial type='pty'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <source path='/dev/pts/0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <log file='/var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/console.log' append='off'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target type='isa-serial' port='0'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <model name='isa-serial'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      </target>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='serial0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <console type='pty' tty='/dev/pts/0'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <source path='/dev/pts/0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <log file='/var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/console.log' append='off'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target type='serial' port='0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='serial0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </console>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <input type='tablet' bus='usb'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='input0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </input>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <input type='mouse' bus='ps2'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='input1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </input>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <input type='keyboard' bus='ps2'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='input2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </input>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <listen type='address' address='::0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </graphics>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <audio id='1' type='none'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='video0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <watchdog model='itco' action='reset'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='watchdog0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </watchdog>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <memballoon model='virtio'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <stats period='10'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='balloon0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <rng model='virtio'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='rng0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <label>system_u:system_r:svirt_t:s0:c774,c906</label>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c774,c906</imagelabel>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </seclabel>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <label>+107:+107</label>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </seclabel>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.726 2 INFO nova.virt.libvirt.driver [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Successfully detached device tap2a8b7c9c-66 from instance d0fb1236-bd41-4efe-8e6a-bb900eb86960 from the persistent domain config.#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.726 2 DEBUG nova.virt.libvirt.driver [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] (1/8): Attempting to detach device tap2a8b7c9c-66 with device alias net1 from instance d0fb1236-bd41-4efe-8e6a-bb900eb86960 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.727 2 DEBUG nova.virt.libvirt.guest [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <mac address="fa:16:3e:2e:88:f2"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <model type="virtio"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <mtu size="1442"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <target dev="tap2a8b7c9c-66"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: </interface>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:17:05 np0005465988 kernel: tap2a8b7c9c-66 (unregistering): left promiscuous mode
Oct  2 08:17:05 np0005465988 NetworkManager[45041]: <info>  [1759407425.8054] device (tap2a8b7c9c-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.816 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Received event <DeviceRemovedEvent: 1759407425.8156626, d0fb1236-bd41-4efe-8e6a-bb900eb86960 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.820 2 DEBUG nova.virt.libvirt.driver [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Start waiting for the detach event from libvirt for device tap2a8b7c9c-66 with device alias net1 for instance d0fb1236-bd41-4efe-8e6a-bb900eb86960 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.821 2 DEBUG nova.virt.libvirt.guest [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2e:88:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a8b7c9c-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:17:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:05Z|00230|binding|INFO|Releasing lport 2a8b7c9c-6681-4c28-8c65-774622c5e92c from this chassis (sb_readonly=0)
Oct  2 08:17:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:05Z|00231|binding|INFO|Setting lport 2a8b7c9c-6681-4c28-8c65-774622c5e92c down in Southbound
Oct  2 08:17:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:05Z|00232|binding|INFO|Removing iface tap2a8b7c9c-66 ovn-installed in OVS
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.827 2 DEBUG nova.virt.libvirt.guest [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2e:88:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a8b7c9c-66"/></interface>not found in domain: <domain type='kvm' id='27'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <name>instance-00000046</name>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <uuid>d0fb1236-bd41-4efe-8e6a-bb900eb86960</uuid>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:name>tempest-tempest.common.compute-instance-1313126078</nova:name>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:creationTime>2025-10-02 12:17:04</nova:creationTime>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:flavor name="m1.nano">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:memory>128</nova:memory>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:disk>1</nova:disk>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:swap>0</nova:swap>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </nova:flavor>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:owner>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:user uuid="8e627e098f2b465d97099fee8f489b71">tempest-AttachInterfacesTestJSON-751325164-project-member</nova:user>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:project uuid="bddf59854a7d4eb1a85ece547923cea0">tempest-AttachInterfacesTestJSON-751325164</nova:project>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </nova:owner>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:ports>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:port uuid="ac8da617-201c-4081-8414-4b18e26dfb4a">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </nova:port>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:port uuid="2a8b7c9c-6681-4c28-8c65-774622c5e92c">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </nova:port>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </nova:ports>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: </nova:instance>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <memory unit='KiB'>131072</memory>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <resource>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <partition>/machine</partition>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </resource>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <sysinfo type='smbios'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='serial'>d0fb1236-bd41-4efe-8e6a-bb900eb86960</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='uuid'>d0fb1236-bd41-4efe-8e6a-bb900eb86960</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <boot dev='hd'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <smbios mode='sysinfo'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <vmcoreinfo state='on'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <feature policy='require' name='x2apic'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <feature policy='require' name='vme'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <clock offset='utc'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <timer name='hpet' present='no'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <on_reboot>restart</on_reboot>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <on_crash>destroy</on_crash>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <disk type='network' device='disk'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <auth username='openstack'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <secret type='ceph' uuid='fd4c5763-22d1-50ea-ad0b-96a3dc3040b2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <source protocol='rbd' name='vms/d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk' index='2'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.100' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.102' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.101' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target dev='vda' bus='virtio'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='virtio-disk0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <disk type='network' device='cdrom'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <auth username='openstack'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <secret type='ceph' uuid='fd4c5763-22d1-50ea-ad0b-96a3dc3040b2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <source protocol='rbd' name='vms/d0fb1236-bd41-4efe-8e6a-bb900eb86960_disk.config' index='1'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.100' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.102' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <host name='192.168.122.101' port='6789'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target dev='sda' bus='sata'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <readonly/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='sata0-0-0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pcie.0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='1' port='0x10'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='2' port='0x11'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='3' port='0x12'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.3'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='4' port='0x13'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.4'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='5' port='0x14'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.5'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='6' port='0x15'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.6'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='7' port='0x16'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.7'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='8' port='0x17'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.8'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='9' port='0x18'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.9'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='10' port='0x19'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.10'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='11' port='0x1a'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.11'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='12' port='0x1b'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.12'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='13' port='0x1c'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.13'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='14' port='0x1d'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.14'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='15' port='0x1e'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.15'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='16' port='0x1f'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.16'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='17' port='0x20'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.17'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='18' port='0x21'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.18'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='19' port='0x22'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.19'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='20' port='0x23'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.20'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='21' port='0x24'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.21'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='22' port='0x25'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.22'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='23' port='0x26'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.23'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='24' port='0x27'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.24'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-root-port'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target chassis='25' port='0x28'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.25'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model name='pcie-pci-bridge'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='pci.26'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='usb'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <controller type='sata' index='0'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='ide'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </controller>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <interface type='ethernet'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <mac address='fa:16:3e:15:f8:40'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target dev='tapac8da617-20'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model type='virtio'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <mtu size='1442'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='net0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <serial type='pty'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <source path='/dev/pts/0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <log file='/var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/console.log' append='off'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target type='isa-serial' port='0'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:        <model name='isa-serial'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      </target>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='serial0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <console type='pty' tty='/dev/pts/0'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <source path='/dev/pts/0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <log file='/var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960/console.log' append='off'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <target type='serial' port='0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='serial0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </console>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <input type='tablet' bus='usb'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='input0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </input>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <input type='mouse' bus='ps2'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='input1'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </input>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <input type='keyboard' bus='ps2'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='input2'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </input>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <listen type='address' address='::0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </graphics>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <audio id='1' type='none'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='video0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <watchdog model='itco' action='reset'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='watchdog0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </watchdog>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <memballoon model='virtio'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <stats period='10'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='balloon0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <rng model='virtio'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <alias name='rng0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <label>system_u:system_r:svirt_t:s0:c774,c906</label>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c774,c906</imagelabel>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </seclabel>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <label>+107:+107</label>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </seclabel>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.827 2 INFO nova.virt.libvirt.driver [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Successfully detached device tap2a8b7c9c-66 from instance d0fb1236-bd41-4efe-8e6a-bb900eb86960 from the live domain config.#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.828 2 DEBUG nova.virt.libvirt.vif [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1313126078',display_name='tempest-tempest.common.compute-instance-1313126078',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1313126078',id=70,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATuG0Igs/aClVmsQdGPZDgNmvV7Bgyq0Y4nFm5M4crDAlAvWuUKcpXWJIzqspIHqd22HRQCaWbqLnJi4PrcYo9nS6OUtnl8L8k0zpd1d2KcZKsipjVMcOXUmRyImfEBhg==',key_name='tempest-keypair-716714059',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bddf59854a7d4eb1a85ece547923cea0',ramdisk_id='',reservation_id='r-vu95vf5t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-751325164',owner_user_name='tempest-AttachInterfacesTestJSON-751325164-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8e627e098f2b465d97099fee8f489b71',uuid=d0fb1236-bd41-4efe-8e6a-bb900eb86960,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "address": "fa:16:3e:2e:88:f2", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8b7c9c-66", "ovs_interfaceid": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.828 2 DEBUG nova.network.os_vif_util [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converting VIF {"id": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "address": "fa:16:3e:2e:88:f2", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8b7c9c-66", "ovs_interfaceid": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.831 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:88:f2 10.100.0.8'], port_security=['fa:16:3e:2e:88:f2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1498800616', 'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd0fb1236-bd41-4efe-8e6a-bb900eb86960', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee114210-598c-482f-83c7-26c3363a45c4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1498800616', 'neutron:project_id': 'bddf59854a7d4eb1a85ece547923cea0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '20d10bb5-2bfc-4d22-a5af-27378413539d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6f6a61f-52bf-4639-80ed-e59f98940f90, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=2a8b7c9c-6681-4c28-8c65-774622c5e92c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.832 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 2a8b7c9c-6681-4c28-8c65-774622c5e92c in datapath ee114210-598c-482f-83c7-26c3363a45c4 unbound from our chassis#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.835 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee114210-598c-482f-83c7-26c3363a45c4#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.829 2 DEBUG nova.network.os_vif_util [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:88:f2,bridge_name='br-int',has_traffic_filtering=True,id=2a8b7c9c-6681-4c28-8c65-774622c5e92c,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2a8b7c9c-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.835 2 DEBUG os_vif [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:88:f2,bridge_name='br-int',has_traffic_filtering=True,id=2a8b7c9c-6681-4c28-8c65-774622c5e92c,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2a8b7c9c-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.838 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a8b7c9c-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.847 2 INFO os_vif [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:88:f2,bridge_name='br-int',has_traffic_filtering=True,id=2a8b7c9c-6681-4c28-8c65-774622c5e92c,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2a8b7c9c-66')#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.848 2 DEBUG nova.virt.libvirt.guest [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:name>tempest-tempest.common.compute-instance-1313126078</nova:name>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:creationTime>2025-10-02 12:17:05</nova:creationTime>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:flavor name="m1.nano">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:memory>128</nova:memory>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:disk>1</nova:disk>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:swap>0</nova:swap>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </nova:flavor>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:owner>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:user uuid="8e627e098f2b465d97099fee8f489b71">tempest-AttachInterfacesTestJSON-751325164-project-member</nova:user>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:project uuid="bddf59854a7d4eb1a85ece547923cea0">tempest-AttachInterfacesTestJSON-751325164</nova:project>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </nova:owner>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  <nova:ports>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    <nova:port uuid="ac8da617-201c-4081-8414-4b18e26dfb4a">
Oct  2 08:17:05 np0005465988 nova_compute[236126]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:    </nova:port>
Oct  2 08:17:05 np0005465988 nova_compute[236126]:  </nova:ports>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: </nova:instance>
Oct  2 08:17:05 np0005465988 nova_compute[236126]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.854 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6959355e-0427-480e-a4fe-30ab5f858e93]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.896 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d46553aa-605d-490d-a1f0-b6aa97974d5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.900 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[3f097623-cfa9-47f3-963f-f111e3f89f22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.934 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf40940-403e-4cd1-8242-2c06ae25516c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.962 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[45f2eb92-a463-49e0-b026-b2e7586c4e73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee114210-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:86:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 916, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 916, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545190, 'reachable_time': 27061, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267194, 'error': None, 'target': 'ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.985 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[39d3d3c1-558b-4f16-8a25-fad4df97e6c5]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee114210-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545208, 'tstamp': 545208}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267195, 'error': None, 'target': 'ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee114210-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545212, 'tstamp': 545212}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267195, 'error': None, 'target': 'ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.987 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee114210-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.990 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee114210-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.990 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.990 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee114210-50, col_values=(('external_ids', {'iface-id': '544b20f9-e292-46bc-a770-180c318d534e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:05.991 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:17:05 np0005465988 nova_compute[236126]: 2025-10-02 12:17:05.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:17:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:17:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e236 e236: 3 total, 3 up, 3 in
Oct  2 08:17:06 np0005465988 nova_compute[236126]: 2025-10-02 12:17:06.309 2 DEBUG nova.storage.rbd_utils [None req-c9b4af91-8935-4f40-a2e9-de7bf6a36efb 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] creating snapshot(snap) on rbd image(e7ad18e6-654f-48ae-a957-a50e1a2c7a2d) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:17:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:06.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:06.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:07 np0005465988 nova_compute[236126]: 2025-10-02 12:17:07.256 2 DEBUG nova.compute.manager [req-82d6c64a-192e-4835-af6e-130be3954eb7 req-3c79348f-cd94-4955-9968-f3ff538e6d5c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-vif-plugged-2a8b7c9c-6681-4c28-8c65-774622c5e92c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:07 np0005465988 nova_compute[236126]: 2025-10-02 12:17:07.256 2 DEBUG oslo_concurrency.lockutils [req-82d6c64a-192e-4835-af6e-130be3954eb7 req-3c79348f-cd94-4955-9968-f3ff538e6d5c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:07 np0005465988 nova_compute[236126]: 2025-10-02 12:17:07.257 2 DEBUG oslo_concurrency.lockutils [req-82d6c64a-192e-4835-af6e-130be3954eb7 req-3c79348f-cd94-4955-9968-f3ff538e6d5c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:07 np0005465988 nova_compute[236126]: 2025-10-02 12:17:07.257 2 DEBUG oslo_concurrency.lockutils [req-82d6c64a-192e-4835-af6e-130be3954eb7 req-3c79348f-cd94-4955-9968-f3ff538e6d5c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:07 np0005465988 nova_compute[236126]: 2025-10-02 12:17:07.257 2 DEBUG nova.compute.manager [req-82d6c64a-192e-4835-af6e-130be3954eb7 req-3c79348f-cd94-4955-9968-f3ff538e6d5c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] No waiting events found dispatching network-vif-plugged-2a8b7c9c-6681-4c28-8c65-774622c5e92c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:07 np0005465988 nova_compute[236126]: 2025-10-02 12:17:07.257 2 WARNING nova.compute.manager [req-82d6c64a-192e-4835-af6e-130be3954eb7 req-3c79348f-cd94-4955-9968-f3ff538e6d5c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received unexpected event network-vif-plugged-2a8b7c9c-6681-4c28-8c65-774622c5e92c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:17:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e237 e237: 3 total, 3 up, 3 in
Oct  2 08:17:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:07 np0005465988 nova_compute[236126]: 2025-10-02 12:17:07.384 2 DEBUG nova.network.neutron [req-508213fd-517b-4a7e-827f-2f13376c34dd req-f96dc009-4959-4522-98eb-a509565e6c95 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updated VIF entry in instance network info cache for port 2a8b7c9c-6681-4c28-8c65-774622c5e92c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:17:07 np0005465988 nova_compute[236126]: 2025-10-02 12:17:07.385 2 DEBUG nova.network.neutron [req-508213fd-517b-4a7e-827f-2f13376c34dd req-f96dc009-4959-4522-98eb-a509565e6c95 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updating instance_info_cache with network_info: [{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "address": "fa:16:3e:2e:88:f2", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8b7c9c-66", "ovs_interfaceid": "2a8b7c9c-6681-4c28-8c65-774622c5e92c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:07 np0005465988 nova_compute[236126]: 2025-10-02 12:17:07.401 2 DEBUG oslo_concurrency.lockutils [req-508213fd-517b-4a7e-827f-2f13376c34dd req-f96dc009-4959-4522-98eb-a509565e6c95 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:07 np0005465988 nova_compute[236126]: 2025-10-02 12:17:07.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:08 np0005465988 nova_compute[236126]: 2025-10-02 12:17:08.253 2 DEBUG oslo_concurrency.lockutils [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:08 np0005465988 nova_compute[236126]: 2025-10-02 12:17:08.253 2 DEBUG oslo_concurrency.lockutils [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquired lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:08 np0005465988 nova_compute[236126]: 2025-10-02 12:17:08.253 2 DEBUG nova.network.neutron [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:17:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e238 e238: 3 total, 3 up, 3 in
Oct  2 08:17:08 np0005465988 nova_compute[236126]: 2025-10-02 12:17:08.751 2 INFO nova.virt.libvirt.driver [None req-c9b4af91-8935-4f40-a2e9-de7bf6a36efb 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Snapshot image upload complete#033[00m
Oct  2 08:17:08 np0005465988 nova_compute[236126]: 2025-10-02 12:17:08.751 2 INFO nova.compute.manager [None req-c9b4af91-8935-4f40-a2e9-de7bf6a36efb 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Took 7.39 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:17:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:08.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:08.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:10 np0005465988 nova_compute[236126]: 2025-10-02 12:17:10.235 2 INFO nova.network.neutron [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Port 2a8b7c9c-6681-4c28-8c65-774622c5e92c from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  2 08:17:10 np0005465988 nova_compute[236126]: 2025-10-02 12:17:10.236 2 DEBUG nova.network.neutron [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updating instance_info_cache with network_info: [{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:10 np0005465988 nova_compute[236126]: 2025-10-02 12:17:10.256 2 DEBUG oslo_concurrency.lockutils [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Releasing lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:10 np0005465988 nova_compute[236126]: 2025-10-02 12:17:10.280 2 DEBUG oslo_concurrency.lockutils [None req-9714817b-e52f-42b5-94e5-2371671fd402 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "interface-d0fb1236-bd41-4efe-8e6a-bb900eb86960-2a8b7c9c-6681-4c28-8c65-774622c5e92c" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 4.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:10.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:10 np0005465988 nova_compute[236126]: 2025-10-02 12:17:10.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:10.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:11 np0005465988 nova_compute[236126]: 2025-10-02 12:17:11.634 2 DEBUG nova.compute.manager [req-126a1cdb-6f0a-4f8d-9037-28fa52fe4622 req-8727fe52-ddf3-4059-8f2c-4cecd815d2de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-changed-ac8da617-201c-4081-8414-4b18e26dfb4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:11 np0005465988 nova_compute[236126]: 2025-10-02 12:17:11.635 2 DEBUG nova.compute.manager [req-126a1cdb-6f0a-4f8d-9037-28fa52fe4622 req-8727fe52-ddf3-4059-8f2c-4cecd815d2de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing instance network info cache due to event network-changed-ac8da617-201c-4081-8414-4b18e26dfb4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:17:11 np0005465988 nova_compute[236126]: 2025-10-02 12:17:11.635 2 DEBUG oslo_concurrency.lockutils [req-126a1cdb-6f0a-4f8d-9037-28fa52fe4622 req-8727fe52-ddf3-4059-8f2c-4cecd815d2de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:11 np0005465988 nova_compute[236126]: 2025-10-02 12:17:11.636 2 DEBUG oslo_concurrency.lockutils [req-126a1cdb-6f0a-4f8d-9037-28fa52fe4622 req-8727fe52-ddf3-4059-8f2c-4cecd815d2de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:11 np0005465988 nova_compute[236126]: 2025-10-02 12:17:11.636 2 DEBUG nova.network.neutron [req-126a1cdb-6f0a-4f8d-9037-28fa52fe4622 req-8727fe52-ddf3-4059-8f2c-4cecd815d2de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Refreshing network info cache for port ac8da617-201c-4081-8414-4b18e26dfb4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:17:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e239 e239: 3 total, 3 up, 3 in
Oct  2 08:17:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:12 np0005465988 podman[267219]: 2025-10-02 12:17:12.525147456 +0000 UTC m=+0.058847094 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:17:12 np0005465988 nova_compute[236126]: 2025-10-02 12:17:12.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e240 e240: 3 total, 3 up, 3 in
Oct  2 08:17:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:12.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:12.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:13 np0005465988 nova_compute[236126]: 2025-10-02 12:17:13.070 2 DEBUG nova.network.neutron [req-126a1cdb-6f0a-4f8d-9037-28fa52fe4622 req-8727fe52-ddf3-4059-8f2c-4cecd815d2de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updated VIF entry in instance network info cache for port ac8da617-201c-4081-8414-4b18e26dfb4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:17:13 np0005465988 nova_compute[236126]: 2025-10-02 12:17:13.071 2 DEBUG nova.network.neutron [req-126a1cdb-6f0a-4f8d-9037-28fa52fe4622 req-8727fe52-ddf3-4059-8f2c-4cecd815d2de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updating instance_info_cache with network_info: [{"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:13 np0005465988 nova_compute[236126]: 2025-10-02 12:17:13.092 2 DEBUG oslo_concurrency.lockutils [req-126a1cdb-6f0a-4f8d-9037-28fa52fe4622 req-8727fe52-ddf3-4059-8f2c-4cecd815d2de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d0fb1236-bd41-4efe-8e6a-bb900eb86960" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:13 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:13Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5d:e9:df 10.100.0.7
Oct  2 08:17:13 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:13Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5d:e9:df 10.100.0.7
Oct  2 08:17:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e241 e241: 3 total, 3 up, 3 in
Oct  2 08:17:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:14.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e242 e242: 3 total, 3 up, 3 in
Oct  2 08:17:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:14.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:15 np0005465988 nova_compute[236126]: 2025-10-02 12:17:15.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:16.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:16.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:17 np0005465988 nova_compute[236126]: 2025-10-02 12:17:17.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e243 e243: 3 total, 3 up, 3 in
Oct  2 08:17:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:18.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:18.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:19.806 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:19 np0005465988 nova_compute[236126]: 2025-10-02 12:17:19.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:19.809 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:17:20 np0005465988 nova_compute[236126]: 2025-10-02 12:17:20.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:20.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:20.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:22 np0005465988 nova_compute[236126]: 2025-10-02 12:17:22.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:22.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:22.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e244 e244: 3 total, 3 up, 3 in
Oct  2 08:17:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:24.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:24.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e245 e245: 3 total, 3 up, 3 in
Oct  2 08:17:25 np0005465988 nova_compute[236126]: 2025-10-02 12:17:25.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:26.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:26.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:27.344 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:27.345 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:27.346 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:27 np0005465988 nova_compute[236126]: 2025-10-02 12:17:27.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e246 e246: 3 total, 3 up, 3 in
Oct  2 08:17:28 np0005465988 podman[267297]: 2025-10-02 12:17:28.548950412 +0000 UTC m=+0.077384664 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:17:28 np0005465988 podman[267298]: 2025-10-02 12:17:28.552191116 +0000 UTC m=+0.065509438 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Oct  2 08:17:28 np0005465988 podman[267296]: 2025-10-02 12:17:28.585339041 +0000 UTC m=+0.117683186 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:17:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:28.813 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:28.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:28.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e247 e247: 3 total, 3 up, 3 in
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.504 2 DEBUG oslo_concurrency.lockutils [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.505 2 DEBUG oslo_concurrency.lockutils [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.505 2 DEBUG oslo_concurrency.lockutils [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.505 2 DEBUG oslo_concurrency.lockutils [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.506 2 DEBUG oslo_concurrency.lockutils [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.507 2 INFO nova.compute.manager [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Terminating instance#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.508 2 DEBUG nova.compute.manager [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:17:29 np0005465988 kernel: tapac8da617-20 (unregistering): left promiscuous mode
Oct  2 08:17:29 np0005465988 NetworkManager[45041]: <info>  [1759407449.5857] device (tapac8da617-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:17:29 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:29Z|00233|binding|INFO|Releasing lport ac8da617-201c-4081-8414-4b18e26dfb4a from this chassis (sb_readonly=0)
Oct  2 08:17:29 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:29Z|00234|binding|INFO|Setting lport ac8da617-201c-4081-8414-4b18e26dfb4a down in Southbound
Oct  2 08:17:29 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:29Z|00235|binding|INFO|Removing iface tapac8da617-20 ovn-installed in OVS
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:29.604 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:f8:40 10.100.0.3'], port_security=['fa:16:3e:15:f8:40 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd0fb1236-bd41-4efe-8e6a-bb900eb86960', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee114210-598c-482f-83c7-26c3363a45c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bddf59854a7d4eb1a85ece547923cea0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fd3b62a4-b346-4888-8f6a-ca787221af6a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6f6a61f-52bf-4639-80ed-e59f98940f90, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=ac8da617-201c-4081-8414-4b18e26dfb4a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:29.605 142124 INFO neutron.agent.ovn.metadata.agent [-] Port ac8da617-201c-4081-8414-4b18e26dfb4a in datapath ee114210-598c-482f-83c7-26c3363a45c4 unbound from our chassis#033[00m
Oct  2 08:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:29.607 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee114210-598c-482f-83c7-26c3363a45c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:29.608 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7a24ecba-a563-464b-aa06-54b9c9cc620b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:29.609 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4 namespace which is not needed anymore#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:29 np0005465988 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000046.scope: Deactivated successfully.
Oct  2 08:17:29 np0005465988 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000046.scope: Consumed 15.587s CPU time.
Oct  2 08:17:29 np0005465988 systemd-machined[192594]: Machine qemu-27-instance-00000046 terminated.
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.749 2 INFO nova.virt.libvirt.driver [-] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Instance destroyed successfully.#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.751 2 DEBUG nova.objects.instance [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lazy-loading 'resources' on Instance uuid d0fb1236-bd41-4efe-8e6a-bb900eb86960 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.773 2 DEBUG nova.virt.libvirt.vif [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1313126078',display_name='tempest-tempest.common.compute-instance-1313126078',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1313126078',id=70,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATuG0Igs/aClVmsQdGPZDgNmvV7Bgyq0Y4nFm5M4crDAlAvWuUKcpXWJIzqspIHqd22HRQCaWbqLnJi4PrcYo9nS6OUtnl8L8k0zpd1d2KcZKsipjVMcOXUmRyImfEBhg==',key_name='tempest-keypair-716714059',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bddf59854a7d4eb1a85ece547923cea0',ramdisk_id='',reservation_id='r-vu95vf5t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-751325164',owner_user_name='tempest-AttachInterfacesTestJSON-751325164-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8e627e098f2b465d97099fee8f489b71',uuid=d0fb1236-bd41-4efe-8e6a-bb900eb86960,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.774 2 DEBUG nova.network.os_vif_util [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converting VIF {"id": "ac8da617-201c-4081-8414-4b18e26dfb4a", "address": "fa:16:3e:15:f8:40", "network": {"id": "ee114210-598c-482f-83c7-26c3363a45c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-985096952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddf59854a7d4eb1a85ece547923cea0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8da617-20", "ovs_interfaceid": "ac8da617-201c-4081-8414-4b18e26dfb4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.775 2 DEBUG nova.network.os_vif_util [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:f8:40,bridge_name='br-int',has_traffic_filtering=True,id=ac8da617-201c-4081-8414-4b18e26dfb4a,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8da617-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.775 2 DEBUG os_vif [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:f8:40,bridge_name='br-int',has_traffic_filtering=True,id=ac8da617-201c-4081-8414-4b18e26dfb4a,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8da617-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.777 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac8da617-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.784 2 INFO os_vif [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:f8:40,bridge_name='br-int',has_traffic_filtering=True,id=ac8da617-201c-4081-8414-4b18e26dfb4a,network=Network(ee114210-598c-482f-83c7-26c3363a45c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8da617-20')#033[00m
Oct  2 08:17:29 np0005465988 neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4[266142]: [NOTICE]   (266146) : haproxy version is 2.8.14-c23fe91
Oct  2 08:17:29 np0005465988 neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4[266142]: [NOTICE]   (266146) : path to executable is /usr/sbin/haproxy
Oct  2 08:17:29 np0005465988 neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4[266142]: [WARNING]  (266146) : Exiting Master process...
Oct  2 08:17:29 np0005465988 neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4[266142]: [ALERT]    (266146) : Current worker (266148) exited with code 143 (Terminated)
Oct  2 08:17:29 np0005465988 neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4[266142]: [WARNING]  (266146) : All workers exited. Exiting... (0)
Oct  2 08:17:29 np0005465988 systemd[1]: libpod-e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51.scope: Deactivated successfully.
Oct  2 08:17:29 np0005465988 podman[267385]: 2025-10-02 12:17:29.799770198 +0000 UTC m=+0.069153554 container died e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:17:29 np0005465988 systemd[1]: var-lib-containers-storage-overlay-a9261db82d16f56b57727a2ea2d60bfcdc4371cfe6284e2958325b8e937bb082-merged.mount: Deactivated successfully.
Oct  2 08:17:29 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51-userdata-shm.mount: Deactivated successfully.
Oct  2 08:17:29 np0005465988 podman[267385]: 2025-10-02 12:17:29.845565621 +0000 UTC m=+0.114948967 container cleanup e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:17:29 np0005465988 systemd[1]: libpod-conmon-e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51.scope: Deactivated successfully.
Oct  2 08:17:29 np0005465988 podman[267447]: 2025-10-02 12:17:29.92762746 +0000 UTC m=+0.050315356 container remove e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:29.935 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c0574aa6-1db7-4539-96fe-6ed6fa25bd52]: (4, ('Thu Oct  2 12:17:29 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4 (e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51)\ne7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51\nThu Oct  2 12:17:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4 (e7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51)\ne7fd444e6ab01eb8064343f9f87f8e89240317b17b8d44f63115aa2cca7b1a51\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:29.938 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[79c27ea3-3e79-41c8-a924-556be54ec9c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:29.939 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee114210-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:29 np0005465988 kernel: tapee114210-50: left promiscuous mode
Oct  2 08:17:29 np0005465988 nova_compute[236126]: 2025-10-02 12:17:29.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:29.982 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c2cac840-2227-4dfc-adc7-f3cab81e9055]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.008 2 DEBUG nova.compute.manager [req-691801ec-0616-46b3-b6b2-89a147d00352 req-94ab6a30-bbfc-4ddf-be3e-da2cef844c62 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-vif-unplugged-ac8da617-201c-4081-8414-4b18e26dfb4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.009 2 DEBUG oslo_concurrency.lockutils [req-691801ec-0616-46b3-b6b2-89a147d00352 req-94ab6a30-bbfc-4ddf-be3e-da2cef844c62 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.009 2 DEBUG oslo_concurrency.lockutils [req-691801ec-0616-46b3-b6b2-89a147d00352 req-94ab6a30-bbfc-4ddf-be3e-da2cef844c62 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.009 2 DEBUG oslo_concurrency.lockutils [req-691801ec-0616-46b3-b6b2-89a147d00352 req-94ab6a30-bbfc-4ddf-be3e-da2cef844c62 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.009 2 DEBUG nova.compute.manager [req-691801ec-0616-46b3-b6b2-89a147d00352 req-94ab6a30-bbfc-4ddf-be3e-da2cef844c62 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] No waiting events found dispatching network-vif-unplugged-ac8da617-201c-4081-8414-4b18e26dfb4a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.009 2 DEBUG nova.compute.manager [req-691801ec-0616-46b3-b6b2-89a147d00352 req-94ab6a30-bbfc-4ddf-be3e-da2cef844c62 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-vif-unplugged-ac8da617-201c-4081-8414-4b18e26dfb4a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:17:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:30.013 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[45480374-c06d-4917-b535-39af6204e806]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:30.014 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[033e0e2c-5e46-4b17-a279-4443ed2062ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:30.038 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1f98f43b-6ddc-4044-8495-838836cb5137]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545180, 'reachable_time': 19839, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267462, 'error': None, 'target': 'ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:30.041 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee114210-598c-482f-83c7-26c3363a45c4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:17:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:17:30.041 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5969e8-36ef-47e4-8878-fd80d0c9f9c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:30 np0005465988 systemd[1]: run-netns-ovnmeta\x2dee114210\x2d598c\x2d482f\x2d83c7\x2d26c3363a45c4.mount: Deactivated successfully.
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.305 2 INFO nova.virt.libvirt.driver [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Deleting instance files /var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960_del#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.306 2 INFO nova.virt.libvirt.driver [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Deletion of /var/lib/nova/instances/d0fb1236-bd41-4efe-8e6a-bb900eb86960_del complete#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.426 2 INFO nova.compute.manager [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Took 0.92 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.427 2 DEBUG oslo.service.loopingcall [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.427 2 DEBUG nova.compute.manager [-] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:17:30 np0005465988 nova_compute[236126]: 2025-10-02 12:17:30.428 2 DEBUG nova.network.neutron [-] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:17:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:30.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:30.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:31 np0005465988 nova_compute[236126]: 2025-10-02 12:17:31.819 2 DEBUG nova.network.neutron [-] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:31 np0005465988 nova_compute[236126]: 2025-10-02 12:17:31.849 2 INFO nova.compute.manager [-] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Took 1.42 seconds to deallocate network for instance.#033[00m
Oct  2 08:17:31 np0005465988 nova_compute[236126]: 2025-10-02 12:17:31.895 2 DEBUG oslo_concurrency.lockutils [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:31 np0005465988 nova_compute[236126]: 2025-10-02 12:17:31.895 2 DEBUG oslo_concurrency.lockutils [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:31 np0005465988 nova_compute[236126]: 2025-10-02 12:17:31.914 2 DEBUG nova.compute.manager [req-f8d151c6-61cf-4f59-b744-a786e5a3eba4 req-40af3792-5b82-446f-a391-3de35523a2e5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-vif-deleted-ac8da617-201c-4081-8414-4b18e26dfb4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:31 np0005465988 nova_compute[236126]: 2025-10-02 12:17:31.987 2 DEBUG oslo_concurrency.processutils [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.177 2 DEBUG nova.compute.manager [req-f290f71d-4f39-4707-b3f4-600cc8f2a969 req-55276dd6-2c46-4c78-9033-006710492c4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received event network-vif-plugged-ac8da617-201c-4081-8414-4b18e26dfb4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.178 2 DEBUG oslo_concurrency.lockutils [req-f290f71d-4f39-4707-b3f4-600cc8f2a969 req-55276dd6-2c46-4c78-9033-006710492c4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.178 2 DEBUG oslo_concurrency.lockutils [req-f290f71d-4f39-4707-b3f4-600cc8f2a969 req-55276dd6-2c46-4c78-9033-006710492c4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.178 2 DEBUG oslo_concurrency.lockutils [req-f290f71d-4f39-4707-b3f4-600cc8f2a969 req-55276dd6-2c46-4c78-9033-006710492c4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.178 2 DEBUG nova.compute.manager [req-f290f71d-4f39-4707-b3f4-600cc8f2a969 req-55276dd6-2c46-4c78-9033-006710492c4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] No waiting events found dispatching network-vif-plugged-ac8da617-201c-4081-8414-4b18e26dfb4a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.179 2 WARNING nova.compute.manager [req-f290f71d-4f39-4707-b3f4-600cc8f2a969 req-55276dd6-2c46-4c78-9033-006710492c4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Received unexpected event network-vif-plugged-ac8da617-201c-4081-8414-4b18e26dfb4a for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:17:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:17:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/638047331' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.467 2 DEBUG oslo_concurrency.processutils [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.473 2 DEBUG nova.compute.provider_tree [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.501 2 DEBUG nova.scheduler.client.report [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.534 2 DEBUG oslo_concurrency.lockutils [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.580 2 INFO nova.scheduler.client.report [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Deleted allocations for instance d0fb1236-bd41-4efe-8e6a-bb900eb86960#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.677 2 DEBUG oslo_concurrency.lockutils [None req-59cb2825-c676-4098-aee8-9544f1d6f375 8e627e098f2b465d97099fee8f489b71 bddf59854a7d4eb1a85ece547923cea0 - - default default] Lock "d0fb1236-bd41-4efe-8e6a-bb900eb86960" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:32 np0005465988 nova_compute[236126]: 2025-10-02 12:17:32.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:17:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:32.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:17:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:17:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:32.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:17:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e248 e248: 3 total, 3 up, 3 in
Oct  2 08:17:34 np0005465988 nova_compute[236126]: 2025-10-02 12:17:34.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:34.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:34.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:36.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:36.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:37 np0005465988 nova_compute[236126]: 2025-10-02 12:17:37.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 e249: 3 total, 3 up, 3 in
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.805128) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407458805250, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2393, "num_deletes": 268, "total_data_size": 5137116, "memory_usage": 5218696, "flush_reason": "Manual Compaction"}
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407458827256, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 3369958, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36455, "largest_seqno": 38843, "table_properties": {"data_size": 3359936, "index_size": 6388, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 21114, "raw_average_key_size": 20, "raw_value_size": 3339827, "raw_average_value_size": 3319, "num_data_blocks": 275, "num_entries": 1006, "num_filter_entries": 1006, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759407299, "oldest_key_time": 1759407299, "file_creation_time": 1759407458, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 22175 microseconds, and 14036 cpu microseconds.
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.827320) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 3369958 bytes OK
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.827351) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.829714) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.829738) EVENT_LOG_v1 {"time_micros": 1759407458829731, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.829762) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 5126433, prev total WAL file size 5126433, number of live WAL files 2.
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.832020) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303034' seq:72057594037927935, type:22 .. '6C6F676D0031323535' seq:0, type:0; will stop at (end)
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(3290KB)], [69(7992KB)]
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407458832075, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 11554697, "oldest_snapshot_seqno": -1}
Oct  2 08:17:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:38.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6346 keys, 11404182 bytes, temperature: kUnknown
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407458915192, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 11404182, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11359370, "index_size": 27870, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15877, "raw_key_size": 162302, "raw_average_key_size": 25, "raw_value_size": 11243144, "raw_average_value_size": 1771, "num_data_blocks": 1122, "num_entries": 6346, "num_filter_entries": 6346, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759407458, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.915641) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 11404182 bytes
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.916970) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 138.7 rd, 136.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 7.8 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(6.8) write-amplify(3.4) OK, records in: 6891, records dropped: 545 output_compression: NoCompression
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.917001) EVENT_LOG_v1 {"time_micros": 1759407458916987, "job": 42, "event": "compaction_finished", "compaction_time_micros": 83321, "compaction_time_cpu_micros": 48642, "output_level": 6, "num_output_files": 1, "total_output_size": 11404182, "num_input_records": 6891, "num_output_records": 6346, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407458918233, "job": 42, "event": "table_file_deletion", "file_number": 71}
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407458921145, "job": 42, "event": "table_file_deletion", "file_number": 69}
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.831903) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.921214) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.921222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.921225) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.921228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:38 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:38.921231) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:38.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:39 np0005465988 nova_compute[236126]: 2025-10-02 12:17:39.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:40.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:40.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:42 np0005465988 nova_compute[236126]: 2025-10-02 12:17:42.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:42 np0005465988 nova_compute[236126]: 2025-10-02 12:17:42.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:42 np0005465988 nova_compute[236126]: 2025-10-02 12:17:42.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:42 np0005465988 nova_compute[236126]: 2025-10-02 12:17:42.764 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:42 np0005465988 nova_compute[236126]: 2025-10-02 12:17:42.765 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:42 np0005465988 nova_compute[236126]: 2025-10-02 12:17:42.765 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:42 np0005465988 nova_compute[236126]: 2025-10-02 12:17:42.766 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:17:42 np0005465988 nova_compute[236126]: 2025-10-02 12:17:42.766 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:42.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:42.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:17:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1166877677' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:17:43 np0005465988 nova_compute[236126]: 2025-10-02 12:17:43.243 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:43 np0005465988 nova_compute[236126]: 2025-10-02 12:17:43.347 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000004b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:17:43 np0005465988 nova_compute[236126]: 2025-10-02 12:17:43.348 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000004b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:17:43 np0005465988 podman[267566]: 2025-10-02 12:17:43.408489639 +0000 UTC m=+0.100905998 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  2 08:17:43 np0005465988 nova_compute[236126]: 2025-10-02 12:17:43.528 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:17:43 np0005465988 nova_compute[236126]: 2025-10-02 12:17:43.529 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4374MB free_disk=20.942459106445312GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:17:43 np0005465988 nova_compute[236126]: 2025-10-02 12:17:43.530 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:43 np0005465988 nova_compute[236126]: 2025-10-02 12:17:43.530 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:43 np0005465988 nova_compute[236126]: 2025-10-02 12:17:43.642 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance bc4239f5-3cf2-4325-803c-73121f7e0ee0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:17:43 np0005465988 nova_compute[236126]: 2025-10-02 12:17:43.643 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:17:43 np0005465988 nova_compute[236126]: 2025-10-02 12:17:43.643 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:17:43 np0005465988 nova_compute[236126]: 2025-10-02 12:17:43.702 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:17:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3984017574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:17:44 np0005465988 nova_compute[236126]: 2025-10-02 12:17:44.200 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:44 np0005465988 nova_compute[236126]: 2025-10-02 12:17:44.208 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:17:44 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:44Z|00236|binding|INFO|Releasing lport 421cd6e3-75aa-44e1-b552-d119c4fcd629 from this chassis (sb_readonly=0)
Oct  2 08:17:44 np0005465988 nova_compute[236126]: 2025-10-02 12:17:44.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:44 np0005465988 nova_compute[236126]: 2025-10-02 12:17:44.249 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:17:44 np0005465988 nova_compute[236126]: 2025-10-02 12:17:44.295 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:17:44 np0005465988 nova_compute[236126]: 2025-10-02 12:17:44.296 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.766s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:44 np0005465988 ovn_controller[132601]: 2025-10-02T12:17:44Z|00237|binding|INFO|Releasing lport 421cd6e3-75aa-44e1-b552-d119c4fcd629 from this chassis (sb_readonly=0)
Oct  2 08:17:44 np0005465988 nova_compute[236126]: 2025-10-02 12:17:44.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:44 np0005465988 nova_compute[236126]: 2025-10-02 12:17:44.746 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407449.7448397, d0fb1236-bd41-4efe-8e6a-bb900eb86960 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:17:44 np0005465988 nova_compute[236126]: 2025-10-02 12:17:44.747 2 INFO nova.compute.manager [-] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:17:44 np0005465988 nova_compute[236126]: 2025-10-02 12:17:44.776 2 DEBUG nova.compute.manager [None req-6c17aa21-694b-4471-8837-621378bfa1cd - - - - - -] [instance: d0fb1236-bd41-4efe-8e6a-bb900eb86960] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:44 np0005465988 nova_compute[236126]: 2025-10-02 12:17:44.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:44.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:17:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:44.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:17:46 np0005465988 nova_compute[236126]: 2025-10-02 12:17:46.297 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:46 np0005465988 nova_compute[236126]: 2025-10-02 12:17:46.298 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:46 np0005465988 nova_compute[236126]: 2025-10-02 12:17:46.298 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:46 np0005465988 nova_compute[236126]: 2025-10-02 12:17:46.299 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:17:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:46.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:46.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:47 np0005465988 nova_compute[236126]: 2025-10-02 12:17:47.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:47 np0005465988 nova_compute[236126]: 2025-10-02 12:17:47.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:48 np0005465988 nova_compute[236126]: 2025-10-02 12:17:48.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:48 np0005465988 nova_compute[236126]: 2025-10-02 12:17:48.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:48.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:48.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:49 np0005465988 nova_compute[236126]: 2025-10-02 12:17:49.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:50.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:50.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:52 np0005465988 nova_compute[236126]: 2025-10-02 12:17:52.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:52 np0005465988 nova_compute[236126]: 2025-10-02 12:17:52.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:17:52 np0005465988 nova_compute[236126]: 2025-10-02 12:17:52.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:17:52 np0005465988 nova_compute[236126]: 2025-10-02 12:17:52.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:52.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:52.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:53 np0005465988 nova_compute[236126]: 2025-10-02 12:17:53.241 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:53 np0005465988 nova_compute[236126]: 2025-10-02 12:17:53.241 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:53 np0005465988 nova_compute[236126]: 2025-10-02 12:17:53.242 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:17:53 np0005465988 nova_compute[236126]: 2025-10-02 12:17:53.242 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bc4239f5-3cf2-4325-803c-73121f7e0ee0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:54 np0005465988 nova_compute[236126]: 2025-10-02 12:17:54.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:54.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:54.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.763070) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407475763138, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 427, "num_deletes": 251, "total_data_size": 468013, "memory_usage": 476232, "flush_reason": "Manual Compaction"}
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407475769391, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 308298, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38848, "largest_seqno": 39270, "table_properties": {"data_size": 305884, "index_size": 514, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5950, "raw_average_key_size": 18, "raw_value_size": 301121, "raw_average_value_size": 949, "num_data_blocks": 23, "num_entries": 317, "num_filter_entries": 317, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759407458, "oldest_key_time": 1759407458, "file_creation_time": 1759407475, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 6386 microseconds, and 3126 cpu microseconds.
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.769458) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 308298 bytes OK
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.769491) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.771847) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.771862) EVENT_LOG_v1 {"time_micros": 1759407475771858, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.771883) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 465313, prev total WAL file size 465313, number of live WAL files 2.
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.772430) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(301KB)], [72(10MB)]
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407475772486, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 11712480, "oldest_snapshot_seqno": -1}
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6153 keys, 9867203 bytes, temperature: kUnknown
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407475838943, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 9867203, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9825029, "index_size": 25680, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15429, "raw_key_size": 158998, "raw_average_key_size": 25, "raw_value_size": 9713456, "raw_average_value_size": 1578, "num_data_blocks": 1023, "num_entries": 6153, "num_filter_entries": 6153, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759407475, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.839169) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 9867203 bytes
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.840794) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.1 rd, 148.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 10.9 +0.0 blob) out(9.4 +0.0 blob), read-write-amplify(70.0) write-amplify(32.0) OK, records in: 6663, records dropped: 510 output_compression: NoCompression
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.840809) EVENT_LOG_v1 {"time_micros": 1759407475840801, "job": 44, "event": "compaction_finished", "compaction_time_micros": 66523, "compaction_time_cpu_micros": 42692, "output_level": 6, "num_output_files": 1, "total_output_size": 9867203, "num_input_records": 6663, "num_output_records": 6153, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407475840955, "job": 44, "event": "table_file_deletion", "file_number": 74}
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407475842755, "job": 44, "event": "table_file_deletion", "file_number": 72}
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.772265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.842802) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.842807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.842809) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.842811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:55 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:17:55.842813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:17:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:17:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:56.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:17:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:56.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:17:57 np0005465988 nova_compute[236126]: 2025-10-02 12:17:57.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:58 np0005465988 nova_compute[236126]: 2025-10-02 12:17:58.505 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Updating instance_info_cache with network_info: [{"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:17:58.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:17:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:17:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:17:58.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:17:59 np0005465988 nova_compute[236126]: 2025-10-02 12:17:59.492 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:59 np0005465988 nova_compute[236126]: 2025-10-02 12:17:59.492 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:17:59 np0005465988 podman[267666]: 2025-10-02 12:17:59.558118055 +0000 UTC m=+0.080469633 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:17:59 np0005465988 podman[267667]: 2025-10-02 12:17:59.564228953 +0000 UTC m=+0.075181309 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:17:59 np0005465988 podman[267665]: 2025-10-02 12:17:59.596642007 +0000 UTC m=+0.115597216 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:17:59 np0005465988 nova_compute[236126]: 2025-10-02 12:17:59.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:00.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:00.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:02 np0005465988 nova_compute[236126]: 2025-10-02 12:18:02.487 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:02 np0005465988 nova_compute[236126]: 2025-10-02 12:18:02.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:02.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:02.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:04 np0005465988 nova_compute[236126]: 2025-10-02 12:18:04.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:04.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:04.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:06.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:18:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:18:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:18:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:06.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:07 np0005465988 nova_compute[236126]: 2025-10-02 12:18:07.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:08.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:08.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:09 np0005465988 nova_compute[236126]: 2025-10-02 12:18:09.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:10.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:10.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:12 np0005465988 nova_compute[236126]: 2025-10-02 12:18:12.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:12.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:12.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:13 np0005465988 podman[267865]: 2025-10-02 12:18:13.547476304 +0000 UTC m=+0.079754343 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 08:18:14 np0005465988 nova_compute[236126]: 2025-10-02 12:18:14.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:14.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:14.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:18:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:18:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:16.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:16.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:17 np0005465988 nova_compute[236126]: 2025-10-02 12:18:17.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:18.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:18.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:19 np0005465988 nova_compute[236126]: 2025-10-02 12:18:19.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:18:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:20.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:18:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:21.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:22 np0005465988 nova_compute[236126]: 2025-10-02 12:18:22.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:22 np0005465988 nova_compute[236126]: 2025-10-02 12:18:22.811 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Acquiring lock "7578f6f0-7071-49b7-978d-18b295ee6504" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:22 np0005465988 nova_compute[236126]: 2025-10-02 12:18:22.812 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:22 np0005465988 nova_compute[236126]: 2025-10-02 12:18:22.849 2 DEBUG nova.compute.manager [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:18:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:22.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:22 np0005465988 nova_compute[236126]: 2025-10-02 12:18:22.972 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:22 np0005465988 nova_compute[236126]: 2025-10-02 12:18:22.973 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:22 np0005465988 nova_compute[236126]: 2025-10-02 12:18:22.983 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:18:22 np0005465988 nova_compute[236126]: 2025-10-02 12:18:22.984 2 INFO nova.compute.claims [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:18:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:23.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.154 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:18:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1481952339' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.613 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.621 2 DEBUG nova.compute.provider_tree [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.648 2 DEBUG nova.scheduler.client.report [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.678 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.680 2 DEBUG nova.compute.manager [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.733 2 DEBUG nova.compute.manager [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.734 2 DEBUG nova.network.neutron [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.765 2 INFO nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.790 2 DEBUG nova.compute.manager [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.918 2 DEBUG nova.compute.manager [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.920 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.920 2 INFO nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Creating image(s)#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.958 2 DEBUG nova.storage.rbd_utils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] rbd image 7578f6f0-7071-49b7-978d-18b295ee6504_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:23 np0005465988 nova_compute[236126]: 2025-10-02 12:18:23.986 2 DEBUG nova.storage.rbd_utils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] rbd image 7578f6f0-7071-49b7-978d-18b295ee6504_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.014 2 DEBUG nova.storage.rbd_utils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] rbd image 7578f6f0-7071-49b7-978d-18b295ee6504_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.019 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.068 2 DEBUG nova.policy [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cc0b544e9ae24eb3b638e87ee4428677', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '471de2e25ff442629af9380f2f732266', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.104 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.105 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.106 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.106 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.138 2 DEBUG nova.storage.rbd_utils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] rbd image 7578f6f0-7071-49b7-978d-18b295ee6504_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.147 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 7578f6f0-7071-49b7-978d-18b295ee6504_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.588 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 7578f6f0-7071-49b7-978d-18b295ee6504_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.699 2 DEBUG nova.storage.rbd_utils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] resizing rbd image 7578f6f0-7071-49b7-978d-18b295ee6504_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.877 2 DEBUG nova.objects.instance [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lazy-loading 'migration_context' on Instance uuid 7578f6f0-7071-49b7-978d-18b295ee6504 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.897 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.898 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Ensure instance console log exists: /var/lib/nova/instances/7578f6f0-7071-49b7-978d-18b295ee6504/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.898 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.898 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:24 np0005465988 nova_compute[236126]: 2025-10-02 12:18:24.899 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:24.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:25.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:25 np0005465988 nova_compute[236126]: 2025-10-02 12:18:25.147 2 DEBUG nova.network.neutron [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Successfully created port: 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:18:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:18:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2452682330' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:18:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:18:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2452682330' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:18:26 np0005465988 nova_compute[236126]: 2025-10-02 12:18:26.502 2 DEBUG nova.network.neutron [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Successfully updated port: 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:18:26 np0005465988 nova_compute[236126]: 2025-10-02 12:18:26.531 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Acquiring lock "refresh_cache-7578f6f0-7071-49b7-978d-18b295ee6504" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:26 np0005465988 nova_compute[236126]: 2025-10-02 12:18:26.531 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Acquired lock "refresh_cache-7578f6f0-7071-49b7-978d-18b295ee6504" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:26 np0005465988 nova_compute[236126]: 2025-10-02 12:18:26.532 2 DEBUG nova.network.neutron [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:18:26 np0005465988 nova_compute[236126]: 2025-10-02 12:18:26.624 2 DEBUG nova.compute.manager [req-ad07e478-7d12-4296-bc62-e77d4d87f3ac req-27427414-98e9-497a-897b-4e5b31e85dda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Received event network-changed-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:26 np0005465988 nova_compute[236126]: 2025-10-02 12:18:26.625 2 DEBUG nova.compute.manager [req-ad07e478-7d12-4296-bc62-e77d4d87f3ac req-27427414-98e9-497a-897b-4e5b31e85dda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Refreshing instance network info cache due to event network-changed-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:18:26 np0005465988 nova_compute[236126]: 2025-10-02 12:18:26.626 2 DEBUG oslo_concurrency.lockutils [req-ad07e478-7d12-4296-bc62-e77d4d87f3ac req-27427414-98e9-497a-897b-4e5b31e85dda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-7578f6f0-7071-49b7-978d-18b295ee6504" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:26 np0005465988 nova_compute[236126]: 2025-10-02 12:18:26.726 2 DEBUG nova.network.neutron [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:18:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:26.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:27.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:27.345 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:27.345 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:27.346 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.812 2 DEBUG nova.network.neutron [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Updating instance_info_cache with network_info: [{"id": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "address": "fa:16:3e:1f:8d:6a", "network": {"id": "920d3a27-b1de-4488-b5a8-5e4b191c002f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1569617558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "471de2e25ff442629af9380f2f732266", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a030840-9c", "ovs_interfaceid": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.865 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Releasing lock "refresh_cache-7578f6f0-7071-49b7-978d-18b295ee6504" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.866 2 DEBUG nova.compute.manager [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Instance network_info: |[{"id": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "address": "fa:16:3e:1f:8d:6a", "network": {"id": "920d3a27-b1de-4488-b5a8-5e4b191c002f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1569617558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "471de2e25ff442629af9380f2f732266", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a030840-9c", "ovs_interfaceid": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.867 2 DEBUG oslo_concurrency.lockutils [req-ad07e478-7d12-4296-bc62-e77d4d87f3ac req-27427414-98e9-497a-897b-4e5b31e85dda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-7578f6f0-7071-49b7-978d-18b295ee6504" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.867 2 DEBUG nova.network.neutron [req-ad07e478-7d12-4296-bc62-e77d4d87f3ac req-27427414-98e9-497a-897b-4e5b31e85dda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Refreshing network info cache for port 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.871 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Start _get_guest_xml network_info=[{"id": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "address": "fa:16:3e:1f:8d:6a", "network": {"id": "920d3a27-b1de-4488-b5a8-5e4b191c002f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1569617558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "471de2e25ff442629af9380f2f732266", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a030840-9c", "ovs_interfaceid": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.876 2 WARNING nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.882 2 DEBUG nova.virt.libvirt.host [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.883 2 DEBUG nova.virt.libvirt.host [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.890 2 DEBUG nova.virt.libvirt.host [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.892 2 DEBUG nova.virt.libvirt.host [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.894 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.894 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.895 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.896 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.896 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.897 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.897 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.898 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.898 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.899 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.899 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.900 2 DEBUG nova.virt.hardware [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:18:27 np0005465988 nova_compute[236126]: 2025-10-02 12:18:27.904 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:18:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3420369335' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.315 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.342 2 DEBUG nova.storage.rbd_utils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] rbd image 7578f6f0-7071-49b7-978d-18b295ee6504_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.346 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:28.525 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:28.527 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:18:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:18:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3055176312' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.766 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.767 2 DEBUG nova.virt.libvirt.vif [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=77,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOPzNvArjDwB5GGouXJpK0Bsm1t83Q+rifkdytktL5z4u+NuQhVZvcYvNFKqBRjaWTLV7SHp3bRWSwSZCcOS2SZN7nTs9gaJAdyxMArsETFuEpdtp7W8CwM3ARCJqinHLw==',key_name='tempest-keypair-1055754341',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='471de2e25ff442629af9380f2f732266',ramdisk_id='',reservation_id='r-03xhdtzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-2097430904',owner_user_name='tempest-ServersTestFqdnHostnames-2097430904-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cc0b544e9ae24eb3b638e87ee4428677',uuid=7578f6f0-7071-49b7-978d-18b295ee6504,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "address": "fa:16:3e:1f:8d:6a", "network": {"id": "920d3a27-b1de-4488-b5a8-5e4b191c002f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1569617558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "471de2e25ff442629af9380f2f732266", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a030840-9c", "ovs_interfaceid": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.768 2 DEBUG nova.network.os_vif_util [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Converting VIF {"id": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "address": "fa:16:3e:1f:8d:6a", "network": {"id": "920d3a27-b1de-4488-b5a8-5e4b191c002f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1569617558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "471de2e25ff442629af9380f2f732266", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a030840-9c", "ovs_interfaceid": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.768 2 DEBUG nova.network.os_vif_util [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:8d:6a,bridge_name='br-int',has_traffic_filtering=True,id=1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb,network=Network(920d3a27-b1de-4488-b5a8-5e4b191c002f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a030840-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.770 2 DEBUG nova.objects.instance [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7578f6f0-7071-49b7-978d-18b295ee6504 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.796 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  <uuid>7578f6f0-7071-49b7-978d-18b295ee6504</uuid>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  <name>instance-0000004d</name>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <nova:name>guest-instance-1.domain.com</nova:name>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:18:27</nova:creationTime>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <nova:user uuid="cc0b544e9ae24eb3b638e87ee4428677">tempest-ServersTestFqdnHostnames-2097430904-project-member</nova:user>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <nova:project uuid="471de2e25ff442629af9380f2f732266">tempest-ServersTestFqdnHostnames-2097430904</nova:project>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <nova:port uuid="1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <entry name="serial">7578f6f0-7071-49b7-978d-18b295ee6504</entry>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <entry name="uuid">7578f6f0-7071-49b7-978d-18b295ee6504</entry>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/7578f6f0-7071-49b7-978d-18b295ee6504_disk">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/7578f6f0-7071-49b7-978d-18b295ee6504_disk.config">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:1f:8d:6a"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <target dev="tap1a030840-9c"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/7578f6f0-7071-49b7-978d-18b295ee6504/console.log" append="off"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:18:28 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:18:28 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:18:28 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:18:28 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.797 2 DEBUG nova.compute.manager [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Preparing to wait for external event network-vif-plugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.798 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Acquiring lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.798 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.799 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.801 2 DEBUG nova.virt.libvirt.vif [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=77,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOPzNvArjDwB5GGouXJpK0Bsm1t83Q+rifkdytktL5z4u+NuQhVZvcYvNFKqBRjaWTLV7SHp3bRWSwSZCcOS2SZN7nTs9gaJAdyxMArsETFuEpdtp7W8CwM3ARCJqinHLw==',key_name='tempest-keypair-1055754341',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='471de2e25ff442629af9380f2f732266',ramdisk_id='',reservation_id='r-03xhdtzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-2097430904',owner_user_name='tempest-ServersTestFqdnHostnames-2097430904-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cc0b544e9ae24eb3b638e87ee4428677',uuid=7578f6f0-7071-49b7-978d-18b295ee6504,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "address": "fa:16:3e:1f:8d:6a", "network": {"id": "920d3a27-b1de-4488-b5a8-5e4b191c002f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1569617558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "471de2e25ff442629af9380f2f732266", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a030840-9c", "ovs_interfaceid": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.801 2 DEBUG nova.network.os_vif_util [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Converting VIF {"id": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "address": "fa:16:3e:1f:8d:6a", "network": {"id": "920d3a27-b1de-4488-b5a8-5e4b191c002f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1569617558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "471de2e25ff442629af9380f2f732266", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a030840-9c", "ovs_interfaceid": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.803 2 DEBUG nova.network.os_vif_util [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:8d:6a,bridge_name='br-int',has_traffic_filtering=True,id=1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb,network=Network(920d3a27-b1de-4488-b5a8-5e4b191c002f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a030840-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.803 2 DEBUG os_vif [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:8d:6a,bridge_name='br-int',has_traffic_filtering=True,id=1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb,network=Network(920d3a27-b1de-4488-b5a8-5e4b191c002f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a030840-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.805 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.806 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.811 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a030840-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.812 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1a030840-9c, col_values=(('external_ids', {'iface-id': '1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:8d:6a', 'vm-uuid': '7578f6f0-7071-49b7-978d-18b295ee6504'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:28 np0005465988 NetworkManager[45041]: <info>  [1759407508.8153] manager: (tap1a030840-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.823 2 INFO os_vif [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:8d:6a,bridge_name='br-int',has_traffic_filtering=True,id=1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb,network=Network(920d3a27-b1de-4488-b5a8-5e4b191c002f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a030840-9c')#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.894 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.894 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.894 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] No VIF found with MAC fa:16:3e:1f:8d:6a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.895 2 INFO nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Using config drive#033[00m
Oct  2 08:18:28 np0005465988 nova_compute[236126]: 2025-10-02 12:18:28.923 2 DEBUG nova.storage.rbd_utils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] rbd image 7578f6f0-7071-49b7-978d-18b295ee6504_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:28.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:29.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:29 np0005465988 nova_compute[236126]: 2025-10-02 12:18:29.817 2 INFO nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Creating config drive at /var/lib/nova/instances/7578f6f0-7071-49b7-978d-18b295ee6504/disk.config#033[00m
Oct  2 08:18:29 np0005465988 nova_compute[236126]: 2025-10-02 12:18:29.823 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7578f6f0-7071-49b7-978d-18b295ee6504/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8ldz4gid execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:29 np0005465988 nova_compute[236126]: 2025-10-02 12:18:29.969 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7578f6f0-7071-49b7-978d-18b295ee6504/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8ldz4gid" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:30 np0005465988 nova_compute[236126]: 2025-10-02 12:18:30.045 2 DEBUG nova.storage.rbd_utils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] rbd image 7578f6f0-7071-49b7-978d-18b295ee6504_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:30 np0005465988 nova_compute[236126]: 2025-10-02 12:18:30.049 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7578f6f0-7071-49b7-978d-18b295ee6504/disk.config 7578f6f0-7071-49b7-978d-18b295ee6504_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:30 np0005465988 nova_compute[236126]: 2025-10-02 12:18:30.498 2 DEBUG nova.network.neutron [req-ad07e478-7d12-4296-bc62-e77d4d87f3ac req-27427414-98e9-497a-897b-4e5b31e85dda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Updated VIF entry in instance network info cache for port 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:18:30 np0005465988 nova_compute[236126]: 2025-10-02 12:18:30.500 2 DEBUG nova.network.neutron [req-ad07e478-7d12-4296-bc62-e77d4d87f3ac req-27427414-98e9-497a-897b-4e5b31e85dda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Updating instance_info_cache with network_info: [{"id": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "address": "fa:16:3e:1f:8d:6a", "network": {"id": "920d3a27-b1de-4488-b5a8-5e4b191c002f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1569617558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "471de2e25ff442629af9380f2f732266", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a030840-9c", "ovs_interfaceid": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:30 np0005465988 nova_compute[236126]: 2025-10-02 12:18:30.559 2 DEBUG oslo_concurrency.lockutils [req-ad07e478-7d12-4296-bc62-e77d4d87f3ac req-27427414-98e9-497a-897b-4e5b31e85dda d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-7578f6f0-7071-49b7-978d-18b295ee6504" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:30 np0005465988 podman[268304]: 2025-10-02 12:18:30.561298182 +0000 UTC m=+0.085359986 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  2 08:18:30 np0005465988 podman[268303]: 2025-10-02 12:18:30.572530979 +0000 UTC m=+0.092369110 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:18:30 np0005465988 podman[268302]: 2025-10-02 12:18:30.605587671 +0000 UTC m=+0.132842177 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Oct  2 08:18:30 np0005465988 nova_compute[236126]: 2025-10-02 12:18:30.653 2 DEBUG oslo_concurrency.processutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7578f6f0-7071-49b7-978d-18b295ee6504/disk.config 7578f6f0-7071-49b7-978d-18b295ee6504_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.605s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:30 np0005465988 nova_compute[236126]: 2025-10-02 12:18:30.654 2 INFO nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Deleting local config drive /var/lib/nova/instances/7578f6f0-7071-49b7-978d-18b295ee6504/disk.config because it was imported into RBD.#033[00m
Oct  2 08:18:30 np0005465988 kernel: tap1a030840-9c: entered promiscuous mode
Oct  2 08:18:30 np0005465988 NetworkManager[45041]: <info>  [1759407510.7178] manager: (tap1a030840-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/129)
Oct  2 08:18:30 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:30Z|00238|binding|INFO|Claiming lport 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb for this chassis.
Oct  2 08:18:30 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:30Z|00239|binding|INFO|1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb: Claiming fa:16:3e:1f:8d:6a 10.100.0.7
Oct  2 08:18:30 np0005465988 nova_compute[236126]: 2025-10-02 12:18:30.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.738 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:8d:6a 10.100.0.7'], port_security=['fa:16:3e:1f:8d:6a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7578f6f0-7071-49b7-978d-18b295ee6504', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-920d3a27-b1de-4488-b5a8-5e4b191c002f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '471de2e25ff442629af9380f2f732266', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e43ced7a-abcb-4b60-ad8d-9a684b716ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30574d55-bb55-4fa5-b5be-dcc8a522270a, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.740 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb in datapath 920d3a27-b1de-4488-b5a8-5e4b191c002f bound to our chassis#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.742 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 920d3a27-b1de-4488-b5a8-5e4b191c002f#033[00m
Oct  2 08:18:30 np0005465988 systemd-udevd[268384]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:18:30 np0005465988 systemd-machined[192594]: New machine qemu-29-instance-0000004d.
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.757 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e19a0b-cfd2-40d3-a1d7-9cfa352f31ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.758 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap920d3a27-b1 in ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.760 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap920d3a27-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.760 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[87fdc478-c39c-4a8d-888b-063bbd8468fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.761 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dba509bc-d201-45da-9d39-8a1d28dced74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:30 np0005465988 NetworkManager[45041]: <info>  [1759407510.7685] device (tap1a030840-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:18:30 np0005465988 systemd[1]: Started Virtual Machine qemu-29-instance-0000004d.
Oct  2 08:18:30 np0005465988 NetworkManager[45041]: <info>  [1759407510.7701] device (tap1a030840-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.777 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ccab42-2084-489c-b400-069c379bfae4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.807 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[15de498a-0949-4456-9174-5f84cab1c87f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:30 np0005465988 nova_compute[236126]: 2025-10-02 12:18:30.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:30 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:30Z|00240|binding|INFO|Setting lport 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb ovn-installed in OVS
Oct  2 08:18:30 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:30Z|00241|binding|INFO|Setting lport 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb up in Southbound
Oct  2 08:18:30 np0005465988 nova_compute[236126]: 2025-10-02 12:18:30.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.846 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f8e9755e-942b-4247-b9ef-0f02b557a6e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.853 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f35e7ed3-7fb5-47c3-ba1a-8487c85c4c3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:30 np0005465988 systemd-udevd[268388]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:18:30 np0005465988 NetworkManager[45041]: <info>  [1759407510.8551] manager: (tap920d3a27-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/130)
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.899 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f02b6bc4-3568-4d33-9d2b-f0a054b7045a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.903 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2b6ddf-cd96-46b4-a9ee-3187f54e3042]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:30 np0005465988 NetworkManager[45041]: <info>  [1759407510.9320] device (tap920d3a27-b0): carrier: link connected
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.941 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[dbd4b025-af0a-483f-90dd-195b508db565]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.963 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[233aeb77-ac84-4493-8c63-219603d63f38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap920d3a27-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:70:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557126, 'reachable_time': 33440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268418, 'error': None, 'target': 'ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:30.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:30.975 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[04530509-8637-4d3a-9b18-a3ec2243c94e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:70f7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 557126, 'tstamp': 557126}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 268419, 'error': None, 'target': 'ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:31.000 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[03eeb16c-4cdf-48ef-8e2a-bd50c5a8e861]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap920d3a27-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:70:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557126, 'reachable_time': 33440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 268420, 'error': None, 'target': 'ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:31.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:31.034 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7b08f9b9-7d60-4a2e-860d-490a1bd9d969]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:31 np0005465988 nova_compute[236126]: 2025-10-02 12:18:31.045 2 DEBUG nova.compute.manager [req-fba4b85b-3c06-4262-a19d-91b75de6cde4 req-d69f5cd2-f2d2-4f91-b9a1-1b04fae76ec7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Received event network-vif-plugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:31 np0005465988 nova_compute[236126]: 2025-10-02 12:18:31.045 2 DEBUG oslo_concurrency.lockutils [req-fba4b85b-3c06-4262-a19d-91b75de6cde4 req-d69f5cd2-f2d2-4f91-b9a1-1b04fae76ec7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:31 np0005465988 nova_compute[236126]: 2025-10-02 12:18:31.045 2 DEBUG oslo_concurrency.lockutils [req-fba4b85b-3c06-4262-a19d-91b75de6cde4 req-d69f5cd2-f2d2-4f91-b9a1-1b04fae76ec7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:31 np0005465988 nova_compute[236126]: 2025-10-02 12:18:31.046 2 DEBUG oslo_concurrency.lockutils [req-fba4b85b-3c06-4262-a19d-91b75de6cde4 req-d69f5cd2-f2d2-4f91-b9a1-1b04fae76ec7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:31 np0005465988 nova_compute[236126]: 2025-10-02 12:18:31.046 2 DEBUG nova.compute.manager [req-fba4b85b-3c06-4262-a19d-91b75de6cde4 req-d69f5cd2-f2d2-4f91-b9a1-1b04fae76ec7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Processing event network-vif-plugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:31.112 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e41c870e-f762-4caa-b0a2-173e1cbbe00f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:31.114 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap920d3a27-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:31.114 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:31.115 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap920d3a27-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:31 np0005465988 nova_compute[236126]: 2025-10-02 12:18:31.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:31 np0005465988 NetworkManager[45041]: <info>  [1759407511.1179] manager: (tap920d3a27-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Oct  2 08:18:31 np0005465988 kernel: tap920d3a27-b0: entered promiscuous mode
Oct  2 08:18:31 np0005465988 nova_compute[236126]: 2025-10-02 12:18:31.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:31.122 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap920d3a27-b0, col_values=(('external_ids', {'iface-id': '1d451e2b-54e7-4816-a195-e1cd79650f9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:31 np0005465988 nova_compute[236126]: 2025-10-02 12:18:31.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:31Z|00242|binding|INFO|Releasing lport 1d451e2b-54e7-4816-a195-e1cd79650f9c from this chassis (sb_readonly=0)
Oct  2 08:18:31 np0005465988 nova_compute[236126]: 2025-10-02 12:18:31.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:31.155 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/920d3a27-b1de-4488-b5a8-5e4b191c002f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/920d3a27-b1de-4488-b5a8-5e4b191c002f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:31.156 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b100e816-eabe-45b5-b099-541c2fac8f6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:31.157 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-920d3a27-b1de-4488-b5a8-5e4b191c002f
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/920d3a27-b1de-4488-b5a8-5e4b191c002f.pid.haproxy
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 920d3a27-b1de-4488-b5a8-5e4b191c002f
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:18:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:31.158 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f', 'env', 'PROCESS_TAG=haproxy-920d3a27-b1de-4488-b5a8-5e4b191c002f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/920d3a27-b1de-4488-b5a8-5e4b191c002f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:18:31 np0005465988 podman[268486]: 2025-10-02 12:18:31.588654265 +0000 UTC m=+0.068557377 container create 37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS)
Oct  2 08:18:31 np0005465988 systemd[1]: Started libpod-conmon-37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62.scope.
Oct  2 08:18:31 np0005465988 podman[268486]: 2025-10-02 12:18:31.555021406 +0000 UTC m=+0.034924498 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:18:31 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:18:31 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e88f3d5468a8fef3f66a90e8efe63efa8cb7c26907f6726b500e185a879764d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:18:31 np0005465988 podman[268486]: 2025-10-02 12:18:31.703493347 +0000 UTC m=+0.183396429 container init 37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 08:18:31 np0005465988 podman[268486]: 2025-10-02 12:18:31.711548912 +0000 UTC m=+0.191451984 container start 37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:18:31 np0005465988 neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f[268507]: [NOTICE]   (268511) : New worker (268513) forked
Oct  2 08:18:31 np0005465988 neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f[268507]: [NOTICE]   (268511) : Loading success.
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.031 2 DEBUG nova.compute.manager [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.033 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407512.0307436, 7578f6f0-7071-49b7-978d-18b295ee6504 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.033 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] VM Started (Lifecycle Event)#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.038 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.043 2 INFO nova.virt.libvirt.driver [-] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Instance spawned successfully.#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.043 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.076 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.084 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.090 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.090 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.091 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.092 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.092 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.093 2 DEBUG nova.virt.libvirt.driver [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.133 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.134 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407512.0310912, 7578f6f0-7071-49b7-978d-18b295ee6504 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.134 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.197 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.204 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407512.036494, 7578f6f0-7071-49b7-978d-18b295ee6504 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.204 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.213 2 INFO nova.compute.manager [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Took 8.29 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.214 2 DEBUG nova.compute.manager [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.230 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.235 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.262 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.305 2 INFO nova.compute.manager [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Took 9.38 seconds to build instance.#033[00m
Oct  2 08:18:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.323 2 DEBUG oslo_concurrency.lockutils [None req-82e78909-c414-4e38-a768-30cccae6174b cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.511s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:32.531 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:32 np0005465988 nova_compute[236126]: 2025-10-02 12:18:32.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:32.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:33.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:33 np0005465988 nova_compute[236126]: 2025-10-02 12:18:33.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:34 np0005465988 nova_compute[236126]: 2025-10-02 12:18:34.001 2 DEBUG nova.compute.manager [req-6c007cd1-b3a2-4d57-92fd-3d45ab910685 req-51d635df-a43e-4f18-a494-f4040b9278ad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Received event network-vif-plugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:34 np0005465988 nova_compute[236126]: 2025-10-02 12:18:34.002 2 DEBUG oslo_concurrency.lockutils [req-6c007cd1-b3a2-4d57-92fd-3d45ab910685 req-51d635df-a43e-4f18-a494-f4040b9278ad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:34 np0005465988 nova_compute[236126]: 2025-10-02 12:18:34.003 2 DEBUG oslo_concurrency.lockutils [req-6c007cd1-b3a2-4d57-92fd-3d45ab910685 req-51d635df-a43e-4f18-a494-f4040b9278ad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:34 np0005465988 nova_compute[236126]: 2025-10-02 12:18:34.004 2 DEBUG oslo_concurrency.lockutils [req-6c007cd1-b3a2-4d57-92fd-3d45ab910685 req-51d635df-a43e-4f18-a494-f4040b9278ad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:34 np0005465988 nova_compute[236126]: 2025-10-02 12:18:34.004 2 DEBUG nova.compute.manager [req-6c007cd1-b3a2-4d57-92fd-3d45ab910685 req-51d635df-a43e-4f18-a494-f4040b9278ad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] No waiting events found dispatching network-vif-plugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:34 np0005465988 nova_compute[236126]: 2025-10-02 12:18:34.005 2 WARNING nova.compute.manager [req-6c007cd1-b3a2-4d57-92fd-3d45ab910685 req-51d635df-a43e-4f18-a494-f4040b9278ad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Received unexpected event network-vif-plugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb for instance with vm_state active and task_state None.#033[00m
Oct  2 08:18:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:34.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:35.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:36 np0005465988 NetworkManager[45041]: <info>  [1759407516.4824] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Oct  2 08:18:36 np0005465988 NetworkManager[45041]: <info>  [1759407516.4836] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Oct  2 08:18:36 np0005465988 nova_compute[236126]: 2025-10-02 12:18:36.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:36 np0005465988 nova_compute[236126]: 2025-10-02 12:18:36.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:36Z|00243|binding|INFO|Releasing lport 1d451e2b-54e7-4816-a195-e1cd79650f9c from this chassis (sb_readonly=0)
Oct  2 08:18:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:36Z|00244|binding|INFO|Releasing lport 421cd6e3-75aa-44e1-b552-d119c4fcd629 from this chassis (sb_readonly=0)
Oct  2 08:18:36 np0005465988 nova_compute[236126]: 2025-10-02 12:18:36.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:36.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:37.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:37 np0005465988 nova_compute[236126]: 2025-10-02 12:18:37.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:37 np0005465988 nova_compute[236126]: 2025-10-02 12:18:37.815 2 DEBUG nova.compute.manager [req-e37b98ff-cfcc-443a-9a40-dd2e7dd6366e req-1d4c62a3-033d-441d-9eb2-585b9cfb3e35 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Received event network-changed-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:37 np0005465988 nova_compute[236126]: 2025-10-02 12:18:37.817 2 DEBUG nova.compute.manager [req-e37b98ff-cfcc-443a-9a40-dd2e7dd6366e req-1d4c62a3-033d-441d-9eb2-585b9cfb3e35 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Refreshing instance network info cache due to event network-changed-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:18:37 np0005465988 nova_compute[236126]: 2025-10-02 12:18:37.817 2 DEBUG oslo_concurrency.lockutils [req-e37b98ff-cfcc-443a-9a40-dd2e7dd6366e req-1d4c62a3-033d-441d-9eb2-585b9cfb3e35 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-7578f6f0-7071-49b7-978d-18b295ee6504" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:37 np0005465988 nova_compute[236126]: 2025-10-02 12:18:37.818 2 DEBUG oslo_concurrency.lockutils [req-e37b98ff-cfcc-443a-9a40-dd2e7dd6366e req-1d4c62a3-033d-441d-9eb2-585b9cfb3e35 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-7578f6f0-7071-49b7-978d-18b295ee6504" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:37 np0005465988 nova_compute[236126]: 2025-10-02 12:18:37.818 2 DEBUG nova.network.neutron [req-e37b98ff-cfcc-443a-9a40-dd2e7dd6366e req-1d4c62a3-033d-441d-9eb2-585b9cfb3e35 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Refreshing network info cache for port 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:18:38 np0005465988 nova_compute[236126]: 2025-10-02 12:18:38.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:38.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:39 np0005465988 nova_compute[236126]: 2025-10-02 12:18:39.011 2 DEBUG nova.network.neutron [req-e37b98ff-cfcc-443a-9a40-dd2e7dd6366e req-1d4c62a3-033d-441d-9eb2-585b9cfb3e35 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Updated VIF entry in instance network info cache for port 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:18:39 np0005465988 nova_compute[236126]: 2025-10-02 12:18:39.012 2 DEBUG nova.network.neutron [req-e37b98ff-cfcc-443a-9a40-dd2e7dd6366e req-1d4c62a3-033d-441d-9eb2-585b9cfb3e35 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Updating instance_info_cache with network_info: [{"id": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "address": "fa:16:3e:1f:8d:6a", "network": {"id": "920d3a27-b1de-4488-b5a8-5e4b191c002f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1569617558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "471de2e25ff442629af9380f2f732266", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a030840-9c", "ovs_interfaceid": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:39.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:39 np0005465988 nova_compute[236126]: 2025-10-02 12:18:39.039 2 DEBUG oslo_concurrency.lockutils [req-e37b98ff-cfcc-443a-9a40-dd2e7dd6366e req-1d4c62a3-033d-441d-9eb2-585b9cfb3e35 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-7578f6f0-7071-49b7-978d-18b295ee6504" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:40Z|00245|binding|INFO|Releasing lport 1d451e2b-54e7-4816-a195-e1cd79650f9c from this chassis (sb_readonly=0)
Oct  2 08:18:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:40Z|00246|binding|INFO|Releasing lport 421cd6e3-75aa-44e1-b552-d119c4fcd629 from this chassis (sb_readonly=0)
Oct  2 08:18:40 np0005465988 nova_compute[236126]: 2025-10-02 12:18:40.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:40.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:41.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:42 np0005465988 nova_compute[236126]: 2025-10-02 12:18:42.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:42 np0005465988 nova_compute[236126]: 2025-10-02 12:18:42.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:18:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:42.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:18:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:43.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:43 np0005465988 nova_compute[236126]: 2025-10-02 12:18:43.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:44 np0005465988 nova_compute[236126]: 2025-10-02 12:18:44.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:44 np0005465988 nova_compute[236126]: 2025-10-02 12:18:44.554 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:44 np0005465988 nova_compute[236126]: 2025-10-02 12:18:44.555 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:44 np0005465988 nova_compute[236126]: 2025-10-02 12:18:44.555 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:44 np0005465988 nova_compute[236126]: 2025-10-02 12:18:44.555 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:18:44 np0005465988 nova_compute[236126]: 2025-10-02 12:18:44.555 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:44 np0005465988 podman[268579]: 2025-10-02 12:18:44.589408798 +0000 UTC m=+0.104195613 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:18:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:44.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:45.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:18:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/942433783' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.053 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.203 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000004d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.204 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000004d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.209 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000004b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.210 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000004b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:18:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:45Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:8d:6a 10.100.0.7
Oct  2 08:18:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:45Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:8d:6a 10.100.0.7
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.422 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.423 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4217MB free_disk=20.92169952392578GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.423 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.424 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.642 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance bc4239f5-3cf2-4325-803c-73121f7e0ee0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.643 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 7578f6f0-7071-49b7-978d-18b295ee6504 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.643 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.644 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:18:45 np0005465988 nova_compute[236126]: 2025-10-02 12:18:45.826 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:18:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1135563654' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:18:46 np0005465988 nova_compute[236126]: 2025-10-02 12:18:46.332 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:46 np0005465988 nova_compute[236126]: 2025-10-02 12:18:46.338 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:46 np0005465988 nova_compute[236126]: 2025-10-02 12:18:46.353 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:46 np0005465988 nova_compute[236126]: 2025-10-02 12:18:46.373 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:18:46 np0005465988 nova_compute[236126]: 2025-10-02 12:18:46.373 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.949s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:46.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:47.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:47 np0005465988 nova_compute[236126]: 2025-10-02 12:18:47.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:48 np0005465988 nova_compute[236126]: 2025-10-02 12:18:48.374 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:48 np0005465988 nova_compute[236126]: 2025-10-02 12:18:48.375 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:48 np0005465988 nova_compute[236126]: 2025-10-02 12:18:48.375 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:48 np0005465988 nova_compute[236126]: 2025-10-02 12:18:48.376 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:48 np0005465988 nova_compute[236126]: 2025-10-02 12:18:48.376 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:18:48 np0005465988 nova_compute[236126]: 2025-10-02 12:18:48.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:48 np0005465988 nova_compute[236126]: 2025-10-02 12:18:48.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:48.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:49.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:49 np0005465988 nova_compute[236126]: 2025-10-02 12:18:49.382 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "e116d367-5ae9-4ce2-9d33-3936fd3de658" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:49 np0005465988 nova_compute[236126]: 2025-10-02 12:18:49.382 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:49 np0005465988 nova_compute[236126]: 2025-10-02 12:18:49.397 2 DEBUG nova.compute.manager [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:18:49 np0005465988 nova_compute[236126]: 2025-10-02 12:18:49.458 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:49 np0005465988 nova_compute[236126]: 2025-10-02 12:18:49.458 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:49 np0005465988 nova_compute[236126]: 2025-10-02 12:18:49.467 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:18:49 np0005465988 nova_compute[236126]: 2025-10-02 12:18:49.467 2 INFO nova.compute.claims [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:18:49 np0005465988 nova_compute[236126]: 2025-10-02 12:18:49.471 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:49 np0005465988 nova_compute[236126]: 2025-10-02 12:18:49.592 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:18:50 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1625404945' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.069 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.074 2 DEBUG nova.compute.provider_tree [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.088 2 DEBUG nova.scheduler.client.report [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.120 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.120 2 DEBUG nova.compute.manager [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.156 2 DEBUG nova.compute.manager [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.157 2 DEBUG nova.network.neutron [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.178 2 INFO nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.205 2 DEBUG nova.compute.manager [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.317 2 DEBUG nova.compute.manager [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.319 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.319 2 INFO nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Creating image(s)#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.371 2 DEBUG nova.storage.rbd_utils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image e116d367-5ae9-4ce2-9d33-3936fd3de658_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.405 2 DEBUG nova.storage.rbd_utils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image e116d367-5ae9-4ce2-9d33-3936fd3de658_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.439 2 DEBUG nova.storage.rbd_utils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image e116d367-5ae9-4ce2-9d33-3936fd3de658_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.444 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.470 2 DEBUG nova.policy [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '69d8e29c6d3747e98a5985a584f4c814', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8efba404696b40fbbaa6431b934b87f1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.514 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.515 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.515 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.516 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.550 2 DEBUG nova.storage.rbd_utils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image e116d367-5ae9-4ce2-9d33-3936fd3de658_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:50 np0005465988 nova_compute[236126]: 2025-10-02 12:18:50.555 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e116d367-5ae9-4ce2-9d33-3936fd3de658_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:51.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:51.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:51 np0005465988 nova_compute[236126]: 2025-10-02 12:18:51.174 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e116d367-5ae9-4ce2-9d33-3936fd3de658_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.619s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:51 np0005465988 nova_compute[236126]: 2025-10-02 12:18:51.268 2 DEBUG nova.storage.rbd_utils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] resizing rbd image e116d367-5ae9-4ce2-9d33-3936fd3de658_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:18:51 np0005465988 nova_compute[236126]: 2025-10-02 12:18:51.393 2 DEBUG nova.objects.instance [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lazy-loading 'migration_context' on Instance uuid e116d367-5ae9-4ce2-9d33-3936fd3de658 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:51 np0005465988 nova_compute[236126]: 2025-10-02 12:18:51.410 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:18:51 np0005465988 nova_compute[236126]: 2025-10-02 12:18:51.410 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Ensure instance console log exists: /var/lib/nova/instances/e116d367-5ae9-4ce2-9d33-3936fd3de658/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:18:51 np0005465988 nova_compute[236126]: 2025-10-02 12:18:51.410 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:51 np0005465988 nova_compute[236126]: 2025-10-02 12:18:51.411 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:51 np0005465988 nova_compute[236126]: 2025-10-02 12:18:51.411 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.340 2 DEBUG nova.network.neutron [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Successfully created port: 0b38303d-0e2e-47e2-84f1-d431f795968b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.385 2 DEBUG oslo_concurrency.lockutils [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Acquiring lock "7578f6f0-7071-49b7-978d-18b295ee6504" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.385 2 DEBUG oslo_concurrency.lockutils [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.386 2 DEBUG oslo_concurrency.lockutils [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Acquiring lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.386 2 DEBUG oslo_concurrency.lockutils [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.387 2 DEBUG oslo_concurrency.lockutils [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.389 2 INFO nova.compute.manager [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Terminating instance#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.390 2 DEBUG nova.compute.manager [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:18:52 np0005465988 kernel: tap1a030840-9c (unregistering): left promiscuous mode
Oct  2 08:18:52 np0005465988 NetworkManager[45041]: <info>  [1759407532.4800] device (tap1a030840-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:18:52 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:52Z|00247|binding|INFO|Releasing lport 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb from this chassis (sb_readonly=0)
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:52 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:52Z|00248|binding|INFO|Setting lport 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb down in Southbound
Oct  2 08:18:52 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:52Z|00249|binding|INFO|Removing iface tap1a030840-9c ovn-installed in OVS
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.509 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:8d:6a 10.100.0.7'], port_security=['fa:16:3e:1f:8d:6a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7578f6f0-7071-49b7-978d-18b295ee6504', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-920d3a27-b1de-4488-b5a8-5e4b191c002f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '471de2e25ff442629af9380f2f732266', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e43ced7a-abcb-4b60-ad8d-9a684b716ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30574d55-bb55-4fa5-b5be-dcc8a522270a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.511 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb in datapath 920d3a27-b1de-4488-b5a8-5e4b191c002f unbound from our chassis#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.513 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 920d3a27-b1de-4488-b5a8-5e4b191c002f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.515 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[151090cd-979a-493d-9088-e5bc189c1fd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.516 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f namespace which is not needed anymore#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:52 np0005465988 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Oct  2 08:18:52 np0005465988 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000004d.scope: Consumed 14.528s CPU time.
Oct  2 08:18:52 np0005465988 systemd-machined[192594]: Machine qemu-29-instance-0000004d terminated.
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.624 2 INFO nova.virt.libvirt.driver [-] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Instance destroyed successfully.#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.625 2 DEBUG nova.objects.instance [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lazy-loading 'resources' on Instance uuid 7578f6f0-7071-49b7-978d-18b295ee6504 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.640 2 DEBUG nova.virt.libvirt.vif [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=77,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOPzNvArjDwB5GGouXJpK0Bsm1t83Q+rifkdytktL5z4u+NuQhVZvcYvNFKqBRjaWTLV7SHp3bRWSwSZCcOS2SZN7nTs9gaJAdyxMArsETFuEpdtp7W8CwM3ARCJqinHLw==',key_name='tempest-keypair-1055754341',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:18:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='471de2e25ff442629af9380f2f732266',ramdisk_id='',reservation_id='r-03xhdtzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestFqdnHostnames-2097430904',owner_user_name='tempest-ServersTestFqdnHostnames-2097430904-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:18:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cc0b544e9ae24eb3b638e87ee4428677',uuid=7578f6f0-7071-49b7-978d-18b295ee6504,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "address": "fa:16:3e:1f:8d:6a", "network": {"id": "920d3a27-b1de-4488-b5a8-5e4b191c002f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1569617558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "471de2e25ff442629af9380f2f732266", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a030840-9c", "ovs_interfaceid": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.640 2 DEBUG nova.network.os_vif_util [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Converting VIF {"id": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "address": "fa:16:3e:1f:8d:6a", "network": {"id": "920d3a27-b1de-4488-b5a8-5e4b191c002f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1569617558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "471de2e25ff442629af9380f2f732266", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a030840-9c", "ovs_interfaceid": "1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.641 2 DEBUG nova.network.os_vif_util [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:8d:6a,bridge_name='br-int',has_traffic_filtering=True,id=1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb,network=Network(920d3a27-b1de-4488-b5a8-5e4b191c002f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a030840-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.641 2 DEBUG os_vif [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:8d:6a,bridge_name='br-int',has_traffic_filtering=True,id=1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb,network=Network(920d3a27-b1de-4488-b5a8-5e4b191c002f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a030840-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.643 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a030840-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.650 2 INFO os_vif [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:8d:6a,bridge_name='br-int',has_traffic_filtering=True,id=1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb,network=Network(920d3a27-b1de-4488-b5a8-5e4b191c002f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a030840-9c')#033[00m
Oct  2 08:18:52 np0005465988 neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f[268507]: [NOTICE]   (268511) : haproxy version is 2.8.14-c23fe91
Oct  2 08:18:52 np0005465988 neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f[268507]: [NOTICE]   (268511) : path to executable is /usr/sbin/haproxy
Oct  2 08:18:52 np0005465988 neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f[268507]: [WARNING]  (268511) : Exiting Master process...
Oct  2 08:18:52 np0005465988 neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f[268507]: [ALERT]    (268511) : Current worker (268513) exited with code 143 (Terminated)
Oct  2 08:18:52 np0005465988 neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f[268507]: [WARNING]  (268511) : All workers exited. Exiting... (0)
Oct  2 08:18:52 np0005465988 systemd[1]: libpod-37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62.scope: Deactivated successfully.
Oct  2 08:18:52 np0005465988 podman[268860]: 2025-10-02 12:18:52.675427542 +0000 UTC m=+0.057707291 container died 37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:18:52 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62-userdata-shm.mount: Deactivated successfully.
Oct  2 08:18:52 np0005465988 systemd[1]: var-lib-containers-storage-overlay-e88f3d5468a8fef3f66a90e8efe63efa8cb7c26907f6726b500e185a879764d4-merged.mount: Deactivated successfully.
Oct  2 08:18:52 np0005465988 podman[268860]: 2025-10-02 12:18:52.72552509 +0000 UTC m=+0.107804839 container cleanup 37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:18:52 np0005465988 systemd[1]: libpod-conmon-37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62.scope: Deactivated successfully.
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:52 np0005465988 podman[268914]: 2025-10-02 12:18:52.82306993 +0000 UTC m=+0.072851842 container remove 37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.831 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8b70c553-b46f-4963-bb9f-712623c7b14d]: (4, ('Thu Oct  2 12:18:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f (37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62)\n37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62\nThu Oct  2 12:18:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f (37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62)\n37c8d5be9872c4b0561ea0bbd1be71f5912759420aedb2342a0a486863627e62\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.833 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9c6df647-e61c-4669-865e-70f052fd5809]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.834 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap920d3a27-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:52 np0005465988 kernel: tap920d3a27-b0: left promiscuous mode
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.858 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[131678fa-35c4-407b-b8ab-73a6d378f91b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.862 2 DEBUG nova.compute.manager [req-0a54b2f2-2861-446a-89d7-ddcb76510b86 req-bdf915c7-4d58-4fc4-93ff-5ea699ed7fe1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Received event network-vif-unplugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.862 2 DEBUG oslo_concurrency.lockutils [req-0a54b2f2-2861-446a-89d7-ddcb76510b86 req-bdf915c7-4d58-4fc4-93ff-5ea699ed7fe1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.863 2 DEBUG oslo_concurrency.lockutils [req-0a54b2f2-2861-446a-89d7-ddcb76510b86 req-bdf915c7-4d58-4fc4-93ff-5ea699ed7fe1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.864 2 DEBUG oslo_concurrency.lockutils [req-0a54b2f2-2861-446a-89d7-ddcb76510b86 req-bdf915c7-4d58-4fc4-93ff-5ea699ed7fe1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.864 2 DEBUG nova.compute.manager [req-0a54b2f2-2861-446a-89d7-ddcb76510b86 req-bdf915c7-4d58-4fc4-93ff-5ea699ed7fe1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] No waiting events found dispatching network-vif-unplugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:52 np0005465988 nova_compute[236126]: 2025-10-02 12:18:52.865 2 DEBUG nova.compute.manager [req-0a54b2f2-2861-446a-89d7-ddcb76510b86 req-bdf915c7-4d58-4fc4-93ff-5ea699ed7fe1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Received event network-vif-unplugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.898 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4da406ac-d6cc-40c5-b0de-7ee3080c0d1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.899 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9dd48bd0-fde7-46bf-aa67-083298743e08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.918 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d89007d1-40c9-4661-9a55-9394b188be1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557116, 'reachable_time': 23708, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268929, 'error': None, 'target': 'ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.921 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-920d3a27-b1de-4488-b5a8-5e4b191c002f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:52.921 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[2ab8df76-dbc5-48b4-9148-2081bd961d3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:52 np0005465988 systemd[1]: run-netns-ovnmeta\x2d920d3a27\x2db1de\x2d4488\x2db5a8\x2d5e4b191c002f.mount: Deactivated successfully.
Oct  2 08:18:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:53.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:53.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:53 np0005465988 nova_compute[236126]: 2025-10-02 12:18:53.333 2 INFO nova.virt.libvirt.driver [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Deleting instance files /var/lib/nova/instances/7578f6f0-7071-49b7-978d-18b295ee6504_del#033[00m
Oct  2 08:18:53 np0005465988 nova_compute[236126]: 2025-10-02 12:18:53.334 2 INFO nova.virt.libvirt.driver [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Deletion of /var/lib/nova/instances/7578f6f0-7071-49b7-978d-18b295ee6504_del complete#033[00m
Oct  2 08:18:53 np0005465988 nova_compute[236126]: 2025-10-02 12:18:53.398 2 INFO nova.compute.manager [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Took 1.01 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:18:53 np0005465988 nova_compute[236126]: 2025-10-02 12:18:53.399 2 DEBUG oslo.service.loopingcall [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:18:53 np0005465988 nova_compute[236126]: 2025-10-02 12:18:53.399 2 DEBUG nova.compute.manager [-] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:18:53 np0005465988 nova_compute[236126]: 2025-10-02 12:18:53.400 2 DEBUG nova.network.neutron [-] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.095 2 DEBUG nova.network.neutron [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Successfully updated port: 0b38303d-0e2e-47e2-84f1-d431f795968b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.147 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "refresh_cache-e116d367-5ae9-4ce2-9d33-3936fd3de658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.148 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquired lock "refresh_cache-e116d367-5ae9-4ce2-9d33-3936fd3de658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.148 2 DEBUG nova.network.neutron [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.288 2 DEBUG nova.compute.manager [req-26b81797-1c0d-47cf-8dad-6e1e088811db req-1f948bde-f20a-4784-8b37-a71052378e3c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Received event network-changed-0b38303d-0e2e-47e2-84f1-d431f795968b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.289 2 DEBUG nova.compute.manager [req-26b81797-1c0d-47cf-8dad-6e1e088811db req-1f948bde-f20a-4784-8b37-a71052378e3c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Refreshing instance network info cache due to event network-changed-0b38303d-0e2e-47e2-84f1-d431f795968b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.289 2 DEBUG oslo_concurrency.lockutils [req-26b81797-1c0d-47cf-8dad-6e1e088811db req-1f948bde-f20a-4784-8b37-a71052378e3c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-e116d367-5ae9-4ce2-9d33-3936fd3de658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.458 2 DEBUG nova.network.neutron [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.504 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.504 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.779 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.779 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.781 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.782 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bc4239f5-3cf2-4325-803c-73121f7e0ee0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.880 2 DEBUG nova.network.neutron [-] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:54 np0005465988 nova_compute[236126]: 2025-10-02 12:18:54.927 2 INFO nova.compute.manager [-] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Took 1.53 seconds to deallocate network for instance.#033[00m
Oct  2 08:18:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:18:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:55.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.015 2 DEBUG oslo_concurrency.lockutils [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.016 2 DEBUG oslo_concurrency.lockutils [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.033 2 DEBUG nova.compute.manager [req-10cf60dc-f7e0-49e4-b78d-e1e625395ed1 req-1dba071c-e975-4a8f-aa9d-7bf12ddc1fa8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Received event network-vif-plugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.033 2 DEBUG oslo_concurrency.lockutils [req-10cf60dc-f7e0-49e4-b78d-e1e625395ed1 req-1dba071c-e975-4a8f-aa9d-7bf12ddc1fa8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.034 2 DEBUG oslo_concurrency.lockutils [req-10cf60dc-f7e0-49e4-b78d-e1e625395ed1 req-1dba071c-e975-4a8f-aa9d-7bf12ddc1fa8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.034 2 DEBUG oslo_concurrency.lockutils [req-10cf60dc-f7e0-49e4-b78d-e1e625395ed1 req-1dba071c-e975-4a8f-aa9d-7bf12ddc1fa8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.034 2 DEBUG nova.compute.manager [req-10cf60dc-f7e0-49e4-b78d-e1e625395ed1 req-1dba071c-e975-4a8f-aa9d-7bf12ddc1fa8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] No waiting events found dispatching network-vif-plugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.035 2 WARNING nova.compute.manager [req-10cf60dc-f7e0-49e4-b78d-e1e625395ed1 req-1dba071c-e975-4a8f-aa9d-7bf12ddc1fa8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Received unexpected event network-vif-plugged-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:18:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:18:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:55.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.129 2 DEBUG oslo_concurrency.processutils [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:18:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1652796999' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.657 2 DEBUG oslo_concurrency.processutils [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.665 2 DEBUG nova.compute.provider_tree [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.681 2 DEBUG nova.scheduler.client.report [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.708 2 DEBUG oslo_concurrency.lockutils [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.736 2 INFO nova.scheduler.client.report [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Deleted allocations for instance 7578f6f0-7071-49b7-978d-18b295ee6504#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.743 2 DEBUG nova.network.neutron [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Updating instance_info_cache with network_info: [{"id": "0b38303d-0e2e-47e2-84f1-d431f795968b", "address": "fa:16:3e:62:fc:3e", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b38303d-0e", "ovs_interfaceid": "0b38303d-0e2e-47e2-84f1-d431f795968b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.774 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Releasing lock "refresh_cache-e116d367-5ae9-4ce2-9d33-3936fd3de658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.775 2 DEBUG nova.compute.manager [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Instance network_info: |[{"id": "0b38303d-0e2e-47e2-84f1-d431f795968b", "address": "fa:16:3e:62:fc:3e", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b38303d-0e", "ovs_interfaceid": "0b38303d-0e2e-47e2-84f1-d431f795968b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.775 2 DEBUG oslo_concurrency.lockutils [req-26b81797-1c0d-47cf-8dad-6e1e088811db req-1f948bde-f20a-4784-8b37-a71052378e3c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-e116d367-5ae9-4ce2-9d33-3936fd3de658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.775 2 DEBUG nova.network.neutron [req-26b81797-1c0d-47cf-8dad-6e1e088811db req-1f948bde-f20a-4784-8b37-a71052378e3c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Refreshing network info cache for port 0b38303d-0e2e-47e2-84f1-d431f795968b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.779 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Start _get_guest_xml network_info=[{"id": "0b38303d-0e2e-47e2-84f1-d431f795968b", "address": "fa:16:3e:62:fc:3e", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b38303d-0e", "ovs_interfaceid": "0b38303d-0e2e-47e2-84f1-d431f795968b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.783 2 WARNING nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.788 2 DEBUG nova.virt.libvirt.host [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.788 2 DEBUG nova.virt.libvirt.host [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.792 2 DEBUG nova.virt.libvirt.host [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.793 2 DEBUG nova.virt.libvirt.host [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.794 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.795 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.795 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.795 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.796 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.796 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.796 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.797 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.797 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.797 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.798 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.798 2 DEBUG nova.virt.hardware [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.802 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:55 np0005465988 nova_compute[236126]: 2025-10-02 12:18:55.846 2 DEBUG oslo_concurrency.lockutils [None req-cd5a7774-8001-4f88-8f77-04ec0068d874 cc0b544e9ae24eb3b638e87ee4428677 471de2e25ff442629af9380f2f732266 - - default default] Lock "7578f6f0-7071-49b7-978d-18b295ee6504" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:18:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2916340495' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.282 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.310 2 DEBUG nova.storage.rbd_utils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image e116d367-5ae9-4ce2-9d33-3936fd3de658_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.314 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.399 2 DEBUG nova.compute.manager [req-42dc10ee-9029-4fab-8bd0-cbd7f218fb73 req-83ce4754-fa88-4305-8f43-5f51db548a3e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Received event network-vif-deleted-1a030840-9cd2-4f7f-a10b-8ecd5be0d9fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:18:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/844289913' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.753 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.754 2 DEBUG nova.virt.libvirt.vif [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-818842927',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-818842927',id=79,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8efba404696b40fbbaa6431b934b87f1',ramdisk_id='',reservation_id='r-t7p63nhq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-153154373',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-153154373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:50Z,user_data=None,user_id='69d8e29c6d3747e98a5985a584f4c814',uuid=e116d367-5ae9-4ce2-9d33-3936fd3de658,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b38303d-0e2e-47e2-84f1-d431f795968b", "address": "fa:16:3e:62:fc:3e", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b38303d-0e", "ovs_interfaceid": "0b38303d-0e2e-47e2-84f1-d431f795968b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.754 2 DEBUG nova.network.os_vif_util [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converting VIF {"id": "0b38303d-0e2e-47e2-84f1-d431f795968b", "address": "fa:16:3e:62:fc:3e", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b38303d-0e", "ovs_interfaceid": "0b38303d-0e2e-47e2-84f1-d431f795968b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.755 2 DEBUG nova.network.os_vif_util [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:fc:3e,bridge_name='br-int',has_traffic_filtering=True,id=0b38303d-0e2e-47e2-84f1-d431f795968b,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b38303d-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.756 2 DEBUG nova.objects.instance [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid e116d367-5ae9-4ce2-9d33-3936fd3de658 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.771 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  <uuid>e116d367-5ae9-4ce2-9d33-3936fd3de658</uuid>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  <name>instance-0000004f</name>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerBootFromVolumeStableRescueTest-server-818842927</nova:name>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:18:55</nova:creationTime>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <nova:user uuid="69d8e29c6d3747e98a5985a584f4c814">tempest-ServerBootFromVolumeStableRescueTest-153154373-project-member</nova:user>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <nova:project uuid="8efba404696b40fbbaa6431b934b87f1">tempest-ServerBootFromVolumeStableRescueTest-153154373</nova:project>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <nova:port uuid="0b38303d-0e2e-47e2-84f1-d431f795968b">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <entry name="serial">e116d367-5ae9-4ce2-9d33-3936fd3de658</entry>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <entry name="uuid">e116d367-5ae9-4ce2-9d33-3936fd3de658</entry>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e116d367-5ae9-4ce2-9d33-3936fd3de658_disk">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e116d367-5ae9-4ce2-9d33-3936fd3de658_disk.config">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:62:fc:3e"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <target dev="tap0b38303d-0e"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/e116d367-5ae9-4ce2-9d33-3936fd3de658/console.log" append="off"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:18:56 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:18:56 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:18:56 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:18:56 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.772 2 DEBUG nova.compute.manager [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Preparing to wait for external event network-vif-plugged-0b38303d-0e2e-47e2-84f1-d431f795968b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.772 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.773 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.773 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.773 2 DEBUG nova.virt.libvirt.vif [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-818842927',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-818842927',id=79,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8efba404696b40fbbaa6431b934b87f1',ramdisk_id='',reservation_id='r-t7p63nhq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-153154373',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-153154373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:50Z,user_data=None,user_id='69d8e29c6d3747e98a5985a584f4c814',uuid=e116d367-5ae9-4ce2-9d33-3936fd3de658,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b38303d-0e2e-47e2-84f1-d431f795968b", "address": "fa:16:3e:62:fc:3e", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b38303d-0e", "ovs_interfaceid": "0b38303d-0e2e-47e2-84f1-d431f795968b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.774 2 DEBUG nova.network.os_vif_util [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converting VIF {"id": "0b38303d-0e2e-47e2-84f1-d431f795968b", "address": "fa:16:3e:62:fc:3e", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b38303d-0e", "ovs_interfaceid": "0b38303d-0e2e-47e2-84f1-d431f795968b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.774 2 DEBUG nova.network.os_vif_util [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:fc:3e,bridge_name='br-int',has_traffic_filtering=True,id=0b38303d-0e2e-47e2-84f1-d431f795968b,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b38303d-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.775 2 DEBUG os_vif [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:fc:3e,bridge_name='br-int',has_traffic_filtering=True,id=0b38303d-0e2e-47e2-84f1-d431f795968b,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b38303d-0e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.776 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.776 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.779 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b38303d-0e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.780 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b38303d-0e, col_values=(('external_ids', {'iface-id': '0b38303d-0e2e-47e2-84f1-d431f795968b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:fc:3e', 'vm-uuid': 'e116d367-5ae9-4ce2-9d33-3936fd3de658'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:56 np0005465988 NetworkManager[45041]: <info>  [1759407536.7827] manager: (tap0b38303d-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.792 2 INFO os_vif [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:fc:3e,bridge_name='br-int',has_traffic_filtering=True,id=0b38303d-0e2e-47e2-84f1-d431f795968b,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b38303d-0e')#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.870 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.871 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.871 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] No VIF found with MAC fa:16:3e:62:fc:3e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.871 2 INFO nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Using config drive#033[00m
Oct  2 08:18:56 np0005465988 nova_compute[236126]: 2025-10-02 12:18:56.900 2 DEBUG nova.storage.rbd_utils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image e116d367-5ae9-4ce2-9d33-3936fd3de658_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:57.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:57.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:57 np0005465988 nova_compute[236126]: 2025-10-02 12:18:57.050 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Updating instance_info_cache with network_info: [{"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:57 np0005465988 nova_compute[236126]: 2025-10-02 12:18:57.071 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:57 np0005465988 nova_compute[236126]: 2025-10-02 12:18:57.071 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:18:57 np0005465988 nova_compute[236126]: 2025-10-02 12:18:57.072 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:57 np0005465988 nova_compute[236126]: 2025-10-02 12:18:57.073 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:18:57 np0005465988 nova_compute[236126]: 2025-10-02 12:18:57.097 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:18:57 np0005465988 nova_compute[236126]: 2025-10-02 12:18:57.098 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:18:57 np0005465988 nova_compute[236126]: 2025-10-02 12:18:57.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:57 np0005465988 nova_compute[236126]: 2025-10-02 12:18:57.946 2 INFO nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Creating config drive at /var/lib/nova/instances/e116d367-5ae9-4ce2-9d33-3936fd3de658/disk.config#033[00m
Oct  2 08:18:57 np0005465988 nova_compute[236126]: 2025-10-02 12:18:57.952 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e116d367-5ae9-4ce2-9d33-3936fd3de658/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyaz2atjv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:58 np0005465988 nova_compute[236126]: 2025-10-02 12:18:58.096 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e116d367-5ae9-4ce2-9d33-3936fd3de658/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyaz2atjv" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:58 np0005465988 nova_compute[236126]: 2025-10-02 12:18:58.129 2 DEBUG nova.storage.rbd_utils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] rbd image e116d367-5ae9-4ce2-9d33-3936fd3de658_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:18:58 np0005465988 nova_compute[236126]: 2025-10-02 12:18:58.133 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e116d367-5ae9-4ce2-9d33-3936fd3de658/disk.config e116d367-5ae9-4ce2-9d33-3936fd3de658_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:58 np0005465988 nova_compute[236126]: 2025-10-02 12:18:58.217 2 DEBUG nova.network.neutron [req-26b81797-1c0d-47cf-8dad-6e1e088811db req-1f948bde-f20a-4784-8b37-a71052378e3c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Updated VIF entry in instance network info cache for port 0b38303d-0e2e-47e2-84f1-d431f795968b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:18:58 np0005465988 nova_compute[236126]: 2025-10-02 12:18:58.218 2 DEBUG nova.network.neutron [req-26b81797-1c0d-47cf-8dad-6e1e088811db req-1f948bde-f20a-4784-8b37-a71052378e3c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Updating instance_info_cache with network_info: [{"id": "0b38303d-0e2e-47e2-84f1-d431f795968b", "address": "fa:16:3e:62:fc:3e", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b38303d-0e", "ovs_interfaceid": "0b38303d-0e2e-47e2-84f1-d431f795968b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:58 np0005465988 nova_compute[236126]: 2025-10-02 12:18:58.233 2 DEBUG oslo_concurrency.lockutils [req-26b81797-1c0d-47cf-8dad-6e1e088811db req-1f948bde-f20a-4784-8b37-a71052378e3c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-e116d367-5ae9-4ce2-9d33-3936fd3de658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:18:59.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:18:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:18:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:18:59.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:18:59 np0005465988 nova_compute[236126]: 2025-10-02 12:18:59.064 2 DEBUG oslo_concurrency.processutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e116d367-5ae9-4ce2-9d33-3936fd3de658/disk.config e116d367-5ae9-4ce2-9d33-3936fd3de658_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.931s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:59 np0005465988 nova_compute[236126]: 2025-10-02 12:18:59.066 2 INFO nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Deleting local config drive /var/lib/nova/instances/e116d367-5ae9-4ce2-9d33-3936fd3de658/disk.config because it was imported into RBD.#033[00m
Oct  2 08:18:59 np0005465988 kernel: tap0b38303d-0e: entered promiscuous mode
Oct  2 08:18:59 np0005465988 NetworkManager[45041]: <info>  [1759407539.1378] manager: (tap0b38303d-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/135)
Oct  2 08:18:59 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:59Z|00250|binding|INFO|Claiming lport 0b38303d-0e2e-47e2-84f1-d431f795968b for this chassis.
Oct  2 08:18:59 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:59Z|00251|binding|INFO|0b38303d-0e2e-47e2-84f1-d431f795968b: Claiming fa:16:3e:62:fc:3e 10.100.0.3
Oct  2 08:18:59 np0005465988 nova_compute[236126]: 2025-10-02 12:18:59.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.155 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:fc:3e 10.100.0.3'], port_security=['fa:16:3e:62:fc:3e 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e116d367-5ae9-4ce2-9d33-3936fd3de658', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8efba404696b40fbbaa6431b934b87f1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3d4d8d91-6fd2-4ab6-a30c-6640fa44e7f5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a5d3722e-d182-43fd-9a86-fa7ed68becec, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=0b38303d-0e2e-47e2-84f1-d431f795968b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.158 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 0b38303d-0e2e-47e2-84f1-d431f795968b in datapath f1725bd8-7d9d-45cc-b992-0cd3db0e30f0 bound to our chassis#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.160 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f1725bd8-7d9d-45cc-b992-0cd3db0e30f0#033[00m
Oct  2 08:18:59 np0005465988 nova_compute[236126]: 2025-10-02 12:18:59.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:59 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:59Z|00252|binding|INFO|Setting lport 0b38303d-0e2e-47e2-84f1-d431f795968b ovn-installed in OVS
Oct  2 08:18:59 np0005465988 ovn_controller[132601]: 2025-10-02T12:18:59Z|00253|binding|INFO|Setting lport 0b38303d-0e2e-47e2-84f1-d431f795968b up in Southbound
Oct  2 08:18:59 np0005465988 nova_compute[236126]: 2025-10-02 12:18:59.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:59 np0005465988 systemd-udevd[269143]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:18:59 np0005465988 systemd-machined[192594]: New machine qemu-30-instance-0000004f.
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.191 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[67976d98-ede2-437d-b719-22a673328a39]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:59 np0005465988 systemd[1]: Started Virtual Machine qemu-30-instance-0000004f.
Oct  2 08:18:59 np0005465988 NetworkManager[45041]: <info>  [1759407539.1993] device (tap0b38303d-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:18:59 np0005465988 NetworkManager[45041]: <info>  [1759407539.2007] device (tap0b38303d-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.232 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[48f145fa-5705-4c5f-b8a8-9bfd5ccb25c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.235 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c5971432-ac8b-4594-b6ad-1eebe3ff38d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.269 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[525ef226-5cea-4b7d-a8c3-c2d4470f5cb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.295 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5c067493-a057-496e-8868-c5983a89e57c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1725bd8-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:76:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547580, 'reachable_time': 19051, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 269156, 'error': None, 'target': 'ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.315 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9fbb0ea7-6b25-4c2e-8539-ee0b31a989db]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf1725bd8-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547593, 'tstamp': 547593}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 269157, 'error': None, 'target': 'ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf1725bd8-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547596, 'tstamp': 547596}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 269157, 'error': None, 'target': 'ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.317 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1725bd8-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:59 np0005465988 nova_compute[236126]: 2025-10-02 12:18:59.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:59 np0005465988 nova_compute[236126]: 2025-10-02 12:18:59.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:59 np0005465988 nova_compute[236126]: 2025-10-02 12:18:59.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.345 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf1725bd8-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.345 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.346 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf1725bd8-70, col_values=(('external_ids', {'iface-id': '421cd6e3-75aa-44e1-b552-d119c4fcd629'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:18:59.346 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:19:00 np0005465988 nova_compute[236126]: 2025-10-02 12:19:00.366 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407540.3658867, e116d367-5ae9-4ce2-9d33-3936fd3de658 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:00 np0005465988 nova_compute[236126]: 2025-10-02 12:19:00.367 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] VM Started (Lifecycle Event)#033[00m
Oct  2 08:19:00 np0005465988 nova_compute[236126]: 2025-10-02 12:19:00.391 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:00 np0005465988 nova_compute[236126]: 2025-10-02 12:19:00.395 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407540.3669634, e116d367-5ae9-4ce2-9d33-3936fd3de658 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:00 np0005465988 nova_compute[236126]: 2025-10-02 12:19:00.396 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:19:00 np0005465988 nova_compute[236126]: 2025-10-02 12:19:00.416 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:00 np0005465988 nova_compute[236126]: 2025-10-02 12:19:00.420 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:00 np0005465988 nova_compute[236126]: 2025-10-02 12:19:00.439 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:19:00 np0005465988 nova_compute[236126]: 2025-10-02 12:19:00.506 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:00 np0005465988 nova_compute[236126]: 2025-10-02 12:19:00.507 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:19:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:01.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:01.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:01 np0005465988 podman[269203]: 2025-10-02 12:19:01.576319106 +0000 UTC m=+0.093377229 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:19:01 np0005465988 podman[269202]: 2025-10-02 12:19:01.575663007 +0000 UTC m=+0.097606872 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:19:01 np0005465988 podman[269201]: 2025-10-02 12:19:01.605301049 +0000 UTC m=+0.127250534 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:19:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:19:01Z|00254|binding|INFO|Releasing lport 421cd6e3-75aa-44e1-b552-d119c4fcd629 from this chassis (sb_readonly=0)
Oct  2 08:19:01 np0005465988 nova_compute[236126]: 2025-10-02 12:19:01.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:01 np0005465988 nova_compute[236126]: 2025-10-02 12:19:01.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:19:01Z|00255|binding|INFO|Releasing lport 421cd6e3-75aa-44e1-b552-d119c4fcd629 from this chassis (sb_readonly=0)
Oct  2 08:19:01 np0005465988 nova_compute[236126]: 2025-10-02 12:19:01.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.492 2 DEBUG nova.compute.manager [req-1862a187-6d55-41cc-ad1b-2a8d0749a5e2 req-68df72e8-c6cf-497a-9e2f-d4305449a8c0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Received event network-vif-plugged-0b38303d-0e2e-47e2-84f1-d431f795968b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.492 2 DEBUG oslo_concurrency.lockutils [req-1862a187-6d55-41cc-ad1b-2a8d0749a5e2 req-68df72e8-c6cf-497a-9e2f-d4305449a8c0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.493 2 DEBUG oslo_concurrency.lockutils [req-1862a187-6d55-41cc-ad1b-2a8d0749a5e2 req-68df72e8-c6cf-497a-9e2f-d4305449a8c0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.493 2 DEBUG oslo_concurrency.lockutils [req-1862a187-6d55-41cc-ad1b-2a8d0749a5e2 req-68df72e8-c6cf-497a-9e2f-d4305449a8c0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.493 2 DEBUG nova.compute.manager [req-1862a187-6d55-41cc-ad1b-2a8d0749a5e2 req-68df72e8-c6cf-497a-9e2f-d4305449a8c0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Processing event network-vif-plugged-0b38303d-0e2e-47e2-84f1-d431f795968b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.493 2 DEBUG nova.compute.manager [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.497 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407542.4974794, e116d367-5ae9-4ce2-9d33-3936fd3de658 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.497 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.499 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.502 2 INFO nova.virt.libvirt.driver [-] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Instance spawned successfully.#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.502 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.524 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.524 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.524 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.525 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.525 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.525 2 DEBUG nova.virt.libvirt.driver [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.529 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.531 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.572 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.598 2 INFO nova.compute.manager [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Took 12.28 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.599 2 DEBUG nova.compute.manager [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.666 2 INFO nova.compute.manager [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Took 13.23 seconds to build instance.#033[00m
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.689 2 DEBUG oslo_concurrency.lockutils [None req-da1451d1-b5a7-4634-b2df-74c7efc11930 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:02 np0005465988 ceph-mgr[76715]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3158772141
Oct  2 08:19:02 np0005465988 nova_compute[236126]: 2025-10-02 12:19:02.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:03.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:03.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:04 np0005465988 nova_compute[236126]: 2025-10-02 12:19:04.681 2 DEBUG nova.compute.manager [req-79d516e5-6b92-40b1-875f-54ccbaec24ef req-e4593fc5-3773-4b42-996e-cd8a934dfde4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Received event network-vif-plugged-0b38303d-0e2e-47e2-84f1-d431f795968b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:04 np0005465988 nova_compute[236126]: 2025-10-02 12:19:04.682 2 DEBUG oslo_concurrency.lockutils [req-79d516e5-6b92-40b1-875f-54ccbaec24ef req-e4593fc5-3773-4b42-996e-cd8a934dfde4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:04 np0005465988 nova_compute[236126]: 2025-10-02 12:19:04.683 2 DEBUG oslo_concurrency.lockutils [req-79d516e5-6b92-40b1-875f-54ccbaec24ef req-e4593fc5-3773-4b42-996e-cd8a934dfde4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:04 np0005465988 nova_compute[236126]: 2025-10-02 12:19:04.683 2 DEBUG oslo_concurrency.lockutils [req-79d516e5-6b92-40b1-875f-54ccbaec24ef req-e4593fc5-3773-4b42-996e-cd8a934dfde4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:04 np0005465988 nova_compute[236126]: 2025-10-02 12:19:04.683 2 DEBUG nova.compute.manager [req-79d516e5-6b92-40b1-875f-54ccbaec24ef req-e4593fc5-3773-4b42-996e-cd8a934dfde4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] No waiting events found dispatching network-vif-plugged-0b38303d-0e2e-47e2-84f1-d431f795968b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:04 np0005465988 nova_compute[236126]: 2025-10-02 12:19:04.684 2 WARNING nova.compute.manager [req-79d516e5-6b92-40b1-875f-54ccbaec24ef req-e4593fc5-3773-4b42-996e-cd8a934dfde4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Received unexpected event network-vif-plugged-0b38303d-0e2e-47e2-84f1-d431f795968b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:19:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:05.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:05.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:05 np0005465988 nova_compute[236126]: 2025-10-02 12:19:05.130 2 DEBUG nova.compute.manager [None req-15a036a4-9e0d-488b-ad53-ef6c72423013 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:05 np0005465988 nova_compute[236126]: 2025-10-02 12:19:05.170 2 INFO nova.compute.manager [None req-15a036a4-9e0d-488b-ad53-ef6c72423013 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] instance snapshotting#033[00m
Oct  2 08:19:05 np0005465988 nova_compute[236126]: 2025-10-02 12:19:05.466 2 INFO nova.virt.libvirt.driver [None req-15a036a4-9e0d-488b-ad53-ef6c72423013 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Beginning live snapshot process#033[00m
Oct  2 08:19:05 np0005465988 nova_compute[236126]: 2025-10-02 12:19:05.633 2 DEBUG nova.virt.libvirt.imagebackend [None req-15a036a4-9e0d-488b-ad53-ef6c72423013 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:19:05 np0005465988 nova_compute[236126]: 2025-10-02 12:19:05.817 2 DEBUG nova.storage.rbd_utils [None req-15a036a4-9e0d-488b-ad53-ef6c72423013 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] creating snapshot(f4f36887667546bebe527379fd0ed747) on rbd image(e116d367-5ae9-4ce2-9d33-3936fd3de658_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:19:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e250 e250: 3 total, 3 up, 3 in
Oct  2 08:19:06 np0005465988 nova_compute[236126]: 2025-10-02 12:19:06.037 2 DEBUG nova.storage.rbd_utils [None req-15a036a4-9e0d-488b-ad53-ef6c72423013 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] cloning vms/e116d367-5ae9-4ce2-9d33-3936fd3de658_disk@f4f36887667546bebe527379fd0ed747 to images/8aa41919-15c5-43d9-ac12-d18997b6c8f0 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:19:06 np0005465988 nova_compute[236126]: 2025-10-02 12:19:06.164 2 DEBUG nova.storage.rbd_utils [None req-15a036a4-9e0d-488b-ad53-ef6c72423013 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] flattening images/8aa41919-15c5-43d9-ac12-d18997b6c8f0 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:19:06 np0005465988 nova_compute[236126]: 2025-10-02 12:19:06.456 2 DEBUG nova.storage.rbd_utils [None req-15a036a4-9e0d-488b-ad53-ef6c72423013 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] removing snapshot(f4f36887667546bebe527379fd0ed747) on rbd image(e116d367-5ae9-4ce2-9d33-3936fd3de658_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:19:06 np0005465988 nova_compute[236126]: 2025-10-02 12:19:06.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:07.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:07.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e251 e251: 3 total, 3 up, 3 in
Oct  2 08:19:07 np0005465988 nova_compute[236126]: 2025-10-02 12:19:07.148 2 DEBUG nova.storage.rbd_utils [None req-15a036a4-9e0d-488b-ad53-ef6c72423013 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] creating snapshot(snap) on rbd image(8aa41919-15c5-43d9-ac12-d18997b6c8f0) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:19:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:07 np0005465988 nova_compute[236126]: 2025-10-02 12:19:07.623 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407532.6203945, 7578f6f0-7071-49b7-978d-18b295ee6504 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:07 np0005465988 nova_compute[236126]: 2025-10-02 12:19:07.624 2 INFO nova.compute.manager [-] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:19:07 np0005465988 nova_compute[236126]: 2025-10-02 12:19:07.646 2 DEBUG nova.compute.manager [None req-018fc339-930b-4779-8a92-5e5fae7d3345 - - - - - -] [instance: 7578f6f0-7071-49b7-978d-18b295ee6504] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:07 np0005465988 nova_compute[236126]: 2025-10-02 12:19:07.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e252 e252: 3 total, 3 up, 3 in
Oct  2 08:19:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:09.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:09.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:10 np0005465988 nova_compute[236126]: 2025-10-02 12:19:10.280 2 INFO nova.virt.libvirt.driver [None req-15a036a4-9e0d-488b-ad53-ef6c72423013 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Snapshot image upload complete#033[00m
Oct  2 08:19:10 np0005465988 nova_compute[236126]: 2025-10-02 12:19:10.282 2 INFO nova.compute.manager [None req-15a036a4-9e0d-488b-ad53-ef6c72423013 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Took 5.11 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:19:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:11.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:11.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:11 np0005465988 nova_compute[236126]: 2025-10-02 12:19:11.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:12 np0005465988 nova_compute[236126]: 2025-10-02 12:19:12.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:19:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:13.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:19:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:13.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 e253: 3 total, 3 up, 3 in
Oct  2 08:19:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:15.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:15.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:15 np0005465988 podman[269437]: 2025-10-02 12:19:15.304832451 +0000 UTC m=+0.072305356 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:19:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:19:16Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:fc:3e 10.100.0.3
Oct  2 08:19:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:19:16Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:fc:3e 10.100.0.3
Oct  2 08:19:16 np0005465988 nova_compute[236126]: 2025-10-02 12:19:16.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:17.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:17.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:19:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:19:17 np0005465988 nova_compute[236126]: 2025-10-02 12:19:17.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:19.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:19.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:19:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:19:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:19:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:19:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:19:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:19:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:21.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:19:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:21.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:21 np0005465988 nova_compute[236126]: 2025-10-02 12:19:21.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:22 np0005465988 nova_compute[236126]: 2025-10-02 12:19:22.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:23.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:23.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:25.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:25.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:26 np0005465988 nova_compute[236126]: 2025-10-02 12:19:26.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:27.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:27.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:19:27.347 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:19:27.347 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:19:27.348 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:27 np0005465988 nova_compute[236126]: 2025-10-02 12:19:27.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:29.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:29.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5e6b6f0 =====
Oct  2 08:19:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:31.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5e6b6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:31 np0005465988 radosgw[82571]: beast: 0x7fcca5e6b6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:31.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:31 np0005465988 nova_compute[236126]: 2025-10-02 12:19:31.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:32 np0005465988 podman[269625]: 2025-10-02 12:19:32.53944869 +0000 UTC m=+0.068334750 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:19:32 np0005465988 podman[269626]: 2025-10-02 12:19:32.540327985 +0000 UTC m=+0.071053259 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:19:32 np0005465988 podman[269624]: 2025-10-02 12:19:32.615219745 +0000 UTC m=+0.141146949 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:19:32 np0005465988 nova_compute[236126]: 2025-10-02 12:19:32.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:33.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:33.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:35.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:35.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:36 np0005465988 nova_compute[236126]: 2025-10-02 12:19:36.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:37.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:19:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:37.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:19:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:19:37.578 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:19:37.579 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:19:37 np0005465988 nova_compute[236126]: 2025-10-02 12:19:37.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:37 np0005465988 nova_compute[236126]: 2025-10-02 12:19:37.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:19:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:19:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:39.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:39.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:19:40 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3169995338' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:19:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:19:40 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3169995338' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:19:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.003000087s ======
Oct  2 08:19:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:41.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000087s
Oct  2 08:19:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:41.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:41 np0005465988 nova_compute[236126]: 2025-10-02 12:19:41.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:42 np0005465988 nova_compute[236126]: 2025-10-02 12:19:42.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:43.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:43.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:44 np0005465988 nova_compute[236126]: 2025-10-02 12:19:44.500 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:44 np0005465988 nova_compute[236126]: 2025-10-02 12:19:44.501 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:44 np0005465988 nova_compute[236126]: 2025-10-02 12:19:44.568 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:44 np0005465988 nova_compute[236126]: 2025-10-02 12:19:44.569 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:44 np0005465988 nova_compute[236126]: 2025-10-02 12:19:44.569 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:44 np0005465988 nova_compute[236126]: 2025-10-02 12:19:44.569 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:19:44 np0005465988 nova_compute[236126]: 2025-10-02 12:19:44.570 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:19:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1846814930' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.012 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:45.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:45.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.095 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000004f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.096 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000004f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.099 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000004b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.100 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000004b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.317 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.318 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4143MB free_disk=20.897136688232422GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.319 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.319 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.515 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance bc4239f5-3cf2-4325-803c-73121f7e0ee0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.515 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance e116d367-5ae9-4ce2-9d33-3936fd3de658 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.515 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.515 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.538 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:19:45 np0005465988 podman[269817]: 2025-10-02 12:19:45.539407272 +0000 UTC m=+0.068323180 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.556 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.556 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.573 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.598 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:19:45 np0005465988 nova_compute[236126]: 2025-10-02 12:19:45.644 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:19:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2271156186' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:19:46 np0005465988 nova_compute[236126]: 2025-10-02 12:19:46.078 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:46 np0005465988 nova_compute[236126]: 2025-10-02 12:19:46.085 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:19:46 np0005465988 nova_compute[236126]: 2025-10-02 12:19:46.113 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:19:46 np0005465988 nova_compute[236126]: 2025-10-02 12:19:46.141 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:19:46 np0005465988 nova_compute[236126]: 2025-10-02 12:19:46.141 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.822s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:46 np0005465988 nova_compute[236126]: 2025-10-02 12:19:46.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:47.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:47.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:19:47.580 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:47 np0005465988 nova_compute[236126]: 2025-10-02 12:19:47.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:49.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:49.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:49 np0005465988 nova_compute[236126]: 2025-10-02 12:19:49.114 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:49 np0005465988 nova_compute[236126]: 2025-10-02 12:19:49.115 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:49 np0005465988 nova_compute[236126]: 2025-10-02 12:19:49.115 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:49 np0005465988 nova_compute[236126]: 2025-10-02 12:19:49.116 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:49 np0005465988 nova_compute[236126]: 2025-10-02 12:19:49.116 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:19:49 np0005465988 nova_compute[236126]: 2025-10-02 12:19:49.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:51.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:51.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:51 np0005465988 nova_compute[236126]: 2025-10-02 12:19:51.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:51 np0005465988 nova_compute[236126]: 2025-10-02 12:19:51.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:52 np0005465988 nova_compute[236126]: 2025-10-02 12:19:52.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:53.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:53.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:54 np0005465988 nova_compute[236126]: 2025-10-02 12:19:54.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:54 np0005465988 nova_compute[236126]: 2025-10-02 12:19:54.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:19:54 np0005465988 nova_compute[236126]: 2025-10-02 12:19:54.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:19:55 np0005465988 nova_compute[236126]: 2025-10-02 12:19:55.087 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:55 np0005465988 nova_compute[236126]: 2025-10-02 12:19:55.087 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:55 np0005465988 nova_compute[236126]: 2025-10-02 12:19:55.087 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:19:55 np0005465988 nova_compute[236126]: 2025-10-02 12:19:55.088 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bc4239f5-3cf2-4325-803c-73121f7e0ee0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:19:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:55.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:19:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:55.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:56 np0005465988 nova_compute[236126]: 2025-10-02 12:19:56.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:57.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:57.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:19:57 np0005465988 nova_compute[236126]: 2025-10-02 12:19:57.404 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Updating instance_info_cache with network_info: [{"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:57 np0005465988 nova_compute[236126]: 2025-10-02 12:19:57.443 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-bc4239f5-3cf2-4325-803c-73121f7e0ee0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:19:57 np0005465988 nova_compute[236126]: 2025-10-02 12:19:57.444 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:19:57 np0005465988 nova_compute[236126]: 2025-10-02 12:19:57.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:19:59.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:19:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:19:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:19:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:19:59.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:00 np0005465988 nova_compute[236126]: 2025-10-02 12:20:00.442 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:00 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 08:20:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:01.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:01.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:01 np0005465988 nova_compute[236126]: 2025-10-02 12:20:01.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:02 np0005465988 nova_compute[236126]: 2025-10-02 12:20:02.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:03.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:03.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:03 np0005465988 podman[269920]: 2025-10-02 12:20:03.538276572 +0000 UTC m=+0.068315920 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:20:03 np0005465988 podman[269919]: 2025-10-02 12:20:03.569664306 +0000 UTC m=+0.096018946 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid)
Oct  2 08:20:03 np0005465988 podman[269918]: 2025-10-02 12:20:03.586395573 +0000 UTC m=+0.118525111 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:20:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:05.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:05.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:06 np0005465988 nova_compute[236126]: 2025-10-02 12:20:06.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:07.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:07.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.600 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.601 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.606 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.606 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.625 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.629 2 DEBUG nova.compute.manager [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.876 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.877 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.891 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.892 2 INFO nova.compute.claims [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:07 np0005465988 nova_compute[236126]: 2025-10-02 12:20:07.913 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.050 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:20:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1183189716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.484 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.494 2 DEBUG nova.compute.provider_tree [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.518 2 DEBUG nova.scheduler.client.report [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.576 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.578 2 DEBUG nova.compute.manager [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.585 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.595 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.595 2 INFO nova.compute.claims [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.650 2 DEBUG nova.compute.manager [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.651 2 DEBUG nova.network.neutron [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.669 2 INFO nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.693 2 DEBUG nova.compute.manager [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.755 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.797 2 DEBUG nova.compute.manager [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.800 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.801 2 INFO nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Creating image(s)#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.845 2 DEBUG nova.storage.rbd_utils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.878 2 DEBUG nova.storage.rbd_utils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.915 2 DEBUG nova.storage.rbd_utils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:20:08 np0005465988 nova_compute[236126]: 2025-10-02 12:20:08.921 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.011 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.012 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.013 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.013 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.046 2 DEBUG nova.storage.rbd_utils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.052 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:09.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:09.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:20:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3938907317' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.235 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.249 2 DEBUG nova.policy [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a9f7faffac7240869a0196df1ddda7e5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1c2c11ebecb14f3188f35ea473c4ca02', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.265 2 DEBUG nova.compute.provider_tree [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.284 2 DEBUG nova.scheduler.client.report [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.317 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.318 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.367 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.368 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.388 2 INFO nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.413 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.361s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.480 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.487 2 DEBUG nova.storage.rbd_utils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] resizing rbd image 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.529 2 INFO nova.virt.block_device [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Booting with volume 376f4726-152b-47c1-8d56-d366a7938d77 at /dev/vda#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.615 2 DEBUG nova.objects.instance [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lazy-loading 'migration_context' on Instance uuid 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.627 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.627 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Ensure instance console log exists: /var/lib/nova/instances/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.628 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.628 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.629 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.679 2 DEBUG os_brick.utils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.681 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.699 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.700 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[e6929910-eb4c-440c-8943-bd901d3e7034]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.701 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.711 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.711 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[920a7b11-d9b4-4589-b232-27fd6ea7266f]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.714 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.723 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.723 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ff05ed-d6b8-4f23-baad-1a7da6ba3293]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.725 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[a43b5c28-fce9-468a-be6e-78e59bc3ed91]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.726 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.762 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] CMD "nvme version" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.764 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.765 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.765 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.765 2 DEBUG os_brick.utils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] <== get_connector_properties: return (85ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:20:09 np0005465988 nova_compute[236126]: 2025-10-02 12:20:09.766 2 DEBUG nova.virt.block_device [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating existing volume attachment record: 67d5bc9c-c6b3-4b48-8f9e-e6ca9555744d _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:20:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:11.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:11.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.148 2 DEBUG nova.policy [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a7f7518ce70488fb4f63af1a3bef131', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '84f71f6076f7425db7653ac203257df0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.195 2 INFO nova.virt.block_device [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Booting with volume df389b61-10b1-4346-9fa3-6333d5b9d5cf at /dev/vdb#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.393 2 DEBUG os_brick.utils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.395 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.408 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.408 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[053b40f0-7481-450b-9808-b60fb3dbdae8]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.411 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.425 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.425 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[91f5325e-6d44-466b-9806-f56809263a1f]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.427 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.438 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.439 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[fc97fe69-3935-46c2-be99-1774ba1fb85b]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.441 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[f22011fc-3fed-42f0-b7c4-08d4428fc4e1]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.442 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.497 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] CMD "nvme version" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.502 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.503 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.503 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.504 2 DEBUG os_brick.utils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] <== get_connector_properties: return (109ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.505 2 DEBUG nova.virt.block_device [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating existing volume attachment record: 1d4eb9f7-4a35-4822-9150-ca4533c919eb _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:20:11 np0005465988 nova_compute[236126]: 2025-10-02 12:20:11.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.287 2 DEBUG nova.network.neutron [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Successfully created port: 674f6c7a-e53a-448e-9da4-840cd15594d1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:20:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.640 2 INFO nova.virt.block_device [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Booting with volume 7a48c418-78a2-476b-af7b-983febde623b at /dev/vdc#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.767 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully created port: c51fc487-eedd-421d-b8cc-d0a322b4a129 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.795 2 DEBUG os_brick.utils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.797 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.810 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.811 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[2e426ebe-32a6-4247-8f93-a2c9ce755ab2]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.813 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.822 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.823 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb48639-1a61-421e-bb20-65053f71906f]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.825 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.835 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.836 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[081785eb-c542-4e0b-bdb8-af3cd7b86713]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.837 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[0b9364bb-1040-4430-bcd7-79d92ae70a88]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.838 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.870 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] CMD "nvme version" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.873 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.873 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.874 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.874 2 DEBUG os_brick.utils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] <== get_connector_properties: return (78ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.875 2 DEBUG nova.virt.block_device [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating existing volume attachment record: bca78dd3-e5b2-45fd-a77e-e3e4d05f19e3 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:20:12 np0005465988 nova_compute[236126]: 2025-10-02 12:20:12.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:13.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:13.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.156 2 DEBUG nova.network.neutron [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Successfully updated port: 674f6c7a-e53a-448e-9da4-840cd15594d1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.220 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "refresh_cache-6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.221 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquired lock "refresh_cache-6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.221 2 DEBUG nova.network.neutron [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.432 2 DEBUG nova.network.neutron [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:20:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1957980323' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.722 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully created port: c5debb4f-7f40-48f2-afb5-efa11af6cc4c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.952 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.954 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.955 2 INFO nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Creating image(s)#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.955 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.956 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Ensure instance console log exists: /var/lib/nova/instances/969ba235-be4a-44e1-a6f2-7c5922b9661e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.956 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.956 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:13 np0005465988 nova_compute[236126]: 2025-10-02 12:20:13.957 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.420 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully created port: fd3698a6-a68d-42ed-b217-f5bdc4163195 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.480 2 DEBUG nova.network.neutron [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Updating instance_info_cache with network_info: [{"id": "674f6c7a-e53a-448e-9da4-840cd15594d1", "address": "fa:16:3e:f6:bb:49", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674f6c7a-e5", "ovs_interfaceid": "674f6c7a-e53a-448e-9da4-840cd15594d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.505 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Releasing lock "refresh_cache-6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.506 2 DEBUG nova.compute.manager [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Instance network_info: |[{"id": "674f6c7a-e53a-448e-9da4-840cd15594d1", "address": "fa:16:3e:f6:bb:49", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674f6c7a-e5", "ovs_interfaceid": "674f6c7a-e53a-448e-9da4-840cd15594d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.508 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Start _get_guest_xml network_info=[{"id": "674f6c7a-e53a-448e-9da4-840cd15594d1", "address": "fa:16:3e:f6:bb:49", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674f6c7a-e5", "ovs_interfaceid": "674f6c7a-e53a-448e-9da4-840cd15594d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.515 2 WARNING nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.521 2 DEBUG nova.virt.libvirt.host [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.523 2 DEBUG nova.virt.libvirt.host [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.528 2 DEBUG nova.virt.libvirt.host [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.529 2 DEBUG nova.virt.libvirt.host [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.531 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.531 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.532 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.533 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.533 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.534 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.534 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.535 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.535 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.536 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.536 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.537 2 DEBUG nova.virt.hardware [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:20:14 np0005465988 nova_compute[236126]: 2025-10-02 12:20:14.542 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:20:15 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2471755034' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.047 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.091 2 DEBUG nova.storage.rbd_utils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.097 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:15.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:15.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.297 2 DEBUG nova.compute.manager [req-caaef0e6-0d0b-4296-8b1d-7c20e6ba4825 req-ffb1246a-1d7f-4a8a-aa78-54fb579f15d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-changed-674f6c7a-e53a-448e-9da4-840cd15594d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.298 2 DEBUG nova.compute.manager [req-caaef0e6-0d0b-4296-8b1d-7c20e6ba4825 req-ffb1246a-1d7f-4a8a-aa78-54fb579f15d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Refreshing instance network info cache due to event network-changed-674f6c7a-e53a-448e-9da4-840cd15594d1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.299 2 DEBUG oslo_concurrency.lockutils [req-caaef0e6-0d0b-4296-8b1d-7c20e6ba4825 req-ffb1246a-1d7f-4a8a-aa78-54fb579f15d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.300 2 DEBUG oslo_concurrency.lockutils [req-caaef0e6-0d0b-4296-8b1d-7c20e6ba4825 req-ffb1246a-1d7f-4a8a-aa78-54fb579f15d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.301 2 DEBUG nova.network.neutron [req-caaef0e6-0d0b-4296-8b1d-7c20e6ba4825 req-ffb1246a-1d7f-4a8a-aa78-54fb579f15d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Refreshing network info cache for port 674f6c7a-e53a-448e-9da4-840cd15594d1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:20:15 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1087788064' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.564 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.565 2 DEBUG nova.virt.libvirt.vif [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:20:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-649338514',display_name='tempest-DeleteServersTestJSON-server-649338514',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-649338514',id=84,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c2c11ebecb14f3188f35ea473c4ca02',ramdisk_id='',reservation_id='r-ag3d9qch',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1602490521',owner_user_name='tempest-DeleteServersTestJSON-1602490521-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:08Z,user_data=None,user_id='a9f7faffac7240869a0196df1ddda7e5',uuid=6a800c5e-ac3d-4b10-aa80-ce9ebcff614c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "674f6c7a-e53a-448e-9da4-840cd15594d1", "address": "fa:16:3e:f6:bb:49", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674f6c7a-e5", "ovs_interfaceid": "674f6c7a-e53a-448e-9da4-840cd15594d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.565 2 DEBUG nova.network.os_vif_util [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converting VIF {"id": "674f6c7a-e53a-448e-9da4-840cd15594d1", "address": "fa:16:3e:f6:bb:49", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674f6c7a-e5", "ovs_interfaceid": "674f6c7a-e53a-448e-9da4-840cd15594d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.566 2 DEBUG nova.network.os_vif_util [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:bb:49,bridge_name='br-int',has_traffic_filtering=True,id=674f6c7a-e53a-448e-9da4-840cd15594d1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap674f6c7a-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.567 2 DEBUG nova.objects.instance [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.580 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  <uuid>6a800c5e-ac3d-4b10-aa80-ce9ebcff614c</uuid>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  <name>instance-00000054</name>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <nova:name>tempest-DeleteServersTestJSON-server-649338514</nova:name>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:20:14</nova:creationTime>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <nova:user uuid="a9f7faffac7240869a0196df1ddda7e5">tempest-DeleteServersTestJSON-1602490521-project-member</nova:user>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <nova:project uuid="1c2c11ebecb14f3188f35ea473c4ca02">tempest-DeleteServersTestJSON-1602490521</nova:project>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <nova:port uuid="674f6c7a-e53a-448e-9da4-840cd15594d1">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <entry name="serial">6a800c5e-ac3d-4b10-aa80-ce9ebcff614c</entry>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <entry name="uuid">6a800c5e-ac3d-4b10-aa80-ce9ebcff614c</entry>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk.config">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:f6:bb:49"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <target dev="tap674f6c7a-e5"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c/console.log" append="off"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:20:15 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:20:15 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:20:15 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:20:15 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.581 2 DEBUG nova.compute.manager [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Preparing to wait for external event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.582 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.582 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.582 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.583 2 DEBUG nova.virt.libvirt.vif [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:20:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-649338514',display_name='tempest-DeleteServersTestJSON-server-649338514',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-649338514',id=84,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c2c11ebecb14f3188f35ea473c4ca02',ramdisk_id='',reservation_id='r-ag3d9qch',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1602490521',owner_user_name='tempest-DeleteServersTestJSON-1602490521-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:08Z,user_data=None,user_id='a9f7faffac7240869a0196df1ddda7e5',uuid=6a800c5e-ac3d-4b10-aa80-ce9ebcff614c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "674f6c7a-e53a-448e-9da4-840cd15594d1", "address": "fa:16:3e:f6:bb:49", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674f6c7a-e5", "ovs_interfaceid": "674f6c7a-e53a-448e-9da4-840cd15594d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.584 2 DEBUG nova.network.os_vif_util [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converting VIF {"id": "674f6c7a-e53a-448e-9da4-840cd15594d1", "address": "fa:16:3e:f6:bb:49", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674f6c7a-e5", "ovs_interfaceid": "674f6c7a-e53a-448e-9da4-840cd15594d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.584 2 DEBUG nova.network.os_vif_util [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:bb:49,bridge_name='br-int',has_traffic_filtering=True,id=674f6c7a-e53a-448e-9da4-840cd15594d1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap674f6c7a-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.585 2 DEBUG os_vif [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:bb:49,bridge_name='br-int',has_traffic_filtering=True,id=674f6c7a-e53a-448e-9da4-840cd15594d1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap674f6c7a-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.586 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.587 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.591 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap674f6c7a-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.592 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap674f6c7a-e5, col_values=(('external_ids', {'iface-id': '674f6c7a-e53a-448e-9da4-840cd15594d1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:bb:49', 'vm-uuid': '6a800c5e-ac3d-4b10-aa80-ce9ebcff614c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005465988 NetworkManager[45041]: <info>  [1759407615.5943] manager: (tap674f6c7a-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.602 2 INFO os_vif [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:bb:49,bridge_name='br-int',has_traffic_filtering=True,id=674f6c7a-e53a-448e-9da4-840cd15594d1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap674f6c7a-e5')#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.656 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.657 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.658 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] No VIF found with MAC fa:16:3e:f6:bb:49, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.659 2 INFO nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Using config drive#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.701 2 DEBUG nova.storage.rbd_utils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:20:15 np0005465988 nova_compute[236126]: 2025-10-02 12:20:15.710 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully created port: f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:20:16 np0005465988 nova_compute[236126]: 2025-10-02 12:20:16.163 2 INFO nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Creating config drive at /var/lib/nova/instances/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c/disk.config#033[00m
Oct  2 08:20:16 np0005465988 nova_compute[236126]: 2025-10-02 12:20:16.173 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwtwmy2vt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:16 np0005465988 nova_compute[236126]: 2025-10-02 12:20:16.331 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwtwmy2vt" returned: 0 in 0.158s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:16 np0005465988 nova_compute[236126]: 2025-10-02 12:20:16.366 2 DEBUG nova.storage.rbd_utils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:20:16 np0005465988 nova_compute[236126]: 2025-10-02 12:20:16.371 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c/disk.config 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:16 np0005465988 podman[270333]: 2025-10-02 12:20:16.515631857 +0000 UTC m=+0.055450975 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:20:16 np0005465988 nova_compute[236126]: 2025-10-02 12:20:16.579 2 DEBUG oslo_concurrency.processutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c/disk.config 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.208s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:16 np0005465988 nova_compute[236126]: 2025-10-02 12:20:16.579 2 INFO nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Deleting local config drive /var/lib/nova/instances/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c/disk.config because it was imported into RBD.#033[00m
Oct  2 08:20:16 np0005465988 kernel: tap674f6c7a-e5: entered promiscuous mode
Oct  2 08:20:16 np0005465988 NetworkManager[45041]: <info>  [1759407616.6428] manager: (tap674f6c7a-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/137)
Oct  2 08:20:16 np0005465988 nova_compute[236126]: 2025-10-02 12:20:16.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:16Z|00256|binding|INFO|Claiming lport 674f6c7a-e53a-448e-9da4-840cd15594d1 for this chassis.
Oct  2 08:20:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:16Z|00257|binding|INFO|674f6c7a-e53a-448e-9da4-840cd15594d1: Claiming fa:16:3e:f6:bb:49 10.100.0.4
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.664 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:bb:49 10.100.0.4'], port_security=['fa:16:3e:f6:bb:49 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '6a800c5e-ac3d-4b10-aa80-ce9ebcff614c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7754c79a-cca5-48c7-9169-831eaad23ccc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c2c11ebecb14f3188f35ea473c4ca02', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c0d053f-a096-4f8c-8162-5ef19e29b5d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45b5774e-2213-45dd-ab74-f2a3868d167c, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=674f6c7a-e53a-448e-9da4-840cd15594d1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.667 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 674f6c7a-e53a-448e-9da4-840cd15594d1 in datapath 7754c79a-cca5-48c7-9169-831eaad23ccc bound to our chassis#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.670 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7754c79a-cca5-48c7-9169-831eaad23ccc#033[00m
Oct  2 08:20:16 np0005465988 systemd-udevd[270375]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.683 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[730da8fe-42df-4874-849b-6822f6d3b218]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.684 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7754c79a-c1 in ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.686 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7754c79a-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.686 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d70b98bf-5f3f-4a01-8b27-9e5c490be141]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.687 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ecfc0c94-0164-420d-97d8-039a2c4b854f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 systemd-machined[192594]: New machine qemu-31-instance-00000054.
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.703 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[a504cc99-195a-44a8-b373-90cf0a1bdadb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 NetworkManager[45041]: <info>  [1759407616.7074] device (tap674f6c7a-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:16 np0005465988 NetworkManager[45041]: <info>  [1759407616.7085] device (tap674f6c7a-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:16 np0005465988 systemd[1]: Started Virtual Machine qemu-31-instance-00000054.
Oct  2 08:20:16 np0005465988 nova_compute[236126]: 2025-10-02 12:20:16.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:16Z|00258|binding|INFO|Setting lport 674f6c7a-e53a-448e-9da4-840cd15594d1 ovn-installed in OVS
Oct  2 08:20:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:16Z|00259|binding|INFO|Setting lport 674f6c7a-e53a-448e-9da4-840cd15594d1 up in Southbound
Oct  2 08:20:16 np0005465988 nova_compute[236126]: 2025-10-02 12:20:16.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.732 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2edd64ca-c94d-4670-be7f-ea1d7a3abc1e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.773 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[cbd56b6a-bee5-439a-8253-496d8fc70cce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 systemd-udevd[270382]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:16 np0005465988 NetworkManager[45041]: <info>  [1759407616.7850] manager: (tap7754c79a-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/138)
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.781 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5d4e4cc6-0b43-47a8-9c6e-c2f6931ac75e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.815 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ff0c01-1ab7-4106-9c12-f8b3c84b1ccb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.818 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d6df79-d04c-470b-a60a-266dc783e05d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 NetworkManager[45041]: <info>  [1759407616.8418] device (tap7754c79a-c0): carrier: link connected
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.849 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[09a26a7d-af9f-4399-aed0-186033e6e5ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.867 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb9c108-06f0-4902-9780-eed58ba00cb5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7754c79a-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:b0:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567717, 'reachable_time': 17614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270410, 'error': None, 'target': 'ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.883 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9d519799-f4ec-4179-ad6d-e1f7bcca66a8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:b018'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567717, 'tstamp': 567717}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270411, 'error': None, 'target': 'ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.901 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dd5a159a-c7c1-472f-ac75-37d6d35b7623]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7754c79a-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:b0:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567717, 'reachable_time': 17614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 270413, 'error': None, 'target': 'ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:16.943 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[16860458-a3ca-469b-8ebf-2448b67e8cca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:17.017 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[72069f02-59ba-48f3-b8ae-0af406636b35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:17.018 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7754c79a-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:17.019 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:17.019 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7754c79a-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:17 np0005465988 kernel: tap7754c79a-c0: entered promiscuous mode
Oct  2 08:20:17 np0005465988 NetworkManager[45041]: <info>  [1759407617.0235] manager: (tap7754c79a-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:17.027 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7754c79a-c0, col_values=(('external_ids', {'iface-id': 'b1ce5636-6283-470c-ab5e-aac212c1256d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:17 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:17Z|00260|binding|INFO|Releasing lport b1ce5636-6283-470c-ab5e-aac212c1256d from this chassis (sb_readonly=0)
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:17.031 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7754c79a-cca5-48c7-9169-831eaad23ccc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7754c79a-cca5-48c7-9169-831eaad23ccc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:17.033 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[63c8065d-82a4-4f41-bbd0-5ec826321096]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:17.034 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-7754c79a-cca5-48c7-9169-831eaad23ccc
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/7754c79a-cca5-48c7-9169-831eaad23ccc.pid.haproxy
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 7754c79a-cca5-48c7-9169-831eaad23ccc
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:20:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:17.035 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc', 'env', 'PROCESS_TAG=haproxy-7754c79a-cca5-48c7-9169-831eaad23ccc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7754c79a-cca5-48c7-9169-831eaad23ccc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:17.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:17.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.346 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully created port: f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:20:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:17 np0005465988 podman[270485]: 2025-10-02 12:20:17.461448007 +0000 UTC m=+0.063827139 container create 95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:20:17 np0005465988 systemd[1]: Started libpod-conmon-95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30.scope.
Oct  2 08:20:17 np0005465988 podman[270485]: 2025-10-02 12:20:17.425541602 +0000 UTC m=+0.027920764 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:20:17 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:20:17 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a3c4521ddb96806f13a395d9f3b9fbedc654ae92b0d04a9fa87294fd6c354f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:20:17 np0005465988 podman[270485]: 2025-10-02 12:20:17.553758353 +0000 UTC m=+0.156137585 container init 95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.556 2 DEBUG nova.compute.manager [req-27ea990a-c2d1-4ee7-9899-6c60169c6e20 req-316ec21a-330a-4277-a5ef-0333d7754336 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.556 2 DEBUG oslo_concurrency.lockutils [req-27ea990a-c2d1-4ee7-9899-6c60169c6e20 req-316ec21a-330a-4277-a5ef-0333d7754336 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.557 2 DEBUG oslo_concurrency.lockutils [req-27ea990a-c2d1-4ee7-9899-6c60169c6e20 req-316ec21a-330a-4277-a5ef-0333d7754336 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.557 2 DEBUG oslo_concurrency.lockutils [req-27ea990a-c2d1-4ee7-9899-6c60169c6e20 req-316ec21a-330a-4277-a5ef-0333d7754336 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.558 2 DEBUG nova.compute.manager [req-27ea990a-c2d1-4ee7-9899-6c60169c6e20 req-316ec21a-330a-4277-a5ef-0333d7754336 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Processing event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.559 2 DEBUG nova.network.neutron [req-caaef0e6-0d0b-4296-8b1d-7c20e6ba4825 req-ffb1246a-1d7f-4a8a-aa78-54fb579f15d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Updated VIF entry in instance network info cache for port 674f6c7a-e53a-448e-9da4-840cd15594d1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.560 2 DEBUG nova.network.neutron [req-caaef0e6-0d0b-4296-8b1d-7c20e6ba4825 req-ffb1246a-1d7f-4a8a-aa78-54fb579f15d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Updating instance_info_cache with network_info: [{"id": "674f6c7a-e53a-448e-9da4-840cd15594d1", "address": "fa:16:3e:f6:bb:49", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674f6c7a-e5", "ovs_interfaceid": "674f6c7a-e53a-448e-9da4-840cd15594d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:17 np0005465988 podman[270485]: 2025-10-02 12:20:17.562836788 +0000 UTC m=+0.165215950 container start 95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.587 2 DEBUG oslo_concurrency.lockutils [req-caaef0e6-0d0b-4296-8b1d-7c20e6ba4825 req-ffb1246a-1d7f-4a8a-aa78-54fb579f15d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:17 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[270500]: [NOTICE]   (270504) : New worker (270506) forked
Oct  2 08:20:17 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[270500]: [NOTICE]   (270504) : Loading success.
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.738 2 DEBUG nova.compute.manager [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.738 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407617.7371914, 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.739 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.744 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.748 2 INFO nova.virt.libvirt.driver [-] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Instance spawned successfully.#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.748 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.773 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.779 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.783 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.783 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.784 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.784 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.785 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.785 2 DEBUG nova.virt.libvirt.driver [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.821 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.822 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407617.7417269, 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.822 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.932 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.935 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407617.7436292, 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.935 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.983 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.987 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.990 2 INFO nova.compute.manager [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Took 9.19 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:20:17 np0005465988 nova_compute[236126]: 2025-10-02 12:20:17.990 2 DEBUG nova.compute.manager [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:18 np0005465988 nova_compute[236126]: 2025-10-02 12:20:18.007 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:18 np0005465988 nova_compute[236126]: 2025-10-02 12:20:18.071 2 INFO nova.compute.manager [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Took 10.24 seconds to build instance.#033[00m
Oct  2 08:20:18 np0005465988 nova_compute[236126]: 2025-10-02 12:20:18.091 2 DEBUG oslo_concurrency.lockutils [None req-0642b18c-2428-462c-9d74-6e361a7d696f a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:18 np0005465988 nova_compute[236126]: 2025-10-02 12:20:18.560 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully updated port: c51fc487-eedd-421d-b8cc-d0a322b4a129 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:18 np0005465988 nova_compute[236126]: 2025-10-02 12:20:18.676 2 DEBUG nova.compute.manager [req-f0c8ec8e-37ab-4351-8f54-615442502cea req-e5c5a3de-4abc-4a05-8cc7-c952a5d3f46e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-changed-c51fc487-eedd-421d-b8cc-d0a322b4a129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:18 np0005465988 nova_compute[236126]: 2025-10-02 12:20:18.676 2 DEBUG nova.compute.manager [req-f0c8ec8e-37ab-4351-8f54-615442502cea req-e5c5a3de-4abc-4a05-8cc7-c952a5d3f46e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing instance network info cache due to event network-changed-c51fc487-eedd-421d-b8cc-d0a322b4a129. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:18 np0005465988 nova_compute[236126]: 2025-10-02 12:20:18.676 2 DEBUG oslo_concurrency.lockutils [req-f0c8ec8e-37ab-4351-8f54-615442502cea req-e5c5a3de-4abc-4a05-8cc7-c952a5d3f46e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:18 np0005465988 nova_compute[236126]: 2025-10-02 12:20:18.676 2 DEBUG oslo_concurrency.lockutils [req-f0c8ec8e-37ab-4351-8f54-615442502cea req-e5c5a3de-4abc-4a05-8cc7-c952a5d3f46e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:18 np0005465988 nova_compute[236126]: 2025-10-02 12:20:18.677 2 DEBUG nova.network.neutron [req-f0c8ec8e-37ab-4351-8f54-615442502cea req-e5c5a3de-4abc-4a05-8cc7-c952a5d3f46e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing network info cache for port c51fc487-eedd-421d-b8cc-d0a322b4a129 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:19.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:19.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.256 2 DEBUG nova.network.neutron [req-f0c8ec8e-37ab-4351-8f54-615442502cea req-e5c5a3de-4abc-4a05-8cc7-c952a5d3f46e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.692 2 DEBUG nova.compute.manager [req-1c25b126-7e8d-4c57-b915-596c5349764a req-dd24c8ef-e33c-4ca3-a56b-f421e4fc1f58 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.693 2 DEBUG oslo_concurrency.lockutils [req-1c25b126-7e8d-4c57-b915-596c5349764a req-dd24c8ef-e33c-4ca3-a56b-f421e4fc1f58 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.693 2 DEBUG oslo_concurrency.lockutils [req-1c25b126-7e8d-4c57-b915-596c5349764a req-dd24c8ef-e33c-4ca3-a56b-f421e4fc1f58 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.694 2 DEBUG oslo_concurrency.lockutils [req-1c25b126-7e8d-4c57-b915-596c5349764a req-dd24c8ef-e33c-4ca3-a56b-f421e4fc1f58 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.694 2 DEBUG nova.compute.manager [req-1c25b126-7e8d-4c57-b915-596c5349764a req-dd24c8ef-e33c-4ca3-a56b-f421e4fc1f58 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] No waiting events found dispatching network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.694 2 WARNING nova.compute.manager [req-1c25b126-7e8d-4c57-b915-596c5349764a req-dd24c8ef-e33c-4ca3-a56b-f421e4fc1f58 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received unexpected event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.704 2 DEBUG nova.network.neutron [req-f0c8ec8e-37ab-4351-8f54-615442502cea req-e5c5a3de-4abc-4a05-8cc7-c952a5d3f46e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.733 2 DEBUG oslo_concurrency.lockutils [req-f0c8ec8e-37ab-4351-8f54-615442502cea req-e5c5a3de-4abc-4a05-8cc7-c952a5d3f46e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.858 2 DEBUG oslo_concurrency.lockutils [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.858 2 DEBUG oslo_concurrency.lockutils [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.859 2 DEBUG oslo_concurrency.lockutils [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.859 2 DEBUG oslo_concurrency.lockutils [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.859 2 DEBUG oslo_concurrency.lockutils [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.860 2 INFO nova.compute.manager [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Terminating instance#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.861 2 DEBUG nova.compute.manager [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:20:19 np0005465988 kernel: tap674f6c7a-e5 (unregistering): left promiscuous mode
Oct  2 08:20:19 np0005465988 NetworkManager[45041]: <info>  [1759407619.9100] device (tap674f6c7a-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:20:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:19Z|00261|binding|INFO|Releasing lport 674f6c7a-e53a-448e-9da4-840cd15594d1 from this chassis (sb_readonly=0)
Oct  2 08:20:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:19Z|00262|binding|INFO|Setting lport 674f6c7a-e53a-448e-9da4-840cd15594d1 down in Southbound
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:19Z|00263|binding|INFO|Removing iface tap674f6c7a-e5 ovn-installed in OVS
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:19.937 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:bb:49 10.100.0.4'], port_security=['fa:16:3e:f6:bb:49 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '6a800c5e-ac3d-4b10-aa80-ce9ebcff614c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7754c79a-cca5-48c7-9169-831eaad23ccc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c2c11ebecb14f3188f35ea473c4ca02', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c0d053f-a096-4f8c-8162-5ef19e29b5d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45b5774e-2213-45dd-ab74-f2a3868d167c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=674f6c7a-e53a-448e-9da4-840cd15594d1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:19.941 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 674f6c7a-e53a-448e-9da4-840cd15594d1 in datapath 7754c79a-cca5-48c7-9169-831eaad23ccc unbound from our chassis#033[00m
Oct  2 08:20:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:19.946 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7754c79a-cca5-48c7-9169-831eaad23ccc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:20:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:19.948 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8548f405-d087-4463-a5c0-5e6875c9cfbc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:19.949 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc namespace which is not needed anymore#033[00m
Oct  2 08:20:19 np0005465988 nova_compute[236126]: 2025-10-02 12:20:19.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005465988 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000054.scope: Deactivated successfully.
Oct  2 08:20:20 np0005465988 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000054.scope: Consumed 3.191s CPU time.
Oct  2 08:20:20 np0005465988 systemd-machined[192594]: Machine qemu-31-instance-00000054 terminated.
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.072 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully updated port: ac2ac5dc-08f6-4faa-9427-e87be9c9d933 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:20 np0005465988 kernel: tap674f6c7a-e5: entered promiscuous mode
Oct  2 08:20:20 np0005465988 NetworkManager[45041]: <info>  [1759407620.0859] manager: (tap674f6c7a-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/140)
Oct  2 08:20:20 np0005465988 kernel: tap674f6c7a-e5 (unregistering): left promiscuous mode
Oct  2 08:20:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:20Z|00264|binding|INFO|Claiming lport 674f6c7a-e53a-448e-9da4-840cd15594d1 for this chassis.
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:20Z|00265|binding|INFO|674f6c7a-e53a-448e-9da4-840cd15594d1: Claiming fa:16:3e:f6:bb:49 10.100.0.4
Oct  2 08:20:20 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[270500]: [NOTICE]   (270504) : haproxy version is 2.8.14-c23fe91
Oct  2 08:20:20 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[270500]: [NOTICE]   (270504) : path to executable is /usr/sbin/haproxy
Oct  2 08:20:20 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[270500]: [WARNING]  (270504) : Exiting Master process...
Oct  2 08:20:20 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[270500]: [WARNING]  (270504) : Exiting Master process...
Oct  2 08:20:20 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[270500]: [ALERT]    (270504) : Current worker (270506) exited with code 143 (Terminated)
Oct  2 08:20:20 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[270500]: [WARNING]  (270504) : All workers exited. Exiting... (0)
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.147 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:bb:49 10.100.0.4'], port_security=['fa:16:3e:f6:bb:49 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '6a800c5e-ac3d-4b10-aa80-ce9ebcff614c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7754c79a-cca5-48c7-9169-831eaad23ccc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c2c11ebecb14f3188f35ea473c4ca02', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c0d053f-a096-4f8c-8162-5ef19e29b5d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45b5774e-2213-45dd-ab74-f2a3868d167c, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=674f6c7a-e53a-448e-9da4-840cd15594d1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:20 np0005465988 systemd[1]: libpod-95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30.scope: Deactivated successfully.
Oct  2 08:20:20 np0005465988 podman[270589]: 2025-10-02 12:20:20.152187554 +0000 UTC m=+0.074091718 container died 95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.164 2 INFO nova.virt.libvirt.driver [-] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Instance destroyed successfully.#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.164 2 DEBUG nova.objects.instance [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lazy-loading 'resources' on Instance uuid 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:20Z|00266|binding|INFO|Setting lport 674f6c7a-e53a-448e-9da4-840cd15594d1 ovn-installed in OVS
Oct  2 08:20:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:20Z|00267|binding|INFO|Setting lport 674f6c7a-e53a-448e-9da4-840cd15594d1 up in Southbound
Oct  2 08:20:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:20Z|00268|binding|INFO|Releasing lport 674f6c7a-e53a-448e-9da4-840cd15594d1 from this chassis (sb_readonly=1)
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:20Z|00269|if_status|INFO|Dropped 3 log messages in last 310 seconds (most recently, 310 seconds ago) due to excessive rate
Oct  2 08:20:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:20Z|00270|if_status|INFO|Not setting lport 674f6c7a-e53a-448e-9da4-840cd15594d1 down as sb is readonly
Oct  2 08:20:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:20Z|00271|binding|INFO|Removing iface tap674f6c7a-e5 ovn-installed in OVS
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:20Z|00272|binding|INFO|Releasing lport 674f6c7a-e53a-448e-9da4-840cd15594d1 from this chassis (sb_readonly=0)
Oct  2 08:20:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:20Z|00273|binding|INFO|Setting lport 674f6c7a-e53a-448e-9da4-840cd15594d1 down in Southbound
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.178 2 DEBUG nova.virt.libvirt.vif [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-649338514',display_name='tempest-DeleteServersTestJSON-server-649338514',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-649338514',id=84,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1c2c11ebecb14f3188f35ea473c4ca02',ramdisk_id='',reservation_id='r-ag3d9qch',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1602490521',owner_user_name='tempest-DeleteServersTestJSON-1602490521-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:18Z,user_data=None,user_id='a9f7faffac7240869a0196df1ddda7e5',uuid=6a800c5e-ac3d-4b10-aa80-ce9ebcff614c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "674f6c7a-e53a-448e-9da4-840cd15594d1", "address": "fa:16:3e:f6:bb:49", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674f6c7a-e5", "ovs_interfaceid": "674f6c7a-e53a-448e-9da4-840cd15594d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.179 2 DEBUG nova.network.os_vif_util [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converting VIF {"id": "674f6c7a-e53a-448e-9da4-840cd15594d1", "address": "fa:16:3e:f6:bb:49", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674f6c7a-e5", "ovs_interfaceid": "674f6c7a-e53a-448e-9da4-840cd15594d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.179 2 DEBUG nova.network.os_vif_util [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:bb:49,bridge_name='br-int',has_traffic_filtering=True,id=674f6c7a-e53a-448e-9da4-840cd15594d1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap674f6c7a-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.180 2 DEBUG os_vif [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:bb:49,bridge_name='br-int',has_traffic_filtering=True,id=674f6c7a-e53a-448e-9da4-840cd15594d1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap674f6c7a-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.181 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap674f6c7a-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.188 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:bb:49 10.100.0.4'], port_security=['fa:16:3e:f6:bb:49 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '6a800c5e-ac3d-4b10-aa80-ce9ebcff614c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7754c79a-cca5-48c7-9169-831eaad23ccc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c2c11ebecb14f3188f35ea473c4ca02', 'neutron:revision_number': '5', 'neutron:security_group_ids': '3c0d053f-a096-4f8c-8162-5ef19e29b5d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45b5774e-2213-45dd-ab74-f2a3868d167c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=674f6c7a-e53a-448e-9da4-840cd15594d1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.191 2 INFO os_vif [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:bb:49,bridge_name='br-int',has_traffic_filtering=True,id=674f6c7a-e53a-448e-9da4-840cd15594d1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap674f6c7a-e5')#033[00m
Oct  2 08:20:20 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30-userdata-shm.mount: Deactivated successfully.
Oct  2 08:20:20 np0005465988 systemd[1]: var-lib-containers-storage-overlay-9a3c4521ddb96806f13a395d9f3b9fbedc654ae92b0d04a9fa87294fd6c354f0-merged.mount: Deactivated successfully.
Oct  2 08:20:20 np0005465988 podman[270589]: 2025-10-02 12:20:20.216795804 +0000 UTC m=+0.138699978 container cleanup 95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:20:20 np0005465988 systemd[1]: libpod-conmon-95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30.scope: Deactivated successfully.
Oct  2 08:20:20 np0005465988 podman[270634]: 2025-10-02 12:20:20.301429638 +0000 UTC m=+0.050902583 container remove 95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.308 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f496019e-9f52-4928-b3bf-2ceb8a3f84e4]: (4, ('Thu Oct  2 12:20:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc (95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30)\n95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30\nThu Oct  2 12:20:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc (95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30)\n95cc8e71fd82c2fd59e46f679596e135334d41b8304d75122745e607d7e21b30\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.311 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cf8d8813-5657-4199-923b-0ff6eaf427be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.312 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7754c79a-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005465988 kernel: tap7754c79a-c0: left promiscuous mode
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.334 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[da9857e2-fed8-4b06-9e73-83e7bc52a002]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.362 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9513108f-6834-4a59-81bd-76a5936269a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.364 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[284b4b9e-a946-4bc8-aeab-ce61e7dafc1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.388 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b650fa33-1b76-417a-a5ff-5d79c0787cbf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567709, 'reachable_time': 18150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270652, 'error': None, 'target': 'ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005465988 systemd[1]: run-netns-ovnmeta\x2d7754c79a\x2dcca5\x2d48c7\x2d9169\x2d831eaad23ccc.mount: Deactivated successfully.
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.394 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.394 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[0ce77b58-9561-4642-8614-9aceca8e5a12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.396 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 674f6c7a-e53a-448e-9da4-840cd15594d1 in datapath 7754c79a-cca5-48c7-9169-831eaad23ccc unbound from our chassis#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.398 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7754c79a-cca5-48c7-9169-831eaad23ccc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.399 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4b58e6d1-d7f0-4440-821d-2ab7b9640486]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.399 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 674f6c7a-e53a-448e-9da4-840cd15594d1 in datapath 7754c79a-cca5-48c7-9169-831eaad23ccc unbound from our chassis#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.401 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7754c79a-cca5-48c7-9169-831eaad23ccc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:20:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:20.402 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[01dbc620-1e32-4ea5-991e-10847a57675d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.765 2 DEBUG nova.compute.manager [req-aa93f5f3-d8e1-4de6-9177-5603bfa79f91 req-dd923124-ec71-451b-b435-4441d9373298 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-changed-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.766 2 DEBUG nova.compute.manager [req-aa93f5f3-d8e1-4de6-9177-5603bfa79f91 req-dd923124-ec71-451b-b435-4441d9373298 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing instance network info cache due to event network-changed-ac2ac5dc-08f6-4faa-9427-e87be9c9d933. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.766 2 DEBUG oslo_concurrency.lockutils [req-aa93f5f3-d8e1-4de6-9177-5603bfa79f91 req-dd923124-ec71-451b-b435-4441d9373298 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.766 2 DEBUG oslo_concurrency.lockutils [req-aa93f5f3-d8e1-4de6-9177-5603bfa79f91 req-dd923124-ec71-451b-b435-4441d9373298 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.767 2 DEBUG nova.network.neutron [req-aa93f5f3-d8e1-4de6-9177-5603bfa79f91 req-dd923124-ec71-451b-b435-4441d9373298 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing network info cache for port ac2ac5dc-08f6-4faa-9427-e87be9c9d933 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.825 2 INFO nova.virt.libvirt.driver [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Deleting instance files /var/lib/nova/instances/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_del#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.826 2 INFO nova.virt.libvirt.driver [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Deletion of /var/lib/nova/instances/6a800c5e-ac3d-4b10-aa80-ce9ebcff614c_del complete#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.892 2 INFO nova.compute.manager [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Took 1.03 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.893 2 DEBUG oslo.service.loopingcall [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.893 2 DEBUG nova.compute.manager [-] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:20:20 np0005465988 nova_compute[236126]: 2025-10-02 12:20:20.894 2 DEBUG nova.network.neutron [-] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.049 2 DEBUG nova.network.neutron [req-aa93f5f3-d8e1-4de6-9177-5603bfa79f91 req-dd923124-ec71-451b-b435-4441d9373298 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.077 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully updated port: 0170ec24-0bde-4eb7-b349-a8f304853e0d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000059s ======
Oct  2 08:20:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:21.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000059s
Oct  2 08:20:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:21.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.464 2 DEBUG nova.network.neutron [req-aa93f5f3-d8e1-4de6-9177-5603bfa79f91 req-dd923124-ec71-451b-b435-4441d9373298 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.483 2 DEBUG oslo_concurrency.lockutils [req-aa93f5f3-d8e1-4de6-9177-5603bfa79f91 req-dd923124-ec71-451b-b435-4441d9373298 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.759 2 DEBUG nova.network.neutron [-] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.785 2 INFO nova.compute.manager [-] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Took 0.89 seconds to deallocate network for instance.#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.792 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-vif-unplugged-674f6c7a-e53a-448e-9da4-840cd15594d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.792 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.792 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.793 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.793 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] No waiting events found dispatching network-vif-unplugged-674f6c7a-e53a-448e-9da4-840cd15594d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.793 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-vif-unplugged-674f6c7a-e53a-448e-9da4-840cd15594d1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.793 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.793 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.794 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.794 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.794 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] No waiting events found dispatching network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.794 2 WARNING nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received unexpected event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.794 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.795 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.795 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.795 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.795 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] No waiting events found dispatching network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.795 2 WARNING nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received unexpected event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.796 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.796 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.796 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.796 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.796 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] No waiting events found dispatching network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.797 2 WARNING nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received unexpected event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.797 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-vif-unplugged-674f6c7a-e53a-448e-9da4-840cd15594d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.797 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.797 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.797 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.797 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] No waiting events found dispatching network-vif-unplugged-674f6c7a-e53a-448e-9da4-840cd15594d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.798 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-vif-unplugged-674f6c7a-e53a-448e-9da4-840cd15594d1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.798 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.798 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.798 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.798 2 DEBUG oslo_concurrency.lockutils [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.799 2 DEBUG nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] No waiting events found dispatching network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.799 2 WARNING nova.compute.manager [req-5e5ec96c-7ce7-46b1-8dbd-086ecf0c42b7 req-9078c477-4660-4f6f-8567-4c9e49d6aa0b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received unexpected event network-vif-plugged-674f6c7a-e53a-448e-9da4-840cd15594d1 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.847 2 DEBUG oslo_concurrency.lockutils [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.847 2 DEBUG oslo_concurrency.lockutils [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005465988 nova_compute[236126]: 2025-10-02 12:20:21.954 2 DEBUG oslo_concurrency.processutils [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.082 2 DEBUG nova.compute.manager [req-c4d485f4-7a23-4f63-b641-68ea2c1c3572 req-2f5f7ca6-21fb-4878-b8be-69d814d52943 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Received event network-vif-deleted-674f6c7a-e53a-448e-9da4-840cd15594d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.106 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully updated port: c5debb4f-7f40-48f2-afb5-efa11af6cc4c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:20:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2938752026' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.395 2 DEBUG oslo_concurrency.processutils [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.402 2 DEBUG nova.compute.provider_tree [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.424 2 DEBUG nova.scheduler.client.report [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.448 2 DEBUG oslo_concurrency.lockutils [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.487 2 INFO nova.scheduler.client.report [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Deleted allocations for instance 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.573 2 DEBUG oslo_concurrency.lockutils [None req-c76a857d-9313-41a8-bf0a-51fd4f5cec6e a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "6a800c5e-ac3d-4b10-aa80-ce9ebcff614c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.714s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.889 2 DEBUG nova.compute.manager [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-changed-0170ec24-0bde-4eb7-b349-a8f304853e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.889 2 DEBUG nova.compute.manager [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing instance network info cache due to event network-changed-0170ec24-0bde-4eb7-b349-a8f304853e0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.889 2 DEBUG oslo_concurrency.lockutils [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.890 2 DEBUG oslo_concurrency.lockutils [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.890 2 DEBUG nova.network.neutron [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing network info cache for port 0170ec24-0bde-4eb7-b349-a8f304853e0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:22 np0005465988 nova_compute[236126]: 2025-10-02 12:20:22.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:20:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:23.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:20:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:23.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:23 np0005465988 nova_compute[236126]: 2025-10-02 12:20:23.204 2 DEBUG nova.network.neutron [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:23 np0005465988 nova_compute[236126]: 2025-10-02 12:20:23.543 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully updated port: fd3698a6-a68d-42ed-b217-f5bdc4163195 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:23 np0005465988 nova_compute[236126]: 2025-10-02 12:20:23.642 2 DEBUG nova.network.neutron [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:23 np0005465988 nova_compute[236126]: 2025-10-02 12:20:23.658 2 DEBUG oslo_concurrency.lockutils [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:23 np0005465988 nova_compute[236126]: 2025-10-02 12:20:23.658 2 DEBUG nova.compute.manager [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-changed-c5debb4f-7f40-48f2-afb5-efa11af6cc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:23 np0005465988 nova_compute[236126]: 2025-10-02 12:20:23.659 2 DEBUG nova.compute.manager [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing instance network info cache due to event network-changed-c5debb4f-7f40-48f2-afb5-efa11af6cc4c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:23 np0005465988 nova_compute[236126]: 2025-10-02 12:20:23.659 2 DEBUG oslo_concurrency.lockutils [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:23 np0005465988 nova_compute[236126]: 2025-10-02 12:20:23.659 2 DEBUG oslo_concurrency.lockutils [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:23 np0005465988 nova_compute[236126]: 2025-10-02 12:20:23.659 2 DEBUG nova.network.neutron [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing network info cache for port c5debb4f-7f40-48f2-afb5-efa11af6cc4c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:23 np0005465988 nova_compute[236126]: 2025-10-02 12:20:23.813 2 DEBUG nova.network.neutron [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e254 e254: 3 total, 3 up, 3 in
Oct  2 08:20:24 np0005465988 nova_compute[236126]: 2025-10-02 12:20:24.216 2 DEBUG nova.network.neutron [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:24 np0005465988 nova_compute[236126]: 2025-10-02 12:20:24.230 2 DEBUG oslo_concurrency.lockutils [req-eb927336-ee98-47cc-bfda-a2ef9829143b req-74b28a01-dd39-4b6f-947e-fbdeabbcb2b8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:25 np0005465988 nova_compute[236126]: 2025-10-02 12:20:25.013 2 DEBUG nova.compute.manager [req-fc8c0c1e-5429-4c7a-9176-f958b455ff2b req-90cedc40-3164-44fa-9362-c0a5a17d237c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-changed-fd3698a6-a68d-42ed-b217-f5bdc4163195 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:25 np0005465988 nova_compute[236126]: 2025-10-02 12:20:25.014 2 DEBUG nova.compute.manager [req-fc8c0c1e-5429-4c7a-9176-f958b455ff2b req-90cedc40-3164-44fa-9362-c0a5a17d237c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing instance network info cache due to event network-changed-fd3698a6-a68d-42ed-b217-f5bdc4163195. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:25 np0005465988 nova_compute[236126]: 2025-10-02 12:20:25.015 2 DEBUG oslo_concurrency.lockutils [req-fc8c0c1e-5429-4c7a-9176-f958b455ff2b req-90cedc40-3164-44fa-9362-c0a5a17d237c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:25 np0005465988 nova_compute[236126]: 2025-10-02 12:20:25.015 2 DEBUG oslo_concurrency.lockutils [req-fc8c0c1e-5429-4c7a-9176-f958b455ff2b req-90cedc40-3164-44fa-9362-c0a5a17d237c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:25 np0005465988 nova_compute[236126]: 2025-10-02 12:20:25.016 2 DEBUG nova.network.neutron [req-fc8c0c1e-5429-4c7a-9176-f958b455ff2b req-90cedc40-3164-44fa-9362-c0a5a17d237c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing network info cache for port fd3698a6-a68d-42ed-b217-f5bdc4163195 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:25.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:25.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:25 np0005465988 nova_compute[236126]: 2025-10-02 12:20:25.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:25 np0005465988 nova_compute[236126]: 2025-10-02 12:20:25.210 2 DEBUG nova.network.neutron [req-fc8c0c1e-5429-4c7a-9176-f958b455ff2b req-90cedc40-3164-44fa-9362-c0a5a17d237c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:25 np0005465988 nova_compute[236126]: 2025-10-02 12:20:25.279 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully updated port: f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:25 np0005465988 nova_compute[236126]: 2025-10-02 12:20:25.606 2 DEBUG nova.network.neutron [req-fc8c0c1e-5429-4c7a-9176-f958b455ff2b req-90cedc40-3164-44fa-9362-c0a5a17d237c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:25 np0005465988 nova_compute[236126]: 2025-10-02 12:20:25.633 2 DEBUG oslo_concurrency.lockutils [req-fc8c0c1e-5429-4c7a-9176-f958b455ff2b req-90cedc40-3164-44fa-9362-c0a5a17d237c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.045 2 DEBUG oslo_concurrency.lockutils [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "e116d367-5ae9-4ce2-9d33-3936fd3de658" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.046 2 DEBUG oslo_concurrency.lockutils [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.046 2 DEBUG oslo_concurrency.lockutils [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.046 2 DEBUG oslo_concurrency.lockutils [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.047 2 DEBUG oslo_concurrency.lockutils [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.048 2 INFO nova.compute.manager [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Terminating instance#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.049 2 DEBUG nova.compute.manager [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:20:26 np0005465988 kernel: tap0b38303d-0e (unregistering): left promiscuous mode
Oct  2 08:20:26 np0005465988 NetworkManager[45041]: <info>  [1759407626.1097] device (tap0b38303d-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:20:26 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:26Z|00274|binding|INFO|Releasing lport 0b38303d-0e2e-47e2-84f1-d431f795968b from this chassis (sb_readonly=0)
Oct  2 08:20:26 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:26Z|00275|binding|INFO|Setting lport 0b38303d-0e2e-47e2-84f1-d431f795968b down in Southbound
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:26 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:26Z|00276|binding|INFO|Removing iface tap0b38303d-0e ovn-installed in OVS
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.146 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:fc:3e 10.100.0.3'], port_security=['fa:16:3e:62:fc:3e 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e116d367-5ae9-4ce2-9d33-3936fd3de658', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8efba404696b40fbbaa6431b934b87f1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3d4d8d91-6fd2-4ab6-a30c-6640fa44e7f5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a5d3722e-d182-43fd-9a86-fa7ed68becec, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=0b38303d-0e2e-47e2-84f1-d431f795968b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.150 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 0b38303d-0e2e-47e2-84f1-d431f795968b in datapath f1725bd8-7d9d-45cc-b992-0cd3db0e30f0 unbound from our chassis#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.154 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f1725bd8-7d9d-45cc-b992-0cd3db0e30f0#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.184 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[60ab1b9b-2b95-4901-8611-8bdc837bf57e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.188 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Successfully updated port: f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:26 np0005465988 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Oct  2 08:20:26 np0005465988 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000004f.scope: Consumed 17.238s CPU time.
Oct  2 08:20:26 np0005465988 systemd-machined[192594]: Machine qemu-30-instance-0000004f terminated.
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.209 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.209 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquired lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.209 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.225 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[036ccdcf-e56f-4190-844d-d698df5c0dcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.229 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[a98ebc68-964f-4b93-b390-b153d2a42e44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.262 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d13c0f11-b28a-4fab-9321-dd8c47db3baa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.323 2 INFO nova.virt.libvirt.driver [-] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Instance destroyed successfully.#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.324 2 DEBUG nova.objects.instance [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lazy-loading 'resources' on Instance uuid e116d367-5ae9-4ce2-9d33-3936fd3de658 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.325 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f03d5d4a-b8ee-4f0a-9910-a3ff96032741]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1725bd8-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:76:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547580, 'reachable_time': 19051, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270692, 'error': None, 'target': 'ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.349 2 DEBUG nova.virt.libvirt.vif [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-818842927',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-818842927',id=79,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:19:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8efba404696b40fbbaa6431b934b87f1',ramdisk_id='',reservation_id='r-t7p63nhq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-153154373',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-153154373-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:19:10Z,user_data=None,user_id='69d8e29c6d3747e98a5985a584f4c814',uuid=e116d367-5ae9-4ce2-9d33-3936fd3de658,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b38303d-0e2e-47e2-84f1-d431f795968b", "address": "fa:16:3e:62:fc:3e", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b38303d-0e", "ovs_interfaceid": "0b38303d-0e2e-47e2-84f1-d431f795968b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.349 2 DEBUG nova.network.os_vif_util [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converting VIF {"id": "0b38303d-0e2e-47e2-84f1-d431f795968b", "address": "fa:16:3e:62:fc:3e", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b38303d-0e", "ovs_interfaceid": "0b38303d-0e2e-47e2-84f1-d431f795968b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.350 2 DEBUG nova.network.os_vif_util [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:fc:3e,bridge_name='br-int',has_traffic_filtering=True,id=0b38303d-0e2e-47e2-84f1-d431f795968b,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b38303d-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.350 2 DEBUG os_vif [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:fc:3e,bridge_name='br-int',has_traffic_filtering=True,id=0b38303d-0e2e-47e2-84f1-d431f795968b,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b38303d-0e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.350 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1a9698ae-336e-4eb4-b391-609993ae383f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf1725bd8-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547593, 'tstamp': 547593}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270699, 'error': None, 'target': 'ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf1725bd8-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547596, 'tstamp': 547596}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270699, 'error': None, 'target': 'ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.352 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1725bd8-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.353 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b38303d-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.356 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf1725bd8-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.356 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.356 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf1725bd8-70, col_values=(('external_ids', {'iface-id': '421cd6e3-75aa-44e1-b552-d119c4fcd629'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:26.356 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.361 2 INFO os_vif [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:fc:3e,bridge_name='br-int',has_traffic_filtering=True,id=0b38303d-0e2e-47e2-84f1-d431f795968b,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b38303d-0e')#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.395 2 DEBUG nova.compute.manager [req-b963b2b3-35c6-4054-9263-5135914430f6 req-23ed3e23-f985-4f64-93cc-6173adecee3e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Received event network-vif-unplugged-0b38303d-0e2e-47e2-84f1-d431f795968b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.395 2 DEBUG oslo_concurrency.lockutils [req-b963b2b3-35c6-4054-9263-5135914430f6 req-23ed3e23-f985-4f64-93cc-6173adecee3e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.396 2 DEBUG oslo_concurrency.lockutils [req-b963b2b3-35c6-4054-9263-5135914430f6 req-23ed3e23-f985-4f64-93cc-6173adecee3e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.396 2 DEBUG oslo_concurrency.lockutils [req-b963b2b3-35c6-4054-9263-5135914430f6 req-23ed3e23-f985-4f64-93cc-6173adecee3e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.396 2 DEBUG nova.compute.manager [req-b963b2b3-35c6-4054-9263-5135914430f6 req-23ed3e23-f985-4f64-93cc-6173adecee3e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] No waiting events found dispatching network-vif-unplugged-0b38303d-0e2e-47e2-84f1-d431f795968b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.397 2 DEBUG nova.compute.manager [req-b963b2b3-35c6-4054-9263-5135914430f6 req-23ed3e23-f985-4f64-93cc-6173adecee3e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Received event network-vif-unplugged-0b38303d-0e2e-47e2-84f1-d431f795968b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.401 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.943 2 INFO nova.virt.libvirt.driver [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Deleting instance files /var/lib/nova/instances/e116d367-5ae9-4ce2-9d33-3936fd3de658_del#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.944 2 INFO nova.virt.libvirt.driver [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Deletion of /var/lib/nova/instances/e116d367-5ae9-4ce2-9d33-3936fd3de658_del complete#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.993 2 INFO nova.compute.manager [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Took 0.94 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.993 2 DEBUG oslo.service.loopingcall [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.994 2 DEBUG nova.compute.manager [-] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:20:26 np0005465988 nova_compute[236126]: 2025-10-02 12:20:26.994 2 DEBUG nova.network.neutron [-] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:20:27 np0005465988 nova_compute[236126]: 2025-10-02 12:20:27.111 2 DEBUG nova.compute.manager [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-changed-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:27 np0005465988 nova_compute[236126]: 2025-10-02 12:20:27.112 2 DEBUG nova.compute.manager [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing instance network info cache due to event network-changed-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:27 np0005465988 nova_compute[236126]: 2025-10-02 12:20:27.112 2 DEBUG oslo_concurrency.lockutils [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:27.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:27.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:27.348 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:27.349 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:27.349 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:27 np0005465988 nova_compute[236126]: 2025-10-02 12:20:27.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:27 np0005465988 nova_compute[236126]: 2025-10-02 12:20:27.987 2 DEBUG nova.network.neutron [-] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.044 2 INFO nova.compute.manager [-] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Took 1.05 seconds to deallocate network for instance.#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.099 2 DEBUG oslo_concurrency.lockutils [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.100 2 DEBUG oslo_concurrency.lockutils [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.118 2 DEBUG nova.compute.manager [req-28d888fd-ae5d-425c-b6ee-825abf5be833 req-77cb8869-89da-48df-9e4b-aaf7affbb767 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Received event network-vif-deleted-0b38303d-0e2e-47e2-84f1-d431f795968b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.216 2 DEBUG oslo_concurrency.processutils [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.562 2 DEBUG nova.compute.manager [req-b9f450e8-5a84-484a-8042-ae6701ef3efe req-cd560b91-ac62-4f8d-870c-4595da21ad83 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Received event network-vif-plugged-0b38303d-0e2e-47e2-84f1-d431f795968b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.563 2 DEBUG oslo_concurrency.lockutils [req-b9f450e8-5a84-484a-8042-ae6701ef3efe req-cd560b91-ac62-4f8d-870c-4595da21ad83 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.563 2 DEBUG oslo_concurrency.lockutils [req-b9f450e8-5a84-484a-8042-ae6701ef3efe req-cd560b91-ac62-4f8d-870c-4595da21ad83 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.564 2 DEBUG oslo_concurrency.lockutils [req-b9f450e8-5a84-484a-8042-ae6701ef3efe req-cd560b91-ac62-4f8d-870c-4595da21ad83 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.564 2 DEBUG nova.compute.manager [req-b9f450e8-5a84-484a-8042-ae6701ef3efe req-cd560b91-ac62-4f8d-870c-4595da21ad83 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] No waiting events found dispatching network-vif-plugged-0b38303d-0e2e-47e2-84f1-d431f795968b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.564 2 WARNING nova.compute.manager [req-b9f450e8-5a84-484a-8042-ae6701ef3efe req-cd560b91-ac62-4f8d-870c-4595da21ad83 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Received unexpected event network-vif-plugged-0b38303d-0e2e-47e2-84f1-d431f795968b for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:20:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:20:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/25203635' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.711 2 DEBUG oslo_concurrency.processutils [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.720 2 DEBUG nova.compute.provider_tree [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.738 2 DEBUG nova.scheduler.client.report [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.763 2 DEBUG oslo_concurrency.lockutils [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.791 2 INFO nova.scheduler.client.report [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Deleted allocations for instance e116d367-5ae9-4ce2-9d33-3936fd3de658#033[00m
Oct  2 08:20:28 np0005465988 nova_compute[236126]: 2025-10-02 12:20:28.854 2 DEBUG oslo_concurrency.lockutils [None req-cb6a1c6d-dad4-454e-b819-328da41a0d90 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "e116d367-5ae9-4ce2-9d33-3936fd3de658" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.808s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:29.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:29.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:31.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:31.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:31 np0005465988 nova_compute[236126]: 2025-10-02 12:20:31.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:32 np0005465988 nova_compute[236126]: 2025-10-02 12:20:32.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:33.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:33.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.634 2 DEBUG nova.network.neutron [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [{"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.681 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Releasing lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.682 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Instance network_info: |[{"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.684 2 DEBUG oslo_concurrency.lockutils [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.684 2 DEBUG nova.network.neutron [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing network info cache for port f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.695 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Start _get_guest_xml network_info=[{"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk', 'boot_index': '2'}, '/dev/vdc': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk', 'boot_index': '3'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T
Oct  2 08:20:33 np0005465988 nova_compute[236126]: min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '67d5bc9c-c6b3-4b48-8f9e-e6ca9555744d', 'disk_bus': 'virtio', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-376f4726-152b-47c1-8d56-d366a7938d77', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '376f4726-152b-47c1-8d56-d366a7938d77', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'attached_at': '', 'detached_at': '', 'volume_id': '376f4726-152b-47c1-8d56-d366a7938d77', 'serial': '376f4726-152b-47c1-8d56-d366a7938d77'}, 'volume_type': None}, {'device_type': 'disk', 'boot_index': 1, 'mount_device': '/dev/vdb', 'attachment_id': '1d4eb9f7-4a35-4822-9150-ca4533c919eb', 'disk_bus': 'virtio', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-df389b61-10b1-4346-9fa3-6333d5b9d5cf', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'df389b61-10b1-4346-9fa3-6333d5b9d5cf', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'attached_at': '', 'detached_at': '', 'volume_id': 'df389b61-10b1-4346-9fa3-6333d5b9d5cf', 'serial': 'df389b61-10b1-4346-9fa3-6333d5b9d5cf'}, 'volume_type': None}, {'device_type': 'disk', 'boot_index': 2, 'mount_device': '/dev/vdc', 'attachment_id': 'bca78dd3-e5b2-45fd-a77e-e3e4d05f19e3', 'disk_bus': 'virtio', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-7a48c418-78a2-476b-af7b-983febde623b', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '7a48c418-78a2-476b-af7b-983febde623b', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'attached_at': '', 'detached_at': '', 'volume_id': '7a48c418-78a2-476b-af7b-983febde623b', 'serial': '7a48c418-78a2-476b-af7b-983febde623b'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.703 2 WARNING nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.707 2 DEBUG nova.virt.libvirt.host [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.708 2 DEBUG nova.virt.libvirt.host [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.711 2 DEBUG nova.virt.libvirt.host [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.712 2 DEBUG nova.virt.libvirt.host [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.714 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.714 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.714 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.714 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.715 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.715 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.715 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.715 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.715 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.716 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.716 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.716 2 DEBUG nova.virt.hardware [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.745 2 DEBUG nova.storage.rbd_utils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] rbd image 969ba235-be4a-44e1-a6f2-7c5922b9661e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:20:33 np0005465988 nova_compute[236126]: 2025-10-02 12:20:33.750 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e255 e255: 3 total, 3 up, 3 in
Oct  2 08:20:33 np0005465988 rsyslogd[1008]: message too long (8192) with configured size 8096, begin of message is: 2025-10-02 12:20:33.695 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7 [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  2 08:20:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:20:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1608713711' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.230 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.314 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.315 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.316 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:6f:4c,bridge_name='br-int',has_traffic_filtering=True,id=c51fc487-eedd-421d-b8cc-d0a322b4a129,network=Network(082f75aa-3cb3-4aac-903c-8187fdb62a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51fc487-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.318 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.319 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.320 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bc:e6,bridge_name='br-int',has_traffic_filtering=True,id=ac2ac5dc-08f6-4faa-9427-e87be9c9d933,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapac2ac5dc-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.321 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.322 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.324 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:5f:d8,bridge_name='br-int',has_traffic_filtering=True,id=0170ec24-0bde-4eb7-b349-a8f304853e0d,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0170ec24-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.326 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.326 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.327 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:75:0f:50,bridge_name='br-int',has_traffic_filtering=True,id=c5debb4f-7f40-48f2-afb5-efa11af6cc4c,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5debb4f-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.329 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.329 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.330 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:ab:4e,bridge_name='br-int',has_traffic_filtering=True,id=fd3698a6-a68d-42ed-b217-f5bdc4163195,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd3698a6-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.332 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.332 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.333 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:9b:57,bridge_name='br-int',has_traffic_filtering=True,id=f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf89c24d0-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.334 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.334 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.335 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:95:32,bridge_name='br-int',has_traffic_filtering=True,id=f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0c51ef0-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.336 2 DEBUG nova.objects.instance [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 969ba235-be4a-44e1-a6f2-7c5922b9661e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.362 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  <uuid>969ba235-be4a-44e1-a6f2-7c5922b9661e</uuid>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  <name>instance-00000053</name>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <nova:name>tempest-device-tagging-server-1607195983</nova:name>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:20:33</nova:creationTime>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:user uuid="2a7f7518ce70488fb4f63af1a3bef131">tempest-TaggedBootDevicesTest_v242-87022127-project-member</nova:user>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:project uuid="84f71f6076f7425db7653ac203257df0">tempest-TaggedBootDevicesTest_v242-87022127</nova:project>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:port uuid="c51fc487-eedd-421d-b8cc-d0a322b4a129">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:port uuid="ac2ac5dc-08f6-4faa-9427-e87be9c9d933">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.1.1.88" ipVersion="4"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:port uuid="0170ec24-0bde-4eb7-b349-a8f304853e0d">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.1.1.46" ipVersion="4"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:port uuid="c5debb4f-7f40-48f2-afb5-efa11af6cc4c">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.1.1.226" ipVersion="4"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:port uuid="fd3698a6-a68d-42ed-b217-f5bdc4163195">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.1.1.172" ipVersion="4"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:port uuid="f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.2.2.100" ipVersion="4"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <nova:port uuid="f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.2.2.200" ipVersion="4"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <entry name="serial">969ba235-be4a-44e1-a6f2-7c5922b9661e</entry>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <entry name="uuid">969ba235-be4a-44e1-a6f2-7c5922b9661e</entry>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/969ba235-be4a-44e1-a6f2-7c5922b9661e_disk.config">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-376f4726-152b-47c1-8d56-d366a7938d77">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <serial>376f4726-152b-47c1-8d56-d366a7938d77</serial>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-df389b61-10b1-4346-9fa3-6333d5b9d5cf">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <target dev="vdb" bus="virtio"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <serial>df389b61-10b1-4346-9fa3-6333d5b9d5cf</serial>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-7a48c418-78a2-476b-af7b-983febde623b">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <target dev="vdc" bus="virtio"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <serial>7a48c418-78a2-476b-af7b-983febde623b</serial>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:95:6f:4c"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <target dev="tapc51fc487-ee"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:0a:bc:e6"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <target dev="tapac2ac5dc-08"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:36:5f:d8"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <target dev="tap0170ec24-0b"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:75:0f:50"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <target dev="tapc5debb4f-7f"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:f7:ab:4e"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <target dev="tapfd3698a6-a6"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:49:9b:57"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <target dev="tapf89c24d0-f3"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:20:95:32"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <target dev="tapf0c51ef0-db"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/969ba235-be4a-44e1-a6f2-7c5922b9661e/console.log" append="off"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:20:34 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:20:34 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:20:34 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:20:34 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.363 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Preparing to wait for external event network-vif-plugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.363 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.364 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.364 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.364 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Preparing to wait for external event network-vif-plugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.365 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.365 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.365 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.365 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Preparing to wait for external event network-vif-plugged-0170ec24-0bde-4eb7-b349-a8f304853e0d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.365 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.366 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.366 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.366 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Preparing to wait for external event network-vif-plugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.366 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.366 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.367 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.367 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Preparing to wait for external event network-vif-plugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.367 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.367 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.367 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.368 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Preparing to wait for external event network-vif-plugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.368 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.368 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.368 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.368 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Preparing to wait for external event network-vif-plugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.369 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.369 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.369 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.370 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.370 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.371 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:6f:4c,bridge_name='br-int',has_traffic_filtering=True,id=c51fc487-eedd-421d-b8cc-d0a322b4a129,network=Network(082f75aa-3cb3-4aac-903c-8187fdb62a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51fc487-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.372 2 DEBUG os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:6f:4c,bridge_name='br-int',has_traffic_filtering=True,id=c51fc487-eedd-421d-b8cc-d0a322b4a129,network=Network(082f75aa-3cb3-4aac-903c-8187fdb62a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51fc487-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.373 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.374 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.377 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc51fc487-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.378 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc51fc487-ee, col_values=(('external_ids', {'iface-id': 'c51fc487-eedd-421d-b8cc-d0a322b4a129', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:6f:4c', 'vm-uuid': '969ba235-be4a-44e1-a6f2-7c5922b9661e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 NetworkManager[45041]: <info>  [1759407634.3809] manager: (tapc51fc487-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.388 2 INFO os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:6f:4c,bridge_name='br-int',has_traffic_filtering=True,id=c51fc487-eedd-421d-b8cc-d0a322b4a129,network=Network(082f75aa-3cb3-4aac-903c-8187fdb62a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51fc487-ee')#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.390 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.390 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.391 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bc:e6,bridge_name='br-int',has_traffic_filtering=True,id=ac2ac5dc-08f6-4faa-9427-e87be9c9d933,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapac2ac5dc-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.392 2 DEBUG os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bc:e6,bridge_name='br-int',has_traffic_filtering=True,id=ac2ac5dc-08f6-4faa-9427-e87be9c9d933,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapac2ac5dc-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.393 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.393 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.397 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac2ac5dc-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.397 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapac2ac5dc-08, col_values=(('external_ids', {'iface-id': 'ac2ac5dc-08f6-4faa-9427-e87be9c9d933', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:bc:e6', 'vm-uuid': '969ba235-be4a-44e1-a6f2-7c5922b9661e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 NetworkManager[45041]: <info>  [1759407634.4002] manager: (tapac2ac5dc-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.409 2 INFO os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bc:e6,bridge_name='br-int',has_traffic_filtering=True,id=ac2ac5dc-08f6-4faa-9427-e87be9c9d933,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapac2ac5dc-08')#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.410 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.411 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.411 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:5f:d8,bridge_name='br-int',has_traffic_filtering=True,id=0170ec24-0bde-4eb7-b349-a8f304853e0d,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0170ec24-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.412 2 DEBUG os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:5f:d8,bridge_name='br-int',has_traffic_filtering=True,id=0170ec24-0bde-4eb7-b349-a8f304853e0d,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0170ec24-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.413 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.414 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.417 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0170ec24-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.417 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0170ec24-0b, col_values=(('external_ids', {'iface-id': '0170ec24-0bde-4eb7-b349-a8f304853e0d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:5f:d8', 'vm-uuid': '969ba235-be4a-44e1-a6f2-7c5922b9661e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 NetworkManager[45041]: <info>  [1759407634.4203] manager: (tap0170ec24-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.436 2 INFO os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:5f:d8,bridge_name='br-int',has_traffic_filtering=True,id=0170ec24-0bde-4eb7-b349-a8f304853e0d,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0170ec24-0b')#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.437 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.437 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.438 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:75:0f:50,bridge_name='br-int',has_traffic_filtering=True,id=c5debb4f-7f40-48f2-afb5-efa11af6cc4c,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5debb4f-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.438 2 DEBUG os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:75:0f:50,bridge_name='br-int',has_traffic_filtering=True,id=c5debb4f-7f40-48f2-afb5-efa11af6cc4c,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5debb4f-7f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.438 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.439 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.441 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5debb4f-7f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.441 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc5debb4f-7f, col_values=(('external_ids', {'iface-id': 'c5debb4f-7f40-48f2-afb5-efa11af6cc4c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:75:0f:50', 'vm-uuid': '969ba235-be4a-44e1-a6f2-7c5922b9661e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 NetworkManager[45041]: <info>  [1759407634.4439] manager: (tapc5debb4f-7f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.459 2 INFO os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:75:0f:50,bridge_name='br-int',has_traffic_filtering=True,id=c5debb4f-7f40-48f2-afb5-efa11af6cc4c,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5debb4f-7f')#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.460 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.460 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.461 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:ab:4e,bridge_name='br-int',has_traffic_filtering=True,id=fd3698a6-a68d-42ed-b217-f5bdc4163195,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd3698a6-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.461 2 DEBUG os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:ab:4e,bridge_name='br-int',has_traffic_filtering=True,id=fd3698a6-a68d-42ed-b217-f5bdc4163195,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd3698a6-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.462 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.462 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.464 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd3698a6-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.465 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfd3698a6-a6, col_values=(('external_ids', {'iface-id': 'fd3698a6-a68d-42ed-b217-f5bdc4163195', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:ab:4e', 'vm-uuid': '969ba235-be4a-44e1-a6f2-7c5922b9661e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:34 np0005465988 NetworkManager[45041]: <info>  [1759407634.4699] manager: (tapfd3698a6-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.487 2 INFO os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:ab:4e,bridge_name='br-int',has_traffic_filtering=True,id=fd3698a6-a68d-42ed-b217-f5bdc4163195,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd3698a6-a6')#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.487 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.488 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.488 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:9b:57,bridge_name='br-int',has_traffic_filtering=True,id=f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf89c24d0-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.489 2 DEBUG os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:9b:57,bridge_name='br-int',has_traffic_filtering=True,id=f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf89c24d0-f3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.490 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.492 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf89c24d0-f3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.492 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf89c24d0-f3, col_values=(('external_ids', {'iface-id': 'f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:9b:57', 'vm-uuid': '969ba235-be4a-44e1-a6f2-7c5922b9661e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:34 np0005465988 NetworkManager[45041]: <info>  [1759407634.4971] manager: (tapf89c24d0-f3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.520 2 INFO os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:9b:57,bridge_name='br-int',has_traffic_filtering=True,id=f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf89c24d0-f3')#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.521 2 DEBUG nova.virt.libvirt.vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.522 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.522 2 DEBUG nova.network.os_vif_util [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:95:32,bridge_name='br-int',has_traffic_filtering=True,id=f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0c51ef0-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.523 2 DEBUG os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:95:32,bridge_name='br-int',has_traffic_filtering=True,id=f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0c51ef0-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.523 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.523 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.525 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0c51ef0-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.525 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf0c51ef0-db, col_values=(('external_ids', {'iface-id': 'f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:95:32', 'vm-uuid': '969ba235-be4a-44e1-a6f2-7c5922b9661e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:34 np0005465988 NetworkManager[45041]: <info>  [1759407634.5286] manager: (tapf0c51ef0-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:34 np0005465988 podman[270798]: 2025-10-02 12:20:34.53758455 +0000 UTC m=+0.070570855 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:20:34 np0005465988 podman[270795]: 2025-10-02 12:20:34.543181863 +0000 UTC m=+0.077454925 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.555 2 INFO os_vif [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:95:32,bridge_name='br-int',has_traffic_filtering=True,id=f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0c51ef0-db')#033[00m
Oct  2 08:20:34 np0005465988 podman[270792]: 2025-10-02 12:20:34.568460559 +0000 UTC m=+0.099315072 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.637 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.638 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.639 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] No VIF found with MAC fa:16:3e:95:6f:4c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.639 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] No VIF found with MAC fa:16:3e:f7:ab:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.640 2 INFO nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Using config drive#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.668 2 DEBUG nova.storage.rbd_utils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] rbd image 969ba235-be4a-44e1-a6f2-7c5922b9661e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.675 2 DEBUG oslo_concurrency.lockutils [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.675 2 DEBUG oslo_concurrency.lockutils [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.676 2 DEBUG oslo_concurrency.lockutils [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.676 2 DEBUG oslo_concurrency.lockutils [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.676 2 DEBUG oslo_concurrency.lockutils [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.677 2 INFO nova.compute.manager [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Terminating instance#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.678 2 DEBUG nova.compute.manager [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:20:34 np0005465988 kernel: tap576bdab0-26 (unregistering): left promiscuous mode
Oct  2 08:20:34 np0005465988 NetworkManager[45041]: <info>  [1759407634.8116] device (tap576bdab0-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:34Z|00277|binding|INFO|Releasing lport 576bdab0-26cd-4663-8dd5-149075e0d45d from this chassis (sb_readonly=0)
Oct  2 08:20:34 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:34Z|00278|binding|INFO|Setting lport 576bdab0-26cd-4663-8dd5-149075e0d45d down in Southbound
Oct  2 08:20:34 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:34Z|00279|binding|INFO|Removing iface tap576bdab0-26 ovn-installed in OVS
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:34.861 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:e9:df 10.100.0.7'], port_security=['fa:16:3e:5d:e9:df 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bc4239f5-3cf2-4325-803c-73121f7e0ee0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8efba404696b40fbbaa6431b934b87f1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3d4d8d91-6fd2-4ab6-a30c-6640fa44e7f5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a5d3722e-d182-43fd-9a86-fa7ed68becec, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=576bdab0-26cd-4663-8dd5-149075e0d45d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:34.862 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 576bdab0-26cd-4663-8dd5-149075e0d45d in datapath f1725bd8-7d9d-45cc-b992-0cd3db0e30f0 unbound from our chassis#033[00m
Oct  2 08:20:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:34.863 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f1725bd8-7d9d-45cc-b992-0cd3db0e30f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:20:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:34.864 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4b1232b9-a4ac-4495-afa0-295a76ac0bd4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:34.865 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0 namespace which is not needed anymore#033[00m
Oct  2 08:20:34 np0005465988 nova_compute[236126]: 2025-10-02 12:20:34.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005465988 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Oct  2 08:20:34 np0005465988 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000004b.scope: Consumed 23.175s CPU time.
Oct  2 08:20:34 np0005465988 systemd-machined[192594]: Machine qemu-28-instance-0000004b terminated.
Oct  2 08:20:35 np0005465988 neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0[266733]: [NOTICE]   (266737) : haproxy version is 2.8.14-c23fe91
Oct  2 08:20:35 np0005465988 neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0[266733]: [NOTICE]   (266737) : path to executable is /usr/sbin/haproxy
Oct  2 08:20:35 np0005465988 neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0[266733]: [WARNING]  (266737) : Exiting Master process...
Oct  2 08:20:35 np0005465988 neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0[266733]: [ALERT]    (266737) : Current worker (266739) exited with code 143 (Terminated)
Oct  2 08:20:35 np0005465988 neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0[266733]: [WARNING]  (266737) : All workers exited. Exiting... (0)
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.055 2 INFO nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Creating config drive at /var/lib/nova/instances/969ba235-be4a-44e1-a6f2-7c5922b9661e/disk.config#033[00m
Oct  2 08:20:35 np0005465988 systemd[1]: libpod-4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071.scope: Deactivated successfully.
Oct  2 08:20:35 np0005465988 podman[270935]: 2025-10-02 12:20:35.063106016 +0000 UTC m=+0.061980235 container died 4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.070 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/969ba235-be4a-44e1-a6f2-7c5922b9661e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmporggoaxx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:35 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071-userdata-shm.mount: Deactivated successfully.
Oct  2 08:20:35 np0005465988 systemd[1]: var-lib-containers-storage-overlay-5ca592ba44181e5982bad4e05ac1164bed6c5106ff51f20567b3d1bb1c87d6cc-merged.mount: Deactivated successfully.
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.126 2 INFO nova.virt.libvirt.driver [-] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Instance destroyed successfully.#033[00m
Oct  2 08:20:35 np0005465988 podman[270935]: 2025-10-02 12:20:35.129695014 +0000 UTC m=+0.128569243 container cleanup 4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.129 2 DEBUG nova.objects.instance [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lazy-loading 'resources' on Instance uuid bc4239f5-3cf2-4325-803c-73121f7e0ee0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.146 2 DEBUG nova.virt.libvirt.vif [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2050580051',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-2050580051',id=75,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8efba404696b40fbbaa6431b934b87f1',ramdisk_id='',reservation_id='r-zp3eerp0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-153154373',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-153154373-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:17:08Z,user_data=None,user_id='69d8e29c6d3747e98a5985a584f4c814',uuid=bc4239f5-3cf2-4325-803c-73121f7e0ee0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:20:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.146 2 DEBUG nova.network.os_vif_util [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converting VIF {"id": "576bdab0-26cd-4663-8dd5-149075e0d45d", "address": "fa:16:3e:5d:e9:df", "network": {"id": "f1725bd8-7d9d-45cc-b992-0cd3db0e30f0", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-64366215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8efba404696b40fbbaa6431b934b87f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap576bdab0-26", "ovs_interfaceid": "576bdab0-26cd-4663-8dd5-149075e0d45d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:35.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.147 2 DEBUG nova.network.os_vif_util [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5d:e9:df,bridge_name='br-int',has_traffic_filtering=True,id=576bdab0-26cd-4663-8dd5-149075e0d45d,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap576bdab0-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.148 2 DEBUG os_vif [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:e9:df,bridge_name='br-int',has_traffic_filtering=True,id=576bdab0-26cd-4663-8dd5-149075e0d45d,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap576bdab0-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.151 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576bdab0-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:35 np0005465988 systemd[1]: libpod-conmon-4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071.scope: Deactivated successfully.
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.157 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407620.1565151, 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.158 2 INFO nova.compute.manager [-] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:20:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:35.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.182 2 DEBUG nova.compute.manager [None req-0ac0fe18-7642-4e15-a1b1-dc78ea962687 - - - - - -] [instance: 6a800c5e-ac3d-4b10-aa80-ce9ebcff614c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.194 2 INFO os_vif [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:e9:df,bridge_name='br-int',has_traffic_filtering=True,id=576bdab0-26cd-4663-8dd5-149075e0d45d,network=Network(f1725bd8-7d9d-45cc-b992-0cd3db0e30f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap576bdab0-26')#033[00m
Oct  2 08:20:35 np0005465988 podman[270983]: 2025-10-02 12:20:35.201895756 +0000 UTC m=+0.044959220 container remove 4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.207 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cdca88d4-ee62-4a71-9bc1-08c963a272d2]: (4, ('Thu Oct  2 12:20:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0 (4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071)\n4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071\nThu Oct  2 12:20:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0 (4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071)\n4c181ae41368c628e7af680436025eb67e10558e2c153a44e71c3442fbf5d071\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.209 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f8693901-c041-42f3-abb3-3db01378cde0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.210 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1725bd8-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:35 np0005465988 kernel: tapf1725bd8-70: left promiscuous mode
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.225 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/969ba235-be4a-44e1-a6f2-7c5922b9661e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmporggoaxx" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.257 2 DEBUG nova.storage.rbd_utils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] rbd image 969ba235-be4a-44e1-a6f2-7c5922b9661e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.259 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2e53c9ea-b74c-484f-a18d-f786cc9cadef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.267 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/969ba235-be4a-44e1-a6f2-7c5922b9661e/disk.config 969ba235-be4a-44e1-a6f2-7c5922b9661e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.291 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[47888470-4bea-4914-9c96-e9c8e24715e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.293 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c20c3636-5da2-43dc-a6ee-ca52afc3ba7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.309 2 DEBUG nova.network.neutron [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updated VIF entry in instance network info cache for port f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.310 2 DEBUG nova.network.neutron [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [{"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.317 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[91982c6d-9a9c-4062-8147-d938ce633a80]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547573, 'reachable_time': 36013, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271065, 'error': None, 'target': 'ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 systemd[1]: run-netns-ovnmeta\x2df1725bd8\x2d7d9d\x2d45cc\x2db992\x2d0cd3db0e30f0.mount: Deactivated successfully.
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.320 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f1725bd8-7d9d-45cc-b992-0cd3db0e30f0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.321 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[9e211bbe-7890-47a4-bf46-0ff1074379e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.337 2 DEBUG oslo_concurrency.lockutils [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.338 2 DEBUG nova.compute.manager [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-changed-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.338 2 DEBUG nova.compute.manager [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing instance network info cache due to event network-changed-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.338 2 DEBUG oslo_concurrency.lockutils [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.338 2 DEBUG oslo_concurrency.lockutils [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.339 2 DEBUG nova.network.neutron [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing network info cache for port f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.513 2 DEBUG oslo_concurrency.processutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/969ba235-be4a-44e1-a6f2-7c5922b9661e/disk.config 969ba235-be4a-44e1-a6f2-7c5922b9661e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.246s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.514 2 INFO nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Deleting local config drive /var/lib/nova/instances/969ba235-be4a-44e1-a6f2-7c5922b9661e/disk.config because it was imported into RBD.#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.588 2 DEBUG nova.compute.manager [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Received event network-vif-unplugged-576bdab0-26cd-4663-8dd5-149075e0d45d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.590 2 DEBUG oslo_concurrency.lockutils [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.590 2 DEBUG oslo_concurrency.lockutils [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.590 2 DEBUG oslo_concurrency.lockutils [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.591 2 DEBUG nova.compute.manager [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] No waiting events found dispatching network-vif-unplugged-576bdab0-26cd-4663-8dd5-149075e0d45d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.591 2 DEBUG nova.compute.manager [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Received event network-vif-unplugged-576bdab0-26cd-4663-8dd5-149075e0d45d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.592 2 DEBUG nova.compute.manager [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Received event network-vif-plugged-576bdab0-26cd-4663-8dd5-149075e0d45d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.592 2 DEBUG oslo_concurrency.lockutils [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.593 2 DEBUG oslo_concurrency.lockutils [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.593 2 DEBUG oslo_concurrency.lockutils [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.593 2 DEBUG nova.compute.manager [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] No waiting events found dispatching network-vif-plugged-576bdab0-26cd-4663-8dd5-149075e0d45d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.594 2 WARNING nova.compute.manager [req-d1619aa5-e29a-44da-b3f1-2995923c5638 req-b27b29d3-005a-40ab-943d-da79cb035440 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Received unexpected event network-vif-plugged-576bdab0-26cd-4663-8dd5-149075e0d45d for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:20:35 np0005465988 systemd-udevd[270909]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.6145] manager: (tapc51fc487-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/148)
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.6372] manager: (tapac2ac5dc-08): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.6414] device (tapc51fc487-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.6437] device (tapc51fc487-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:35 np0005465988 kernel: tapc51fc487-ee: entered promiscuous mode
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00280|binding|INFO|Claiming lport c51fc487-eedd-421d-b8cc-d0a322b4a129 for this chassis.
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00281|binding|INFO|c51fc487-eedd-421d-b8cc-d0a322b4a129: Claiming fa:16:3e:95:6f:4c 10.100.0.6
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.6529] manager: (tap0170ec24-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/150)
Oct  2 08:20:35 np0005465988 systemd-udevd[270907]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.663 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:6f:4c 10.100.0.6'], port_security=['fa:16:3e:95:6f:4c 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-082f75aa-3cb3-4aac-903c-8187fdb62a93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a5294f6e-41c7-4853-9f7a-aaececdf82e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31127d17-755e-4278-8ea5-b82c1b47b10b, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c51fc487-eedd-421d-b8cc-d0a322b4a129) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.665 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c51fc487-eedd-421d-b8cc-d0a322b4a129 in datapath 082f75aa-3cb3-4aac-903c-8187fdb62a93 bound to our chassis#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.669 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 082f75aa-3cb3-4aac-903c-8187fdb62a93#033[00m
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.6712] manager: (tapc5debb4f-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/151)
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.683 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[855bfb76-d70a-42ea-8f3b-e1718341e4a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.684 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap082f75aa-31 in ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.687 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap082f75aa-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.687 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[275216dd-c827-427d-9e73-1671e60074c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.688 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[976f8e53-c2be-415c-912c-94870509769d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.6932] manager: (tapfd3698a6-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.702 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[aab8beeb-74ec-4961-953a-dc936a4e4afb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7090] manager: (tapf89c24d0-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/153)
Oct  2 08:20:35 np0005465988 systemd-udevd[271115]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7263] manager: (tapf0c51ef0-db): new Tun device (/org/freedesktop/NetworkManager/Devices/154)
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.729 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[528cf8f4-bb80-422c-b930-94377467c034]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 kernel: tap0170ec24-0b: entered promiscuous mode
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7493] device (tap0170ec24-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:35 np0005465988 kernel: tapc5debb4f-7f: entered promiscuous mode
Oct  2 08:20:35 np0005465988 kernel: tapfd3698a6-a6: entered promiscuous mode
Oct  2 08:20:35 np0005465988 kernel: tapf0c51ef0-db: entered promiscuous mode
Oct  2 08:20:35 np0005465988 kernel: tapac2ac5dc-08: entered promiscuous mode
Oct  2 08:20:35 np0005465988 kernel: tapf89c24d0-f3: entered promiscuous mode
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7514] device (tapc5debb4f-7f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7523] device (tapfd3698a6-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7531] device (tapf0c51ef0-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7543] device (tap0170ec24-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7551] device (tapac2ac5dc-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00282|binding|INFO|Claiming lport fd3698a6-a68d-42ed-b217-f5bdc4163195 for this chassis.
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00283|binding|INFO|fd3698a6-a68d-42ed-b217-f5bdc4163195: Claiming fa:16:3e:f7:ab:4e 10.1.1.172
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00284|binding|INFO|Claiming lport ac2ac5dc-08f6-4faa-9427-e87be9c9d933 for this chassis.
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00285|binding|INFO|ac2ac5dc-08f6-4faa-9427-e87be9c9d933: Claiming fa:16:3e:0a:bc:e6 10.1.1.88
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00286|binding|INFO|Claiming lport f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b for this chassis.
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00287|binding|INFO|f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b: Claiming fa:16:3e:49:9b:57 10.2.2.100
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00288|binding|INFO|Claiming lport f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 for this chassis.
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00289|binding|INFO|f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6: Claiming fa:16:3e:20:95:32 10.2.2.200
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7564] device (tapf89c24d0-f3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7577] device (tapc5debb4f-7f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7582] device (tapfd3698a6-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7586] device (tapf0c51ef0-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7593] device (tapac2ac5dc-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7602] device (tapf89c24d0-f3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.759 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b067bd92-93ce-49bd-953f-b42227e2c487]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00290|binding|INFO|Claiming lport 0170ec24-0bde-4eb7-b349-a8f304853e0d for this chassis.
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00291|binding|INFO|0170ec24-0bde-4eb7-b349-a8f304853e0d: Claiming fa:16:3e:36:5f:d8 10.1.1.46
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00292|binding|INFO|Claiming lport c5debb4f-7f40-48f2-afb5-efa11af6cc4c for this chassis.
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00293|binding|INFO|c5debb4f-7f40-48f2-afb5-efa11af6cc4c: Claiming fa:16:3e:75:0f:50 10.1.1.226
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.765 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[67568ac2-e2fd-4b7b-9f6d-f022d688915d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.7658] manager: (tap082f75aa-30): new Veth device (/org/freedesktop/NetworkManager/Devices/155)
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.768 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:5f:d8 10.1.1.46'], port_security=['fa:16:3e:36:5f:d8 10.1.1.46'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-206664704', 'neutron:cidrs': '10.1.1.46/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b73044fd-31f0-4d49-88bb-64109a59249a', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-206664704', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b3030356-f7fc-436b-a72d-f5bb90702dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f6bb4ad-4d85-4fbe-8f9a-0eea7c1f3e56, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=0170ec24-0bde-4eb7-b349-a8f304853e0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.769 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:bc:e6 10.1.1.88'], port_security=['fa:16:3e:0a:bc:e6 10.1.1.88'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-1092778721', 'neutron:cidrs': '10.1.1.88/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b73044fd-31f0-4d49-88bb-64109a59249a', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-1092778721', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b3030356-f7fc-436b-a72d-f5bb90702dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f6bb4ad-4d85-4fbe-8f9a-0eea7c1f3e56, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=ac2ac5dc-08f6-4faa-9427-e87be9c9d933) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.770 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:95:32 10.2.2.200'], port_security=['fa:16:3e:20:95:32 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a5294f6e-41c7-4853-9f7a-aaececdf82e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d716296-3fee-49e1-85e2-41d6ec935efa, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.771 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:0f:50 10.1.1.226'], port_security=['fa:16:3e:75:0f:50 10.1.1.226'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.226/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b73044fd-31f0-4d49-88bb-64109a59249a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a5294f6e-41c7-4853-9f7a-aaececdf82e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f6bb4ad-4d85-4fbe-8f9a-0eea7c1f3e56, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c5debb4f-7f40-48f2-afb5-efa11af6cc4c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.773 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:9b:57 10.2.2.100'], port_security=['fa:16:3e:49:9b:57 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a5294f6e-41c7-4853-9f7a-aaececdf82e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d716296-3fee-49e1-85e2-41d6ec935efa, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.774 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:ab:4e 10.1.1.172'], port_security=['fa:16:3e:f7:ab:4e 10.1.1.172'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.172/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b73044fd-31f0-4d49-88bb-64109a59249a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a5294f6e-41c7-4853-9f7a-aaececdf82e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f6bb4ad-4d85-4fbe-8f9a-0eea7c1f3e56, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=fd3698a6-a68d-42ed-b217-f5bdc4163195) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00294|binding|INFO|Setting lport c51fc487-eedd-421d-b8cc-d0a322b4a129 ovn-installed in OVS
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00295|binding|INFO|Setting lport c51fc487-eedd-421d-b8cc-d0a322b4a129 up in Southbound
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005465988 systemd-machined[192594]: New machine qemu-32-instance-00000053.
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.806 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[49af177e-20e9-4b0e-98a6-d260ceba20b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.809 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[664aef19-2e79-4416-8080-712be0df5238]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.8288] device (tap082f75aa-30): carrier: link connected
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.832 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[31071213-5ddc-4e72-90f2-0f7756275643]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.848 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b910014e-fdda-41ae-975f-2ffe065fe9aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap082f75aa-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:e1:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569616, 'reachable_time': 37422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271158, 'error': None, 'target': 'ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.860 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2620f9ac-4df5-4ef9-9f73-2526ab5975f8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe39:e15c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569616, 'tstamp': 569616}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271159, 'error': None, 'target': 'ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 systemd[1]: Started Virtual Machine qemu-32-instance-00000053.
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.881 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[424b4ebc-ea05-4b1b-961e-b1969ea8d814]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap082f75aa-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:e1:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569616, 'reachable_time': 37422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 271160, 'error': None, 'target': 'ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.910 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc9054f-83c1-4a9f-acc6-fcd3caa4ca7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.988 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7de196a2-54b1-4f74-9d89-82ae4920926d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.990 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap082f75aa-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.990 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.990 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap082f75aa-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:35 np0005465988 NetworkManager[45041]: <info>  [1759407635.9929] manager: (tap082f75aa-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Oct  2 08:20:35 np0005465988 kernel: tap082f75aa-30: entered promiscuous mode
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:35.996 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap082f75aa-30, col_values=(('external_ids', {'iface-id': '0683cf03-5d0f-4c2e-8b28-7912d215445a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:35 np0005465988 nova_compute[236126]: 2025-10-02 12:20:35.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:35Z|00296|binding|INFO|Releasing lport 0683cf03-5d0f-4c2e-8b28-7912d215445a from this chassis (sb_readonly=0)
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.001 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/082f75aa-3cb3-4aac-903c-8187fdb62a93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/082f75aa-3cb3-4aac-903c-8187fdb62a93.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.001 2 DEBUG nova.compute.manager [req-c437ad4d-8693-4996-95a4-37982c04b7b2 req-ba3e8518-afcc-4a3f-86fc-354cf4baf3a1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.001 2 DEBUG oslo_concurrency.lockutils [req-c437ad4d-8693-4996-95a4-37982c04b7b2 req-ba3e8518-afcc-4a3f-86fc-354cf4baf3a1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.001 2 DEBUG oslo_concurrency.lockutils [req-c437ad4d-8693-4996-95a4-37982c04b7b2 req-ba3e8518-afcc-4a3f-86fc-354cf4baf3a1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.001 2 DEBUG oslo_concurrency.lockutils [req-c437ad4d-8693-4996-95a4-37982c04b7b2 req-ba3e8518-afcc-4a3f-86fc-354cf4baf3a1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.002 2 DEBUG nova.compute.manager [req-c437ad4d-8693-4996-95a4-37982c04b7b2 req-ba3e8518-afcc-4a3f-86fc-354cf4baf3a1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Processing event network-vif-plugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.002 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4f1b80ea-8c05-4bbe-8010-90ade15168cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.003 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-082f75aa-3cb3-4aac-903c-8187fdb62a93
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/082f75aa-3cb3-4aac-903c-8187fdb62a93.pid.haproxy
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 082f75aa-3cb3-4aac-903c-8187fdb62a93
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.004 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93', 'env', 'PROCESS_TAG=haproxy-082f75aa-3cb3-4aac-903c-8187fdb62a93', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/082f75aa-3cb3-4aac-903c-8187fdb62a93.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00297|binding|INFO|Setting lport f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 ovn-installed in OVS
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00298|binding|INFO|Setting lport f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 up in Southbound
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00299|binding|INFO|Setting lport 0170ec24-0bde-4eb7-b349-a8f304853e0d ovn-installed in OVS
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00300|binding|INFO|Setting lport 0170ec24-0bde-4eb7-b349-a8f304853e0d up in Southbound
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00301|binding|INFO|Setting lport fd3698a6-a68d-42ed-b217-f5bdc4163195 ovn-installed in OVS
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00302|binding|INFO|Setting lport fd3698a6-a68d-42ed-b217-f5bdc4163195 up in Southbound
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00303|binding|INFO|Setting lport c5debb4f-7f40-48f2-afb5-efa11af6cc4c ovn-installed in OVS
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00304|binding|INFO|Setting lport c5debb4f-7f40-48f2-afb5-efa11af6cc4c up in Southbound
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00305|binding|INFO|Setting lport f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b ovn-installed in OVS
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00306|binding|INFO|Setting lport f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b up in Southbound
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00307|binding|INFO|Setting lport ac2ac5dc-08f6-4faa-9427-e87be9c9d933 ovn-installed in OVS
Oct  2 08:20:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:36Z|00308|binding|INFO|Setting lport ac2ac5dc-08f6-4faa-9427-e87be9c9d933 up in Southbound
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.438 2 INFO nova.virt.libvirt.driver [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Deleting instance files /var/lib/nova/instances/bc4239f5-3cf2-4325-803c-73121f7e0ee0_del#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.439 2 INFO nova.virt.libvirt.driver [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Deletion of /var/lib/nova/instances/bc4239f5-3cf2-4325-803c-73121f7e0ee0_del complete#033[00m
Oct  2 08:20:36 np0005465988 podman[271199]: 2025-10-02 12:20:36.439869768 +0000 UTC m=+0.058067421 container create fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:20:36 np0005465988 systemd[1]: Started libpod-conmon-fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0.scope.
Oct  2 08:20:36 np0005465988 podman[271199]: 2025-10-02 12:20:36.415075446 +0000 UTC m=+0.033273109 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:20:36 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:20:36 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce9784a83c7ef3c6a6711462adcd5b1d54aa5d0257d697bb56164190825633d0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.523 2 INFO nova.compute.manager [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Took 1.84 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.524 2 DEBUG oslo.service.loopingcall [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.525 2 DEBUG nova.compute.manager [-] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.525 2 DEBUG nova.network.neutron [-] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:20:36 np0005465988 podman[271199]: 2025-10-02 12:20:36.540715573 +0000 UTC m=+0.158913236 container init fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:20:36 np0005465988 podman[271199]: 2025-10-02 12:20:36.547659075 +0000 UTC m=+0.165856718 container start fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:20:36 np0005465988 neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93[271240]: [NOTICE]   (271255) : New worker (271257) forked
Oct  2 08:20:36 np0005465988 neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93[271240]: [NOTICE]   (271255) : Loading success.
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.601 2 DEBUG nova.compute.manager [req-ce48c2bb-9941-4d9c-88cb-2772151427f7 req-d54f839c-f901-48d8-93f3-072282f6c5cc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.601 2 DEBUG oslo_concurrency.lockutils [req-ce48c2bb-9941-4d9c-88cb-2772151427f7 req-d54f839c-f901-48d8-93f3-072282f6c5cc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.601 2 DEBUG oslo_concurrency.lockutils [req-ce48c2bb-9941-4d9c-88cb-2772151427f7 req-d54f839c-f901-48d8-93f3-072282f6c5cc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.602 2 DEBUG oslo_concurrency.lockutils [req-ce48c2bb-9941-4d9c-88cb-2772151427f7 req-d54f839c-f901-48d8-93f3-072282f6c5cc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.602 2 DEBUG nova.compute.manager [req-ce48c2bb-9941-4d9c-88cb-2772151427f7 req-d54f839c-f901-48d8-93f3-072282f6c5cc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Processing event network-vif-plugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.636 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 0170ec24-0bde-4eb7-b349-a8f304853e0d in datapath b73044fd-31f0-4d49-88bb-64109a59249a unbound from our chassis#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.639 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b73044fd-31f0-4d49-88bb-64109a59249a#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.652 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[42f53fbc-8134-426d-b3e8-40612c7ca0ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.653 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb73044fd-31 in ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.655 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb73044fd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.655 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b0cf2345-c7f2-4835-99c3-50f4551cd8f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.656 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2025ba75-efcb-4415-be79-b27608ad1a40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.667 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[95b3f167-305a-4f0c-b55b-bdc289e794b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.695 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[db93680a-2458-4ff4-b170-2179e6e8cddb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.738 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f701ffe2-3498-402f-91c0-686416332360]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.746 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[956acf94-7241-4866-aad1-e2e0d6669120]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 NetworkManager[45041]: <info>  [1759407636.7485] manager: (tapb73044fd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/157)
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.791 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9a40c63b-f9aa-4e34-a0a3-73e493279419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.794 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e3888ad6-1db9-40b1-919d-52dc8aeb5f62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 NetworkManager[45041]: <info>  [1759407636.8202] device (tapb73044fd-30): carrier: link connected
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.822 2 DEBUG nova.network.neutron [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updated VIF entry in instance network info cache for port f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.823 2 DEBUG nova.network.neutron [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [{"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.831 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7c39ac7a-0fa3-485a-86f0-998d84ba1ab7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.853 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8d0c2d03-e7dc-4d3b-af82-5845b3ad6f72]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb73044fd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:59:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569715, 'reachable_time': 33333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271323, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 nova_compute[236126]: 2025-10-02 12:20:36.858 2 DEBUG oslo_concurrency.lockutils [req-5ab2f558-c582-4d57-8f9b-417705c95570 req-50510c62-695f-43b0-a4a8-2e328024a13f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.874 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7b0c6b05-415d-48d9-bd6c-b56ff046380d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9b:5952'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569715, 'tstamp': 569715}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271324, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.905 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4cc388-0acb-457f-a91e-e51228d20b6a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb73044fd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:59:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569715, 'reachable_time': 33333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 271326, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:36.957 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[70cabaef-dfbd-4398-9d00-cb620f8d1c66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.063 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5a7701a8-8710-4050-bc70-3e74634866d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.066 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb73044fd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.066 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.068 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb73044fd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:37 np0005465988 kernel: tapb73044fd-30: entered promiscuous mode
Oct  2 08:20:37 np0005465988 NetworkManager[45041]: <info>  [1759407637.0722] manager: (tapb73044fd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.077 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb73044fd-30, col_values=(('external_ids', {'iface-id': 'be4fbd69-8ace-4de3-94ae-0a974cc165ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:37Z|00309|binding|INFO|Releasing lport be4fbd69-8ace-4de3-94ae-0a974cc165ac from this chassis (sb_readonly=0)
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.083 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b73044fd-31f0-4d49-88bb-64109a59249a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b73044fd-31f0-4d49-88bb-64109a59249a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.084 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3355df47-dc07-4060-93c7-8e2bc5df7311]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.085 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-b73044fd-31f0-4d49-88bb-64109a59249a
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/b73044fd-31f0-4d49-88bb-64109a59249a.pid.haproxy
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID b73044fd-31f0-4d49-88bb-64109a59249a
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.087 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'env', 'PROCESS_TAG=haproxy-b73044fd-31f0-4d49-88bb-64109a59249a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b73044fd-31f0-4d49-88bb-64109a59249a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:37.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:37.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.202 2 DEBUG nova.network.neutron [-] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.245 2 INFO nova.compute.manager [-] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Took 0.72 seconds to deallocate network for instance.#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.301 2 DEBUG oslo_concurrency.lockutils [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.301 2 DEBUG oslo_concurrency.lockutils [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.373 2 DEBUG oslo_concurrency.processutils [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.427 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407637.4250584, 969ba235-be4a-44e1-a6f2-7c5922b9661e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.428 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] VM Started (Lifecycle Event)#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.449 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.454 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407637.427171, 969ba235-be4a-44e1-a6f2-7c5922b9661e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.454 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.470 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.476 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.503 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:37 np0005465988 podman[271361]: 2025-10-02 12:20:37.504650841 +0000 UTC m=+0.044429754 container create eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:20:37 np0005465988 systemd[1]: Started libpod-conmon-eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea.scope.
Oct  2 08:20:37 np0005465988 podman[271361]: 2025-10-02 12:20:37.483059762 +0000 UTC m=+0.022838705 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:20:37 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:20:37 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1236f4e0fff89e802e4671c99231c6cf678d1bc5bfc895e55631724b0b694efb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:20:37 np0005465988 podman[271361]: 2025-10-02 12:20:37.603164658 +0000 UTC m=+0.142943601 container init eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:20:37 np0005465988 podman[271361]: 2025-10-02 12:20:37.624131298 +0000 UTC m=+0.163910231 container start eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:20:37 np0005465988 neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a[271395]: [NOTICE]   (271399) : New worker (271401) forked
Oct  2 08:20:37 np0005465988 neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a[271395]: [NOTICE]   (271399) : Loading success.
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.676 2 DEBUG nova.compute.manager [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.676 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.677 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.677 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.678 2 DEBUG nova.compute.manager [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Processing event network-vif-plugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.678 2 DEBUG nova.compute.manager [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.678 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.679 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.679 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.679 2 DEBUG nova.compute.manager [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No event matching network-vif-plugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b in dict_keys([('network-vif-plugged', '0170ec24-0bde-4eb7-b349-a8f304853e0d'), ('network-vif-plugged', 'c5debb4f-7f40-48f2-afb5-efa11af6cc4c'), ('network-vif-plugged', 'fd3698a6-a68d-42ed-b217-f5bdc4163195'), ('network-vif-plugged', 'f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.679 2 WARNING nova.compute.manager [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.680 2 DEBUG nova.compute.manager [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.680 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.680 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.681 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.681 2 DEBUG nova.compute.manager [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Processing event network-vif-plugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.681 2 DEBUG nova.compute.manager [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.682 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.682 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.682 2 DEBUG oslo_concurrency.lockutils [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.683 2 DEBUG nova.compute.manager [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No event matching network-vif-plugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 in dict_keys([('network-vif-plugged', '0170ec24-0bde-4eb7-b349-a8f304853e0d'), ('network-vif-plugged', 'c5debb4f-7f40-48f2-afb5-efa11af6cc4c'), ('network-vif-plugged', 'f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.683 2 WARNING nova.compute.manager [req-54c56815-e00d-42cb-ba98-10999a36a015 req-1e1f2d33-3962-437e-9b7b-274b5f634af1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.700 142124 INFO neutron.agent.ovn.metadata.agent [-] Port ac2ac5dc-08f6-4faa-9427-e87be9c9d933 in datapath b73044fd-31f0-4d49-88bb-64109a59249a unbound from our chassis#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.704 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b73044fd-31f0-4d49-88bb-64109a59249a#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.726 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d487e4eb-e726-4cf4-8ed0-ebc461945967]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.774 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[dc4f56c5-03fc-43d2-b20d-453dc12d7be8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.784 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[6267e491-573f-4892-8e95-c6a2ac46032d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:20:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1064520525' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.830 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f4175928-0480-42f0-a71a-290e305caa9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.854 2 DEBUG oslo_concurrency.processutils [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.863 2 DEBUG nova.compute.provider_tree [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.866 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b6600f4c-2a1e-491e-a12b-f816ccf16bcd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb73044fd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:59:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 306, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 306, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569715, 'reachable_time': 33333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271417, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.894 2 DEBUG nova.scheduler.client.report [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.895 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3a6c0255-69b7-4faa-aaaa-ebf7241531e9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb73044fd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569733, 'tstamp': 569733}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271418, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapb73044fd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569738, 'tstamp': 569738}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271418, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.898 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb73044fd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.903 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb73044fd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.904 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.905 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb73044fd-30, col_values=(('external_ids', {'iface-id': 'be4fbd69-8ace-4de3-94ae-0a974cc165ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.905 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.907 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 in datapath 3f8de4fc-b69f-4cee-bee5-9f9b5275933e unbound from our chassis#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.910 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f8de4fc-b69f-4cee-bee5-9f9b5275933e#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.933 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b850fd90-daff-408f-93b7-8f2da8fa3681]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.934 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3f8de4fc-b1 in ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.937 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3f8de4fc-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.937 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[65262c26-4418-4e99-8324-f97529d39b91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 nova_compute[236126]: 2025-10-02 12:20:37.939 2 DEBUG oslo_concurrency.lockutils [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.939 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fcef3f-ebb8-4d6e-a01e-4e19bfe0cd19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.957 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[69f93a70-4863-4ac9-abe6-9e35565ca8b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:37.981 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[00db6f90-40c1-4ff2-9dc6-322c8a898792]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e256 e256: 3 total, 3 up, 3 in
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.003 2 INFO nova.scheduler.client.report [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Deleted allocations for instance bc4239f5-3cf2-4325-803c-73121f7e0ee0#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.028 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[344d1f1e-a779-468e-a8b6-8d35ec3ec5a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 NetworkManager[45041]: <info>  [1759407638.0444] manager: (tap3f8de4fc-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/159)
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.045 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[72e31a4f-d274-4ab5-82cd-648983aa1f5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.076 2 DEBUG oslo_concurrency.lockutils [None req-06b6cdf3-9e1f-4d44-978b-89704c4fae7a 69d8e29c6d3747e98a5985a584f4c814 8efba404696b40fbbaa6431b934b87f1 - - default default] Lock "bc4239f5-3cf2-4325-803c-73121f7e0ee0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.401s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:38 np0005465988 systemd-udevd[271426]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.088 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[456cb51e-114a-4d0f-8896-a940575abf9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.094 2 DEBUG nova.compute.manager [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.094 2 DEBUG oslo_concurrency.lockutils [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.094 2 DEBUG oslo_concurrency.lockutils [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.095 2 DEBUG oslo_concurrency.lockutils [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.095 2 DEBUG nova.compute.manager [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No event matching network-vif-plugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 in dict_keys([('network-vif-plugged', '0170ec24-0bde-4eb7-b349-a8f304853e0d'), ('network-vif-plugged', 'c5debb4f-7f40-48f2-afb5-efa11af6cc4c'), ('network-vif-plugged', 'f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.095 2 WARNING nova.compute.manager [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.095 2 DEBUG nova.compute.manager [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.095 2 DEBUG oslo_concurrency.lockutils [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.095 2 DEBUG oslo_concurrency.lockutils [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.093 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8dde6fb3-942d-4923-8ebe-b564066f237f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.096 2 DEBUG oslo_concurrency.lockutils [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.096 2 DEBUG nova.compute.manager [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Processing event network-vif-plugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.096 2 DEBUG nova.compute.manager [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.096 2 DEBUG oslo_concurrency.lockutils [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.096 2 DEBUG oslo_concurrency.lockutils [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.096 2 DEBUG oslo_concurrency.lockutils [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.097 2 DEBUG nova.compute.manager [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No event matching network-vif-plugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 in dict_keys([('network-vif-plugged', '0170ec24-0bde-4eb7-b349-a8f304853e0d'), ('network-vif-plugged', 'c5debb4f-7f40-48f2-afb5-efa11af6cc4c')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.097 2 WARNING nova.compute.manager [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.097 2 DEBUG nova.compute.manager [req-a9ec5b9e-998e-46c7-a418-b8cf9197f0f9 req-22ad35c0-e299-4235-9ab2-de732d117a3a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Received event network-vif-deleted-576bdab0-26cd-4663-8dd5-149075e0d45d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:38 np0005465988 NetworkManager[45041]: <info>  [1759407638.1242] device (tap3f8de4fc-b0): carrier: link connected
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.131 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5e6f965d-2469-4ddd-9eba-d364c784de05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.149 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8907b75e-150d-483d-b81d-f2bcbcfb976e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f8de4fc-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0b:05:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569845, 'reachable_time': 30912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271445, 'error': None, 'target': 'ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.168 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f68fa5e5-0897-4504-aa00-f40e9b4d3147]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0b:59e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569845, 'tstamp': 569845}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271446, 'error': None, 'target': 'ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.186 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[773c7a74-d410-4774-88a6-3f9a6256a0ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f8de4fc-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0b:05:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569845, 'reachable_time': 30912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 271447, 'error': None, 'target': 'ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.220 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8880c57a-cf01-448a-a34d-dbee917b84b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.286 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f92f65-4b2c-4d4d-9f2f-1804eccfe2f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.288 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f8de4fc-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.289 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.290 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f8de4fc-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:38 np0005465988 kernel: tap3f8de4fc-b0: entered promiscuous mode
Oct  2 08:20:38 np0005465988 NetworkManager[45041]: <info>  [1759407638.2940] manager: (tap3f8de4fc-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.299 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f8de4fc-b0, col_values=(('external_ids', {'iface-id': 'c420bb84-cdaf-431d-8fb8-c332d6fdbb10'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:38Z|00310|binding|INFO|Releasing lport c420bb84-cdaf-431d-8fb8-c332d6fdbb10 from this chassis (sb_readonly=0)
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.330 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3f8de4fc-b69f-4cee-bee5-9f9b5275933e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3f8de4fc-b69f-4cee-bee5-9f9b5275933e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.331 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3feed788-8a5a-4678-8b22-d42d830c50ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.333 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-3f8de4fc-b69f-4cee-bee5-9f9b5275933e
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/3f8de4fc-b69f-4cee-bee5-9f9b5275933e.pid.haproxy
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 3f8de4fc-b69f-4cee-bee5-9f9b5275933e
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.334 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'env', 'PROCESS_TAG=haproxy-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3f8de4fc-b69f-4cee-bee5-9f9b5275933e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.700 2 DEBUG nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.701 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.701 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.701 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.701 2 DEBUG nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No event matching network-vif-plugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 in dict_keys([('network-vif-plugged', '0170ec24-0bde-4eb7-b349-a8f304853e0d'), ('network-vif-plugged', 'c5debb4f-7f40-48f2-afb5-efa11af6cc4c')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.702 2 WARNING nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.702 2 DEBUG nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-0170ec24-0bde-4eb7-b349-a8f304853e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.702 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.702 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.702 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.703 2 DEBUG nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Processing event network-vif-plugged-0170ec24-0bde-4eb7-b349-a8f304853e0d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.703 2 DEBUG nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-0170ec24-0bde-4eb7-b349-a8f304853e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.703 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.703 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.704 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.704 2 DEBUG nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No event matching network-vif-plugged-0170ec24-0bde-4eb7-b349-a8f304853e0d in dict_keys([('network-vif-plugged', 'c5debb4f-7f40-48f2-afb5-efa11af6cc4c')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.704 2 WARNING nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-0170ec24-0bde-4eb7-b349-a8f304853e0d for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.704 2 DEBUG nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.705 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.705 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.706 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.706 2 DEBUG nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Processing event network-vif-plugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.706 2 DEBUG nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.706 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.706 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.706 2 DEBUG oslo_concurrency.lockutils [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.706 2 DEBUG nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-plugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.707 2 WARNING nova.compute.manager [req-ed877dbc-aace-499b-be3c-1ed226a28ce1 req-58d58654-48cc-46e5-aa74-8776f81b7910 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.707 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.712 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.716 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407638.7165613, 969ba235-be4a-44e1-a6f2-7c5922b9661e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.717 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.719 2 INFO nova.virt.libvirt.driver [-] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Instance spawned successfully.#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.719 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.737 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.747 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.752 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.752 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.753 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.754 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.754 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.755 2 DEBUG nova.virt.libvirt.driver [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:38 np0005465988 podman[271480]: 2025-10-02 12:20:38.763152971 +0000 UTC m=+0.073411718 container create c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.784 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:38 np0005465988 systemd[1]: Started libpod-conmon-c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574.scope.
Oct  2 08:20:38 np0005465988 podman[271480]: 2025-10-02 12:20:38.729940994 +0000 UTC m=+0.040199811 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:20:38 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:20:38 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61a7a26443364e2fbecce32a7fbd5a452c008631c6cf84c7ca029f3a7f0523c9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:20:38 np0005465988 podman[271480]: 2025-10-02 12:20:38.850117232 +0000 UTC m=+0.160375969 container init c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:20:38 np0005465988 podman[271480]: 2025-10-02 12:20:38.85690201 +0000 UTC m=+0.167160747 container start c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.865 2 INFO nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Took 24.91 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.866 2 DEBUG nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:38 np0005465988 neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e[271494]: [NOTICE]   (271498) : New worker (271501) forked
Oct  2 08:20:38 np0005465988 neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e[271494]: [NOTICE]   (271498) : Loading success.
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.917 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c5debb4f-7f40-48f2-afb5-efa11af6cc4c in datapath b73044fd-31f0-4d49-88bb-64109a59249a unbound from our chassis#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.920 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b73044fd-31f0-4d49-88bb-64109a59249a#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.937 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3ad93c5b-bbfc-4e80-bd12-0b97a51aafb0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.969 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d6200352-f536-4395-b240-2aeef5dc1556]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.971 2 INFO nova.compute.manager [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Took 31.13 seconds to build instance.#033[00m
Oct  2 08:20:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:38.973 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4c10ec3a-e582-452e-9f38-a83228f28c6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:38 np0005465988 nova_compute[236126]: 2025-10-02 12:20:38.992 2 DEBUG oslo_concurrency.lockutils [None req-e12ad3b3-efc7-45f2-a250-fc96b684fe25 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 31.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.015 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae307eb-81b6-4324-a014-57ef3dc28296]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e257 e257: 3 total, 3 up, 3 in
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.046 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[23b75502-7cc4-42ae-a2ea-a15e70df5b19]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb73044fd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:59:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 8, 'rx_bytes': 612, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 8, 'rx_bytes': 612, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569715, 'reachable_time': 33333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 528, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 528, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271515, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.066 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8f48c53a-4e4f-4740-bd98-82c75f6d253c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb73044fd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569733, 'tstamp': 569733}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271516, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapb73044fd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569738, 'tstamp': 569738}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271516, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.068 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb73044fd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.072 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb73044fd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.072 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.073 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb73044fd-30, col_values=(('external_ids', {'iface-id': 'be4fbd69-8ace-4de3-94ae-0a974cc165ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:39 np0005465988 nova_compute[236126]: 2025-10-02 12:20:39.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.073 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.075 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b in datapath 3f8de4fc-b69f-4cee-bee5-9f9b5275933e unbound from our chassis#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.077 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f8de4fc-b69f-4cee-bee5-9f9b5275933e#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.099 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[238196cf-ffe3-4f05-8a8b-305a3d38cf88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.136 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e847320d-f34a-47ad-8710-2ba6a34b2272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.142 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e26886ff-0a39-4b0e-b132-d6ccf8ef6d2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:39.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:39.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.176 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[eb9ac34a-838a-4e62-8f18-8385574fa8e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.200 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d33a5783-1d89-4086-931c-cf42ffff7546]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f8de4fc-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0b:05:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 306, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 306, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569845, 'reachable_time': 30912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271546, 'error': None, 'target': 'ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.228 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c68f12b2-e2bd-4018-b132-ecf2d4ec5c4b]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.2.2.2'], ['IFA_LOCAL', '10.2.2.2'], ['IFA_BROADCAST', '10.2.2.255'], ['IFA_LABEL', 'tap3f8de4fc-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569858, 'tstamp': 569858}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271548, 'error': None, 'target': 'ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3f8de4fc-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569860, 'tstamp': 569860}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271548, 'error': None, 'target': 'ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.230 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f8de4fc-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.233 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f8de4fc-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:39 np0005465988 nova_compute[236126]: 2025-10-02 12:20:39.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.234 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.234 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f8de4fc-b0, col_values=(('external_ids', {'iface-id': 'c420bb84-cdaf-431d-8fb8-c332d6fdbb10'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.235 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.237 142124 INFO neutron.agent.ovn.metadata.agent [-] Port fd3698a6-a68d-42ed-b217-f5bdc4163195 in datapath b73044fd-31f0-4d49-88bb-64109a59249a unbound from our chassis#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.239 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b73044fd-31f0-4d49-88bb-64109a59249a#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.260 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[278334ae-364d-427e-9fe5-8e2977145fcc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.316 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[226ad45e-e087-4dc0-ae58-2840f924cb65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.321 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e998bdec-af9e-49e1-af42-44504f18e9fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.375 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[077dec74-404b-42bd-8c10-8a163e8cab21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.414 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cb71661f-ff88-43a7-a3e6-55822979bb8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb73044fd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:59:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 10, 'rx_bytes': 612, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 10, 'rx_bytes': 612, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569715, 'reachable_time': 33333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 528, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 528, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271630, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.441 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dd9c62cf-2b9b-4a85-a92f-fe0de0d325b3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb73044fd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569733, 'tstamp': 569733}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271661, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapb73044fd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569738, 'tstamp': 569738}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271661, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.444 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb73044fd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:39 np0005465988 nova_compute[236126]: 2025-10-02 12:20:39.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:39 np0005465988 nova_compute[236126]: 2025-10-02 12:20:39.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.454 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb73044fd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.456 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.460 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb73044fd-30, col_values=(('external_ids', {'iface-id': 'be4fbd69-8ace-4de3-94ae-0a974cc165ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:39.461 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e258 e258: 3 total, 3 up, 3 in
Oct  2 08:20:40 np0005465988 nova_compute[236126]: 2025-10-02 12:20:40.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:41 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:20:41 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:20:41 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:20:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:41.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:41.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:41 np0005465988 NetworkManager[45041]: <info>  [1759407641.2094] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Oct  2 08:20:41 np0005465988 NetworkManager[45041]: <info>  [1759407641.2115] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Oct  2 08:20:41 np0005465988 nova_compute[236126]: 2025-10-02 12:20:41.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:41 np0005465988 nova_compute[236126]: 2025-10-02 12:20:41.322 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407626.3214686, e116d367-5ae9-4ce2-9d33-3936fd3de658 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:41 np0005465988 nova_compute[236126]: 2025-10-02 12:20:41.323 2 INFO nova.compute.manager [-] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:20:41 np0005465988 nova_compute[236126]: 2025-10-02 12:20:41.345 2 DEBUG nova.compute.manager [None req-5d4e45a1-4444-46f9-8ce4-fe2ce10ea8c8 - - - - - -] [instance: e116d367-5ae9-4ce2-9d33-3936fd3de658] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:41 np0005465988 nova_compute[236126]: 2025-10-02 12:20:41.480 2 DEBUG nova.compute.manager [req-54877b58-9a60-4403-979a-d4985500cb7d req-a4c116c5-ebca-4fae-ae19-084fa89500eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-changed-c51fc487-eedd-421d-b8cc-d0a322b4a129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:41 np0005465988 nova_compute[236126]: 2025-10-02 12:20:41.480 2 DEBUG nova.compute.manager [req-54877b58-9a60-4403-979a-d4985500cb7d req-a4c116c5-ebca-4fae-ae19-084fa89500eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing instance network info cache due to event network-changed-c51fc487-eedd-421d-b8cc-d0a322b4a129. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:41 np0005465988 nova_compute[236126]: 2025-10-02 12:20:41.481 2 DEBUG oslo_concurrency.lockutils [req-54877b58-9a60-4403-979a-d4985500cb7d req-a4c116c5-ebca-4fae-ae19-084fa89500eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:41 np0005465988 nova_compute[236126]: 2025-10-02 12:20:41.481 2 DEBUG oslo_concurrency.lockutils [req-54877b58-9a60-4403-979a-d4985500cb7d req-a4c116c5-ebca-4fae-ae19-084fa89500eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:41 np0005465988 nova_compute[236126]: 2025-10-02 12:20:41.481 2 DEBUG nova.network.neutron [req-54877b58-9a60-4403-979a-d4985500cb7d req-a4c116c5-ebca-4fae-ae19-084fa89500eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Refreshing network info cache for port c51fc487-eedd-421d-b8cc-d0a322b4a129 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:41 np0005465988 nova_compute[236126]: 2025-10-02 12:20:41.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:41Z|00311|binding|INFO|Releasing lport 0683cf03-5d0f-4c2e-8b28-7912d215445a from this chassis (sb_readonly=0)
Oct  2 08:20:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:41Z|00312|binding|INFO|Releasing lport c420bb84-cdaf-431d-8fb8-c332d6fdbb10 from this chassis (sb_readonly=0)
Oct  2 08:20:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:41Z|00313|binding|INFO|Releasing lport be4fbd69-8ace-4de3-94ae-0a974cc165ac from this chassis (sb_readonly=0)
Oct  2 08:20:41 np0005465988 nova_compute[236126]: 2025-10-02 12:20:41.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:42 np0005465988 nova_compute[236126]: 2025-10-02 12:20:42.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:43.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:20:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:43.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:20:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e259 e259: 3 total, 3 up, 3 in
Oct  2 08:20:44 np0005465988 nova_compute[236126]: 2025-10-02 12:20:44.034 2 DEBUG nova.network.neutron [req-54877b58-9a60-4403-979a-d4985500cb7d req-a4c116c5-ebca-4fae-ae19-084fa89500eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updated VIF entry in instance network info cache for port c51fc487-eedd-421d-b8cc-d0a322b4a129. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:20:44 np0005465988 nova_compute[236126]: 2025-10-02 12:20:44.035 2 DEBUG nova.network.neutron [req-54877b58-9a60-4403-979a-d4985500cb7d req-a4c116c5-ebca-4fae-ae19-084fa89500eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [{"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:44 np0005465988 nova_compute[236126]: 2025-10-02 12:20:44.069 2 DEBUG oslo_concurrency.lockutils [req-54877b58-9a60-4403-979a-d4985500cb7d req-a4c116c5-ebca-4fae-ae19-084fa89500eb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-969ba235-be4a-44e1-a6f2-7c5922b9661e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:44 np0005465988 nova_compute[236126]: 2025-10-02 12:20:44.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:44.131 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:44.134 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:20:44 np0005465988 nova_compute[236126]: 2025-10-02 12:20:44.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:44 np0005465988 nova_compute[236126]: 2025-10-02 12:20:44.526 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:44 np0005465988 nova_compute[236126]: 2025-10-02 12:20:44.527 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:44 np0005465988 nova_compute[236126]: 2025-10-02 12:20:44.527 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:44 np0005465988 nova_compute[236126]: 2025-10-02 12:20:44.527 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:20:44 np0005465988 nova_compute[236126]: 2025-10-02 12:20:44.528 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:44 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:44Z|00314|binding|INFO|Releasing lport 0683cf03-5d0f-4c2e-8b28-7912d215445a from this chassis (sb_readonly=0)
Oct  2 08:20:44 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:44Z|00315|binding|INFO|Releasing lport c420bb84-cdaf-431d-8fb8-c332d6fdbb10 from this chassis (sb_readonly=0)
Oct  2 08:20:44 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:44Z|00316|binding|INFO|Releasing lport be4fbd69-8ace-4de3-94ae-0a974cc165ac from this chassis (sb_readonly=0)
Oct  2 08:20:44 np0005465988 nova_compute[236126]: 2025-10-02 12:20:44.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:20:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1346637038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.033 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.134 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000053 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.135 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000053 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.135 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000053 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.135 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000053 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:45.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e260 e260: 3 total, 3 up, 3 in
Oct  2 08:20:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:45.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.386 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.388 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4337MB free_disk=20.935108184814453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.388 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.389 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.468 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 969ba235-be4a-44e1-a6f2-7c5922b9661e actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.469 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.469 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.529 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:20:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2418439772' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.972 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.976 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:45 np0005465988 nova_compute[236126]: 2025-10-02 12:20:45.992 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:46 np0005465988 nova_compute[236126]: 2025-10-02 12:20:46.021 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:20:46 np0005465988 nova_compute[236126]: 2025-10-02 12:20:46.021 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e261 e261: 3 total, 3 up, 3 in
Oct  2 08:20:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:47.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:47.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:20:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:20:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e262 e262: 3 total, 3 up, 3 in
Oct  2 08:20:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:47 np0005465988 podman[271762]: 2025-10-02 12:20:47.536323075 +0000 UTC m=+0.066833896 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent)
Oct  2 08:20:47 np0005465988 nova_compute[236126]: 2025-10-02 12:20:47.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:48 np0005465988 nova_compute[236126]: 2025-10-02 12:20:48.022 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:48 np0005465988 nova_compute[236126]: 2025-10-02 12:20:48.022 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:48 np0005465988 nova_compute[236126]: 2025-10-02 12:20:48.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:48 np0005465988 nova_compute[236126]: 2025-10-02 12:20:48.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:48 np0005465988 nova_compute[236126]: 2025-10-02 12:20:48.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:48 np0005465988 nova_compute[236126]: 2025-10-02 12:20:48.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:20:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e263 e263: 3 total, 3 up, 3 in
Oct  2 08:20:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:49.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:49.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:50 np0005465988 nova_compute[236126]: 2025-10-02 12:20:50.122 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407635.1208453, bc4239f5-3cf2-4325-803c-73121f7e0ee0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:50 np0005465988 nova_compute[236126]: 2025-10-02 12:20:50.122 2 INFO nova.compute.manager [-] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:20:50 np0005465988 nova_compute[236126]: 2025-10-02 12:20:50.149 2 DEBUG nova.compute.manager [None req-8e0dedae-3327-4dfb-9236-f4153fbbc1f3 - - - - - -] [instance: bc4239f5-3cf2-4325-803c-73121f7e0ee0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:50 np0005465988 nova_compute[236126]: 2025-10-02 12:20:50.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:50 np0005465988 nova_compute[236126]: 2025-10-02 12:20:50.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:51.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:51.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e264 e264: 3 total, 3 up, 3 in
Oct  2 08:20:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:52 np0005465988 nova_compute[236126]: 2025-10-02 12:20:52.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:52 np0005465988 nova_compute[236126]: 2025-10-02 12:20:52.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:20:53.136 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:20:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:53.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:20:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:53.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:53 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:53Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:49:9b:57 10.2.2.100
Oct  2 08:20:53 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:53Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:49:9b:57 10.2.2.100
Oct  2 08:20:53 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:53Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:ab:4e 10.1.1.172
Oct  2 08:20:53 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:53Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:ab:4e 10.1.1.172
Oct  2 08:20:53 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:53Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:20:95:32 10.2.2.200
Oct  2 08:20:53 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:53Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:20:95:32 10.2.2.200
Oct  2 08:20:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e265 e265: 3 total, 3 up, 3 in
Oct  2 08:20:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:54Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:6f:4c 10.100.0.6
Oct  2 08:20:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:54Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:6f:4c 10.100.0.6
Oct  2 08:20:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:54Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:75:0f:50 10.1.1.226
Oct  2 08:20:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:54Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:75:0f:50 10.1.1.226
Oct  2 08:20:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:54Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0a:bc:e6 10.1.1.88
Oct  2 08:20:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:54Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:bc:e6 10.1.1.88
Oct  2 08:20:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:54Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:36:5f:d8 10.1.1.46
Oct  2 08:20:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:20:54Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:36:5f:d8 10.1.1.46
Oct  2 08:20:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e266 e266: 3 total, 3 up, 3 in
Oct  2 08:20:55 np0005465988 nova_compute[236126]: 2025-10-02 12:20:55.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:55.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:55.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:56 np0005465988 nova_compute[236126]: 2025-10-02 12:20:56.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:56 np0005465988 nova_compute[236126]: 2025-10-02 12:20:56.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:20:56 np0005465988 nova_compute[236126]: 2025-10-02 12:20:56.619 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:20:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:57.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:20:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:57.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:20:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:20:57 np0005465988 nova_compute[236126]: 2025-10-02 12:20:57.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e267 e267: 3 total, 3 up, 3 in
Oct  2 08:20:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:20:59.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:20:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:20:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:20:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:20:59.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:00 np0005465988 nova_compute[236126]: 2025-10-02 12:21:00.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:01.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:01.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e268 e268: 3 total, 3 up, 3 in
Oct  2 08:21:02 np0005465988 nova_compute[236126]: 2025-10-02 12:21:02.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:03.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:03.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e269 e269: 3 total, 3 up, 3 in
Oct  2 08:21:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e270 e270: 3 total, 3 up, 3 in
Oct  2 08:21:05 np0005465988 nova_compute[236126]: 2025-10-02 12:21:05.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:21:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:05.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:21:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:05.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:05 np0005465988 podman[271895]: 2025-10-02 12:21:05.551991833 +0000 UTC m=+0.070094211 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.license=GPLv2)
Oct  2 08:21:05 np0005465988 podman[271894]: 2025-10-02 12:21:05.562797327 +0000 UTC m=+0.081169103 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:21:05 np0005465988 podman[271893]: 2025-10-02 12:21:05.578401722 +0000 UTC m=+0.107576153 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.098929) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407667098981, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 2515, "num_deletes": 261, "total_data_size": 5691199, "memory_usage": 5777328, "flush_reason": "Manual Compaction"}
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407667123012, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 3715692, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39275, "largest_seqno": 41785, "table_properties": {"data_size": 3705427, "index_size": 6503, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 22161, "raw_average_key_size": 21, "raw_value_size": 3684699, "raw_average_value_size": 3509, "num_data_blocks": 281, "num_entries": 1050, "num_filter_entries": 1050, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759407476, "oldest_key_time": 1759407476, "file_creation_time": 1759407667, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 24150 microseconds, and 13473 cpu microseconds.
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.123072) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 3715692 bytes OK
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.123105) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.126022) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.126049) EVENT_LOG_v1 {"time_micros": 1759407667126039, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.126076) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 5680065, prev total WAL file size 5680065, number of live WAL files 2.
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.128391) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(3628KB)], [75(9635KB)]
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407667128433, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 13582895, "oldest_snapshot_seqno": -1}
Oct  2 08:21:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:07.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6669 keys, 11644434 bytes, temperature: kUnknown
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407667206591, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 11644434, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11597222, "index_size": 29436, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16709, "raw_key_size": 170870, "raw_average_key_size": 25, "raw_value_size": 11475084, "raw_average_value_size": 1720, "num_data_blocks": 1177, "num_entries": 6669, "num_filter_entries": 6669, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759407667, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.206919) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 11644434 bytes
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.208547) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 173.6 rd, 148.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 9.4 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(6.8) write-amplify(3.1) OK, records in: 7203, records dropped: 534 output_compression: NoCompression
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.208576) EVENT_LOG_v1 {"time_micros": 1759407667208562, "job": 46, "event": "compaction_finished", "compaction_time_micros": 78254, "compaction_time_cpu_micros": 44878, "output_level": 6, "num_output_files": 1, "total_output_size": 11644434, "num_input_records": 7203, "num_output_records": 6669, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407667210008, "job": 46, "event": "table_file_deletion", "file_number": 77}
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407667213291, "job": 46, "event": "table_file_deletion", "file_number": 75}
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.128245) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.213386) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.213394) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.213395) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.213397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:21:07.213399) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:21:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:07.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:07 np0005465988 nova_compute[236126]: 2025-10-02 12:21:07.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e271 e271: 3 total, 3 up, 3 in
Oct  2 08:21:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:09.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:09.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:10 np0005465988 nova_compute[236126]: 2025-10-02 12:21:10.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:11.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:21:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:11.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:21:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:12 np0005465988 nova_compute[236126]: 2025-10-02 12:21:12.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.013 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.013 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.032 2 DEBUG nova.compute.manager [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.126 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.127 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.138 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.138 2 INFO nova.compute.claims [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:21:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:13.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:13.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.288 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:21:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/639431983' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.732 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.742 2 DEBUG nova.compute.provider_tree [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.759 2 DEBUG nova.scheduler.client.report [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.785 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.786 2 DEBUG nova.compute.manager [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.829 2 DEBUG nova.compute.manager [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.830 2 DEBUG nova.network.neutron [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.856 2 INFO nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:21:13 np0005465988 nova_compute[236126]: 2025-10-02 12:21:13.880 2 DEBUG nova.compute.manager [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.014 2 DEBUG nova.compute.manager [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.017 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.018 2 INFO nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Creating image(s)#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.061 2 DEBUG nova.storage.rbd_utils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.110 2 DEBUG nova.storage.rbd_utils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.144 2 DEBUG nova.storage.rbd_utils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.149 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e272 e272: 3 total, 3 up, 3 in
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.190 2 DEBUG nova.policy [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a9f7faffac7240869a0196df1ddda7e5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1c2c11ebecb14f3188f35ea473c4ca02', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.245 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.247 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.248 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.248 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.285 2 DEBUG nova.storage.rbd_utils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.291 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.897 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.606s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:14 np0005465988 nova_compute[236126]: 2025-10-02 12:21:14.995 2 DEBUG nova.storage.rbd_utils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] resizing rbd image 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:21:15 np0005465988 nova_compute[236126]: 2025-10-02 12:21:15.153 2 DEBUG nova.objects.instance [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lazy-loading 'migration_context' on Instance uuid 9c7a04c1-a740-4d58-bb92-b34f14ccff42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:15 np0005465988 nova_compute[236126]: 2025-10-02 12:21:15.157 2 DEBUG nova.network.neutron [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Successfully created port: 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:21:15 np0005465988 nova_compute[236126]: 2025-10-02 12:21:15.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:15 np0005465988 nova_compute[236126]: 2025-10-02 12:21:15.179 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:21:15 np0005465988 nova_compute[236126]: 2025-10-02 12:21:15.179 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Ensure instance console log exists: /var/lib/nova/instances/9c7a04c1-a740-4d58-bb92-b34f14ccff42/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:21:15 np0005465988 nova_compute[236126]: 2025-10-02 12:21:15.180 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:15 np0005465988 nova_compute[236126]: 2025-10-02 12:21:15.180 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:15 np0005465988 nova_compute[236126]: 2025-10-02 12:21:15.181 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:15.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:15.195 142241 DEBUG eventlet.wsgi.server [-] (142241) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Oct  2 08:21:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:15.197 142241 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Oct  2 08:21:15 np0005465988 ovn_metadata_agent[142119]: Accept: */*#015
Oct  2 08:21:15 np0005465988 ovn_metadata_agent[142119]: Connection: close#015
Oct  2 08:21:15 np0005465988 ovn_metadata_agent[142119]: Content-Type: text/plain#015
Oct  2 08:21:15 np0005465988 ovn_metadata_agent[142119]: Host: 169.254.169.254#015
Oct  2 08:21:15 np0005465988 ovn_metadata_agent[142119]: User-Agent: curl/7.84.0#015
Oct  2 08:21:15 np0005465988 ovn_metadata_agent[142119]: X-Forwarded-For: 10.100.0.6#015
Oct  2 08:21:15 np0005465988 ovn_metadata_agent[142119]: X-Ovn-Network-Id: 082f75aa-3cb3-4aac-903c-8187fdb62a93 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Oct  2 08:21:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:15.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:15.924 142241 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Oct  2 08:21:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:15.925 142241 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 2550 time: 0.7280929#033[00m
Oct  2 08:21:15 np0005465988 haproxy-metadata-proxy-082f75aa-3cb3-4aac-903c-8187fdb62a93[271257]: 10.100.0.6:48432 [02/Oct/2025:12:21:15.193] listener listener/metadata 0/0/0/731/731 200 2534 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Oct  2 08:21:16 np0005465988 nova_compute[236126]: 2025-10-02 12:21:16.048 2 DEBUG nova.network.neutron [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Successfully updated port: 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:21:16 np0005465988 nova_compute[236126]: 2025-10-02 12:21:16.064 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:16 np0005465988 nova_compute[236126]: 2025-10-02 12:21:16.065 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquired lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:16 np0005465988 nova_compute[236126]: 2025-10-02 12:21:16.065 2 DEBUG nova.network.neutron [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:21:16 np0005465988 nova_compute[236126]: 2025-10-02 12:21:16.166 2 DEBUG nova.compute.manager [req-2ffee095-bdc9-4943-936b-ff686971f2e3 req-91825572-95b1-43c4-bdc3-9cd06d01a998 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Received event network-changed-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:16 np0005465988 nova_compute[236126]: 2025-10-02 12:21:16.166 2 DEBUG nova.compute.manager [req-2ffee095-bdc9-4943-936b-ff686971f2e3 req-91825572-95b1-43c4-bdc3-9cd06d01a998 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Refreshing instance network info cache due to event network-changed-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:21:16 np0005465988 nova_compute[236126]: 2025-10-02 12:21:16.166 2 DEBUG oslo_concurrency.lockutils [req-2ffee095-bdc9-4943-936b-ff686971f2e3 req-91825572-95b1-43c4-bdc3-9cd06d01a998 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:16 np0005465988 nova_compute[236126]: 2025-10-02 12:21:16.221 2 DEBUG nova.network.neutron [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:21:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:17.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:17.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.644 2 DEBUG oslo_concurrency.lockutils [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.644 2 DEBUG oslo_concurrency.lockutils [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.646 2 DEBUG oslo_concurrency.lockutils [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.646 2 DEBUG oslo_concurrency.lockutils [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.646 2 DEBUG oslo_concurrency.lockutils [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.648 2 INFO nova.compute.manager [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Terminating instance#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.650 2 DEBUG nova.compute.manager [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.705 2 DEBUG nova.network.neutron [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Updating instance_info_cache with network_info: [{"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.727 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Releasing lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.728 2 DEBUG nova.compute.manager [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Instance network_info: |[{"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.729 2 DEBUG oslo_concurrency.lockutils [req-2ffee095-bdc9-4943-936b-ff686971f2e3 req-91825572-95b1-43c4-bdc3-9cd06d01a998 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.729 2 DEBUG nova.network.neutron [req-2ffee095-bdc9-4943-936b-ff686971f2e3 req-91825572-95b1-43c4-bdc3-9cd06d01a998 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Refreshing network info cache for port 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.735 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Start _get_guest_xml network_info=[{"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.742 2 WARNING nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.748 2 DEBUG nova.virt.libvirt.host [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.749 2 DEBUG nova.virt.libvirt.host [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.759 2 DEBUG nova.virt.libvirt.host [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.760 2 DEBUG nova.virt.libvirt.host [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.762 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.762 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.763 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.764 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.764 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.765 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.765 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.766 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.766 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.767 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.767 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.767 2 DEBUG nova.virt.hardware [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.772 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:17 np0005465988 nova_compute[236126]: 2025-10-02 12:21:17.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:17 np0005465988 kernel: tapc51fc487-ee (unregistering): left promiscuous mode
Oct  2 08:21:18 np0005465988 NetworkManager[45041]: <info>  [1759407678.0014] device (tapc51fc487-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00317|binding|INFO|Releasing lport c51fc487-eedd-421d-b8cc-d0a322b4a129 from this chassis (sb_readonly=0)
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00318|binding|INFO|Setting lport c51fc487-eedd-421d-b8cc-d0a322b4a129 down in Southbound
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00319|binding|INFO|Removing iface tapc51fc487-ee ovn-installed in OVS
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.029 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:6f:4c 10.100.0.6'], port_security=['fa:16:3e:95:6f:4c 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-082f75aa-3cb3-4aac-903c-8187fdb62a93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a5294f6e-41c7-4853-9f7a-aaececdf82e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.206'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31127d17-755e-4278-8ea5-b82c1b47b10b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c51fc487-eedd-421d-b8cc-d0a322b4a129) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.032 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c51fc487-eedd-421d-b8cc-d0a322b4a129 in datapath 082f75aa-3cb3-4aac-903c-8187fdb62a93 unbound from our chassis#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.035 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 082f75aa-3cb3-4aac-903c-8187fdb62a93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.037 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e3bbd668-7502-4b83-92a8-7bac88b64774]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.039 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93 namespace which is not needed anymore#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 kernel: tapac2ac5dc-08 (unregistering): left promiscuous mode
Oct  2 08:21:18 np0005465988 NetworkManager[45041]: <info>  [1759407678.0659] device (tapac2ac5dc-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00320|binding|INFO|Releasing lport ac2ac5dc-08f6-4faa-9427-e87be9c9d933 from this chassis (sb_readonly=0)
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00321|binding|INFO|Setting lport ac2ac5dc-08f6-4faa-9427-e87be9c9d933 down in Southbound
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00322|binding|INFO|Removing iface tapac2ac5dc-08 ovn-installed in OVS
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.080 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:bc:e6 10.1.1.88'], port_security=['fa:16:3e:0a:bc:e6 10.1.1.88'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-1092778721', 'neutron:cidrs': '10.1.1.88/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b73044fd-31f0-4d49-88bb-64109a59249a', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-1092778721', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b3030356-f7fc-436b-a72d-f5bb90702dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f6bb4ad-4d85-4fbe-8f9a-0eea7c1f3e56, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=ac2ac5dc-08f6-4faa-9427-e87be9c9d933) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:18 np0005465988 kernel: tap0170ec24-0b (unregistering): left promiscuous mode
Oct  2 08:21:18 np0005465988 NetworkManager[45041]: <info>  [1759407678.1041] device (tap0170ec24-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00323|binding|INFO|Releasing lport 0170ec24-0bde-4eb7-b349-a8f304853e0d from this chassis (sb_readonly=0)
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00324|binding|INFO|Setting lport 0170ec24-0bde-4eb7-b349-a8f304853e0d down in Southbound
Oct  2 08:21:18 np0005465988 kernel: tapc5debb4f-7f (unregistering): left promiscuous mode
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00325|binding|INFO|Removing iface tap0170ec24-0b ovn-installed in OVS
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 NetworkManager[45041]: <info>  [1759407678.1265] device (tapc5debb4f-7f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.128 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:5f:d8 10.1.1.46'], port_security=['fa:16:3e:36:5f:d8 10.1.1.46'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-206664704', 'neutron:cidrs': '10.1.1.46/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b73044fd-31f0-4d49-88bb-64109a59249a', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-206664704', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b3030356-f7fc-436b-a72d-f5bb90702dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f6bb4ad-4d85-4fbe-8f9a-0eea7c1f3e56, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=0170ec24-0bde-4eb7-b349-a8f304853e0d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:18 np0005465988 podman[272172]: 2025-10-02 12:21:18.144461444 +0000 UTC m=+0.103247947 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00326|binding|INFO|Releasing lport c5debb4f-7f40-48f2-afb5-efa11af6cc4c from this chassis (sb_readonly=0)
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00327|binding|INFO|Setting lport c5debb4f-7f40-48f2-afb5-efa11af6cc4c down in Southbound
Oct  2 08:21:18 np0005465988 kernel: tapfd3698a6-a6 (unregistering): left promiscuous mode
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00328|binding|INFO|Removing iface tapc5debb4f-7f ovn-installed in OVS
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 NetworkManager[45041]: <info>  [1759407678.1639] device (tapfd3698a6-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.174 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:0f:50 10.1.1.226'], port_security=['fa:16:3e:75:0f:50 10.1.1.226'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.226/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b73044fd-31f0-4d49-88bb-64109a59249a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a5294f6e-41c7-4853-9f7a-aaececdf82e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f6bb4ad-4d85-4fbe-8f9a-0eea7c1f3e56, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c5debb4f-7f40-48f2-afb5-efa11af6cc4c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00329|binding|INFO|Releasing lport fd3698a6-a68d-42ed-b217-f5bdc4163195 from this chassis (sb_readonly=0)
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00330|binding|INFO|Setting lport fd3698a6-a68d-42ed-b217-f5bdc4163195 down in Southbound
Oct  2 08:21:18 np0005465988 kernel: tapf89c24d0-f3 (unregistering): left promiscuous mode
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00331|binding|INFO|Removing iface tapfd3698a6-a6 ovn-installed in OVS
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 NetworkManager[45041]: <info>  [1759407678.2017] device (tapf89c24d0-f3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.204 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:ab:4e 10.1.1.172'], port_security=['fa:16:3e:f7:ab:4e 10.1.1.172'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.172/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b73044fd-31f0-4d49-88bb-64109a59249a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a5294f6e-41c7-4853-9f7a-aaececdf82e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f6bb4ad-4d85-4fbe-8f9a-0eea7c1f3e56, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=fd3698a6-a68d-42ed-b217-f5bdc4163195) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:18 np0005465988 kernel: tapf0c51ef0-db (unregistering): left promiscuous mode
Oct  2 08:21:18 np0005465988 NetworkManager[45041]: <info>  [1759407678.2260] device (tapf0c51ef0-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:21:18 np0005465988 neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93[271240]: [NOTICE]   (271255) : haproxy version is 2.8.14-c23fe91
Oct  2 08:21:18 np0005465988 neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93[271240]: [NOTICE]   (271255) : path to executable is /usr/sbin/haproxy
Oct  2 08:21:18 np0005465988 neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93[271240]: [WARNING]  (271255) : Exiting Master process...
Oct  2 08:21:18 np0005465988 neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93[271240]: [WARNING]  (271255) : Exiting Master process...
Oct  2 08:21:18 np0005465988 neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93[271240]: [ALERT]    (271255) : Current worker (271257) exited with code 143 (Terminated)
Oct  2 08:21:18 np0005465988 neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93[271240]: [WARNING]  (271255) : All workers exited. Exiting... (0)
Oct  2 08:21:18 np0005465988 systemd[1]: libpod-fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0.scope: Deactivated successfully.
Oct  2 08:21:18 np0005465988 podman[272225]: 2025-10-02 12:21:18.238205252 +0000 UTC m=+0.069810313 container died fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00332|binding|INFO|Releasing lport f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b from this chassis (sb_readonly=0)
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00333|binding|INFO|Setting lport f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b down in Southbound
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00334|binding|INFO|Removing iface tapf89c24d0-f3 ovn-installed in OVS
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.264 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:9b:57 10.2.2.100'], port_security=['fa:16:3e:49:9b:57 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a5294f6e-41c7-4853-9f7a-aaececdf82e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d716296-3fee-49e1-85e2-41d6ec935efa, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:18 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0-userdata-shm.mount: Deactivated successfully.
Oct  2 08:21:18 np0005465988 systemd[1]: var-lib-containers-storage-overlay-ce9784a83c7ef3c6a6711462adcd5b1d54aa5d0257d697bb56164190825633d0-merged.mount: Deactivated successfully.
Oct  2 08:21:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:21:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4182312384' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:21:18 np0005465988 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000053.scope: Deactivated successfully.
Oct  2 08:21:18 np0005465988 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000053.scope: Consumed 17.472s CPU time.
Oct  2 08:21:18 np0005465988 systemd-machined[192594]: Machine qemu-32-instance-00000053 terminated.
Oct  2 08:21:18 np0005465988 podman[272225]: 2025-10-02 12:21:18.292694608 +0000 UTC m=+0.124299659 container cleanup fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:21:18 np0005465988 systemd[1]: libpod-conmon-fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0.scope: Deactivated successfully.
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.323 2 DEBUG nova.compute.manager [req-2194b059-9dcb-41d1-a52b-805beec9ba4d req-39eb968b-bfbc-4f74-96ab-3cd71aaa6c71 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.323 2 DEBUG oslo_concurrency.lockutils [req-2194b059-9dcb-41d1-a52b-805beec9ba4d req-39eb968b-bfbc-4f74-96ab-3cd71aaa6c71 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.324 2 DEBUG oslo_concurrency.lockutils [req-2194b059-9dcb-41d1-a52b-805beec9ba4d req-39eb968b-bfbc-4f74-96ab-3cd71aaa6c71 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.324 2 DEBUG oslo_concurrency.lockutils [req-2194b059-9dcb-41d1-a52b-805beec9ba4d req-39eb968b-bfbc-4f74-96ab-3cd71aaa6c71 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.324 2 DEBUG nova.compute.manager [req-2194b059-9dcb-41d1-a52b-805beec9ba4d req-39eb968b-bfbc-4f74-96ab-3cd71aaa6c71 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-unplugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.324 2 DEBUG nova.compute.manager [req-2194b059-9dcb-41d1-a52b-805beec9ba4d req-39eb968b-bfbc-4f74-96ab-3cd71aaa6c71 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00335|binding|INFO|Releasing lport f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 from this chassis (sb_readonly=0)
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00336|binding|INFO|Setting lport f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 down in Southbound
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:18Z|00337|binding|INFO|Removing iface tapf0c51ef0-db ovn-installed in OVS
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.357 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:95:32 10.2.2.200'], port_security=['fa:16:3e:20:95:32 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': '969ba235-be4a-44e1-a6f2-7c5922b9661e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84f71f6076f7425db7653ac203257df0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a5294f6e-41c7-4853-9f7a-aaececdf82e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d716296-3fee-49e1-85e2-41d6ec935efa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.357 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.585s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.385 2 DEBUG nova.storage.rbd_utils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.391 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:18 np0005465988 podman[272280]: 2025-10-02 12:21:18.40546411 +0000 UTC m=+0.040812098 container remove fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.410 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a88fb9-f86a-4bc9-8c88-ca0d64edbe8a]: (4, ('Thu Oct  2 12:21:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93 (fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0)\nfe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0\nThu Oct  2 12:21:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93 (fe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0)\nfe36d62e44c5eaa1aaaaf1fa0a9fdd9dad0e40b46c85f062b68f8ee5770f5bc0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.412 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ccb58aca-b3eb-49a4-97fb-3b48981232b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.413 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap082f75aa-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:18 np0005465988 kernel: tap082f75aa-30: left promiscuous mode
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.428 2 DEBUG nova.compute.manager [req-c83972c8-3329-4777-851e-2f18319b8b35 req-548cda19-3e33-4a6b-9c66-13f799fcd9b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.429 2 DEBUG oslo_concurrency.lockutils [req-c83972c8-3329-4777-851e-2f18319b8b35 req-548cda19-3e33-4a6b-9c66-13f799fcd9b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.429 2 DEBUG oslo_concurrency.lockutils [req-c83972c8-3329-4777-851e-2f18319b8b35 req-548cda19-3e33-4a6b-9c66-13f799fcd9b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.429 2 DEBUG oslo_concurrency.lockutils [req-c83972c8-3329-4777-851e-2f18319b8b35 req-548cda19-3e33-4a6b-9c66-13f799fcd9b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.430 2 DEBUG nova.compute.manager [req-c83972c8-3329-4777-851e-2f18319b8b35 req-548cda19-3e33-4a6b-9c66-13f799fcd9b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-unplugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.430 2 DEBUG nova.compute.manager [req-c83972c8-3329-4777-851e-2f18319b8b35 req-548cda19-3e33-4a6b-9c66-13f799fcd9b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.448 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[681dcc08-209e-493c-b62f-20cdb041c229]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.474 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fc44f777-a470-465b-8300-67f2a764e983]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.476 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ff2f7690-faa1-479b-abae-2060294bd23c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 NetworkManager[45041]: <info>  [1759407678.4891] manager: (tapac2ac5dc-08): new Tun device (/org/freedesktop/NetworkManager/Devices/163)
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.492 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[da630db0-05ea-409a-80f2-539a614de814]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569608, 'reachable_time': 32945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272340, 'error': None, 'target': 'ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 systemd[1]: run-netns-ovnmeta\x2d082f75aa\x2d3cb3\x2d4aac\x2d903c\x2d8187fdb62a93.mount: Deactivated successfully.
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.498 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-082f75aa-3cb3-4aac-903c-8187fdb62a93 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.498 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[22b18d02-8e73-414a-99c4-d817677b8cf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.505 142124 INFO neutron.agent.ovn.metadata.agent [-] Port ac2ac5dc-08f6-4faa-9427-e87be9c9d933 in datapath b73044fd-31f0-4d49-88bb-64109a59249a unbound from our chassis#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.512 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b73044fd-31f0-4d49-88bb-64109a59249a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.512 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2ba251bb-931f-4c80-9c1b-131661bc2f05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.513 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a namespace which is not needed anymore#033[00m
Oct  2 08:21:18 np0005465988 NetworkManager[45041]: <info>  [1759407678.5190] manager: (tapc5debb4f-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/164)
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.566 2 DEBUG nova.compute.manager [req-d630fca7-20f4-428c-8224-d9532efcd1e4 req-e5129d69-0cf5-48ef-ac35-5968fda59290 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.567 2 DEBUG oslo_concurrency.lockutils [req-d630fca7-20f4-428c-8224-d9532efcd1e4 req-e5129d69-0cf5-48ef-ac35-5968fda59290 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:18 np0005465988 NetworkManager[45041]: <info>  [1759407678.5691] manager: (tapf0c51ef0-db): new Tun device (/org/freedesktop/NetworkManager/Devices/165)
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.574 2 DEBUG oslo_concurrency.lockutils [req-d630fca7-20f4-428c-8224-d9532efcd1e4 req-e5129d69-0cf5-48ef-ac35-5968fda59290 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.575 2 DEBUG oslo_concurrency.lockutils [req-d630fca7-20f4-428c-8224-d9532efcd1e4 req-e5129d69-0cf5-48ef-ac35-5968fda59290 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.576 2 DEBUG nova.compute.manager [req-d630fca7-20f4-428c-8224-d9532efcd1e4 req-e5129d69-0cf5-48ef-ac35-5968fda59290 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-unplugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.576 2 DEBUG nova.compute.manager [req-d630fca7-20f4-428c-8224-d9532efcd1e4 req-e5129d69-0cf5-48ef-ac35-5968fda59290 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.592 2 INFO nova.virt.libvirt.driver [-] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Instance destroyed successfully.#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.593 2 DEBUG nova.objects.instance [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lazy-loading 'resources' on Instance uuid 969ba235-be4a-44e1-a6f2-7c5922b9661e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.617 2 DEBUG nova.virt.libvirt.vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.617 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.619 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:6f:4c,bridge_name='br-int',has_traffic_filtering=True,id=c51fc487-eedd-421d-b8cc-d0a322b4a129,network=Network(082f75aa-3cb3-4aac-903c-8187fdb62a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51fc487-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.619 2 DEBUG os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:6f:4c,bridge_name='br-int',has_traffic_filtering=True,id=c51fc487-eedd-421d-b8cc-d0a322b4a129,network=Network(082f75aa-3cb3-4aac-903c-8187fdb62a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51fc487-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.622 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc51fc487-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.651 2 INFO os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:6f:4c,bridge_name='br-int',has_traffic_filtering=True,id=c51fc487-eedd-421d-b8cc-d0a322b4a129,network=Network(082f75aa-3cb3-4aac-903c-8187fdb62a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51fc487-ee')#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.653 2 DEBUG nova.virt.libvirt.vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.653 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.654 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bc:e6,bridge_name='br-int',has_traffic_filtering=True,id=ac2ac5dc-08f6-4faa-9427-e87be9c9d933,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapac2ac5dc-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.655 2 DEBUG os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bc:e6,bridge_name='br-int',has_traffic_filtering=True,id=ac2ac5dc-08f6-4faa-9427-e87be9c9d933,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapac2ac5dc-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.657 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac2ac5dc-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:21:18 np0005465988 neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a[271395]: [NOTICE]   (271399) : haproxy version is 2.8.14-c23fe91
Oct  2 08:21:18 np0005465988 neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a[271395]: [NOTICE]   (271399) : path to executable is /usr/sbin/haproxy
Oct  2 08:21:18 np0005465988 neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a[271395]: [ALERT]    (271399) : Current worker (271401) exited with code 143 (Terminated)
Oct  2 08:21:18 np0005465988 neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a[271395]: [WARNING]  (271399) : All workers exited. Exiting... (0)
Oct  2 08:21:18 np0005465988 systemd[1]: libpod-eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea.scope: Deactivated successfully.
Oct  2 08:21:18 np0005465988 podman[272449]: 2025-10-02 12:21:18.672539213 +0000 UTC m=+0.038949005 container died eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.682 2 INFO os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bc:e6,bridge_name='br-int',has_traffic_filtering=True,id=ac2ac5dc-08f6-4faa-9427-e87be9c9d933,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapac2ac5dc-08')#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.683 2 DEBUG nova.virt.libvirt.vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.684 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.685 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:5f:d8,bridge_name='br-int',has_traffic_filtering=True,id=0170ec24-0bde-4eb7-b349-a8f304853e0d,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0170ec24-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.685 2 DEBUG os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:5f:d8,bridge_name='br-int',has_traffic_filtering=True,id=0170ec24-0bde-4eb7-b349-a8f304853e0d,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0170ec24-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.687 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0170ec24-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:21:18 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea-userdata-shm.mount: Deactivated successfully.
Oct  2 08:21:18 np0005465988 systemd[1]: var-lib-containers-storage-overlay-1236f4e0fff89e802e4671c99231c6cf678d1bc5bfc895e55631724b0b694efb-merged.mount: Deactivated successfully.
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 podman[272449]: 2025-10-02 12:21:18.710733675 +0000 UTC m=+0.077143467 container cleanup eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.713 2 INFO os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:5f:d8,bridge_name='br-int',has_traffic_filtering=True,id=0170ec24-0bde-4eb7-b349-a8f304853e0d,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0170ec24-0b')#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.714 2 DEBUG nova.virt.libvirt.vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.714 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "address": "fa:16:3e:75:0f:50", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5debb4f-7f", "ovs_interfaceid": "c5debb4f-7f40-48f2-afb5-efa11af6cc4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.715 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:75:0f:50,bridge_name='br-int',has_traffic_filtering=True,id=c5debb4f-7f40-48f2-afb5-efa11af6cc4c,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5debb4f-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.715 2 DEBUG os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:75:0f:50,bridge_name='br-int',has_traffic_filtering=True,id=c5debb4f-7f40-48f2-afb5-efa11af6cc4c,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5debb4f-7f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.717 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5debb4f-7f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:21:18 np0005465988 systemd[1]: libpod-conmon-eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea.scope: Deactivated successfully.
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.735 2 INFO os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:75:0f:50,bridge_name='br-int',has_traffic_filtering=True,id=c5debb4f-7f40-48f2-afb5-efa11af6cc4c,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5debb4f-7f')#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.736 2 DEBUG nova.virt.libvirt.vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.736 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.737 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:ab:4e,bridge_name='br-int',has_traffic_filtering=True,id=fd3698a6-a68d-42ed-b217-f5bdc4163195,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd3698a6-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.737 2 DEBUG os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:ab:4e,bridge_name='br-int',has_traffic_filtering=True,id=fd3698a6-a68d-42ed-b217-f5bdc4163195,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd3698a6-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.739 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd3698a6-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.749 2 INFO os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:ab:4e,bridge_name='br-int',has_traffic_filtering=True,id=fd3698a6-a68d-42ed-b217-f5bdc4163195,network=Network(b73044fd-31f0-4d49-88bb-64109a59249a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd3698a6-a6')#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.750 2 DEBUG nova.virt.libvirt.vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.751 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.751 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:9b:57,bridge_name='br-int',has_traffic_filtering=True,id=f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf89c24d0-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.752 2 DEBUG os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:9b:57,bridge_name='br-int',has_traffic_filtering=True,id=f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf89c24d0-f3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.754 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf89c24d0-f3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.763 2 INFO os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:9b:57,bridge_name='br-int',has_traffic_filtering=True,id=f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf89c24d0-f3')#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.764 2 DEBUG nova.virt.libvirt.vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1607195983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1607195983',id=83,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFsu4SUzXswoym1vOrzTAH4sjnuMlhe3jmfRc2t4NC8EPniRhOpCy+VSyIJIX9FsrEqnsB88//S4cLZA6E8DwjYmqcqTbPcKHA8t0B/MbhE4klmrontjS6mQAmAwlo7Ulw==',key_name='tempest-keypair-374677317',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84f71f6076f7425db7653ac203257df0',ramdisk_id='',reservation_id='r-n69ap67n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-87022127',owner_user_name='tempest-TaggedBootDevicesTest_v242-87022127-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a7f7518ce70488fb4f63af1a3bef131',uuid=969ba235-be4a-44e1-a6f2-7c5922b9661e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.764 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converting VIF {"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.765 2 DEBUG nova.network.os_vif_util [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:95:32,bridge_name='br-int',has_traffic_filtering=True,id=f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0c51ef0-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.765 2 DEBUG os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:95:32,bridge_name='br-int',has_traffic_filtering=True,id=f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0c51ef0-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.770 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0c51ef0-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.775 2 INFO os_vif [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:95:32,bridge_name='br-int',has_traffic_filtering=True,id=f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6,network=Network(3f8de4fc-b69f-4cee-bee5-9f9b5275933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0c51ef0-db')#033[00m
Oct  2 08:21:18 np0005465988 podman[272491]: 2025-10-02 12:21:18.777959291 +0000 UTC m=+0.043426414 container remove eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.789 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[185c0132-be66-4d77-947b-5cac944efc4b]: (4, ('Thu Oct  2 12:21:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a (eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea)\neb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea\nThu Oct  2 12:21:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a (eb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea)\neb6faace1896ad057384e035ba51ed72fbcd72af993abfeb122718bddd75acea\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.791 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[af290ebd-a082-472e-8289-5965ed0eef74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.792 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb73044fd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:18 np0005465988 kernel: tapb73044fd-30: left promiscuous mode
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.820 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[86e852e5-ed8a-4e55-b203-4ab45f3f7a9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.851 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ddd749-249e-4452-a931-4b7e80634d63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.853 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb35347-57fb-466f-ae03-54f55f73d6cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.878 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c81f8a4b-5fde-4b77-b6a0-1476dd411ccb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569705, 'reachable_time': 39658, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272529, 'error': None, 'target': 'ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4082954703' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.880 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b73044fd-31f0-4d49-88bb-64109a59249a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.881 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[c31f634e-9399-453f-874a-976a5490233d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.882 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 0170ec24-0bde-4eb7-b349-a8f304853e0d in datapath b73044fd-31f0-4d49-88bb-64109a59249a unbound from our chassis#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.884 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b73044fd-31f0-4d49-88bb-64109a59249a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.885 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[09f38946-ec65-4f15-8689-dfd64c017d9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.885 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c5debb4f-7f40-48f2-afb5-efa11af6cc4c in datapath b73044fd-31f0-4d49-88bb-64109a59249a unbound from our chassis#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.887 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b73044fd-31f0-4d49-88bb-64109a59249a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.888 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[01c1c4bc-6a7a-4a4d-9a86-910b4c3ae84d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.888 142124 INFO neutron.agent.ovn.metadata.agent [-] Port fd3698a6-a68d-42ed-b217-f5bdc4163195 in datapath b73044fd-31f0-4d49-88bb-64109a59249a unbound from our chassis#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.890 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b73044fd-31f0-4d49-88bb-64109a59249a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.890 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[35d26bc3-469d-43f1-94d8-8b815438ee1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.891 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b in datapath 3f8de4fc-b69f-4cee-bee5-9f9b5275933e unbound from our chassis#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.893 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f8de4fc-b69f-4cee-bee5-9f9b5275933e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.893 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e1868133-6300-441a-88cf-622dd628af70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:18.894 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e namespace which is not needed anymore#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.904 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.906 2 DEBUG nova.virt.libvirt.vif [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:21:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-97811332',display_name='tempest-DeleteServersTestJSON-server-97811332',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-97811332',id=89,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c2c11ebecb14f3188f35ea473c4ca02',ramdisk_id='',reservation_id='r-c1yw38tm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1602490521',owner_user_name='tempest-DeleteServersTestJSON-1602490521-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:21:13Z,user_data=None,user_id='a9f7faffac7240869a0196df1ddda7e5',uuid=9c7a04c1-a740-4d58-bb92-b34f14ccff42,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.906 2 DEBUG nova.network.os_vif_util [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converting VIF {"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.906 2 DEBUG nova.network.os_vif_util [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:6e:67,bridge_name='br-int',has_traffic_filtering=True,id=8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c8b83a0-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:18 np0005465988 nova_compute[236126]: 2025-10-02 12:21:18.908 2 DEBUG nova.objects.instance [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9c7a04c1-a740-4d58-bb92-b34f14ccff42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.051 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  <uuid>9c7a04c1-a740-4d58-bb92-b34f14ccff42</uuid>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  <name>instance-00000059</name>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <nova:name>tempest-DeleteServersTestJSON-server-97811332</nova:name>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:21:17</nova:creationTime>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <nova:user uuid="a9f7faffac7240869a0196df1ddda7e5">tempest-DeleteServersTestJSON-1602490521-project-member</nova:user>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <nova:project uuid="1c2c11ebecb14f3188f35ea473c4ca02">tempest-DeleteServersTestJSON-1602490521</nova:project>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <nova:port uuid="8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <entry name="serial">9c7a04c1-a740-4d58-bb92-b34f14ccff42</entry>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <entry name="uuid">9c7a04c1-a740-4d58-bb92-b34f14ccff42</entry>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk.config">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:cd:6e:67"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <target dev="tap8c8b83a0-49"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/9c7a04c1-a740-4d58-bb92-b34f14ccff42/console.log" append="off"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:21:19 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:21:19 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:21:19 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:21:19 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.052 2 DEBUG nova.compute.manager [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Preparing to wait for external event network-vif-plugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.052 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.052 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.053 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.053 2 DEBUG nova.virt.libvirt.vif [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:21:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-97811332',display_name='tempest-DeleteServersTestJSON-server-97811332',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-97811332',id=89,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c2c11ebecb14f3188f35ea473c4ca02',ramdisk_id='',reservation_id='r-c1yw38tm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1602490521',owner_user_name='tempest-DeleteServersTestJSON-1602490521-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:21:13Z,user_data=None,user_id='a9f7faffac7240869a0196df1ddda7e5',uuid=9c7a04c1-a740-4d58-bb92-b34f14ccff42,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.054 2 DEBUG nova.network.os_vif_util [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converting VIF {"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:19 np0005465988 neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e[271494]: [NOTICE]   (271498) : haproxy version is 2.8.14-c23fe91
Oct  2 08:21:19 np0005465988 neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e[271494]: [NOTICE]   (271498) : path to executable is /usr/sbin/haproxy
Oct  2 08:21:19 np0005465988 neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e[271494]: [WARNING]  (271498) : Exiting Master process...
Oct  2 08:21:19 np0005465988 neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e[271494]: [WARNING]  (271498) : Exiting Master process...
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.055 2 DEBUG nova.network.os_vif_util [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:6e:67,bridge_name='br-int',has_traffic_filtering=True,id=8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c8b83a0-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.055 2 DEBUG os_vif [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:6e:67,bridge_name='br-int',has_traffic_filtering=True,id=8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c8b83a0-49') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.056 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.056 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.058 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c8b83a0-49, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.059 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8c8b83a0-49, col_values=(('external_ids', {'iface-id': '8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:6e:67', 'vm-uuid': '9c7a04c1-a740-4d58-bb92-b34f14ccff42'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:19 np0005465988 NetworkManager[45041]: <info>  [1759407679.0617] manager: (tap8c8b83a0-49): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Oct  2 08:21:19 np0005465988 neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e[271494]: [ALERT]    (271498) : Current worker (271501) exited with code 143 (Terminated)
Oct  2 08:21:19 np0005465988 neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e[271494]: [WARNING]  (271498) : All workers exited. Exiting... (0)
Oct  2 08:21:19 np0005465988 systemd[1]: libpod-c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574.scope: Deactivated successfully.
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:19 np0005465988 podman[272550]: 2025-10-02 12:21:19.068731375 +0000 UTC m=+0.062256413 container died c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.068 2 INFO os_vif [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:6e:67,bridge_name='br-int',has_traffic_filtering=True,id=8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c8b83a0-49')#033[00m
Oct  2 08:21:19 np0005465988 podman[272550]: 2025-10-02 12:21:19.110196562 +0000 UTC m=+0.103721560 container cleanup c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:21:19 np0005465988 systemd[1]: libpod-conmon-c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574.scope: Deactivated successfully.
Oct  2 08:21:19 np0005465988 podman[272583]: 2025-10-02 12:21:19.190560281 +0000 UTC m=+0.054951271 container remove c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:21:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:19.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.202 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b6b8898d-c4b9-42ee-85ee-07ad38d491a2]: (4, ('Thu Oct  2 12:21:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e (c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574)\nc5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574\nThu Oct  2 12:21:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e (c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574)\nc5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.205 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[57b47cf6-a7af-4fde-8794-be35c28febf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.207 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f8de4fc-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:19 np0005465988 kernel: tap3f8de4fc-b0: left promiscuous mode
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.228 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.229 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.229 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] No VIF found with MAC fa:16:3e:cd:6e:67, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.230 2 INFO nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Using config drive#033[00m
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.246 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4b859c-695b-4e2d-84b9-c70305be0823]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:19.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.256 2 DEBUG nova.storage.rbd_utils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:19 np0005465988 systemd[1]: var-lib-containers-storage-overlay-61a7a26443364e2fbecce32a7fbd5a452c008631c6cf84c7ca029f3a7f0523c9-merged.mount: Deactivated successfully.
Oct  2 08:21:19 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c5e0984d49c7a3eb3316a2878ef27ab3584081d01907817567a359b167b9b574-userdata-shm.mount: Deactivated successfully.
Oct  2 08:21:19 np0005465988 systemd[1]: run-netns-ovnmeta\x2db73044fd\x2d31f0\x2d4d49\x2d88bb\x2d64109a59249a.mount: Deactivated successfully.
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.281 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8e1c46f4-4f87-40eb-b3ea-107faa879fd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.283 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2c66ceee-0258-43c4-9e68-ee8b88d96a55]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.301 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[71825922-1ade-439f-9853-bdef406743a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569835, 'reachable_time': 25302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272621, 'error': None, 'target': 'ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:19 np0005465988 systemd[1]: run-netns-ovnmeta\x2d3f8de4fc\x2db69f\x2d4cee\x2dbee5\x2d9f9b5275933e.mount: Deactivated successfully.
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.303 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3f8de4fc-b69f-4cee-bee5-9f9b5275933e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.303 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[91ccba77-cec6-4704-a0de-5e43f300ee43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.307 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 in datapath 3f8de4fc-b69f-4cee-bee5-9f9b5275933e unbound from our chassis#033[00m
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.309 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f8de4fc-b69f-4cee-bee5-9f9b5275933e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:21:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:19.310 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[890b08c5-ee41-469c-95b3-46e01bb26afc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.327 2 INFO nova.virt.libvirt.driver [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Deleting instance files /var/lib/nova/instances/969ba235-be4a-44e1-a6f2-7c5922b9661e_del#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.328 2 INFO nova.virt.libvirt.driver [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Deletion of /var/lib/nova/instances/969ba235-be4a-44e1-a6f2-7c5922b9661e_del complete#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.389 2 INFO nova.compute.manager [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Took 1.74 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.390 2 DEBUG oslo.service.loopingcall [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.390 2 DEBUG nova.compute.manager [-] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:21:19 np0005465988 nova_compute[236126]: 2025-10-02 12:21:19.391 2 DEBUG nova.network.neutron [-] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.250 2 INFO nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Creating config drive at /var/lib/nova/instances/9c7a04c1-a740-4d58-bb92-b34f14ccff42/disk.config#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.258 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9c7a04c1-a740-4d58-bb92-b34f14ccff42/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu2oiyq9a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.410 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9c7a04c1-a740-4d58-bb92-b34f14ccff42/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu2oiyq9a" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.452 2 DEBUG nova.storage.rbd_utils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] rbd image 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.458 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9c7a04c1-a740-4d58-bb92-b34f14ccff42/disk.config 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.915 2 DEBUG nova.compute.manager [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.916 2 DEBUG oslo_concurrency.lockutils [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.917 2 DEBUG oslo_concurrency.lockutils [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.918 2 DEBUG oslo_concurrency.lockutils [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.918 2 DEBUG nova.compute.manager [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-plugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.919 2 WARNING nova.compute.manager [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-c51fc487-eedd-421d-b8cc-d0a322b4a129 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.920 2 DEBUG nova.compute.manager [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.921 2 DEBUG oslo_concurrency.lockutils [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.921 2 DEBUG oslo_concurrency.lockutils [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.922 2 DEBUG oslo_concurrency.lockutils [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.923 2 DEBUG nova.compute.manager [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-unplugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.923 2 DEBUG nova.compute.manager [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.924 2 DEBUG nova.compute.manager [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.925 2 DEBUG oslo_concurrency.lockutils [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.925 2 DEBUG oslo_concurrency.lockutils [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.926 2 DEBUG oslo_concurrency.lockutils [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.927 2 DEBUG nova.compute.manager [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-plugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.927 2 WARNING nova.compute.manager [req-c0512739-5059-4c6e-b275-39a1a8dba97c req-4b14fb02-0260-4b0f-a92f-918aef71b264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.930 2 DEBUG nova.compute.manager [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.931 2 DEBUG oslo_concurrency.lockutils [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.931 2 DEBUG oslo_concurrency.lockutils [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.932 2 DEBUG oslo_concurrency.lockutils [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.932 2 DEBUG nova.compute.manager [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-plugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.933 2 WARNING nova.compute.manager [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-ac2ac5dc-08f6-4faa-9427-e87be9c9d933 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.933 2 DEBUG nova.compute.manager [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-0170ec24-0bde-4eb7-b349-a8f304853e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.934 2 DEBUG oslo_concurrency.lockutils [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.934 2 DEBUG oslo_concurrency.lockutils [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.935 2 DEBUG oslo_concurrency.lockutils [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.935 2 DEBUG nova.compute.manager [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-unplugged-0170ec24-0bde-4eb7-b349-a8f304853e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.936 2 DEBUG nova.compute.manager [req-fad791e5-bdf1-40b1-b4af-eb824bc7b768 req-653578a1-a9fc-4c3d-8710-d176ade7290d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-0170ec24-0bde-4eb7-b349-a8f304853e0d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.936 2 DEBUG oslo_concurrency.processutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9c7a04c1-a740-4d58-bb92-b34f14ccff42/disk.config 9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:20 np0005465988 nova_compute[236126]: 2025-10-02 12:21:20.938 2 INFO nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Deleting local config drive /var/lib/nova/instances/9c7a04c1-a740-4d58-bb92-b34f14ccff42/disk.config because it was imported into RBD.#033[00m
Oct  2 08:21:21 np0005465988 kernel: tap8c8b83a0-49: entered promiscuous mode
Oct  2 08:21:21 np0005465988 NetworkManager[45041]: <info>  [1759407681.0163] manager: (tap8c8b83a0-49): new Tun device (/org/freedesktop/NetworkManager/Devices/167)
Oct  2 08:21:21 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:21Z|00338|binding|INFO|Claiming lport 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 for this chassis.
Oct  2 08:21:21 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:21Z|00339|binding|INFO|8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1: Claiming fa:16:3e:cd:6e:67 10.100.0.6
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:21 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:21Z|00340|binding|INFO|Setting lport 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 ovn-installed in OVS
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:21 np0005465988 systemd-machined[192594]: New machine qemu-33-instance-00000059.
Oct  2 08:21:21 np0005465988 systemd-udevd[272726]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.074 2 DEBUG nova.compute.manager [req-4d740afd-6ad1-4f50-8a7f-ec0957e67abe req-dbfd8496-3284-48a8-b9bb-7f56fb97094e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.075 2 DEBUG oslo_concurrency.lockutils [req-4d740afd-6ad1-4f50-8a7f-ec0957e67abe req-dbfd8496-3284-48a8-b9bb-7f56fb97094e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.075 2 DEBUG oslo_concurrency.lockutils [req-4d740afd-6ad1-4f50-8a7f-ec0957e67abe req-dbfd8496-3284-48a8-b9bb-7f56fb97094e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.075 2 DEBUG oslo_concurrency.lockutils [req-4d740afd-6ad1-4f50-8a7f-ec0957e67abe req-dbfd8496-3284-48a8-b9bb-7f56fb97094e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.076 2 DEBUG nova.compute.manager [req-4d740afd-6ad1-4f50-8a7f-ec0957e67abe req-dbfd8496-3284-48a8-b9bb-7f56fb97094e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-plugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.076 2 WARNING nova.compute.manager [req-4d740afd-6ad1-4f50-8a7f-ec0957e67abe req-dbfd8496-3284-48a8-b9bb-7f56fb97094e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-fd3698a6-a68d-42ed-b217-f5bdc4163195 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:21:21 np0005465988 systemd[1]: Started Virtual Machine qemu-33-instance-00000059.
Oct  2 08:21:21 np0005465988 NetworkManager[45041]: <info>  [1759407681.0860] device (tap8c8b83a0-49): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:21:21 np0005465988 NetworkManager[45041]: <info>  [1759407681.0872] device (tap8c8b83a0-49): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:21:21 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:21Z|00341|binding|INFO|Setting lport 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 up in Southbound
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.109 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:6e:67 10.100.0.6'], port_security=['fa:16:3e:cd:6e:67 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9c7a04c1-a740-4d58-bb92-b34f14ccff42', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7754c79a-cca5-48c7-9169-831eaad23ccc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c2c11ebecb14f3188f35ea473c4ca02', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c0d053f-a096-4f8c-8162-5ef19e29b5d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45b5774e-2213-45dd-ab74-f2a3868d167c, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.112 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 in datapath 7754c79a-cca5-48c7-9169-831eaad23ccc bound to our chassis#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.115 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7754c79a-cca5-48c7-9169-831eaad23ccc#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.131 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8289b912-16c6-4be4-a38a-ed5f834acb44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.132 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7754c79a-c1 in ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.134 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7754c79a-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.135 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e5124805-bf83-48ee-a4a2-7df28464b898]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.136 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef81841-d914-4e82-b51a-47bd429896ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.153 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[fb15e771-2f44-40a8-83bf-e4c1917bc70e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.176 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e23c37-b16c-4d30-a988-48ddcfa87e8f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:21.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.217 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[524e26e3-b213-4ae3-aa5e-cfe97204aec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.225 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[757fc099-ce3a-4546-aa9d-d13c78cd929d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 NetworkManager[45041]: <info>  [1759407681.2278] manager: (tap7754c79a-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/168)
Oct  2 08:21:21 np0005465988 systemd-udevd[272728]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:21:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:21.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.270 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[00819404-0359-4d3b-8e75-7824c80aab20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.275 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ae92413e-58e1-48c6-9b01-0bc8c8a3522a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 NetworkManager[45041]: <info>  [1759407681.3103] device (tap7754c79a-c0): carrier: link connected
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.314 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[cc26412f-603e-47fd-9686-4cb0928b59d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.317 2 DEBUG nova.network.neutron [req-2ffee095-bdc9-4943-936b-ff686971f2e3 req-91825572-95b1-43c4-bdc3-9cd06d01a998 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Updated VIF entry in instance network info cache for port 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.317 2 DEBUG nova.network.neutron [req-2ffee095-bdc9-4943-936b-ff686971f2e3 req-91825572-95b1-43c4-bdc3-9cd06d01a998 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Updating instance_info_cache with network_info: [{"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.343 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6b0583-9c93-41a3-90e8-18061448a260]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7754c79a-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:b0:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574163, 'reachable_time': 36578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272761, 'error': None, 'target': 'ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.371 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[24de5f7a-0fc1-4c22-a51b-d2a2dcd9dc2c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:b018'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574163, 'tstamp': 574163}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272762, 'error': None, 'target': 'ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.401 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bbfac72f-a2d3-437d-ae70-4ea3d7857464]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7754c79a-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:b0:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574163, 'reachable_time': 36578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 272763, 'error': None, 'target': 'ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.449 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[10da629f-0c52-40f2-81c3-a51c528a877c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.504 2 DEBUG oslo_concurrency.lockutils [req-2ffee095-bdc9-4943-936b-ff686971f2e3 req-91825572-95b1-43c4-bdc3-9cd06d01a998 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.519 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[23ab2b5f-771e-40db-8473-d88dcbb12fba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.521 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7754c79a-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.521 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.521 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7754c79a-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:21 np0005465988 NetworkManager[45041]: <info>  [1759407681.5240] manager: (tap7754c79a-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Oct  2 08:21:21 np0005465988 kernel: tap7754c79a-c0: entered promiscuous mode
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.528 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7754c79a-c0, col_values=(('external_ids', {'iface-id': 'b1ce5636-6283-470c-ab5e-aac212c1256d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:21 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:21Z|00342|binding|INFO|Releasing lport b1ce5636-6283-470c-ab5e-aac212c1256d from this chassis (sb_readonly=0)
Oct  2 08:21:21 np0005465988 nova_compute[236126]: 2025-10-02 12:21:21.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.548 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7754c79a-cca5-48c7-9169-831eaad23ccc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7754c79a-cca5-48c7-9169-831eaad23ccc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.550 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a67e8042-982e-4308-9f54-d3fb3863efdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.550 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-7754c79a-cca5-48c7-9169-831eaad23ccc
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/7754c79a-cca5-48c7-9169-831eaad23ccc.pid.haproxy
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 7754c79a-cca5-48c7-9169-831eaad23ccc
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:21:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:21.552 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc', 'env', 'PROCESS_TAG=haproxy-7754c79a-cca5-48c7-9169-831eaad23ccc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7754c79a-cca5-48c7-9169-831eaad23ccc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:21:21 np0005465988 podman[272831]: 2025-10-02 12:21:21.97031321 +0000 UTC m=+0.065341123 container create eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:21:22 np0005465988 systemd[1]: Started libpod-conmon-eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce.scope.
Oct  2 08:21:22 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:21:22 np0005465988 podman[272831]: 2025-10-02 12:21:21.945507138 +0000 UTC m=+0.040535101 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:21:22 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/263d4dd3b0aa4cf61fc102162b122949cd0f1ed185b51ca2aaf041deb0e45eb0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:21:22 np0005465988 podman[272831]: 2025-10-02 12:21:22.057223809 +0000 UTC m=+0.152251752 container init eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:21:22 np0005465988 podman[272831]: 2025-10-02 12:21:22.062829932 +0000 UTC m=+0.157857835 container start eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:21:22 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[272852]: [NOTICE]   (272856) : New worker (272858) forked
Oct  2 08:21:22 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[272852]: [NOTICE]   (272856) : Loading success.
Oct  2 08:21:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:22 np0005465988 nova_compute[236126]: 2025-10-02 12:21:22.466 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407682.4647725, 9c7a04c1-a740-4d58-bb92-b34f14ccff42 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:22 np0005465988 nova_compute[236126]: 2025-10-02 12:21:22.466 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] VM Started (Lifecycle Event)#033[00m
Oct  2 08:21:22 np0005465988 nova_compute[236126]: 2025-10-02 12:21:22.584 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:22 np0005465988 nova_compute[236126]: 2025-10-02 12:21:22.590 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407682.4650254, 9c7a04c1-a740-4d58-bb92-b34f14ccff42 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:22 np0005465988 nova_compute[236126]: 2025-10-02 12:21:22.591 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:21:22 np0005465988 nova_compute[236126]: 2025-10-02 12:21:22.718 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:22 np0005465988 nova_compute[236126]: 2025-10-02 12:21:22.722 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:22 np0005465988 nova_compute[236126]: 2025-10-02 12:21:22.915 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:21:22 np0005465988 nova_compute[236126]: 2025-10-02 12:21:22.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.098 2 DEBUG nova.compute.manager [req-30a8e77e-74bd-48fa-b79c-382285084ed3 req-fed3a0c5-b87f-4894-a8f0-d80410c385f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Received event network-vif-plugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.099 2 DEBUG oslo_concurrency.lockutils [req-30a8e77e-74bd-48fa-b79c-382285084ed3 req-fed3a0c5-b87f-4894-a8f0-d80410c385f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.099 2 DEBUG oslo_concurrency.lockutils [req-30a8e77e-74bd-48fa-b79c-382285084ed3 req-fed3a0c5-b87f-4894-a8f0-d80410c385f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.100 2 DEBUG oslo_concurrency.lockutils [req-30a8e77e-74bd-48fa-b79c-382285084ed3 req-fed3a0c5-b87f-4894-a8f0-d80410c385f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.100 2 DEBUG nova.compute.manager [req-30a8e77e-74bd-48fa-b79c-382285084ed3 req-fed3a0c5-b87f-4894-a8f0-d80410c385f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Processing event network-vif-plugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.101 2 DEBUG nova.compute.manager [req-30a8e77e-74bd-48fa-b79c-382285084ed3 req-fed3a0c5-b87f-4894-a8f0-d80410c385f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Received event network-vif-plugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.101 2 DEBUG oslo_concurrency.lockutils [req-30a8e77e-74bd-48fa-b79c-382285084ed3 req-fed3a0c5-b87f-4894-a8f0-d80410c385f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.102 2 DEBUG oslo_concurrency.lockutils [req-30a8e77e-74bd-48fa-b79c-382285084ed3 req-fed3a0c5-b87f-4894-a8f0-d80410c385f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.102 2 DEBUG oslo_concurrency.lockutils [req-30a8e77e-74bd-48fa-b79c-382285084ed3 req-fed3a0c5-b87f-4894-a8f0-d80410c385f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.103 2 DEBUG nova.compute.manager [req-30a8e77e-74bd-48fa-b79c-382285084ed3 req-fed3a0c5-b87f-4894-a8f0-d80410c385f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] No waiting events found dispatching network-vif-plugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.103 2 WARNING nova.compute.manager [req-30a8e77e-74bd-48fa-b79c-382285084ed3 req-fed3a0c5-b87f-4894-a8f0-d80410c385f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Received unexpected event network-vif-plugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.106 2 DEBUG nova.compute.manager [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-0170ec24-0bde-4eb7-b349-a8f304853e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.106 2 DEBUG oslo_concurrency.lockutils [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.107 2 DEBUG oslo_concurrency.lockutils [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.107 2 DEBUG oslo_concurrency.lockutils [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.108 2 DEBUG nova.compute.manager [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-plugged-0170ec24-0bde-4eb7-b349-a8f304853e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.108 2 WARNING nova.compute.manager [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-0170ec24-0bde-4eb7-b349-a8f304853e0d for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.109 2 DEBUG nova.compute.manager [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.109 2 DEBUG oslo_concurrency.lockutils [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.110 2 DEBUG oslo_concurrency.lockutils [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.111 2 DEBUG oslo_concurrency.lockutils [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.111 2 DEBUG nova.compute.manager [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-unplugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.112 2 DEBUG nova.compute.manager [req-062ba2f4-7c0c-452b-9369-04253b40ff99 req-dc8bbbe3-960c-4eac-89c5-87313aa78f97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.113 2 DEBUG nova.compute.manager [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.120 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.121 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407683.1200922, 9c7a04c1-a740-4d58-bb92-b34f14ccff42 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.122 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.130 2 INFO nova.virt.libvirt.driver [-] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Instance spawned successfully.#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.131 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:21:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:23.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:23.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.313 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.321 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.321 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.322 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.323 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.324 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.325 2 DEBUG nova.virt.libvirt.driver [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.331 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.354 2 DEBUG nova.compute.manager [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.355 2 DEBUG oslo_concurrency.lockutils [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.355 2 DEBUG oslo_concurrency.lockutils [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.355 2 DEBUG oslo_concurrency.lockutils [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.356 2 DEBUG nova.compute.manager [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-unplugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.356 2 DEBUG nova.compute.manager [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-unplugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.357 2 DEBUG nova.compute.manager [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.357 2 DEBUG oslo_concurrency.lockutils [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.358 2 DEBUG oslo_concurrency.lockutils [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.358 2 DEBUG oslo_concurrency.lockutils [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.359 2 DEBUG nova.compute.manager [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-plugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.359 2 WARNING nova.compute.manager [req-0cfe09d2-e82a-429e-9aab-21085f7502dc req-3da72860-697b-4699-97dd-95d14f422be5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.465 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.535 2 INFO nova.compute.manager [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Took 9.52 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.536 2 DEBUG nova.compute.manager [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.783 2 INFO nova.compute.manager [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Took 10.70 seconds to build instance.#033[00m
Oct  2 08:21:23 np0005465988 nova_compute[236126]: 2025-10-02 12:21:23.868 2 DEBUG oslo_concurrency.lockutils [None req-1ac9220e-6ca7-456f-b7b1-04b72bad5fb1 a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:24 np0005465988 nova_compute[236126]: 2025-10-02 12:21:24.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:25.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:25 np0005465988 nova_compute[236126]: 2025-10-02 12:21:25.215 2 DEBUG nova.compute.manager [req-965466db-c7dd-46d6-b661-cd9e62267f5e req-e6d06b77-0c0b-470b-aa33-ae509a32b116 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-plugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:25 np0005465988 nova_compute[236126]: 2025-10-02 12:21:25.215 2 DEBUG oslo_concurrency.lockutils [req-965466db-c7dd-46d6-b661-cd9e62267f5e req-e6d06b77-0c0b-470b-aa33-ae509a32b116 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:25 np0005465988 nova_compute[236126]: 2025-10-02 12:21:25.216 2 DEBUG oslo_concurrency.lockutils [req-965466db-c7dd-46d6-b661-cd9e62267f5e req-e6d06b77-0c0b-470b-aa33-ae509a32b116 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:25 np0005465988 nova_compute[236126]: 2025-10-02 12:21:25.216 2 DEBUG oslo_concurrency.lockutils [req-965466db-c7dd-46d6-b661-cd9e62267f5e req-e6d06b77-0c0b-470b-aa33-ae509a32b116 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:25 np0005465988 nova_compute[236126]: 2025-10-02 12:21:25.217 2 DEBUG nova.compute.manager [req-965466db-c7dd-46d6-b661-cd9e62267f5e req-e6d06b77-0c0b-470b-aa33-ae509a32b116 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] No waiting events found dispatching network-vif-plugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:25 np0005465988 nova_compute[236126]: 2025-10-02 12:21:25.217 2 WARNING nova.compute.manager [req-965466db-c7dd-46d6-b661-cd9e62267f5e req-e6d06b77-0c0b-470b-aa33-ae509a32b116 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received unexpected event network-vif-plugged-c5debb4f-7f40-48f2-afb5-efa11af6cc4c for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:21:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:25.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:25 np0005465988 nova_compute[236126]: 2025-10-02 12:21:25.481 2 DEBUG nova.compute.manager [req-848041e0-07d0-4412-adc4-fd5a25ce00c2 req-c0df714a-f388-4da6-9cc4-0d3cc927090e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-deleted-c5debb4f-7f40-48f2-afb5-efa11af6cc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:25 np0005465988 nova_compute[236126]: 2025-10-02 12:21:25.482 2 INFO nova.compute.manager [req-848041e0-07d0-4412-adc4-fd5a25ce00c2 req-c0df714a-f388-4da6-9cc4-0d3cc927090e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Neutron deleted interface c5debb4f-7f40-48f2-afb5-efa11af6cc4c; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:21:25 np0005465988 nova_compute[236126]: 2025-10-02 12:21:25.482 2 DEBUG nova.network.neutron [req-848041e0-07d0-4412-adc4-fd5a25ce00c2 req-c0df714a-f388-4da6-9cc4-0d3cc927090e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [{"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "address": "fa:16:3e:20:95:32", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0c51ef0-db", "ovs_interfaceid": "f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:25 np0005465988 nova_compute[236126]: 2025-10-02 12:21:25.504 2 DEBUG nova.compute.manager [req-848041e0-07d0-4412-adc4-fd5a25ce00c2 req-c0df714a-f388-4da6-9cc4-0d3cc927090e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Detach interface failed, port_id=c5debb4f-7f40-48f2-afb5-efa11af6cc4c, reason: Instance 969ba235-be4a-44e1-a6f2-7c5922b9661e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.080 2 DEBUG oslo_concurrency.lockutils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.081 2 DEBUG oslo_concurrency.lockutils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.082 2 INFO nova.compute.manager [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Shelving#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.166 2 DEBUG nova.virt.libvirt.driver [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:21:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:27.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:21:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:27.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:21:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:27.352 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:27.358 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:27.360 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.607 2 DEBUG nova.compute.manager [req-0f61edaa-bbc0-4689-983d-5e0a4e2beed5 req-2fe4b22c-d63f-43ad-8bff-842406f54f93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-deleted-f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.608 2 INFO nova.compute.manager [req-0f61edaa-bbc0-4689-983d-5e0a4e2beed5 req-2fe4b22c-d63f-43ad-8bff-842406f54f93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Neutron deleted interface f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.608 2 DEBUG nova.network.neutron [req-0f61edaa-bbc0-4689-983d-5e0a4e2beed5 req-2fe4b22c-d63f-43ad-8bff-842406f54f93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [{"id": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "address": "fa:16:3e:95:6f:4c", "network": {"id": "082f75aa-3cb3-4aac-903c-8187fdb62a93", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-550185625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51fc487-ee", "ovs_interfaceid": "c51fc487-eedd-421d-b8cc-d0a322b4a129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.900 2 DEBUG nova.compute.manager [req-0f61edaa-bbc0-4689-983d-5e0a4e2beed5 req-2fe4b22c-d63f-43ad-8bff-842406f54f93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Detach interface failed, port_id=f0c51ef0-db2b-4fd3-9e5e-107a6e3092f6, reason: Instance 969ba235-be4a-44e1-a6f2-7c5922b9661e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.901 2 DEBUG nova.compute.manager [req-0f61edaa-bbc0-4689-983d-5e0a4e2beed5 req-2fe4b22c-d63f-43ad-8bff-842406f54f93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-deleted-c51fc487-eedd-421d-b8cc-d0a322b4a129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.901 2 INFO nova.compute.manager [req-0f61edaa-bbc0-4689-983d-5e0a4e2beed5 req-2fe4b22c-d63f-43ad-8bff-842406f54f93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Neutron deleted interface c51fc487-eedd-421d-b8cc-d0a322b4a129; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.902 2 DEBUG nova.network.neutron [req-0f61edaa-bbc0-4689-983d-5e0a4e2beed5 req-2fe4b22c-d63f-43ad-8bff-842406f54f93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [{"id": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "address": "fa:16:3e:0a:bc:e6", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.88", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac2ac5dc-08", "ovs_interfaceid": "ac2ac5dc-08f6-4faa-9427-e87be9c9d933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "address": "fa:16:3e:36:5f:d8", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0170ec24-0b", "ovs_interfaceid": "0170ec24-0bde-4eb7-b349-a8f304853e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "address": "fa:16:3e:f7:ab:4e", "network": {"id": "b73044fd-31f0-4d49-88bb-64109a59249a", "bridge": "br-int", "label": "tempest-device-tagging-net1-1978677398", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3698a6-a6", "ovs_interfaceid": "fd3698a6-a68d-42ed-b217-f5bdc4163195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "address": "fa:16:3e:49:9b:57", "network": {"id": "3f8de4fc-b69f-4cee-bee5-9f9b5275933e", "bridge": "br-int", "label": "tempest-device-tagging-net2-543297517", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84f71f6076f7425db7653ac203257df0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf89c24d0-f3", "ovs_interfaceid": "f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:27 np0005465988 nova_compute[236126]: 2025-10-02 12:21:27.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:28 np0005465988 nova_compute[236126]: 2025-10-02 12:21:28.189 2 DEBUG nova.compute.manager [req-0f61edaa-bbc0-4689-983d-5e0a4e2beed5 req-2fe4b22c-d63f-43ad-8bff-842406f54f93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Detach interface failed, port_id=c51fc487-eedd-421d-b8cc-d0a322b4a129, reason: Instance 969ba235-be4a-44e1-a6f2-7c5922b9661e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:21:28 np0005465988 nova_compute[236126]: 2025-10-02 12:21:28.455 2 DEBUG nova.network.neutron [-] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:28 np0005465988 nova_compute[236126]: 2025-10-02 12:21:28.472 2 INFO nova.compute.manager [-] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Took 9.08 seconds to deallocate network for instance.#033[00m
Oct  2 08:21:29 np0005465988 nova_compute[236126]: 2025-10-02 12:21:29.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:29.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:29.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:29 np0005465988 nova_compute[236126]: 2025-10-02 12:21:29.453 2 INFO nova.compute.manager [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Took 0.98 seconds to detach 3 volumes for instance.#033[00m
Oct  2 08:21:29 np0005465988 nova_compute[236126]: 2025-10-02 12:21:29.535 2 DEBUG oslo_concurrency.lockutils [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:29 np0005465988 nova_compute[236126]: 2025-10-02 12:21:29.537 2 DEBUG oslo_concurrency.lockutils [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:29 np0005465988 nova_compute[236126]: 2025-10-02 12:21:29.621 2 DEBUG oslo_concurrency.processutils [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:29 np0005465988 nova_compute[236126]: 2025-10-02 12:21:29.685 2 DEBUG nova.compute.manager [req-086a3f4b-1713-49d4-9767-de22efea742a req-c2019cb8-4b59-47f1-b77e-fb3545a1e909 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-deleted-fd3698a6-a68d-42ed-b217-f5bdc4163195 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:29 np0005465988 nova_compute[236126]: 2025-10-02 12:21:29.686 2 DEBUG nova.compute.manager [req-086a3f4b-1713-49d4-9767-de22efea742a req-c2019cb8-4b59-47f1-b77e-fb3545a1e909 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Received event network-vif-deleted-f89c24d0-f3c5-4d5b-8043-6f6f6f8f7c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:21:30 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/296445127' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:21:30 np0005465988 nova_compute[236126]: 2025-10-02 12:21:30.169 2 DEBUG oslo_concurrency.processutils [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.548s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:30 np0005465988 nova_compute[236126]: 2025-10-02 12:21:30.176 2 DEBUG nova.compute.provider_tree [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:21:30 np0005465988 nova_compute[236126]: 2025-10-02 12:21:30.198 2 DEBUG nova.scheduler.client.report [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:21:30 np0005465988 nova_compute[236126]: 2025-10-02 12:21:30.239 2 DEBUG oslo_concurrency.lockutils [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:30 np0005465988 nova_compute[236126]: 2025-10-02 12:21:30.268 2 INFO nova.scheduler.client.report [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Deleted allocations for instance 969ba235-be4a-44e1-a6f2-7c5922b9661e#033[00m
Oct  2 08:21:30 np0005465988 nova_compute[236126]: 2025-10-02 12:21:30.351 2 DEBUG oslo_concurrency.lockutils [None req-e49b3e8c-349b-418a-a0fb-63f95f11573a 2a7f7518ce70488fb4f63af1a3bef131 84f71f6076f7425db7653ac203257df0 - - default default] Lock "969ba235-be4a-44e1-a6f2-7c5922b9661e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 12.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:31.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:31.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:32 np0005465988 nova_compute[236126]: 2025-10-02 12:21:32.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:33.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:33.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:33 np0005465988 nova_compute[236126]: 2025-10-02 12:21:33.591 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407678.5858388, 969ba235-be4a-44e1-a6f2-7c5922b9661e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:33 np0005465988 nova_compute[236126]: 2025-10-02 12:21:33.593 2 INFO nova.compute.manager [-] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:21:33 np0005465988 nova_compute[236126]: 2025-10-02 12:21:33.619 2 DEBUG nova.compute.manager [None req-7ed4d870-246b-4501-af94-3092600e6473 - - - - - -] [instance: 969ba235-be4a-44e1-a6f2-7c5922b9661e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:34 np0005465988 nova_compute[236126]: 2025-10-02 12:21:34.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:34 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #49. Immutable memtables: 6.
Oct  2 08:21:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:35.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:35.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:35Z|00343|binding|INFO|Releasing lport b1ce5636-6283-470c-ab5e-aac212c1256d from this chassis (sb_readonly=0)
Oct  2 08:21:35 np0005465988 nova_compute[236126]: 2025-10-02 12:21:35.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:35Z|00344|binding|INFO|Releasing lport b1ce5636-6283-470c-ab5e-aac212c1256d from this chassis (sb_readonly=0)
Oct  2 08:21:35 np0005465988 nova_compute[236126]: 2025-10-02 12:21:35.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:36Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cd:6e:67 10.100.0.6
Oct  2 08:21:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:36Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cd:6e:67 10.100.0.6
Oct  2 08:21:36 np0005465988 podman[272899]: 2025-10-02 12:21:36.576882412 +0000 UTC m=+0.084223972 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:21:36 np0005465988 podman[272898]: 2025-10-02 12:21:36.580194728 +0000 UTC m=+0.093462991 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:21:36 np0005465988 podman[272897]: 2025-10-02 12:21:36.605832075 +0000 UTC m=+0.120461148 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:21:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:21:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:37.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:21:37 np0005465988 nova_compute[236126]: 2025-10-02 12:21:37.247 2 DEBUG nova.virt.libvirt.driver [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:21:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:37.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:37 np0005465988 nova_compute[236126]: 2025-10-02 12:21:37.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:39 np0005465988 nova_compute[236126]: 2025-10-02 12:21:39.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:39.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:39.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:41.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:41 np0005465988 nova_compute[236126]: 2025-10-02 12:21:41.271 2 INFO nova.virt.libvirt.driver [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Instance shutdown successfully after 14 seconds.#033[00m
Oct  2 08:21:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:41.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:41 np0005465988 kernel: tap8c8b83a0-49 (unregistering): left promiscuous mode
Oct  2 08:21:41 np0005465988 NetworkManager[45041]: <info>  [1759407701.4122] device (tap8c8b83a0-49): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:21:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:41Z|00345|binding|INFO|Releasing lport 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 from this chassis (sb_readonly=0)
Oct  2 08:21:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:41Z|00346|binding|INFO|Setting lport 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 down in Southbound
Oct  2 08:21:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:41Z|00347|binding|INFO|Removing iface tap8c8b83a0-49 ovn-installed in OVS
Oct  2 08:21:41 np0005465988 nova_compute[236126]: 2025-10-02 12:21:41.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.434 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:6e:67 10.100.0.6'], port_security=['fa:16:3e:cd:6e:67 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9c7a04c1-a740-4d58-bb92-b34f14ccff42', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7754c79a-cca5-48c7-9169-831eaad23ccc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c2c11ebecb14f3188f35ea473c4ca02', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c0d053f-a096-4f8c-8162-5ef19e29b5d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45b5774e-2213-45dd-ab74-f2a3868d167c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.435 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 in datapath 7754c79a-cca5-48c7-9169-831eaad23ccc unbound from our chassis#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.437 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7754c79a-cca5-48c7-9169-831eaad23ccc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.442 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[109857ae-ce6b-4b92-bd80-c010dcbbaa90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.443 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc namespace which is not needed anymore#033[00m
Oct  2 08:21:41 np0005465988 nova_compute[236126]: 2025-10-02 12:21:41.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:41 np0005465988 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000059.scope: Deactivated successfully.
Oct  2 08:21:41 np0005465988 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000059.scope: Consumed 14.281s CPU time.
Oct  2 08:21:41 np0005465988 systemd-machined[192594]: Machine qemu-33-instance-00000059 terminated.
Oct  2 08:21:41 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[272852]: [NOTICE]   (272856) : haproxy version is 2.8.14-c23fe91
Oct  2 08:21:41 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[272852]: [NOTICE]   (272856) : path to executable is /usr/sbin/haproxy
Oct  2 08:21:41 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[272852]: [WARNING]  (272856) : Exiting Master process...
Oct  2 08:21:41 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[272852]: [ALERT]    (272856) : Current worker (272858) exited with code 143 (Terminated)
Oct  2 08:21:41 np0005465988 neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc[272852]: [WARNING]  (272856) : All workers exited. Exiting... (0)
Oct  2 08:21:41 np0005465988 systemd[1]: libpod-eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce.scope: Deactivated successfully.
Oct  2 08:21:41 np0005465988 podman[273039]: 2025-10-02 12:21:41.63891186 +0000 UTC m=+0.061915454 container died eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:21:41 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce-userdata-shm.mount: Deactivated successfully.
Oct  2 08:21:41 np0005465988 systemd[1]: var-lib-containers-storage-overlay-263d4dd3b0aa4cf61fc102162b122949cd0f1ed185b51ca2aaf041deb0e45eb0-merged.mount: Deactivated successfully.
Oct  2 08:21:41 np0005465988 podman[273039]: 2025-10-02 12:21:41.688694299 +0000 UTC m=+0.111697893 container cleanup eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:21:41 np0005465988 nova_compute[236126]: 2025-10-02 12:21:41.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:41 np0005465988 nova_compute[236126]: 2025-10-02 12:21:41.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:41 np0005465988 systemd[1]: libpod-conmon-eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce.scope: Deactivated successfully.
Oct  2 08:21:41 np0005465988 nova_compute[236126]: 2025-10-02 12:21:41.717 2 INFO nova.virt.libvirt.driver [-] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Instance destroyed successfully.#033[00m
Oct  2 08:21:41 np0005465988 nova_compute[236126]: 2025-10-02 12:21:41.718 2 DEBUG nova.objects.instance [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9c7a04c1-a740-4d58-bb92-b34f14ccff42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:41 np0005465988 podman[273074]: 2025-10-02 12:21:41.788916026 +0000 UTC m=+0.071265075 container remove eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.796 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bd65a36e-053a-429d-ba80-1bb431daf7e9]: (4, ('Thu Oct  2 12:21:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc (eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce)\neb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce\nThu Oct  2 12:21:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc (eb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce)\neb3d4fc78edcbbcb23630545688b73548aca13febdc53caa5e578a50ee4d3cce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.797 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e6155589-793c-4959-bf28-c66c123bd013]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.798 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7754c79a-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:41 np0005465988 nova_compute[236126]: 2025-10-02 12:21:41.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:41 np0005465988 kernel: tap7754c79a-c0: left promiscuous mode
Oct  2 08:21:41 np0005465988 nova_compute[236126]: 2025-10-02 12:21:41.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:41 np0005465988 nova_compute[236126]: 2025-10-02 12:21:41.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.823 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec95ce0-c207-4d46-a3f2-a480e507bc29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.856 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5bfcf419-2c06-4fb9-bd66-332d84a07cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.857 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[77460a3e-ed3d-443e-ab81-40b65795ffd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.881 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6637bc64-98fa-4e16-b219-1f09b62e8160]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574154, 'reachable_time': 29215, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273095, 'error': None, 'target': 'ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:41 np0005465988 systemd[1]: run-netns-ovnmeta\x2d7754c79a\x2dcca5\x2d48c7\x2d9169\x2d831eaad23ccc.mount: Deactivated successfully.
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.888 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7754c79a-cca5-48c7-9169-831eaad23ccc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:21:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:41.888 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[c863df99-738a-4e18-9884-4ec75e823b38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:42 np0005465988 nova_compute[236126]: 2025-10-02 12:21:42.141 2 INFO nova.virt.libvirt.driver [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Beginning cold snapshot process#033[00m
Oct  2 08:21:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:42 np0005465988 nova_compute[236126]: 2025-10-02 12:21:42.364 2 DEBUG nova.virt.libvirt.imagebackend [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:21:42 np0005465988 nova_compute[236126]: 2025-10-02 12:21:42.581 2 DEBUG nova.storage.rbd_utils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] creating snapshot(34e2a049837842539ba90cd036733e31) on rbd image(9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:21:42 np0005465988 nova_compute[236126]: 2025-10-02 12:21:42.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e273 e273: 3 total, 3 up, 3 in
Oct  2 08:21:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:21:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:43.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:21:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:43.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:43 np0005465988 nova_compute[236126]: 2025-10-02 12:21:43.412 2 DEBUG nova.compute.manager [req-4a5a4163-e2a4-496b-8180-4b766fd11168 req-745590e2-5abd-4239-8ccb-56e5ceb0acc6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Received event network-vif-unplugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:43 np0005465988 nova_compute[236126]: 2025-10-02 12:21:43.412 2 DEBUG oslo_concurrency.lockutils [req-4a5a4163-e2a4-496b-8180-4b766fd11168 req-745590e2-5abd-4239-8ccb-56e5ceb0acc6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:43 np0005465988 nova_compute[236126]: 2025-10-02 12:21:43.413 2 DEBUG oslo_concurrency.lockutils [req-4a5a4163-e2a4-496b-8180-4b766fd11168 req-745590e2-5abd-4239-8ccb-56e5ceb0acc6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:43 np0005465988 nova_compute[236126]: 2025-10-02 12:21:43.413 2 DEBUG oslo_concurrency.lockutils [req-4a5a4163-e2a4-496b-8180-4b766fd11168 req-745590e2-5abd-4239-8ccb-56e5ceb0acc6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:43 np0005465988 nova_compute[236126]: 2025-10-02 12:21:43.414 2 DEBUG nova.compute.manager [req-4a5a4163-e2a4-496b-8180-4b766fd11168 req-745590e2-5abd-4239-8ccb-56e5ceb0acc6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] No waiting events found dispatching network-vif-unplugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:43 np0005465988 nova_compute[236126]: 2025-10-02 12:21:43.414 2 WARNING nova.compute.manager [req-4a5a4163-e2a4-496b-8180-4b766fd11168 req-745590e2-5abd-4239-8ccb-56e5ceb0acc6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Received unexpected event network-vif-unplugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:21:43 np0005465988 nova_compute[236126]: 2025-10-02 12:21:43.444 2 DEBUG nova.storage.rbd_utils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] cloning vms/9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk@34e2a049837842539ba90cd036733e31 to images/7aeb8031-5492-47ae-a269-5cc3e1a66a2a clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:21:43 np0005465988 nova_compute[236126]: 2025-10-02 12:21:43.650 2 DEBUG nova.storage.rbd_utils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] flattening images/7aeb8031-5492-47ae-a269-5cc3e1a66a2a flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:21:44 np0005465988 nova_compute[236126]: 2025-10-02 12:21:44.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:44 np0005465988 nova_compute[236126]: 2025-10-02 12:21:44.695 2 DEBUG nova.storage.rbd_utils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] removing snapshot(34e2a049837842539ba90cd036733e31) on rbd image(9c7a04c1-a740-4d58-bb92-b34f14ccff42_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:21:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.003000088s ======
Oct  2 08:21:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:45.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000088s
Oct  2 08:21:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:45.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e274 e274: 3 total, 3 up, 3 in
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.526 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.527 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.527 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.527 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.528 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.584 2 DEBUG nova.compute.manager [req-f4caa040-160f-42ef-b282-5e9929405abe req-91200219-94e9-4246-8fce-fccd173313c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Received event network-vif-plugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.585 2 DEBUG oslo_concurrency.lockutils [req-f4caa040-160f-42ef-b282-5e9929405abe req-91200219-94e9-4246-8fce-fccd173313c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.586 2 DEBUG oslo_concurrency.lockutils [req-f4caa040-160f-42ef-b282-5e9929405abe req-91200219-94e9-4246-8fce-fccd173313c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.586 2 DEBUG oslo_concurrency.lockutils [req-f4caa040-160f-42ef-b282-5e9929405abe req-91200219-94e9-4246-8fce-fccd173313c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.587 2 DEBUG nova.compute.manager [req-f4caa040-160f-42ef-b282-5e9929405abe req-91200219-94e9-4246-8fce-fccd173313c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] No waiting events found dispatching network-vif-plugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.587 2 WARNING nova.compute.manager [req-f4caa040-160f-42ef-b282-5e9929405abe req-91200219-94e9-4246-8fce-fccd173313c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Received unexpected event network-vif-plugged-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.910 2 DEBUG nova.storage.rbd_utils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] creating snapshot(snap) on rbd image(7aeb8031-5492-47ae-a269-5cc3e1a66a2a) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:21:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:21:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2895835696' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:21:45 np0005465988 nova_compute[236126]: 2025-10-02 12:21:45.979 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.071 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000059 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.072 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000059 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:46.076 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:46.077 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.258 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.259 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4512MB free_disk=20.852508544921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.259 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.259 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.384 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 9c7a04c1-a740-4d58-bb92-b34f14ccff42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.384 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.385 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.423 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e275 e275: 3 total, 3 up, 3 in
Oct  2 08:21:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:21:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3059860970' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.871 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.878 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.896 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.924 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:21:46 np0005465988 nova_compute[236126]: 2025-10-02 12:21:46.924 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:21:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:47.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:21:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:47.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:47 np0005465988 nova_compute[236126]: 2025-10-02 12:21:47.925 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:47 np0005465988 nova_compute[236126]: 2025-10-02 12:21:47.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:48 np0005465988 nova_compute[236126]: 2025-10-02 12:21:48.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:48 np0005465988 podman[273404]: 2025-10-02 12:21:48.53296562 +0000 UTC m=+0.065007313 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:21:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:21:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:21:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.104 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.107 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.136 2 DEBUG nova.compute.manager [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.217 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.218 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.229 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.230 2 INFO nova.compute.claims [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:21:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:49.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.293 2 INFO nova.virt.libvirt.driver [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Snapshot image upload complete#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.294 2 DEBUG nova.compute.manager [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:21:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:49.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.365 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.401 2 INFO nova.compute.manager [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Shelve offloading#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.414 2 INFO nova.virt.libvirt.driver [-] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Instance destroyed successfully.#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.415 2 DEBUG nova.compute.manager [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.419 2 DEBUG oslo_concurrency.lockutils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.419 2 DEBUG oslo_concurrency.lockutils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquired lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.420 2 DEBUG nova.network.neutron [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:21:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3635761677' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.875 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e276 e276: 3 total, 3 up, 3 in
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.882 2 DEBUG nova.compute.provider_tree [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.904 2 DEBUG nova.scheduler.client.report [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.931 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.713s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.933 2 DEBUG nova.compute.manager [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.977 2 DEBUG nova.compute.manager [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.978 2 DEBUG nova.network.neutron [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:21:49 np0005465988 nova_compute[236126]: 2025-10-02 12:21:49.995 2 INFO nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.017 2 DEBUG nova.compute.manager [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.105 2 DEBUG nova.compute.manager [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.106 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.107 2 INFO nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Creating image(s)#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.129 2 DEBUG nova.storage.rbd_utils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.163 2 DEBUG nova.storage.rbd_utils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.199 2 DEBUG nova.storage.rbd_utils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.204 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.304 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.305 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.305 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.305 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.333 2 DEBUG nova.storage.rbd_utils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.338 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.379 2 DEBUG nova.policy [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4146a31af09c4e6a8aee251f2fec4f98', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c740a14d1c5c45d1a0959b0e24ac460b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.477 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.478 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.479 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:50 np0005465988 nova_compute[236126]: 2025-10-02 12:21:50.479 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:21:51 np0005465988 nova_compute[236126]: 2025-10-02 12:21:51.038 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.700s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:51 np0005465988 nova_compute[236126]: 2025-10-02 12:21:51.129 2 DEBUG nova.storage.rbd_utils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] resizing rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:21:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:51 np0005465988 nova_compute[236126]: 2025-10-02 12:21:51.237 2 DEBUG nova.network.neutron [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Successfully created port: 7b92f05d-cce2-48f0-a124-0408773ce275 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:21:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:51.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:51.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:51 np0005465988 nova_compute[236126]: 2025-10-02 12:21:51.349 2 DEBUG nova.objects.instance [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'migration_context' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:51 np0005465988 nova_compute[236126]: 2025-10-02 12:21:51.366 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:21:51 np0005465988 nova_compute[236126]: 2025-10-02 12:21:51.367 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Ensure instance console log exists: /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:21:51 np0005465988 nova_compute[236126]: 2025-10-02 12:21:51.367 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:51 np0005465988 nova_compute[236126]: 2025-10-02 12:21:51.368 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:51 np0005465988 nova_compute[236126]: 2025-10-02 12:21:51.368 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:51 np0005465988 nova_compute[236126]: 2025-10-02 12:21:51.724 2 DEBUG nova.network.neutron [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Updating instance_info_cache with network_info: [{"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:51 np0005465988 nova_compute[236126]: 2025-10-02 12:21:51.751 2 DEBUG oslo_concurrency.lockutils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Releasing lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:52 np0005465988 nova_compute[236126]: 2025-10-02 12:21:52.793 2 DEBUG nova.network.neutron [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Successfully updated port: 7b92f05d-cce2-48f0-a124-0408773ce275 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:21:52 np0005465988 nova_compute[236126]: 2025-10-02 12:21:52.808 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:52 np0005465988 nova_compute[236126]: 2025-10-02 12:21:52.809 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquired lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:52 np0005465988 nova_compute[236126]: 2025-10-02 12:21:52.809 2 DEBUG nova.network.neutron [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:21:52 np0005465988 nova_compute[236126]: 2025-10-02 12:21:52.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.047 2 DEBUG nova.network.neutron [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:21:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:53.080 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.130 2 INFO nova.virt.libvirt.driver [-] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Instance destroyed successfully.#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.131 2 DEBUG nova.objects.instance [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lazy-loading 'resources' on Instance uuid 9c7a04c1-a740-4d58-bb92-b34f14ccff42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.149 2 DEBUG nova.virt.libvirt.vif [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-97811332',display_name='tempest-DeleteServersTestJSON-server-97811332',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-97811332',id=89,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:23Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1c2c11ebecb14f3188f35ea473c4ca02',ramdisk_id='',reservation_id='r-c1yw38tm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1602490521',owner_user_name='tempest-DeleteServersTestJSON-1602490521-project-member',shelved_at='2025-10-02T12:21:49.294406',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='7aeb8031-5492-47ae-a269-5cc3e1a66a2a'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:21:42Z,user_data=None,user_id='a9f7faffac7240869a0196df1ddda7e5',uuid=9c7a04c1-a740-4d58-bb92-b34f14ccff42,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.150 2 DEBUG nova.network.os_vif_util [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converting VIF {"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.151 2 DEBUG nova.network.os_vif_util [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:6e:67,bridge_name='br-int',has_traffic_filtering=True,id=8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c8b83a0-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.152 2 DEBUG os_vif [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:6e:67,bridge_name='br-int',has_traffic_filtering=True,id=8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c8b83a0-49') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.156 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c8b83a0-49, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.168 2 INFO os_vif [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:6e:67,bridge_name='br-int',has_traffic_filtering=True,id=8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1,network=Network(7754c79a-cca5-48c7-9169-831eaad23ccc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c8b83a0-49')#033[00m
Oct  2 08:21:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:53.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:21:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:53.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.377 2 DEBUG nova.compute.manager [req-7b1da67b-6d55-4ffe-9191-5695f31f75a2 req-4838e325-3a76-4208-b634-a49f8f45cf31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-changed-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.378 2 DEBUG nova.compute.manager [req-7b1da67b-6d55-4ffe-9191-5695f31f75a2 req-4838e325-3a76-4208-b634-a49f8f45cf31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Refreshing instance network info cache due to event network-changed-7b92f05d-cce2-48f0-a124-0408773ce275. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:21:53 np0005465988 nova_compute[236126]: 2025-10-02 12:21:53.378 2 DEBUG oslo_concurrency.lockutils [req-7b1da67b-6d55-4ffe-9191-5695f31f75a2 req-4838e325-3a76-4208-b634-a49f8f45cf31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.093 2 DEBUG nova.network.neutron [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating instance_info_cache with network_info: [{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.132 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Releasing lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.133 2 DEBUG nova.compute.manager [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance network_info: |[{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.133 2 DEBUG oslo_concurrency.lockutils [req-7b1da67b-6d55-4ffe-9191-5695f31f75a2 req-4838e325-3a76-4208-b634-a49f8f45cf31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.134 2 DEBUG nova.network.neutron [req-7b1da67b-6d55-4ffe-9191-5695f31f75a2 req-4838e325-3a76-4208-b634-a49f8f45cf31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Refreshing network info cache for port 7b92f05d-cce2-48f0-a124-0408773ce275 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.137 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Start _get_guest_xml network_info=[{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.155 2 WARNING nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.164 2 DEBUG nova.virt.libvirt.host [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.165 2 DEBUG nova.virt.libvirt.host [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.169 2 DEBUG nova.virt.libvirt.host [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.169 2 DEBUG nova.virt.libvirt.host [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.171 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.171 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.171 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.172 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.172 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.172 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.172 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.172 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.173 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.173 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.173 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.173 2 DEBUG nova.virt.hardware [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.176 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e277 e277: 3 total, 3 up, 3 in
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.471 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:21:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/286001587' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.644 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.684 2 DEBUG nova.storage.rbd_utils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:54 np0005465988 nova_compute[236126]: 2025-10-02 12:21:54.689 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:21:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/131269951' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.105 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.107 2 DEBUG nova.virt.libvirt.vif [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:21:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-757604995',display_name='tempest-ServersNegativeTestJSON-server-757604995',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-757604995',id=90,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c740a14d1c5c45d1a0959b0e24ac460b',ramdisk_id='',reservation_id='r-1qzpbgvk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-462972452',owner_user_name='tempest-ServersNegativeTestJSON-462972452-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:21:50Z,user_data=None,user_id='4146a31af09c4e6a8aee251f2fec4f98',uuid=3a4d32fc-bed8-4e11-9033-5b73501128fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.107 2 DEBUG nova.network.os_vif_util [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converting VIF {"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.108 2 DEBUG nova.network.os_vif_util [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.109 2 DEBUG nova.objects.instance [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.126 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  <uuid>3a4d32fc-bed8-4e11-9033-5b73501128fe</uuid>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  <name>instance-0000005a</name>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersNegativeTestJSON-server-757604995</nova:name>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:21:54</nova:creationTime>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <nova:user uuid="4146a31af09c4e6a8aee251f2fec4f98">tempest-ServersNegativeTestJSON-462972452-project-member</nova:user>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <nova:project uuid="c740a14d1c5c45d1a0959b0e24ac460b">tempest-ServersNegativeTestJSON-462972452</nova:project>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <nova:port uuid="7b92f05d-cce2-48f0-a124-0408773ce275">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <entry name="serial">3a4d32fc-bed8-4e11-9033-5b73501128fe</entry>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <entry name="uuid">3a4d32fc-bed8-4e11-9033-5b73501128fe</entry>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/3a4d32fc-bed8-4e11-9033-5b73501128fe_disk">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:80:0c:a5"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <target dev="tap7b92f05d-cc"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/console.log" append="off"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:21:55 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:21:55 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:21:55 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:21:55 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.128 2 DEBUG nova.compute.manager [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Preparing to wait for external event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.128 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.129 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.129 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.130 2 DEBUG nova.virt.libvirt.vif [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:21:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-757604995',display_name='tempest-ServersNegativeTestJSON-server-757604995',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-757604995',id=90,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c740a14d1c5c45d1a0959b0e24ac460b',ramdisk_id='',reservation_id='r-1qzpbgvk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-462972452',owner_user_name='tempest-ServersNegativeTestJSON-462972452-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:21:50Z,user_data=None,user_id='4146a31af09c4e6a8aee251f2fec4f98',uuid=3a4d32fc-bed8-4e11-9033-5b73501128fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.130 2 DEBUG nova.network.os_vif_util [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converting VIF {"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.131 2 DEBUG nova.network.os_vif_util [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.131 2 DEBUG os_vif [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.133 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.133 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.137 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b92f05d-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.137 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b92f05d-cc, col_values=(('external_ids', {'iface-id': '7b92f05d-cce2-48f0-a124-0408773ce275', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:0c:a5', 'vm-uuid': '3a4d32fc-bed8-4e11-9033-5b73501128fe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:55 np0005465988 NetworkManager[45041]: <info>  [1759407715.1399] manager: (tap7b92f05d-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.149 2 INFO os_vif [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc')#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.193 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.194 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.194 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] No VIF found with MAC fa:16:3e:80:0c:a5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.195 2 INFO nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Using config drive#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.228 2 DEBUG nova.storage.rbd_utils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:21:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:55.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:21:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:55.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.475 2 DEBUG nova.compute.manager [req-8f611a54-719a-4c2e-9114-5c3df0c082af req-f57077cc-1c57-4474-b06e-1d4efe5cb95c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Received event network-changed-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.476 2 DEBUG nova.compute.manager [req-8f611a54-719a-4c2e-9114-5c3df0c082af req-f57077cc-1c57-4474-b06e-1d4efe5cb95c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Refreshing instance network info cache due to event network-changed-8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.477 2 DEBUG oslo_concurrency.lockutils [req-8f611a54-719a-4c2e-9114-5c3df0c082af req-f57077cc-1c57-4474-b06e-1d4efe5cb95c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.477 2 DEBUG oslo_concurrency.lockutils [req-8f611a54-719a-4c2e-9114-5c3df0c082af req-f57077cc-1c57-4474-b06e-1d4efe5cb95c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.478 2 DEBUG nova.network.neutron [req-8f611a54-719a-4c2e-9114-5c3df0c082af req-f57077cc-1c57-4474-b06e-1d4efe5cb95c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Refreshing network info cache for port 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.677 2 INFO nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Creating config drive at /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.688 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp30gbd_k8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.727 2 DEBUG nova.network.neutron [req-7b1da67b-6d55-4ffe-9191-5695f31f75a2 req-4838e325-3a76-4208-b634-a49f8f45cf31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updated VIF entry in instance network info cache for port 7b92f05d-cce2-48f0-a124-0408773ce275. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.729 2 DEBUG nova.network.neutron [req-7b1da67b-6d55-4ffe-9191-5695f31f75a2 req-4838e325-3a76-4208-b634-a49f8f45cf31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating instance_info_cache with network_info: [{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.749 2 DEBUG oslo_concurrency.lockutils [req-7b1da67b-6d55-4ffe-9191-5695f31f75a2 req-4838e325-3a76-4208-b634-a49f8f45cf31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.834 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp30gbd_k8" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.866 2 DEBUG nova.storage.rbd_utils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.872 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.926 2 INFO nova.virt.libvirt.driver [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Deleting instance files /var/lib/nova/instances/9c7a04c1-a740-4d58-bb92-b34f14ccff42_del#033[00m
Oct  2 08:21:55 np0005465988 nova_compute[236126]: 2025-10-02 12:21:55.927 2 INFO nova.virt.libvirt.driver [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Deletion of /var/lib/nova/instances/9c7a04c1-a740-4d58-bb92-b34f14ccff42_del complete#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.069 2 INFO nova.scheduler.client.report [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Deleted allocations for instance 9c7a04c1-a740-4d58-bb92-b34f14ccff42#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.130 2 DEBUG oslo_concurrency.lockutils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.131 2 DEBUG oslo_concurrency.lockutils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.178 2 DEBUG oslo_concurrency.processutils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.219 2 DEBUG oslo_concurrency.processutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.347s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.220 2 INFO nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Deleting local config drive /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config because it was imported into RBD.#033[00m
Oct  2 08:21:56 np0005465988 kernel: tap7b92f05d-cc: entered promiscuous mode
Oct  2 08:21:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:56Z|00348|binding|INFO|Claiming lport 7b92f05d-cce2-48f0-a124-0408773ce275 for this chassis.
Oct  2 08:21:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:56Z|00349|binding|INFO|7b92f05d-cce2-48f0-a124-0408773ce275: Claiming fa:16:3e:80:0c:a5 10.100.0.11
Oct  2 08:21:56 np0005465988 NetworkManager[45041]: <info>  [1759407716.2908] manager: (tap7b92f05d-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/171)
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.299 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:0c:a5 10.100.0.11'], port_security=['fa:16:3e:80:0c:a5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3a4d32fc-bed8-4e11-9033-5b73501128fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f671ae-bb65-4932-84ce-cef4210e4599', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c740a14d1c5c45d1a0959b0e24ac460b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0cf8cc6d-2482-45e4-b576-7d811b75025a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5feaa29-5f25-4f45-a24f-ce44451fb322, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7b92f05d-cce2-48f0-a124-0408773ce275) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.300 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7b92f05d-cce2-48f0-a124-0408773ce275 in datapath d5f671ae-bb65-4932-84ce-cef4210e4599 bound to our chassis#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.302 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d5f671ae-bb65-4932-84ce-cef4210e4599#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.318 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[36b1d41a-b035-4d50-85c7-6da8c1eb3134]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.319 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd5f671ae-b1 in ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.324 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd5f671ae-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.324 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e262dc-a571-416f-8a0f-17bd11c28a3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.325 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ff09df50-9e66-4732-8c7b-ff71ea1e2f82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.337 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd50452-33d9-4cee-a8a6-f923e8fe55ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 systemd-machined[192594]: New machine qemu-34-instance-0000005a.
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:56 np0005465988 systemd[1]: Started Virtual Machine qemu-34-instance-0000005a.
Oct  2 08:21:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:56Z|00350|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 ovn-installed in OVS
Oct  2 08:21:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:56Z|00351|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 up in Southbound
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:56 np0005465988 systemd-udevd[273805]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.371 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e9847219-b597-4629-8ede-2107b70755b5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 NetworkManager[45041]: <info>  [1759407716.3915] device (tap7b92f05d-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:21:56 np0005465988 NetworkManager[45041]: <info>  [1759407716.3923] device (tap7b92f05d-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.413 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[43828991-f3f8-4245-b3f5-3fccca29d396]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.422 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[029de3e6-d73e-4304-8410-b1d9e57c7132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 systemd-udevd[273809]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:21:56 np0005465988 NetworkManager[45041]: <info>  [1759407716.4251] manager: (tapd5f671ae-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/172)
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.467 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[3a856a08-aefa-4787-b533-d2baf0bb1371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.472 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e51740bb-c4ef-4ae6-aa9f-07d9408b3b8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 NetworkManager[45041]: <info>  [1759407716.5072] device (tapd5f671ae-b0): carrier: link connected
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.519 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[719ec87c-2a9c-4395-8095-24bf553af69f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.547 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[981a6b2b-8109-4f8e-84c9-6bc92625a600]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5f671ae-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:3a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577683, 'reachable_time': 21826, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273835, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.570 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[79f661e4-0fbb-4aaf-93eb-83bd96571abc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:3a86'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577683, 'tstamp': 577683}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273836, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.592 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1d415017-a611-48cb-8397-de147f63a651]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5f671ae-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:3a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577683, 'reachable_time': 21826, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 273837, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.642 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[43dfe055-beb5-4b2a-9e4f-b203bcfec9d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.661 2 DEBUG nova.network.neutron [req-8f611a54-719a-4c2e-9114-5c3df0c082af req-f57077cc-1c57-4474-b06e-1d4efe5cb95c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Updated VIF entry in instance network info cache for port 8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.662 2 DEBUG nova.network.neutron [req-8f611a54-719a-4c2e-9114-5c3df0c082af req-f57077cc-1c57-4474-b06e-1d4efe5cb95c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Updating instance_info_cache with network_info: [{"id": "8c8b83a0-49eb-44fc-be5b-b24fe0a0d2e1", "address": "fa:16:3e:cd:6e:67", "network": {"id": "7754c79a-cca5-48c7-9169-831eaad23ccc", "bridge": null, "label": "tempest-DeleteServersTestJSON-484493292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c2c11ebecb14f3188f35ea473c4ca02", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap8c8b83a0-49", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:21:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2824117336' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:21:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e278 e278: 3 total, 3 up, 3 in
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.686 2 DEBUG oslo_concurrency.lockutils [req-8f611a54-719a-4c2e-9114-5c3df0c082af req-f57077cc-1c57-4474-b06e-1d4efe5cb95c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-9c7a04c1-a740-4d58-bb92-b34f14ccff42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.697 2 DEBUG oslo_concurrency.processutils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.705 2 DEBUG nova.compute.provider_tree [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.715 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407701.7139094, 9c7a04c1-a740-4d58-bb92-b34f14ccff42 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.715 2 INFO nova.compute.manager [-] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.724 2 DEBUG nova.scheduler.client.report [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.734 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[df579c29-40c6-4340-a602-e73a33abe309]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.736 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5f671ae-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.736 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.737 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5f671ae-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:56 np0005465988 NetworkManager[45041]: <info>  [1759407716.7402] manager: (tapd5f671ae-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Oct  2 08:21:56 np0005465988 kernel: tapd5f671ae-b0: entered promiscuous mode
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.744 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd5f671ae-b0, col_values=(('external_ids', {'iface-id': '18276c7d-4e7d-4b5c-a013-87c3ea8e7868'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:21:56Z|00352|binding|INFO|Releasing lport 18276c7d-4e7d-4b5c-a013-87c3ea8e7868 from this chassis (sb_readonly=0)
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.765 2 DEBUG nova.compute.manager [None req-983c9be0-6075-4a7a-82cf-980d8b24371c - - - - - -] [instance: 9c7a04c1-a740-4d58-bb92-b34f14ccff42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.770 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f671ae-bb65-4932-84ce-cef4210e4599.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f671ae-bb65-4932-84ce-cef4210e4599.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.771 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d3311029-eef2-4a06-8ad5-cd7da77970c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.773 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-d5f671ae-bb65-4932-84ce-cef4210e4599
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/d5f671ae-bb65-4932-84ce-cef4210e4599.pid.haproxy
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID d5f671ae-bb65-4932-84ce-cef4210e4599
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:21:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:21:56.774 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'env', 'PROCESS_TAG=haproxy-d5f671ae-bb65-4932-84ce-cef4210e4599', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d5f671ae-bb65-4932-84ce-cef4210e4599.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.814 2 DEBUG oslo_concurrency.lockutils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:56 np0005465988 nova_compute[236126]: 2025-10-02 12:21:56.986 2 DEBUG oslo_concurrency.lockutils [None req-1bdaa187-fbd7-4634-81be-a209c7ad157d a9f7faffac7240869a0196df1ddda7e5 1c2c11ebecb14f3188f35ea473c4ca02 - - default default] Lock "9c7a04c1-a740-4d58-bb92-b34f14ccff42" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 29.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:57 np0005465988 podman[273907]: 2025-10-02 12:21:57.205626991 +0000 UTC m=+0.072967495 container create c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:21:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:57.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:57 np0005465988 podman[273907]: 2025-10-02 12:21:57.172301331 +0000 UTC m=+0.039641825 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:21:57 np0005465988 systemd[1]: Started libpod-conmon-c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2.scope.
Oct  2 08:21:57 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:21:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:57.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:57 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ae8a9327a574cf18d50ce8534c5d794212191a572b93dfc73eb571439a05eda/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:21:57 np0005465988 podman[273907]: 2025-10-02 12:21:57.345904244 +0000 UTC m=+0.213244728 container init c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:21:57 np0005465988 podman[273907]: 2025-10-02 12:21:57.351790135 +0000 UTC m=+0.219130599 container start c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:21:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:21:57 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[273929]: [NOTICE]   (273933) : New worker (273935) forked
Oct  2 08:21:57 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[273929]: [NOTICE]   (273933) : Loading success.
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.692 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407717.6920323, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.693 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Started (Lifecycle Event)#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.699 2 DEBUG nova.compute.manager [req-408c8083-0760-4b3e-948e-c0cfc94399ae req-539d7940-a14b-41ae-8623-a0d7465646d1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.699 2 DEBUG oslo_concurrency.lockutils [req-408c8083-0760-4b3e-948e-c0cfc94399ae req-539d7940-a14b-41ae-8623-a0d7465646d1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.700 2 DEBUG oslo_concurrency.lockutils [req-408c8083-0760-4b3e-948e-c0cfc94399ae req-539d7940-a14b-41ae-8623-a0d7465646d1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.700 2 DEBUG oslo_concurrency.lockutils [req-408c8083-0760-4b3e-948e-c0cfc94399ae req-539d7940-a14b-41ae-8623-a0d7465646d1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.701 2 DEBUG nova.compute.manager [req-408c8083-0760-4b3e-948e-c0cfc94399ae req-539d7940-a14b-41ae-8623-a0d7465646d1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Processing event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.701 2 DEBUG nova.compute.manager [req-408c8083-0760-4b3e-948e-c0cfc94399ae req-539d7940-a14b-41ae-8623-a0d7465646d1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.701 2 DEBUG oslo_concurrency.lockutils [req-408c8083-0760-4b3e-948e-c0cfc94399ae req-539d7940-a14b-41ae-8623-a0d7465646d1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.702 2 DEBUG oslo_concurrency.lockutils [req-408c8083-0760-4b3e-948e-c0cfc94399ae req-539d7940-a14b-41ae-8623-a0d7465646d1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.702 2 DEBUG oslo_concurrency.lockutils [req-408c8083-0760-4b3e-948e-c0cfc94399ae req-539d7940-a14b-41ae-8623-a0d7465646d1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.702 2 DEBUG nova.compute.manager [req-408c8083-0760-4b3e-948e-c0cfc94399ae req-539d7940-a14b-41ae-8623-a0d7465646d1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.703 2 WARNING nova.compute.manager [req-408c8083-0760-4b3e-948e-c0cfc94399ae req-539d7940-a14b-41ae-8623-a0d7465646d1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.704 2 DEBUG nova.compute.manager [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.708 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.713 2 INFO nova.virt.libvirt.driver [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance spawned successfully.#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.713 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.718 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.723 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.735 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.736 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.737 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.737 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.738 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.739 2 DEBUG nova.virt.libvirt.driver [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.749 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.749 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407717.6922317, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.750 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.793 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.798 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407717.7075434, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.799 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.830 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.835 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.864 2 INFO nova.compute.manager [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Took 7.76 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.865 2 DEBUG nova.compute.manager [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.883 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.959 2 INFO nova.compute.manager [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Took 8.76 seconds to build instance.#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:57 np0005465988 nova_compute[236126]: 2025-10-02 12:21:57.989 2 DEBUG oslo_concurrency.lockutils [None req-2ca34764-9e7d-4f2c-85e1-1340a5a8edc8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:58 np0005465988 nova_compute[236126]: 2025-10-02 12:21:58.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:58 np0005465988 nova_compute[236126]: 2025-10-02 12:21:58.500 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:58 np0005465988 nova_compute[236126]: 2025-10-02 12:21:58.500 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:21:58 np0005465988 nova_compute[236126]: 2025-10-02 12:21:58.501 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:21:58 np0005465988 nova_compute[236126]: 2025-10-02 12:21:58.868 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:58 np0005465988 nova_compute[236126]: 2025-10-02 12:21:58.869 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:58 np0005465988 nova_compute[236126]: 2025-10-02 12:21:58.869 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:21:58 np0005465988 nova_compute[236126]: 2025-10-02 12:21:58.869 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e279 e279: 3 total, 3 up, 3 in
Oct  2 08:21:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:21:59.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:21:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:21:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:21:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:21:59.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:00 np0005465988 nova_compute[236126]: 2025-10-02 12:22:00.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:22:00 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:22:00 np0005465988 nova_compute[236126]: 2025-10-02 12:22:00.782 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating instance_info_cache with network_info: [{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:00 np0005465988 nova_compute[236126]: 2025-10-02 12:22:00.799 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:00 np0005465988 nova_compute[236126]: 2025-10-02 12:22:00.800 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:22:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:01.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:01.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:02 np0005465988 nova_compute[236126]: 2025-10-02 12:22:02.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:03.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:22:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:03.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:22:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e280 e280: 3 total, 3 up, 3 in
Oct  2 08:22:05 np0005465988 nova_compute[236126]: 2025-10-02 12:22:05.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:05.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:05.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:07.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:07.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:07 np0005465988 podman[274050]: 2025-10-02 12:22:07.568628489 +0000 UTC m=+0.095857481 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:22:07 np0005465988 podman[274051]: 2025-10-02 12:22:07.589005492 +0000 UTC m=+0.103789452 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct  2 08:22:07 np0005465988 podman[274049]: 2025-10-02 12:22:07.613090533 +0000 UTC m=+0.136109972 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  2 08:22:07 np0005465988 nova_compute[236126]: 2025-10-02 12:22:07.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:09.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:09.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 e281: 3 total, 3 up, 3 in
Oct  2 08:22:10 np0005465988 nova_compute[236126]: 2025-10-02 12:22:10.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:11.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:11.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:11 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:11Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:80:0c:a5 10.100.0.11
Oct  2 08:22:11 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:11Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:80:0c:a5 10.100.0.11
Oct  2 08:22:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:12 np0005465988 nova_compute[236126]: 2025-10-02 12:22:12.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:13.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:13.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:15 np0005465988 nova_compute[236126]: 2025-10-02 12:22:15.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:22:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:15.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:22:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:15.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:17.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:17.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:17 np0005465988 nova_compute[236126]: 2025-10-02 12:22:17.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:18 np0005465988 nova_compute[236126]: 2025-10-02 12:22:18.131 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Acquiring lock "78949d28-3c77-4033-8f30-5d1a3802169b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:18 np0005465988 nova_compute[236126]: 2025-10-02 12:22:18.132 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:18 np0005465988 nova_compute[236126]: 2025-10-02 12:22:18.168 2 DEBUG nova.compute.manager [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:22:18 np0005465988 nova_compute[236126]: 2025-10-02 12:22:18.285 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:18 np0005465988 nova_compute[236126]: 2025-10-02 12:22:18.286 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:18 np0005465988 nova_compute[236126]: 2025-10-02 12:22:18.294 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:22:18 np0005465988 nova_compute[236126]: 2025-10-02 12:22:18.294 2 INFO nova.compute.claims [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:22:18 np0005465988 nova_compute[236126]: 2025-10-02 12:22:18.652 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:22:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1522284553' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.153 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.163 2 DEBUG nova.compute.provider_tree [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.198 2 DEBUG nova.scheduler.client.report [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.234 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.948s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.235 2 DEBUG nova.compute.manager [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:22:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:19.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.332 2 DEBUG nova.compute.manager [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.333 2 DEBUG nova.network.neutron [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:22:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:19.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.362 2 INFO nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.395 2 DEBUG nova.compute.manager [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.535 2 DEBUG nova.compute.manager [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.536 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.536 2 INFO nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Creating image(s)#033[00m
Oct  2 08:22:19 np0005465988 podman[274140]: 2025-10-02 12:22:19.556578845 +0000 UTC m=+0.085630423 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.582 2 DEBUG nova.storage.rbd_utils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] rbd image 78949d28-3c77-4033-8f30-5d1a3802169b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.632 2 DEBUG nova.storage.rbd_utils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] rbd image 78949d28-3c77-4033-8f30-5d1a3802169b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.674 2 DEBUG nova.storage.rbd_utils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] rbd image 78949d28-3c77-4033-8f30-5d1a3802169b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.683 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.718 2 DEBUG nova.policy [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93e805bcb0e047ca9d45c653f5ec913d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '91d108e807094b0fa8e63a923d2269ee', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.767 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.768 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.769 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.770 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.805 2 DEBUG nova.storage.rbd_utils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] rbd image 78949d28-3c77-4033-8f30-5d1a3802169b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:22:19 np0005465988 nova_compute[236126]: 2025-10-02 12:22:19.810 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 78949d28-3c77-4033-8f30-5d1a3802169b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:20 np0005465988 nova_compute[236126]: 2025-10-02 12:22:20.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:20 np0005465988 nova_compute[236126]: 2025-10-02 12:22:20.723 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 78949d28-3c77-4033-8f30-5d1a3802169b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.913s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:20 np0005465988 nova_compute[236126]: 2025-10-02 12:22:20.801 2 DEBUG nova.network.neutron [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Successfully created port: 96d7ab9f-3868-403b-8702-fba5dffc1c3d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:22:20 np0005465988 nova_compute[236126]: 2025-10-02 12:22:20.811 2 DEBUG nova.storage.rbd_utils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] resizing rbd image 78949d28-3c77-4033-8f30-5d1a3802169b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:22:20 np0005465988 nova_compute[236126]: 2025-10-02 12:22:20.948 2 DEBUG nova.objects.instance [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lazy-loading 'migration_context' on Instance uuid 78949d28-3c77-4033-8f30-5d1a3802169b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:20 np0005465988 nova_compute[236126]: 2025-10-02 12:22:20.965 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:22:20 np0005465988 nova_compute[236126]: 2025-10-02 12:22:20.966 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Ensure instance console log exists: /var/lib/nova/instances/78949d28-3c77-4033-8f30-5d1a3802169b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:22:20 np0005465988 nova_compute[236126]: 2025-10-02 12:22:20.966 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:20 np0005465988 nova_compute[236126]: 2025-10-02 12:22:20.967 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:20 np0005465988 nova_compute[236126]: 2025-10-02 12:22:20.967 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:21.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:21.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:21 np0005465988 nova_compute[236126]: 2025-10-02 12:22:21.899 2 DEBUG nova.network.neutron [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Successfully updated port: 96d7ab9f-3868-403b-8702-fba5dffc1c3d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:22:21 np0005465988 nova_compute[236126]: 2025-10-02 12:22:21.945 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Acquiring lock "refresh_cache-78949d28-3c77-4033-8f30-5d1a3802169b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:21 np0005465988 nova_compute[236126]: 2025-10-02 12:22:21.946 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Acquired lock "refresh_cache-78949d28-3c77-4033-8f30-5d1a3802169b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:21 np0005465988 nova_compute[236126]: 2025-10-02 12:22:21.946 2 DEBUG nova.network.neutron [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:22:22 np0005465988 nova_compute[236126]: 2025-10-02 12:22:22.022 2 DEBUG nova.compute.manager [req-700836ed-975f-48a7-aa0d-3e29e867f4c0 req-4fd9d278-b510-4bda-a2cb-9d6cbb7140bb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Received event network-changed-96d7ab9f-3868-403b-8702-fba5dffc1c3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:22 np0005465988 nova_compute[236126]: 2025-10-02 12:22:22.022 2 DEBUG nova.compute.manager [req-700836ed-975f-48a7-aa0d-3e29e867f4c0 req-4fd9d278-b510-4bda-a2cb-9d6cbb7140bb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Refreshing instance network info cache due to event network-changed-96d7ab9f-3868-403b-8702-fba5dffc1c3d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:22:22 np0005465988 nova_compute[236126]: 2025-10-02 12:22:22.022 2 DEBUG oslo_concurrency.lockutils [req-700836ed-975f-48a7-aa0d-3e29e867f4c0 req-4fd9d278-b510-4bda-a2cb-9d6cbb7140bb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-78949d28-3c77-4033-8f30-5d1a3802169b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:22 np0005465988 nova_compute[236126]: 2025-10-02 12:22:22.122 2 DEBUG nova.network.neutron [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:22:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:22 np0005465988 nova_compute[236126]: 2025-10-02 12:22:22.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.017 2 DEBUG nova.network.neutron [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Updating instance_info_cache with network_info: [{"id": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "address": "fa:16:3e:2f:ef:21", "network": {"id": "81df0bd4-1de1-409c-8730-5d718bbb9ab0", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-681141919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91d108e807094b0fa8e63a923d2269ee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96d7ab9f-38", "ovs_interfaceid": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.042 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Releasing lock "refresh_cache-78949d28-3c77-4033-8f30-5d1a3802169b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.043 2 DEBUG nova.compute.manager [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Instance network_info: |[{"id": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "address": "fa:16:3e:2f:ef:21", "network": {"id": "81df0bd4-1de1-409c-8730-5d718bbb9ab0", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-681141919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91d108e807094b0fa8e63a923d2269ee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96d7ab9f-38", "ovs_interfaceid": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.044 2 DEBUG oslo_concurrency.lockutils [req-700836ed-975f-48a7-aa0d-3e29e867f4c0 req-4fd9d278-b510-4bda-a2cb-9d6cbb7140bb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-78949d28-3c77-4033-8f30-5d1a3802169b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.044 2 DEBUG nova.network.neutron [req-700836ed-975f-48a7-aa0d-3e29e867f4c0 req-4fd9d278-b510-4bda-a2cb-9d6cbb7140bb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Refreshing network info cache for port 96d7ab9f-3868-403b-8702-fba5dffc1c3d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.050 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Start _get_guest_xml network_info=[{"id": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "address": "fa:16:3e:2f:ef:21", "network": {"id": "81df0bd4-1de1-409c-8730-5d718bbb9ab0", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-681141919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91d108e807094b0fa8e63a923d2269ee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96d7ab9f-38", "ovs_interfaceid": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.060 2 WARNING nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.065 2 DEBUG nova.virt.libvirt.host [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.066 2 DEBUG nova.virt.libvirt.host [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.077 2 DEBUG nova.virt.libvirt.host [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.078 2 DEBUG nova.virt.libvirt.host [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.080 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.081 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.082 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.082 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.083 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.083 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.084 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.084 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.085 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.085 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.086 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.086 2 DEBUG nova.virt.hardware [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.091 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:23.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:23.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:22:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/376438984' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.583 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.625 2 DEBUG nova.storage.rbd_utils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] rbd image 78949d28-3c77-4033-8f30-5d1a3802169b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:22:23 np0005465988 nova_compute[236126]: 2025-10-02 12:22:23.630 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:22:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3974811453' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.107 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.110 2 DEBUG nova.virt.libvirt.vif [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:22:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-478094513',display_name='tempest-ListServersNegativeTestJSON-server-478094513-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-478094513-2',id=95,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='91d108e807094b0fa8e63a923d2269ee',ramdisk_id='',reservation_id='r-jgpyr0ft',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1913538533',owner_user_name='tempest-ListServersNegativeTestJSON-1913538533-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:22:19Z,user_data=None,user_id='93e805bcb0e047ca9d45c653f5ec913d',uuid=78949d28-3c77-4033-8f30-5d1a3802169b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "address": "fa:16:3e:2f:ef:21", "network": {"id": "81df0bd4-1de1-409c-8730-5d718bbb9ab0", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-681141919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91d108e807094b0fa8e63a923d2269ee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96d7ab9f-38", "ovs_interfaceid": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.111 2 DEBUG nova.network.os_vif_util [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Converting VIF {"id": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "address": "fa:16:3e:2f:ef:21", "network": {"id": "81df0bd4-1de1-409c-8730-5d718bbb9ab0", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-681141919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91d108e807094b0fa8e63a923d2269ee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96d7ab9f-38", "ovs_interfaceid": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.113 2 DEBUG nova.network.os_vif_util [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:ef:21,bridge_name='br-int',has_traffic_filtering=True,id=96d7ab9f-3868-403b-8702-fba5dffc1c3d,network=Network(81df0bd4-1de1-409c-8730-5d718bbb9ab0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96d7ab9f-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.115 2 DEBUG nova.objects.instance [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lazy-loading 'pci_devices' on Instance uuid 78949d28-3c77-4033-8f30-5d1a3802169b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.141 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  <uuid>78949d28-3c77-4033-8f30-5d1a3802169b</uuid>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  <name>instance-0000005f</name>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <nova:name>tempest-ListServersNegativeTestJSON-server-478094513-2</nova:name>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:22:23</nova:creationTime>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <nova:user uuid="93e805bcb0e047ca9d45c653f5ec913d">tempest-ListServersNegativeTestJSON-1913538533-project-member</nova:user>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <nova:project uuid="91d108e807094b0fa8e63a923d2269ee">tempest-ListServersNegativeTestJSON-1913538533</nova:project>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <nova:port uuid="96d7ab9f-3868-403b-8702-fba5dffc1c3d">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <entry name="serial">78949d28-3c77-4033-8f30-5d1a3802169b</entry>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <entry name="uuid">78949d28-3c77-4033-8f30-5d1a3802169b</entry>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/78949d28-3c77-4033-8f30-5d1a3802169b_disk">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/78949d28-3c77-4033-8f30-5d1a3802169b_disk.config">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:2f:ef:21"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <target dev="tap96d7ab9f-38"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/78949d28-3c77-4033-8f30-5d1a3802169b/console.log" append="off"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:22:24 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:22:24 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:22:24 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:22:24 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.144 2 DEBUG nova.compute.manager [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Preparing to wait for external event network-vif-plugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.145 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Acquiring lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.145 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.146 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.147 2 DEBUG nova.virt.libvirt.vif [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:22:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-478094513',display_name='tempest-ListServersNegativeTestJSON-server-478094513-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-478094513-2',id=95,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='91d108e807094b0fa8e63a923d2269ee',ramdisk_id='',reservation_id='r-jgpyr0ft',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1913538533',owner_user_name='tempest-ListServersNegativeTestJSON-1913538533-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:22:19Z,user_data=None,user_id='93e805bcb0e047ca9d45c653f5ec913d',uuid=78949d28-3c77-4033-8f30-5d1a3802169b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "address": "fa:16:3e:2f:ef:21", "network": {"id": "81df0bd4-1de1-409c-8730-5d718bbb9ab0", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-681141919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91d108e807094b0fa8e63a923d2269ee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96d7ab9f-38", "ovs_interfaceid": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.148 2 DEBUG nova.network.os_vif_util [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Converting VIF {"id": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "address": "fa:16:3e:2f:ef:21", "network": {"id": "81df0bd4-1de1-409c-8730-5d718bbb9ab0", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-681141919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91d108e807094b0fa8e63a923d2269ee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96d7ab9f-38", "ovs_interfaceid": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.149 2 DEBUG nova.network.os_vif_util [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:ef:21,bridge_name='br-int',has_traffic_filtering=True,id=96d7ab9f-3868-403b-8702-fba5dffc1c3d,network=Network(81df0bd4-1de1-409c-8730-5d718bbb9ab0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96d7ab9f-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.149 2 DEBUG os_vif [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:ef:21,bridge_name='br-int',has_traffic_filtering=True,id=96d7ab9f-3868-403b-8702-fba5dffc1c3d,network=Network(81df0bd4-1de1-409c-8730-5d718bbb9ab0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96d7ab9f-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.151 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.152 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.157 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap96d7ab9f-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.158 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap96d7ab9f-38, col_values=(('external_ids', {'iface-id': '96d7ab9f-3868-403b-8702-fba5dffc1c3d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2f:ef:21', 'vm-uuid': '78949d28-3c77-4033-8f30-5d1a3802169b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:24 np0005465988 NetworkManager[45041]: <info>  [1759407744.2138] manager: (tap96d7ab9f-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.226 2 INFO os_vif [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:ef:21,bridge_name='br-int',has_traffic_filtering=True,id=96d7ab9f-3868-403b-8702-fba5dffc1c3d,network=Network(81df0bd4-1de1-409c-8730-5d718bbb9ab0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96d7ab9f-38')#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.318 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.319 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.319 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] No VIF found with MAC fa:16:3e:2f:ef:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.320 2 INFO nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Using config drive#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.362 2 DEBUG nova.storage.rbd_utils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] rbd image 78949d28-3c77-4033-8f30-5d1a3802169b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.949 2 INFO nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Creating config drive at /var/lib/nova/instances/78949d28-3c77-4033-8f30-5d1a3802169b/disk.config#033[00m
Oct  2 08:22:24 np0005465988 nova_compute[236126]: 2025-10-02 12:22:24.958 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/78949d28-3c77-4033-8f30-5d1a3802169b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpedn55u3g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.116 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/78949d28-3c77-4033-8f30-5d1a3802169b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpedn55u3g" returned: 0 in 0.158s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.167 2 DEBUG nova.storage.rbd_utils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] rbd image 78949d28-3c77-4033-8f30-5d1a3802169b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.172 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/78949d28-3c77-4033-8f30-5d1a3802169b/disk.config 78949d28-3c77-4033-8f30-5d1a3802169b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:22:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/684442568' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:22:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:25.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.310 2 DEBUG nova.network.neutron [req-700836ed-975f-48a7-aa0d-3e29e867f4c0 req-4fd9d278-b510-4bda-a2cb-9d6cbb7140bb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Updated VIF entry in instance network info cache for port 96d7ab9f-3868-403b-8702-fba5dffc1c3d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.311 2 DEBUG nova.network.neutron [req-700836ed-975f-48a7-aa0d-3e29e867f4c0 req-4fd9d278-b510-4bda-a2cb-9d6cbb7140bb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Updating instance_info_cache with network_info: [{"id": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "address": "fa:16:3e:2f:ef:21", "network": {"id": "81df0bd4-1de1-409c-8730-5d718bbb9ab0", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-681141919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91d108e807094b0fa8e63a923d2269ee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96d7ab9f-38", "ovs_interfaceid": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.340 2 DEBUG oslo_concurrency.lockutils [req-700836ed-975f-48a7-aa0d-3e29e867f4c0 req-4fd9d278-b510-4bda-a2cb-9d6cbb7140bb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-78949d28-3c77-4033-8f30-5d1a3802169b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:25.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.441 2 DEBUG oslo_concurrency.processutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/78949d28-3c77-4033-8f30-5d1a3802169b/disk.config 78949d28-3c77-4033-8f30-5d1a3802169b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.269s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.442 2 INFO nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Deleting local config drive /var/lib/nova/instances/78949d28-3c77-4033-8f30-5d1a3802169b/disk.config because it was imported into RBD.#033[00m
Oct  2 08:22:25 np0005465988 kernel: tap96d7ab9f-38: entered promiscuous mode
Oct  2 08:22:25 np0005465988 NetworkManager[45041]: <info>  [1759407745.4976] manager: (tap96d7ab9f-38): new Tun device (/org/freedesktop/NetworkManager/Devices/175)
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:25Z|00353|binding|INFO|Claiming lport 96d7ab9f-3868-403b-8702-fba5dffc1c3d for this chassis.
Oct  2 08:22:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:25Z|00354|binding|INFO|96d7ab9f-3868-403b-8702-fba5dffc1c3d: Claiming fa:16:3e:2f:ef:21 10.100.0.12
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.514 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:ef:21 10.100.0.12'], port_security=['fa:16:3e:2f:ef:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '78949d28-3c77-4033-8f30-5d1a3802169b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81df0bd4-1de1-409c-8730-5d718bbb9ab0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '91d108e807094b0fa8e63a923d2269ee', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1f5f828a-9c59-4be3-90d2-ef448e431573', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91243388-a708-4071-bc5f-90666534b8e0, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=96d7ab9f-3868-403b-8702-fba5dffc1c3d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.516 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 96d7ab9f-3868-403b-8702-fba5dffc1c3d in datapath 81df0bd4-1de1-409c-8730-5d718bbb9ab0 bound to our chassis#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.519 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81df0bd4-1de1-409c-8730-5d718bbb9ab0#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.537 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b568d31b-7d4e-4f4d-9909-a484a8e6568a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.538 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81df0bd4-11 in ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:22:25 np0005465988 systemd-udevd[274514]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.542 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81df0bd4-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.542 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[78539538-a205-484a-b03f-eaf820fe05f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.543 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2d06584f-4893-4e4a-8c95-8ddd2938b45d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 systemd-machined[192594]: New machine qemu-35-instance-0000005f.
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.556 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[5bec8aad-e3d4-453b-af9f-3f1c5baf2bd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 NetworkManager[45041]: <info>  [1759407745.5598] device (tap96d7ab9f-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:22:25 np0005465988 NetworkManager[45041]: <info>  [1759407745.5609] device (tap96d7ab9f-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:22:25 np0005465988 systemd[1]: Started Virtual Machine qemu-35-instance-0000005f.
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:25Z|00355|binding|INFO|Setting lport 96d7ab9f-3868-403b-8702-fba5dffc1c3d ovn-installed in OVS
Oct  2 08:22:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:25Z|00356|binding|INFO|Setting lport 96d7ab9f-3868-403b-8702-fba5dffc1c3d up in Southbound
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.585 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[56a7f3aa-27ca-44f0-8430-ef71d86223c2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.620 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9702c851-5f3c-47c8-9708-3eca785cd715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.628 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[892110b2-c88e-483d-930e-c9c9978857c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 NetworkManager[45041]: <info>  [1759407745.6301] manager: (tap81df0bd4-10): new Veth device (/org/freedesktop/NetworkManager/Devices/176)
Oct  2 08:22:25 np0005465988 systemd-udevd[274518]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.663 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[47f301df-bf43-4494-b2a7-c99340d813fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.667 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ab0cf840-b0b2-4b22-a25e-903be3927ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 NetworkManager[45041]: <info>  [1759407745.6914] device (tap81df0bd4-10): carrier: link connected
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.696 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9800b41e-a839-4b05-8bbf-d7ac89d9a8f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.723 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8f244c38-c815-425d-bbd4-1ea27e7aaf02]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81df0bd4-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:05:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580602, 'reachable_time': 15774, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274547, 'error': None, 'target': 'ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.744 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[52086bd8-c3ed-49a7-b1d4-4e475500436f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:5cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580602, 'tstamp': 580602}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274548, 'error': None, 'target': 'ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.773 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ca33bc-dbc0-42e4-811a-dde5c2ea33cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81df0bd4-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:05:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580602, 'reachable_time': 15774, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 274549, 'error': None, 'target': 'ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.820 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[96cd7f97-01a7-406c-834e-c461ea42e19c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.845 2 DEBUG nova.compute.manager [req-3ec117e1-ba16-4752-a03f-97407e3bc80f req-b65c1f9e-6d26-4c48-a69f-4008a044acfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Received event network-vif-plugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.846 2 DEBUG oslo_concurrency.lockutils [req-3ec117e1-ba16-4752-a03f-97407e3bc80f req-b65c1f9e-6d26-4c48-a69f-4008a044acfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.846 2 DEBUG oslo_concurrency.lockutils [req-3ec117e1-ba16-4752-a03f-97407e3bc80f req-b65c1f9e-6d26-4c48-a69f-4008a044acfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.847 2 DEBUG oslo_concurrency.lockutils [req-3ec117e1-ba16-4752-a03f-97407e3bc80f req-b65c1f9e-6d26-4c48-a69f-4008a044acfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.847 2 DEBUG nova.compute.manager [req-3ec117e1-ba16-4752-a03f-97407e3bc80f req-b65c1f9e-6d26-4c48-a69f-4008a044acfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Processing event network-vif-plugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.923 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bea07db6-3612-4a35-af23-27fadee0cf61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.925 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81df0bd4-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.925 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.926 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81df0bd4-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:25 np0005465988 kernel: tap81df0bd4-10: entered promiscuous mode
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:25 np0005465988 NetworkManager[45041]: <info>  [1759407745.9289] manager: (tap81df0bd4-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.934 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81df0bd4-10, col_values=(('external_ids', {'iface-id': '158c9d13-b7ad-4d55-8f96-3c408ed5e2d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:25Z|00357|binding|INFO|Releasing lport 158c9d13-b7ad-4d55-8f96-3c408ed5e2d5 from this chassis (sb_readonly=0)
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:25 np0005465988 nova_compute[236126]: 2025-10-02 12:22:25.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.970 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81df0bd4-1de1-409c-8730-5d718bbb9ab0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81df0bd4-1de1-409c-8730-5d718bbb9ab0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.971 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1eeef69b-e518-4ee1-8de5-2680a11bde20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.972 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-81df0bd4-1de1-409c-8730-5d718bbb9ab0
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/81df0bd4-1de1-409c-8730-5d718bbb9ab0.pid.haproxy
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 81df0bd4-1de1-409c-8730-5d718bbb9ab0
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:22:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:25.973 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0', 'env', 'PROCESS_TAG=haproxy-81df0bd4-1de1-409c-8730-5d718bbb9ab0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81df0bd4-1de1-409c-8730-5d718bbb9ab0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:26.049 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:26 np0005465988 podman[274623]: 2025-10-02 12:22:26.431160689 +0000 UTC m=+0.065535659 container create 06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:22:26 np0005465988 systemd[1]: Started libpod-conmon-06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83.scope.
Oct  2 08:22:26 np0005465988 podman[274623]: 2025-10-02 12:22:26.394548423 +0000 UTC m=+0.028923373 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:22:26 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:22:26 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba1398d660db8730aa0fa94cdc0771ace0bd919b41b701ff71116880329fac6d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:22:26 np0005465988 podman[274623]: 2025-10-02 12:22:26.544856497 +0000 UTC m=+0.179231527 container init 06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:22:26 np0005465988 podman[274623]: 2025-10-02 12:22:26.550309616 +0000 UTC m=+0.184684586 container start 06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:22:26 np0005465988 neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0[274639]: [NOTICE]   (274643) : New worker (274645) forked
Oct  2 08:22:26 np0005465988 neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0[274639]: [NOTICE]   (274643) : Loading success.
Oct  2 08:22:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:26.629 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.774 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407746.7738535, 78949d28-3c77-4033-8f30-5d1a3802169b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.775 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] VM Started (Lifecycle Event)#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.778 2 DEBUG nova.compute.manager [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.787 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.791 2 INFO nova.virt.libvirt.driver [-] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Instance spawned successfully.#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.791 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.800 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.805 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.823 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.824 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.825 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.826 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.826 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.827 2 DEBUG nova.virt.libvirt.driver [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.833 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.833 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407746.7772672, 78949d28-3c77-4033-8f30-5d1a3802169b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.833 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.887 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.893 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407746.7867286, 78949d28-3c77-4033-8f30-5d1a3802169b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.893 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.911 2 INFO nova.compute.manager [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Took 7.38 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.911 2 DEBUG nova.compute.manager [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.916 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.926 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.956 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:22:26 np0005465988 nova_compute[236126]: 2025-10-02 12:22:26.991 2 INFO nova.compute.manager [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Took 8.74 seconds to build instance.#033[00m
Oct  2 08:22:27 np0005465988 nova_compute[236126]: 2025-10-02 12:22:27.013 2 DEBUG oslo_concurrency.lockutils [None req-8696be7c-0aea-4749-a57d-fcc63d09ff90 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:27.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:27.352 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:27.353 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:27.354 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:27.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:27.631 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:27 np0005465988 nova_compute[236126]: 2025-10-02 12:22:27.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:28 np0005465988 nova_compute[236126]: 2025-10-02 12:22:28.043 2 DEBUG nova.compute.manager [req-52918d57-c8d0-4254-9aff-36d360973fb8 req-05586fa4-2cc3-4406-b169-f9bd54bbdc53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Received event network-vif-plugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:28 np0005465988 nova_compute[236126]: 2025-10-02 12:22:28.044 2 DEBUG oslo_concurrency.lockutils [req-52918d57-c8d0-4254-9aff-36d360973fb8 req-05586fa4-2cc3-4406-b169-f9bd54bbdc53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:28 np0005465988 nova_compute[236126]: 2025-10-02 12:22:28.045 2 DEBUG oslo_concurrency.lockutils [req-52918d57-c8d0-4254-9aff-36d360973fb8 req-05586fa4-2cc3-4406-b169-f9bd54bbdc53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:28 np0005465988 nova_compute[236126]: 2025-10-02 12:22:28.045 2 DEBUG oslo_concurrency.lockutils [req-52918d57-c8d0-4254-9aff-36d360973fb8 req-05586fa4-2cc3-4406-b169-f9bd54bbdc53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:28 np0005465988 nova_compute[236126]: 2025-10-02 12:22:28.045 2 DEBUG nova.compute.manager [req-52918d57-c8d0-4254-9aff-36d360973fb8 req-05586fa4-2cc3-4406-b169-f9bd54bbdc53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] No waiting events found dispatching network-vif-plugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:28 np0005465988 nova_compute[236126]: 2025-10-02 12:22:28.046 2 WARNING nova.compute.manager [req-52918d57-c8d0-4254-9aff-36d360973fb8 req-05586fa4-2cc3-4406-b169-f9bd54bbdc53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Received unexpected event network-vif-plugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d for instance with vm_state active and task_state None.#033[00m
Oct  2 08:22:29 np0005465988 nova_compute[236126]: 2025-10-02 12:22:29.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:29.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:29.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:31.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:31.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:32 np0005465988 nova_compute[236126]: 2025-10-02 12:22:32.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:33.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:33.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:34 np0005465988 nova_compute[236126]: 2025-10-02 12:22:34.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:35.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:35.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:37.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:37.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:37 np0005465988 nova_compute[236126]: 2025-10-02 12:22:37.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:38 np0005465988 podman[274662]: 2025-10-02 12:22:38.571015074 +0000 UTC m=+0.086673434 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2)
Oct  2 08:22:38 np0005465988 podman[274660]: 2025-10-02 12:22:38.59147751 +0000 UTC m=+0.120320423 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct  2 08:22:38 np0005465988 podman[274661]: 2025-10-02 12:22:38.60316056 +0000 UTC m=+0.121607741 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:22:39 np0005465988 nova_compute[236126]: 2025-10-02 12:22:39.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:39.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:39.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:39Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2f:ef:21 10.100.0.12
Oct  2 08:22:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:39Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2f:ef:21 10.100.0.12
Oct  2 08:22:39 np0005465988 nova_compute[236126]: 2025-10-02 12:22:39.876 2 DEBUG oslo_concurrency.lockutils [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Acquiring lock "78949d28-3c77-4033-8f30-5d1a3802169b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:39 np0005465988 nova_compute[236126]: 2025-10-02 12:22:39.876 2 DEBUG oslo_concurrency.lockutils [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:39 np0005465988 nova_compute[236126]: 2025-10-02 12:22:39.877 2 DEBUG oslo_concurrency.lockutils [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Acquiring lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:39 np0005465988 nova_compute[236126]: 2025-10-02 12:22:39.878 2 DEBUG oslo_concurrency.lockutils [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:39 np0005465988 nova_compute[236126]: 2025-10-02 12:22:39.878 2 DEBUG oslo_concurrency.lockutils [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:39 np0005465988 nova_compute[236126]: 2025-10-02 12:22:39.881 2 INFO nova.compute.manager [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Terminating instance#033[00m
Oct  2 08:22:39 np0005465988 nova_compute[236126]: 2025-10-02 12:22:39.882 2 DEBUG nova.compute.manager [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:22:39 np0005465988 kernel: tap96d7ab9f-38 (unregistering): left promiscuous mode
Oct  2 08:22:39 np0005465988 NetworkManager[45041]: <info>  [1759407759.9520] device (tap96d7ab9f-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:22:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:39Z|00358|binding|INFO|Releasing lport 96d7ab9f-3868-403b-8702-fba5dffc1c3d from this chassis (sb_readonly=0)
Oct  2 08:22:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:39Z|00359|binding|INFO|Setting lport 96d7ab9f-3868-403b-8702-fba5dffc1c3d down in Southbound
Oct  2 08:22:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:39Z|00360|binding|INFO|Removing iface tap96d7ab9f-38 ovn-installed in OVS
Oct  2 08:22:39 np0005465988 nova_compute[236126]: 2025-10-02 12:22:39.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:39.973 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:ef:21 10.100.0.12'], port_security=['fa:16:3e:2f:ef:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '78949d28-3c77-4033-8f30-5d1a3802169b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81df0bd4-1de1-409c-8730-5d718bbb9ab0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '91d108e807094b0fa8e63a923d2269ee', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1f5f828a-9c59-4be3-90d2-ef448e431573', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91243388-a708-4071-bc5f-90666534b8e0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=96d7ab9f-3868-403b-8702-fba5dffc1c3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:39.975 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 96d7ab9f-3868-403b-8702-fba5dffc1c3d in datapath 81df0bd4-1de1-409c-8730-5d718bbb9ab0 unbound from our chassis#033[00m
Oct  2 08:22:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:39.979 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81df0bd4-1de1-409c-8730-5d718bbb9ab0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:22:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:39.981 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b8a372-89f6-455b-919e-640e9d062502]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:39.985 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0 namespace which is not needed anymore#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005465988 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Oct  2 08:22:40 np0005465988 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000005f.scope: Consumed 13.364s CPU time.
Oct  2 08:22:40 np0005465988 systemd-machined[192594]: Machine qemu-35-instance-0000005f terminated.
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.126 2 INFO nova.virt.libvirt.driver [-] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Instance destroyed successfully.#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.127 2 DEBUG nova.objects.instance [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lazy-loading 'resources' on Instance uuid 78949d28-3c77-4033-8f30-5d1a3802169b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.142 2 DEBUG nova.virt.libvirt.vif [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:22:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-478094513',display_name='tempest-ListServersNegativeTestJSON-server-478094513-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-478094513-2',id=95,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-10-02T12:22:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='91d108e807094b0fa8e63a923d2269ee',ramdisk_id='',reservation_id='r-jgpyr0ft',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1913538533',owner_user_name='tempest-ListServersNegativeTestJSON-1913538533-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:22:26Z,user_data=None,user_id='93e805bcb0e047ca9d45c653f5ec913d',uuid=78949d28-3c77-4033-8f30-5d1a3802169b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "address": "fa:16:3e:2f:ef:21", "network": {"id": "81df0bd4-1de1-409c-8730-5d718bbb9ab0", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-681141919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91d108e807094b0fa8e63a923d2269ee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96d7ab9f-38", "ovs_interfaceid": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.143 2 DEBUG nova.network.os_vif_util [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Converting VIF {"id": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "address": "fa:16:3e:2f:ef:21", "network": {"id": "81df0bd4-1de1-409c-8730-5d718bbb9ab0", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-681141919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91d108e807094b0fa8e63a923d2269ee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96d7ab9f-38", "ovs_interfaceid": "96d7ab9f-3868-403b-8702-fba5dffc1c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.144 2 DEBUG nova.network.os_vif_util [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:ef:21,bridge_name='br-int',has_traffic_filtering=True,id=96d7ab9f-3868-403b-8702-fba5dffc1c3d,network=Network(81df0bd4-1de1-409c-8730-5d718bbb9ab0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96d7ab9f-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.145 2 DEBUG os_vif [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:ef:21,bridge_name='br-int',has_traffic_filtering=True,id=96d7ab9f-3868-403b-8702-fba5dffc1c3d,network=Network(81df0bd4-1de1-409c-8730-5d718bbb9ab0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96d7ab9f-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.152 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96d7ab9f-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.160 2 INFO os_vif [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:ef:21,bridge_name='br-int',has_traffic_filtering=True,id=96d7ab9f-3868-403b-8702-fba5dffc1c3d,network=Network(81df0bd4-1de1-409c-8730-5d718bbb9ab0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96d7ab9f-38')#033[00m
Oct  2 08:22:40 np0005465988 neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0[274639]: [NOTICE]   (274643) : haproxy version is 2.8.14-c23fe91
Oct  2 08:22:40 np0005465988 neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0[274639]: [NOTICE]   (274643) : path to executable is /usr/sbin/haproxy
Oct  2 08:22:40 np0005465988 neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0[274639]: [ALERT]    (274643) : Current worker (274645) exited with code 143 (Terminated)
Oct  2 08:22:40 np0005465988 neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0[274639]: [WARNING]  (274643) : All workers exited. Exiting... (0)
Oct  2 08:22:40 np0005465988 systemd[1]: libpod-06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83.scope: Deactivated successfully.
Oct  2 08:22:40 np0005465988 podman[274755]: 2025-10-02 12:22:40.208356062 +0000 UTC m=+0.078676551 container died 06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:22:40 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83-userdata-shm.mount: Deactivated successfully.
Oct  2 08:22:40 np0005465988 systemd[1]: var-lib-containers-storage-overlay-ba1398d660db8730aa0fa94cdc0771ace0bd919b41b701ff71116880329fac6d-merged.mount: Deactivated successfully.
Oct  2 08:22:40 np0005465988 podman[274755]: 2025-10-02 12:22:40.257376928 +0000 UTC m=+0.127697167 container cleanup 06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:22:40 np0005465988 systemd[1]: libpod-conmon-06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83.scope: Deactivated successfully.
Oct  2 08:22:40 np0005465988 podman[274812]: 2025-10-02 12:22:40.33542017 +0000 UTC m=+0.048785271 container remove 06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:22:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:40.343 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4c413451-f567-4dda-9516-4d483f136301]: (4, ('Thu Oct  2 12:22:40 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0 (06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83)\n06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83\nThu Oct  2 12:22:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0 (06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83)\n06e5773a8533d2bc2df87716c77a2f293c6668f8b90f4918c6c741bbee965d83\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:40.346 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dd26bd3a-f7a2-4c8c-a50e-33bafd3ae9a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:40.347 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81df0bd4-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005465988 kernel: tap81df0bd4-10: left promiscuous mode
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:40.375 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[784dc3dc-6904-4bef-8430-9e4f494a035a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:40.413 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[83dfdbba-8197-4995-8dd2-a962a7149c78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:40.415 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ea5f8549-424e-4322-ae89-6df15c77cbec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:40.436 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[01139648-dcf6-435e-9841-b1aeef4a191d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580594, 'reachable_time': 22212, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274829, 'error': None, 'target': 'ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:40.440 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81df0bd4-1de1-409c-8730-5d718bbb9ab0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:22:40 np0005465988 systemd[1]: run-netns-ovnmeta\x2d81df0bd4\x2d1de1\x2d409c\x2d8730\x2d5d718bbb9ab0.mount: Deactivated successfully.
Oct  2 08:22:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:22:40.440 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[1e2dec48-c6d5-4816-ac95-61115f7b338e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.705 2 INFO nova.virt.libvirt.driver [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Deleting instance files /var/lib/nova/instances/78949d28-3c77-4033-8f30-5d1a3802169b_del#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.706 2 INFO nova.virt.libvirt.driver [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Deletion of /var/lib/nova/instances/78949d28-3c77-4033-8f30-5d1a3802169b_del complete#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.763 2 DEBUG nova.compute.manager [req-07c9d06a-caaf-46db-9b14-921ef32b9336 req-14845b93-d2e2-4055-956a-c939bad1339f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Received event network-vif-unplugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.764 2 DEBUG oslo_concurrency.lockutils [req-07c9d06a-caaf-46db-9b14-921ef32b9336 req-14845b93-d2e2-4055-956a-c939bad1339f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.764 2 DEBUG oslo_concurrency.lockutils [req-07c9d06a-caaf-46db-9b14-921ef32b9336 req-14845b93-d2e2-4055-956a-c939bad1339f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.765 2 DEBUG oslo_concurrency.lockutils [req-07c9d06a-caaf-46db-9b14-921ef32b9336 req-14845b93-d2e2-4055-956a-c939bad1339f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.765 2 DEBUG nova.compute.manager [req-07c9d06a-caaf-46db-9b14-921ef32b9336 req-14845b93-d2e2-4055-956a-c939bad1339f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] No waiting events found dispatching network-vif-unplugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.766 2 DEBUG nova.compute.manager [req-07c9d06a-caaf-46db-9b14-921ef32b9336 req-14845b93-d2e2-4055-956a-c939bad1339f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Received event network-vif-unplugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.772 2 INFO nova.compute.manager [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Took 0.89 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.773 2 DEBUG oslo.service.loopingcall [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.774 2 DEBUG nova.compute.manager [-] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:22:40 np0005465988 nova_compute[236126]: 2025-10-02 12:22:40.774 2 DEBUG nova.network.neutron [-] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:22:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:41.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:41.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:41 np0005465988 nova_compute[236126]: 2025-10-02 12:22:41.767 2 DEBUG nova.network.neutron [-] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:41 np0005465988 nova_compute[236126]: 2025-10-02 12:22:41.788 2 INFO nova.compute.manager [-] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Took 1.01 seconds to deallocate network for instance.#033[00m
Oct  2 08:22:41 np0005465988 nova_compute[236126]: 2025-10-02 12:22:41.837 2 DEBUG oslo_concurrency.lockutils [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:41 np0005465988 nova_compute[236126]: 2025-10-02 12:22:41.838 2 DEBUG oslo_concurrency.lockutils [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:41 np0005465988 nova_compute[236126]: 2025-10-02 12:22:41.857 2 DEBUG nova.compute.manager [req-c8023b3c-f430-4077-ab99-9b97548b7546 req-614d7d2a-57a3-4a44-8e90-a93ba5607785 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Received event network-vif-deleted-96d7ab9f-3868-403b-8702-fba5dffc1c3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:41 np0005465988 nova_compute[236126]: 2025-10-02 12:22:41.933 2 DEBUG oslo_concurrency.processutils [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:22:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/514671987' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.418 2 DEBUG oslo_concurrency.processutils [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.426 2 DEBUG nova.compute.provider_tree [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.452 2 DEBUG nova.scheduler.client.report [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.473 2 DEBUG oslo_concurrency.lockutils [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.499 2 INFO nova.scheduler.client.report [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Deleted allocations for instance 78949d28-3c77-4033-8f30-5d1a3802169b#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.598 2 DEBUG oslo_concurrency.lockutils [None req-7f8b8dfb-c6b1-4e0a-9501-007c58298231 93e805bcb0e047ca9d45c653f5ec913d 91d108e807094b0fa8e63a923d2269ee - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.952 2 DEBUG nova.compute.manager [req-e9686be9-b18b-46db-a892-aabfea2112b1 req-bee4a601-f1a3-405a-bf9e-e668bf79c17f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Received event network-vif-plugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.953 2 DEBUG oslo_concurrency.lockutils [req-e9686be9-b18b-46db-a892-aabfea2112b1 req-bee4a601-f1a3-405a-bf9e-e668bf79c17f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.953 2 DEBUG oslo_concurrency.lockutils [req-e9686be9-b18b-46db-a892-aabfea2112b1 req-bee4a601-f1a3-405a-bf9e-e668bf79c17f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.953 2 DEBUG oslo_concurrency.lockutils [req-e9686be9-b18b-46db-a892-aabfea2112b1 req-bee4a601-f1a3-405a-bf9e-e668bf79c17f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "78949d28-3c77-4033-8f30-5d1a3802169b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.954 2 DEBUG nova.compute.manager [req-e9686be9-b18b-46db-a892-aabfea2112b1 req-bee4a601-f1a3-405a-bf9e-e668bf79c17f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] No waiting events found dispatching network-vif-plugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.954 2 WARNING nova.compute.manager [req-e9686be9-b18b-46db-a892-aabfea2112b1 req-bee4a601-f1a3-405a-bf9e-e668bf79c17f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Received unexpected event network-vif-plugged-96d7ab9f-3868-403b-8702-fba5dffc1c3d for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:22:42 np0005465988 nova_compute[236126]: 2025-10-02 12:22:42.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:43.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:43.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:45 np0005465988 nova_compute[236126]: 2025-10-02 12:22:45.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:45.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:45.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:45 np0005465988 nova_compute[236126]: 2025-10-02 12:22:45.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:45 np0005465988 nova_compute[236126]: 2025-10-02 12:22:45.558 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:45 np0005465988 nova_compute[236126]: 2025-10-02 12:22:45.559 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:45 np0005465988 nova_compute[236126]: 2025-10-02 12:22:45.560 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:45 np0005465988 nova_compute[236126]: 2025-10-02 12:22:45.560 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:22:45 np0005465988 nova_compute[236126]: 2025-10-02 12:22:45.561 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:22:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4184840037' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.061 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.161 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000005a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.161 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000005a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.344 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.346 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4245MB free_disk=20.7677001953125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.346 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.347 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.436 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 3a4d32fc-bed8-4e11-9033-5b73501128fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.436 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.436 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.490 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:22:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3969393644' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.960 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:46 np0005465988 nova_compute[236126]: 2025-10-02 12:22:46.970 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:22:47 np0005465988 nova_compute[236126]: 2025-10-02 12:22:47.004 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:22:47 np0005465988 nova_compute[236126]: 2025-10-02 12:22:47.047 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:22:47 np0005465988 nova_compute[236126]: 2025-10-02 12:22:47.048 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:47.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:47.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:47 np0005465988 nova_compute[236126]: 2025-10-02 12:22:47.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:49 np0005465988 nova_compute[236126]: 2025-10-02 12:22:49.049 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:49.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:49.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:49 np0005465988 nova_compute[236126]: 2025-10-02 12:22:49.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:49 np0005465988 ovn_controller[132601]: 2025-10-02T12:22:49Z|00361|binding|INFO|Releasing lport 18276c7d-4e7d-4b5c-a013-87c3ea8e7868 from this chassis (sb_readonly=0)
Oct  2 08:22:49 np0005465988 nova_compute[236126]: 2025-10-02 12:22:49.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:50 np0005465988 nova_compute[236126]: 2025-10-02 12:22:50.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:50 np0005465988 nova_compute[236126]: 2025-10-02 12:22:50.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:50 np0005465988 nova_compute[236126]: 2025-10-02 12:22:50.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:50 np0005465988 nova_compute[236126]: 2025-10-02 12:22:50.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:22:50 np0005465988 podman[274953]: 2025-10-02 12:22:50.570921617 +0000 UTC m=+0.096617073 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:22:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:51.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:51.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:51 np0005465988 nova_compute[236126]: 2025-10-02 12:22:51.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:52 np0005465988 nova_compute[236126]: 2025-10-02 12:22:52.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:53 np0005465988 nova_compute[236126]: 2025-10-02 12:22:53.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:53.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:53.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:54 np0005465988 nova_compute[236126]: 2025-10-02 12:22:54.470 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:55 np0005465988 nova_compute[236126]: 2025-10-02 12:22:55.120 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407760.1199462, 78949d28-3c77-4033-8f30-5d1a3802169b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:55 np0005465988 nova_compute[236126]: 2025-10-02 12:22:55.121 2 INFO nova.compute.manager [-] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:22:55 np0005465988 nova_compute[236126]: 2025-10-02 12:22:55.138 2 DEBUG nova.compute.manager [None req-ef2ed0ab-2a76-4d7b-88e1-c7bf30575cf9 - - - - - -] [instance: 78949d28-3c77-4033-8f30-5d1a3802169b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:55 np0005465988 nova_compute[236126]: 2025-10-02 12:22:55.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:22:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:55.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:22:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:55.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000030s ======
Oct  2 08:22:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:57.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct  2 08:22:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:57.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:22:58 np0005465988 nova_compute[236126]: 2025-10-02 12:22:58.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:59 np0005465988 ceph-mds[84851]: mds.beacon.cephfs.compute-2.gpiyct missed beacon ack from the monitors
Oct  2 08:22:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:22:59.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:22:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:22:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:22:59.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:22:59 np0005465988 nova_compute[236126]: 2025-10-02 12:22:59.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:59 np0005465988 nova_compute[236126]: 2025-10-02 12:22:59.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:22:59 np0005465988 nova_compute[236126]: 2025-10-02 12:22:59.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:22:59 np0005465988 nova_compute[236126]: 2025-10-02 12:22:59.689 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:59 np0005465988 nova_compute[236126]: 2025-10-02 12:22:59.689 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:59 np0005465988 nova_compute[236126]: 2025-10-02 12:22:59.690 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:22:59 np0005465988 nova_compute[236126]: 2025-10-02 12:22:59.691 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:00 np0005465988 nova_compute[236126]: 2025-10-02 12:23:00.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:01 np0005465988 podman[275202]: 2025-10-02 12:23:01.225284757 +0000 UTC m=+0.117975025 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Oct  2 08:23:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:01.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:01.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:01 np0005465988 podman[275202]: 2025-10-02 12:23:01.442786667 +0000 UTC m=+0.335476895 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:23:01 np0005465988 nova_compute[236126]: 2025-10-02 12:23:01.643 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating instance_info_cache with network_info: [{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:01 np0005465988 nova_compute[236126]: 2025-10-02 12:23:01.663 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:01 np0005465988 nova_compute[236126]: 2025-10-02 12:23:01.663 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:23:02 np0005465988 podman[275337]: 2025-10-02 12:23:02.330907206 +0000 UTC m=+0.100898546 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:23:02 np0005465988 podman[275337]: 2025-10-02 12:23:02.342920336 +0000 UTC m=+0.112911656 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:23:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:02 np0005465988 podman[275403]: 2025-10-02 12:23:02.61034162 +0000 UTC m=+0.067665421 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.buildah.version=1.28.2, name=keepalived, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:23:02 np0005465988 podman[275403]: 2025-10-02 12:23:02.655769812 +0000 UTC m=+0.113093593 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, version=2.2.4, vcs-type=git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., name=keepalived, build-date=2023-02-22T09:23:20, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793)
Oct  2 08:23:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:23:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:23:03 np0005465988 nova_compute[236126]: 2025-10-02 12:23:03.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:03.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:03.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:04 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:23:04 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:23:04 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:23:04 np0005465988 nova_compute[236126]: 2025-10-02 12:23:04.090 2 INFO nova.compute.manager [None req-37a3bdfc-f151-491b-879f-0f9b609e333f 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Pausing#033[00m
Oct  2 08:23:04 np0005465988 nova_compute[236126]: 2025-10-02 12:23:04.092 2 DEBUG nova.objects.instance [None req-37a3bdfc-f151-491b-879f-0f9b609e333f 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'flavor' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:04 np0005465988 nova_compute[236126]: 2025-10-02 12:23:04.141 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407784.140699, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:04 np0005465988 nova_compute[236126]: 2025-10-02 12:23:04.141 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:23:04 np0005465988 nova_compute[236126]: 2025-10-02 12:23:04.145 2 DEBUG nova.compute.manager [None req-37a3bdfc-f151-491b-879f-0f9b609e333f 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:04 np0005465988 nova_compute[236126]: 2025-10-02 12:23:04.165 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:04 np0005465988 nova_compute[236126]: 2025-10-02 12:23:04.169 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:04 np0005465988 nova_compute[236126]: 2025-10-02 12:23:04.206 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Oct  2 08:23:05 np0005465988 nova_compute[236126]: 2025-10-02 12:23:05.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:05.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:05.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:23:05 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3539772317' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.345 2 INFO nova.compute.manager [None req-72594fe9-70ae-48f8-860f-394f78ee7598 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Unpausing#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.346 2 DEBUG nova.objects.instance [None req-72594fe9-70ae-48f8-860f-394f78ee7598 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'flavor' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.386 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407786.3867297, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.387 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:23:06 np0005465988 virtqemud[235689]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.391 2 DEBUG nova.virt.libvirt.guest [None req-72594fe9-70ae-48f8-860f-394f78ee7598 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.392 2 DEBUG nova.compute.manager [None req-72594fe9-70ae-48f8-860f-394f78ee7598 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.411 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.415 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.436 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.553 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Acquiring lock "465fce36-cba2-4b45-b592-eeda70de3c2a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.554 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.572 2 DEBUG nova.compute.manager [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.643 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.643 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.651 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:23:06 np0005465988 nova_compute[236126]: 2025-10-02 12:23:06.652 2 INFO nova.compute.claims [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:23:06 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Oct  2 08:23:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:06.994151) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:23:06 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Oct  2 08:23:06 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407786994265, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1617, "num_deletes": 255, "total_data_size": 3350113, "memory_usage": 3393688, "flush_reason": "Manual Compaction"}
Oct  2 08:23:06 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407787006098, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 1425644, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41791, "largest_seqno": 43402, "table_properties": {"data_size": 1420038, "index_size": 2746, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15248, "raw_average_key_size": 21, "raw_value_size": 1407664, "raw_average_value_size": 2005, "num_data_blocks": 119, "num_entries": 702, "num_filter_entries": 702, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759407667, "oldest_key_time": 1759407667, "file_creation_time": 1759407786, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 12000 microseconds, and 7554 cpu microseconds.
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.006166) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 1425644 bytes OK
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.006193) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.007514) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.007540) EVENT_LOG_v1 {"time_micros": 1759407787007531, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.007567) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 3342583, prev total WAL file size 3342583, number of live WAL files 2.
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.009258) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323536' seq:72057594037927935, type:22 .. '6D6772737461740031353038' seq:0, type:0; will stop at (end)
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(1392KB)], [78(11MB)]
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407787009304, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 13070078, "oldest_snapshot_seqno": -1}
Oct  2 08:23:07 np0005465988 nova_compute[236126]: 2025-10-02 12:23:07.019 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6894 keys, 9977025 bytes, temperature: kUnknown
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407787079451, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 9977025, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9931320, "index_size": 27353, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17285, "raw_key_size": 176125, "raw_average_key_size": 25, "raw_value_size": 9808311, "raw_average_value_size": 1422, "num_data_blocks": 1092, "num_entries": 6894, "num_filter_entries": 6894, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759407787, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.079847) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 9977025 bytes
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.081347) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 185.9 rd, 141.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 11.1 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(16.2) write-amplify(7.0) OK, records in: 7371, records dropped: 477 output_compression: NoCompression
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.081398) EVENT_LOG_v1 {"time_micros": 1759407787081383, "job": 48, "event": "compaction_finished", "compaction_time_micros": 70297, "compaction_time_cpu_micros": 48372, "output_level": 6, "num_output_files": 1, "total_output_size": 9977025, "num_input_records": 7371, "num_output_records": 6894, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407787082063, "job": 48, "event": "table_file_deletion", "file_number": 80}
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407787086113, "job": 48, "event": "table_file_deletion", "file_number": 78}
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.009112) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.086248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.086258) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.086261) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.086263) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:07.086266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:07.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:07.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:23:07 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/773209993' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:23:07 np0005465988 nova_compute[236126]: 2025-10-02 12:23:07.484 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:07 np0005465988 nova_compute[236126]: 2025-10-02 12:23:07.491 2 DEBUG nova.compute.provider_tree [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:07 np0005465988 nova_compute[236126]: 2025-10-02 12:23:07.693 2 DEBUG nova.scheduler.client.report [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:08 np0005465988 nova_compute[236126]: 2025-10-02 12:23:08.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:08 np0005465988 nova_compute[236126]: 2025-10-02 12:23:08.318 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:08 np0005465988 nova_compute[236126]: 2025-10-02 12:23:08.319 2 DEBUG nova.compute.manager [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:23:08 np0005465988 nova_compute[236126]: 2025-10-02 12:23:08.613 2 DEBUG nova.compute.manager [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:23:08 np0005465988 nova_compute[236126]: 2025-10-02 12:23:08.613 2 DEBUG nova.network.neutron [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:23:08 np0005465988 nova_compute[236126]: 2025-10-02 12:23:08.841 2 INFO nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.165 2 DEBUG nova.compute.manager [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.311 2 DEBUG nova.policy [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e7ac8498cf5493d9eb7fd8747db6b07', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '17f64fa8d6e845999cf42a2e95664585', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:23:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:09.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.407 2 DEBUG nova.compute.manager [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.408 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.409 2 INFO nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Creating image(s)#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.433 2 DEBUG nova.storage.rbd_utils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] rbd image 465fce36-cba2-4b45-b592-eeda70de3c2a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:09.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.474 2 DEBUG nova.storage.rbd_utils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] rbd image 465fce36-cba2-4b45-b592-eeda70de3c2a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.513 2 DEBUG nova.storage.rbd_utils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] rbd image 465fce36-cba2-4b45-b592-eeda70de3c2a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:09 np0005465988 podman[275634]: 2025-10-02 12:23:09.524098784 +0000 UTC m=+0.056705122 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:23:09 np0005465988 podman[275633]: 2025-10-02 12:23:09.530435378 +0000 UTC m=+0.066408334 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.531 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:09 np0005465988 podman[275625]: 2025-10-02 12:23:09.558482095 +0000 UTC m=+0.094455261 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.598 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.599 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.600 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.600 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.626 2 DEBUG nova.storage.rbd_utils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] rbd image 465fce36-cba2-4b45-b592-eeda70de3c2a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.632 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 465fce36-cba2-4b45-b592-eeda70de3c2a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:09 np0005465988 nova_compute[236126]: 2025-10-02 12:23:09.948 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 465fce36-cba2-4b45-b592-eeda70de3c2a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.316s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:10 np0005465988 nova_compute[236126]: 2025-10-02 12:23:10.019 2 DEBUG nova.storage.rbd_utils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] resizing rbd image 465fce36-cba2-4b45-b592-eeda70de3c2a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:23:10 np0005465988 nova_compute[236126]: 2025-10-02 12:23:10.141 2 DEBUG nova.network.neutron [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Successfully created port: af1fa88c-fdf1-43be-9ce6-bf553125c03b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:23:10 np0005465988 nova_compute[236126]: 2025-10-02 12:23:10.151 2 DEBUG nova.objects.instance [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lazy-loading 'migration_context' on Instance uuid 465fce36-cba2-4b45-b592-eeda70de3c2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:10 np0005465988 nova_compute[236126]: 2025-10-02 12:23:10.162 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:23:10 np0005465988 nova_compute[236126]: 2025-10-02 12:23:10.163 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Ensure instance console log exists: /var/lib/nova/instances/465fce36-cba2-4b45-b592-eeda70de3c2a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:23:10 np0005465988 nova_compute[236126]: 2025-10-02 12:23:10.163 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:10 np0005465988 nova_compute[236126]: 2025-10-02 12:23:10.164 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:10 np0005465988 nova_compute[236126]: 2025-10-02 12:23:10.164 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:10 np0005465988 nova_compute[236126]: 2025-10-02 12:23:10.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:23:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:23:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:11.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:11.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:11 np0005465988 nova_compute[236126]: 2025-10-02 12:23:11.686 2 DEBUG nova.network.neutron [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Successfully updated port: af1fa88c-fdf1-43be-9ce6-bf553125c03b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:23:11 np0005465988 nova_compute[236126]: 2025-10-02 12:23:11.727 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Acquiring lock "refresh_cache-465fce36-cba2-4b45-b592-eeda70de3c2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:11 np0005465988 nova_compute[236126]: 2025-10-02 12:23:11.728 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Acquired lock "refresh_cache-465fce36-cba2-4b45-b592-eeda70de3c2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:11 np0005465988 nova_compute[236126]: 2025-10-02 12:23:11.728 2 DEBUG nova.network.neutron [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:23:11 np0005465988 nova_compute[236126]: 2025-10-02 12:23:11.794 2 DEBUG nova.compute.manager [req-54db5045-b206-4ee8-b2d6-abc7ae678985 req-674efeec-42c8-4d35-beab-22319ec2fd57 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Received event network-changed-af1fa88c-fdf1-43be-9ce6-bf553125c03b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:11 np0005465988 nova_compute[236126]: 2025-10-02 12:23:11.794 2 DEBUG nova.compute.manager [req-54db5045-b206-4ee8-b2d6-abc7ae678985 req-674efeec-42c8-4d35-beab-22319ec2fd57 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Refreshing instance network info cache due to event network-changed-af1fa88c-fdf1-43be-9ce6-bf553125c03b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:23:11 np0005465988 nova_compute[236126]: 2025-10-02 12:23:11.795 2 DEBUG oslo_concurrency.lockutils [req-54db5045-b206-4ee8-b2d6-abc7ae678985 req-674efeec-42c8-4d35-beab-22319ec2fd57 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-465fce36-cba2-4b45-b592-eeda70de3c2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:12 np0005465988 nova_compute[236126]: 2025-10-02 12:23:12.413 2 DEBUG nova.network.neutron [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:23:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:13.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:13.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.645 2 DEBUG nova.network.neutron [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Updating instance_info_cache with network_info: [{"id": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "address": "fa:16:3e:2b:8a:cf", "network": {"id": "b5696736-5eda-4019-be30-cf82f91c84f3", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1165749578-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17f64fa8d6e845999cf42a2e95664585", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf1fa88c-fd", "ovs_interfaceid": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.671 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Releasing lock "refresh_cache-465fce36-cba2-4b45-b592-eeda70de3c2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.671 2 DEBUG nova.compute.manager [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Instance network_info: |[{"id": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "address": "fa:16:3e:2b:8a:cf", "network": {"id": "b5696736-5eda-4019-be30-cf82f91c84f3", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1165749578-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17f64fa8d6e845999cf42a2e95664585", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf1fa88c-fd", "ovs_interfaceid": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.672 2 DEBUG oslo_concurrency.lockutils [req-54db5045-b206-4ee8-b2d6-abc7ae678985 req-674efeec-42c8-4d35-beab-22319ec2fd57 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-465fce36-cba2-4b45-b592-eeda70de3c2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.672 2 DEBUG nova.network.neutron [req-54db5045-b206-4ee8-b2d6-abc7ae678985 req-674efeec-42c8-4d35-beab-22319ec2fd57 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Refreshing network info cache for port af1fa88c-fdf1-43be-9ce6-bf553125c03b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.675 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Start _get_guest_xml network_info=[{"id": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "address": "fa:16:3e:2b:8a:cf", "network": {"id": "b5696736-5eda-4019-be30-cf82f91c84f3", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1165749578-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17f64fa8d6e845999cf42a2e95664585", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf1fa88c-fd", "ovs_interfaceid": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.683 2 WARNING nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.696 2 DEBUG nova.virt.libvirt.host [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.696 2 DEBUG nova.virt.libvirt.host [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.701 2 DEBUG nova.virt.libvirt.host [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.701 2 DEBUG nova.virt.libvirt.host [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.702 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.702 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.703 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.703 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.703 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.703 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.703 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.704 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.704 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.704 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.704 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.704 2 DEBUG nova.virt.hardware [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:23:13 np0005465988 nova_compute[236126]: 2025-10-02 12:23:13.707 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:23:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3162333235' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.183 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.220 2 DEBUG nova.storage.rbd_utils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] rbd image 465fce36-cba2-4b45-b592-eeda70de3c2a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.226 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:23:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2882623300' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.713 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.715 2 DEBUG nova.virt.libvirt.vif [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:23:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-89446138',display_name='tempest-NoVNCConsoleTestJSON-server-89446138',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-89446138',id=98,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='17f64fa8d6e845999cf42a2e95664585',ramdisk_id='',reservation_id='r-juifw9ye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-1478352436',owner_user_name='tempest-NoVNCConsoleTestJSON-1478352436-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:09Z,user_data=None,user_id='6e7ac8498cf5493d9eb7fd8747db6b07',uuid=465fce36-cba2-4b45-b592-eeda70de3c2a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "address": "fa:16:3e:2b:8a:cf", "network": {"id": "b5696736-5eda-4019-be30-cf82f91c84f3", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1165749578-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17f64fa8d6e845999cf42a2e95664585", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf1fa88c-fd", "ovs_interfaceid": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.716 2 DEBUG nova.network.os_vif_util [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Converting VIF {"id": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "address": "fa:16:3e:2b:8a:cf", "network": {"id": "b5696736-5eda-4019-be30-cf82f91c84f3", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1165749578-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17f64fa8d6e845999cf42a2e95664585", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf1fa88c-fd", "ovs_interfaceid": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.717 2 DEBUG nova.network.os_vif_util [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:cf,bridge_name='br-int',has_traffic_filtering=True,id=af1fa88c-fdf1-43be-9ce6-bf553125c03b,network=Network(b5696736-5eda-4019-be30-cf82f91c84f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf1fa88c-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.718 2 DEBUG nova.objects.instance [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lazy-loading 'pci_devices' on Instance uuid 465fce36-cba2-4b45-b592-eeda70de3c2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.735 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  <uuid>465fce36-cba2-4b45-b592-eeda70de3c2a</uuid>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  <name>instance-00000062</name>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <nova:name>tempest-NoVNCConsoleTestJSON-server-89446138</nova:name>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:23:13</nova:creationTime>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <nova:user uuid="6e7ac8498cf5493d9eb7fd8747db6b07">tempest-NoVNCConsoleTestJSON-1478352436-project-member</nova:user>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <nova:project uuid="17f64fa8d6e845999cf42a2e95664585">tempest-NoVNCConsoleTestJSON-1478352436</nova:project>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <nova:port uuid="af1fa88c-fdf1-43be-9ce6-bf553125c03b">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <entry name="serial">465fce36-cba2-4b45-b592-eeda70de3c2a</entry>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <entry name="uuid">465fce36-cba2-4b45-b592-eeda70de3c2a</entry>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/465fce36-cba2-4b45-b592-eeda70de3c2a_disk">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/465fce36-cba2-4b45-b592-eeda70de3c2a_disk.config">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:2b:8a:cf"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <target dev="tapaf1fa88c-fd"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/465fce36-cba2-4b45-b592-eeda70de3c2a/console.log" append="off"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:23:14 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:23:14 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:23:14 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:23:14 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.737 2 DEBUG nova.compute.manager [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Preparing to wait for external event network-vif-plugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.737 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Acquiring lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.737 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.738 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.738 2 DEBUG nova.virt.libvirt.vif [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:23:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-89446138',display_name='tempest-NoVNCConsoleTestJSON-server-89446138',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-89446138',id=98,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='17f64fa8d6e845999cf42a2e95664585',ramdisk_id='',reservation_id='r-juifw9ye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-1478352436',owner_user_name='tempest-NoVNCConsoleTestJSON-1478352436-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:09Z,user_data=None,user_id='6e7ac8498cf5493d9eb7fd8747db6b07',uuid=465fce36-cba2-4b45-b592-eeda70de3c2a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "address": "fa:16:3e:2b:8a:cf", "network": {"id": "b5696736-5eda-4019-be30-cf82f91c84f3", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1165749578-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17f64fa8d6e845999cf42a2e95664585", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf1fa88c-fd", "ovs_interfaceid": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.739 2 DEBUG nova.network.os_vif_util [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Converting VIF {"id": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "address": "fa:16:3e:2b:8a:cf", "network": {"id": "b5696736-5eda-4019-be30-cf82f91c84f3", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1165749578-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17f64fa8d6e845999cf42a2e95664585", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf1fa88c-fd", "ovs_interfaceid": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.739 2 DEBUG nova.network.os_vif_util [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:cf,bridge_name='br-int',has_traffic_filtering=True,id=af1fa88c-fdf1-43be-9ce6-bf553125c03b,network=Network(b5696736-5eda-4019-be30-cf82f91c84f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf1fa88c-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.739 2 DEBUG os_vif [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:cf,bridge_name='br-int',has_traffic_filtering=True,id=af1fa88c-fdf1-43be-9ce6-bf553125c03b,network=Network(b5696736-5eda-4019-be30-cf82f91c84f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf1fa88c-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.740 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.741 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.744 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf1fa88c-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.744 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaf1fa88c-fd, col_values=(('external_ids', {'iface-id': 'af1fa88c-fdf1-43be-9ce6-bf553125c03b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:8a:cf', 'vm-uuid': '465fce36-cba2-4b45-b592-eeda70de3c2a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:14 np0005465988 NetworkManager[45041]: <info>  [1759407794.7471] manager: (tapaf1fa88c-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.752 2 INFO os_vif [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:cf,bridge_name='br-int',has_traffic_filtering=True,id=af1fa88c-fdf1-43be-9ce6-bf553125c03b,network=Network(b5696736-5eda-4019-be30-cf82f91c84f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf1fa88c-fd')#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.806 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.807 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.807 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] No VIF found with MAC fa:16:3e:2b:8a:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.808 2 INFO nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Using config drive#033[00m
Oct  2 08:23:14 np0005465988 nova_compute[236126]: 2025-10-02 12:23:14.879 2 DEBUG nova.storage.rbd_utils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] rbd image 465fce36-cba2-4b45-b592-eeda70de3c2a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.289 2 DEBUG nova.network.neutron [req-54db5045-b206-4ee8-b2d6-abc7ae678985 req-674efeec-42c8-4d35-beab-22319ec2fd57 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Updated VIF entry in instance network info cache for port af1fa88c-fdf1-43be-9ce6-bf553125c03b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.290 2 DEBUG nova.network.neutron [req-54db5045-b206-4ee8-b2d6-abc7ae678985 req-674efeec-42c8-4d35-beab-22319ec2fd57 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Updating instance_info_cache with network_info: [{"id": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "address": "fa:16:3e:2b:8a:cf", "network": {"id": "b5696736-5eda-4019-be30-cf82f91c84f3", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1165749578-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17f64fa8d6e845999cf42a2e95664585", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf1fa88c-fd", "ovs_interfaceid": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.305 2 DEBUG oslo_concurrency.lockutils [req-54db5045-b206-4ee8-b2d6-abc7ae678985 req-674efeec-42c8-4d35-beab-22319ec2fd57 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-465fce36-cba2-4b45-b592-eeda70de3c2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:15.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.433 2 INFO nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Creating config drive at /var/lib/nova/instances/465fce36-cba2-4b45-b592-eeda70de3c2a/disk.config#033[00m
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.439 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/465fce36-cba2-4b45-b592-eeda70de3c2a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2z7uhodq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:15.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.589 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/465fce36-cba2-4b45-b592-eeda70de3c2a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2z7uhodq" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.624 2 DEBUG nova.storage.rbd_utils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] rbd image 465fce36-cba2-4b45-b592-eeda70de3c2a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.629 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/465fce36-cba2-4b45-b592-eeda70de3c2a/disk.config 465fce36-cba2-4b45-b592-eeda70de3c2a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.827 2 DEBUG oslo_concurrency.processutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/465fce36-cba2-4b45-b592-eeda70de3c2a/disk.config 465fce36-cba2-4b45-b592-eeda70de3c2a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.198s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.829 2 INFO nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Deleting local config drive /var/lib/nova/instances/465fce36-cba2-4b45-b592-eeda70de3c2a/disk.config because it was imported into RBD.#033[00m
Oct  2 08:23:15 np0005465988 kernel: tapaf1fa88c-fd: entered promiscuous mode
Oct  2 08:23:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:15Z|00362|binding|INFO|Claiming lport af1fa88c-fdf1-43be-9ce6-bf553125c03b for this chassis.
Oct  2 08:23:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:15Z|00363|binding|INFO|af1fa88c-fdf1-43be-9ce6-bf553125c03b: Claiming fa:16:3e:2b:8a:cf 10.100.0.4
Oct  2 08:23:15 np0005465988 NetworkManager[45041]: <info>  [1759407795.9021] manager: (tapaf1fa88c-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/179)
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:15.918 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:8a:cf 10.100.0.4'], port_security=['fa:16:3e:2b:8a:cf 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '465fce36-cba2-4b45-b592-eeda70de3c2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5696736-5eda-4019-be30-cf82f91c84f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '17f64fa8d6e845999cf42a2e95664585', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0429b06b-1d2e-4b01-a44b-a57901df7bb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da4d4ac2-0a58-44dc-9263-1d5627d0b57b, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=af1fa88c-fdf1-43be-9ce6-bf553125c03b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:15.919 142124 INFO neutron.agent.ovn.metadata.agent [-] Port af1fa88c-fdf1-43be-9ce6-bf553125c03b in datapath b5696736-5eda-4019-be30-cf82f91c84f3 bound to our chassis#033[00m
Oct  2 08:23:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:15.920 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5696736-5eda-4019-be30-cf82f91c84f3#033[00m
Oct  2 08:23:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:15.935 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5d2005a1-a59a-4196-8848-58d0ecbf9dc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:15.936 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5696736-51 in ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:23:15 np0005465988 systemd-udevd[276021]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:23:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:15.939 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5696736-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:23:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:15.939 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e15c0063-ce9c-4b67-a62c-b36e1bfa7b11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:15.940 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[228730ec-79d2-4b95-b4a7-dcb3efb3e03d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:15 np0005465988 systemd-machined[192594]: New machine qemu-36-instance-00000062.
Oct  2 08:23:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:15.955 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[ec56388c-9ece-4705-9607-4fe7727e0f8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:15 np0005465988 NetworkManager[45041]: <info>  [1759407795.9591] device (tapaf1fa88c-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:23:15 np0005465988 NetworkManager[45041]: <info>  [1759407795.9600] device (tapaf1fa88c-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:23:15 np0005465988 systemd[1]: Started Virtual Machine qemu-36-instance-00000062.
Oct  2 08:23:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:15.981 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[974ff06b-162f-49e1-ac1d-2f02f57cffda]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:15Z|00364|binding|INFO|Setting lport af1fa88c-fdf1-43be-9ce6-bf553125c03b ovn-installed in OVS
Oct  2 08:23:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:15Z|00365|binding|INFO|Setting lport af1fa88c-fdf1-43be-9ce6-bf553125c03b up in Southbound
Oct  2 08:23:15 np0005465988 nova_compute[236126]: 2025-10-02 12:23:15.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.010 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ba012f8f-50d4-4429-815f-e404334615e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:16 np0005465988 NetworkManager[45041]: <info>  [1759407796.0155] manager: (tapb5696736-50): new Veth device (/org/freedesktop/NetworkManager/Devices/180)
Oct  2 08:23:16 np0005465988 systemd-udevd[276026]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.016 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[60415cba-3bd4-45d0-99f8-96bd8d66a38a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.054 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9ad7f5ee-267e-4347-9717-30263fa7cd7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.058 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[748546d0-d408-4b00-a54d-4e43133f9df4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:16 np0005465988 NetworkManager[45041]: <info>  [1759407796.0842] device (tapb5696736-50): carrier: link connected
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.090 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc5b80f-3e6c-4e6b-ab39-a7b3998976c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.108 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6b8d566f-ce3a-4cee-8e6a-d68fd25fcb41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5696736-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:b2:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 111], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585641, 'reachable_time': 40478, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276054, 'error': None, 'target': 'ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.122 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d5d673-29d8-4f2d-a320-ac3346b10622]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:b2c3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585641, 'tstamp': 585641}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 276055, 'error': None, 'target': 'ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.138 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[116f2132-aff9-4858-b00a-b2d3cb008a63]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5696736-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:b2:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 111], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585641, 'reachable_time': 40478, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 276056, 'error': None, 'target': 'ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.169 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fabe4304-d8f5-4c42-a844-e8034b8f0aa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.222 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a7057a-8077-487e-b446-8106c7be97f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.224 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5696736-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.224 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.224 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5696736-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:16 np0005465988 nova_compute[236126]: 2025-10-02 12:23:16.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:16 np0005465988 NetworkManager[45041]: <info>  [1759407796.2272] manager: (tapb5696736-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/181)
Oct  2 08:23:16 np0005465988 kernel: tapb5696736-50: entered promiscuous mode
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.228 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5696736-50, col_values=(('external_ids', {'iface-id': 'cee4f99b-37b4-48d8-ad53-791ca401d63b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:16 np0005465988 nova_compute[236126]: 2025-10-02 12:23:16.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:16Z|00366|binding|INFO|Releasing lport cee4f99b-37b4-48d8-ad53-791ca401d63b from this chassis (sb_readonly=0)
Oct  2 08:23:16 np0005465988 nova_compute[236126]: 2025-10-02 12:23:16.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.246 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5696736-5eda-4019-be30-cf82f91c84f3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5696736-5eda-4019-be30-cf82f91c84f3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.247 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[34090c9f-8450-45c7-a884-50ab6048e959]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.247 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-b5696736-5eda-4019-be30-cf82f91c84f3
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/b5696736-5eda-4019-be30-cf82f91c84f3.pid.haproxy
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID b5696736-5eda-4019-be30-cf82f91c84f3
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:23:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:16.248 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3', 'env', 'PROCESS_TAG=haproxy-b5696736-5eda-4019-be30-cf82f91c84f3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5696736-5eda-4019-be30-cf82f91c84f3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:23:16 np0005465988 podman[276130]: 2025-10-02 12:23:16.710224365 +0000 UTC m=+0.075768826 container create 36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:23:16 np0005465988 systemd[1]: Started libpod-conmon-36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b.scope.
Oct  2 08:23:16 np0005465988 podman[276130]: 2025-10-02 12:23:16.682756616 +0000 UTC m=+0.048301077 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:23:16 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:23:16 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8d394b9a4151e262c5269de516e97bb4428f3fbe376f872c4a9a5b286816be5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:23:16 np0005465988 nova_compute[236126]: 2025-10-02 12:23:16.801 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407796.7998824, 465fce36-cba2-4b45-b592-eeda70de3c2a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:16 np0005465988 nova_compute[236126]: 2025-10-02 12:23:16.802 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:23:16 np0005465988 podman[276130]: 2025-10-02 12:23:16.808227558 +0000 UTC m=+0.173772099 container init 36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:23:16 np0005465988 podman[276130]: 2025-10-02 12:23:16.814543651 +0000 UTC m=+0.180088132 container start 36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:23:16 np0005465988 nova_compute[236126]: 2025-10-02 12:23:16.826 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:16 np0005465988 nova_compute[236126]: 2025-10-02 12:23:16.831 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407796.800159, 465fce36-cba2-4b45-b592-eeda70de3c2a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:16 np0005465988 nova_compute[236126]: 2025-10-02 12:23:16.831 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:23:16 np0005465988 neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3[276145]: [NOTICE]   (276149) : New worker (276151) forked
Oct  2 08:23:16 np0005465988 neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3[276145]: [NOTICE]   (276149) : Loading success.
Oct  2 08:23:16 np0005465988 nova_compute[236126]: 2025-10-02 12:23:16.851 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:16 np0005465988 nova_compute[236126]: 2025-10-02 12:23:16.855 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:16 np0005465988 nova_compute[236126]: 2025-10-02 12:23:16.879 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:23:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:17.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:17.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.820 2 DEBUG nova.compute.manager [req-115a234b-dce9-4ee4-88d9-986ce821a676 req-fa5583c8-43da-4a9f-95ab-e2e2dca797f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Received event network-vif-plugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.821 2 DEBUG oslo_concurrency.lockutils [req-115a234b-dce9-4ee4-88d9-986ce821a676 req-fa5583c8-43da-4a9f-95ab-e2e2dca797f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.822 2 DEBUG oslo_concurrency.lockutils [req-115a234b-dce9-4ee4-88d9-986ce821a676 req-fa5583c8-43da-4a9f-95ab-e2e2dca797f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.823 2 DEBUG oslo_concurrency.lockutils [req-115a234b-dce9-4ee4-88d9-986ce821a676 req-fa5583c8-43da-4a9f-95ab-e2e2dca797f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.823 2 DEBUG nova.compute.manager [req-115a234b-dce9-4ee4-88d9-986ce821a676 req-fa5583c8-43da-4a9f-95ab-e2e2dca797f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Processing event network-vif-plugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.823 2 DEBUG nova.compute.manager [req-115a234b-dce9-4ee4-88d9-986ce821a676 req-fa5583c8-43da-4a9f-95ab-e2e2dca797f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Received event network-vif-plugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.824 2 DEBUG oslo_concurrency.lockutils [req-115a234b-dce9-4ee4-88d9-986ce821a676 req-fa5583c8-43da-4a9f-95ab-e2e2dca797f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.824 2 DEBUG oslo_concurrency.lockutils [req-115a234b-dce9-4ee4-88d9-986ce821a676 req-fa5583c8-43da-4a9f-95ab-e2e2dca797f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.824 2 DEBUG oslo_concurrency.lockutils [req-115a234b-dce9-4ee4-88d9-986ce821a676 req-fa5583c8-43da-4a9f-95ab-e2e2dca797f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.824 2 DEBUG nova.compute.manager [req-115a234b-dce9-4ee4-88d9-986ce821a676 req-fa5583c8-43da-4a9f-95ab-e2e2dca797f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] No waiting events found dispatching network-vif-plugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.825 2 WARNING nova.compute.manager [req-115a234b-dce9-4ee4-88d9-986ce821a676 req-fa5583c8-43da-4a9f-95ab-e2e2dca797f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Received unexpected event network-vif-plugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.826 2 DEBUG nova.compute.manager [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.830 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407797.8301418, 465fce36-cba2-4b45-b592-eeda70de3c2a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.830 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.832 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.837 2 INFO nova.virt.libvirt.driver [-] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Instance spawned successfully.#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.838 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.872 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.877 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.886 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.887 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.887 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.888 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.888 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.889 2 DEBUG nova.virt.libvirt.driver [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.920 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.954 2 INFO nova.compute.manager [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Took 8.55 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:23:17 np0005465988 nova_compute[236126]: 2025-10-02 12:23:17.955 2 DEBUG nova.compute.manager [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:18 np0005465988 nova_compute[236126]: 2025-10-02 12:23:18.020 2 INFO nova.compute.manager [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Took 11.40 seconds to build instance.#033[00m
Oct  2 08:23:18 np0005465988 nova_compute[236126]: 2025-10-02 12:23:18.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:18 np0005465988 nova_compute[236126]: 2025-10-02 12:23:18.048 2 DEBUG oslo_concurrency.lockutils [None req-f51282b1-01ad-4a3f-ab6f-660abf95680b 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.494s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:18 np0005465988 nova_compute[236126]: 2025-10-02 12:23:18.871 2 DEBUG nova.compute.manager [None req-ac3cc270-5b3b-4e8e-a3ef-af48bce87ff2 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Oct  2 08:23:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:19.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:19.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:19 np0005465988 nova_compute[236126]: 2025-10-02 12:23:19.523 2 DEBUG nova.compute.manager [None req-4fa13eb9-6257-4b82-842f-42bba2d291a7 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Oct  2 08:23:19 np0005465988 nova_compute[236126]: 2025-10-02 12:23:19.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:23:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1346144995' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:23:19 np0005465988 nova_compute[236126]: 2025-10-02 12:23:19.961 2 DEBUG oslo_concurrency.lockutils [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Acquiring lock "465fce36-cba2-4b45-b592-eeda70de3c2a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:19 np0005465988 nova_compute[236126]: 2025-10-02 12:23:19.962 2 DEBUG oslo_concurrency.lockutils [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:19 np0005465988 nova_compute[236126]: 2025-10-02 12:23:19.963 2 DEBUG oslo_concurrency.lockutils [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Acquiring lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:19 np0005465988 nova_compute[236126]: 2025-10-02 12:23:19.964 2 DEBUG oslo_concurrency.lockutils [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:19 np0005465988 nova_compute[236126]: 2025-10-02 12:23:19.965 2 DEBUG oslo_concurrency.lockutils [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:19 np0005465988 nova_compute[236126]: 2025-10-02 12:23:19.967 2 INFO nova.compute.manager [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Terminating instance#033[00m
Oct  2 08:23:19 np0005465988 nova_compute[236126]: 2025-10-02 12:23:19.969 2 DEBUG nova.compute.manager [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:23:20 np0005465988 kernel: tapaf1fa88c-fd (unregistering): left promiscuous mode
Oct  2 08:23:20 np0005465988 NetworkManager[45041]: <info>  [1759407800.0501] device (tapaf1fa88c-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:23:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:20Z|00367|binding|INFO|Releasing lport af1fa88c-fdf1-43be-9ce6-bf553125c03b from this chassis (sb_readonly=0)
Oct  2 08:23:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:20Z|00368|binding|INFO|Setting lport af1fa88c-fdf1-43be-9ce6-bf553125c03b down in Southbound
Oct  2 08:23:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:20Z|00369|binding|INFO|Removing iface tapaf1fa88c-fd ovn-installed in OVS
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:20 np0005465988 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000062.scope: Deactivated successfully.
Oct  2 08:23:20 np0005465988 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000062.scope: Consumed 2.994s CPU time.
Oct  2 08:23:20 np0005465988 systemd-machined[192594]: Machine qemu-36-instance-00000062 terminated.
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.172 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:8a:cf 10.100.0.4'], port_security=['fa:16:3e:2b:8a:cf 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '465fce36-cba2-4b45-b592-eeda70de3c2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5696736-5eda-4019-be30-cf82f91c84f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '17f64fa8d6e845999cf42a2e95664585', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0429b06b-1d2e-4b01-a44b-a57901df7bb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da4d4ac2-0a58-44dc-9263-1d5627d0b57b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=af1fa88c-fdf1-43be-9ce6-bf553125c03b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.174 142124 INFO neutron.agent.ovn.metadata.agent [-] Port af1fa88c-fdf1-43be-9ce6-bf553125c03b in datapath b5696736-5eda-4019-be30-cf82f91c84f3 unbound from our chassis#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.178 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5696736-5eda-4019-be30-cf82f91c84f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.179 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9cf147-2c89-4fe6-bef8-48efd7253426]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.180 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3 namespace which is not needed anymore#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.214 2 INFO nova.virt.libvirt.driver [-] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Instance destroyed successfully.#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.214 2 DEBUG nova.objects.instance [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lazy-loading 'resources' on Instance uuid 465fce36-cba2-4b45-b592-eeda70de3c2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.231 2 DEBUG nova.virt.libvirt.vif [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-89446138',display_name='tempest-NoVNCConsoleTestJSON-server-89446138',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-89446138',id=98,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:23:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='17f64fa8d6e845999cf42a2e95664585',ramdisk_id='',reservation_id='r-juifw9ye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NoVNCConsoleTestJSON-1478352436',owner_user_name='tempest-NoVNCConsoleTestJSON-1478352436-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:23:17Z,user_data=None,user_id='6e7ac8498cf5493d9eb7fd8747db6b07',uuid=465fce36-cba2-4b45-b592-eeda70de3c2a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "address": "fa:16:3e:2b:8a:cf", "network": {"id": "b5696736-5eda-4019-be30-cf82f91c84f3", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1165749578-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17f64fa8d6e845999cf42a2e95664585", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf1fa88c-fd", "ovs_interfaceid": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.231 2 DEBUG nova.network.os_vif_util [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Converting VIF {"id": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "address": "fa:16:3e:2b:8a:cf", "network": {"id": "b5696736-5eda-4019-be30-cf82f91c84f3", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1165749578-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17f64fa8d6e845999cf42a2e95664585", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf1fa88c-fd", "ovs_interfaceid": "af1fa88c-fdf1-43be-9ce6-bf553125c03b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.233 2 DEBUG nova.network.os_vif_util [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:cf,bridge_name='br-int',has_traffic_filtering=True,id=af1fa88c-fdf1-43be-9ce6-bf553125c03b,network=Network(b5696736-5eda-4019-be30-cf82f91c84f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf1fa88c-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.233 2 DEBUG os_vif [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:cf,bridge_name='br-int',has_traffic_filtering=True,id=af1fa88c-fdf1-43be-9ce6-bf553125c03b,network=Network(b5696736-5eda-4019-be30-cf82f91c84f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf1fa88c-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.236 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf1fa88c-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.248 2 INFO os_vif [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:cf,bridge_name='br-int',has_traffic_filtering=True,id=af1fa88c-fdf1-43be-9ce6-bf553125c03b,network=Network(b5696736-5eda-4019-be30-cf82f91c84f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf1fa88c-fd')#033[00m
Oct  2 08:23:20 np0005465988 neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3[276145]: [NOTICE]   (276149) : haproxy version is 2.8.14-c23fe91
Oct  2 08:23:20 np0005465988 neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3[276145]: [NOTICE]   (276149) : path to executable is /usr/sbin/haproxy
Oct  2 08:23:20 np0005465988 neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3[276145]: [WARNING]  (276149) : Exiting Master process...
Oct  2 08:23:20 np0005465988 neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3[276145]: [ALERT]    (276149) : Current worker (276151) exited with code 143 (Terminated)
Oct  2 08:23:20 np0005465988 neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3[276145]: [WARNING]  (276149) : All workers exited. Exiting... (0)
Oct  2 08:23:20 np0005465988 systemd[1]: libpod-36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b.scope: Deactivated successfully.
Oct  2 08:23:20 np0005465988 podman[276211]: 2025-10-02 12:23:20.426248985 +0000 UTC m=+0.073022567 container died 36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:23:20 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b-userdata-shm.mount: Deactivated successfully.
Oct  2 08:23:20 np0005465988 systemd[1]: var-lib-containers-storage-overlay-b8d394b9a4151e262c5269de516e97bb4428f3fbe376f872c4a9a5b286816be5-merged.mount: Deactivated successfully.
Oct  2 08:23:20 np0005465988 podman[276211]: 2025-10-02 12:23:20.470725669 +0000 UTC m=+0.117499231 container cleanup 36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:23:20 np0005465988 systemd[1]: libpod-conmon-36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b.scope: Deactivated successfully.
Oct  2 08:23:20 np0005465988 podman[276241]: 2025-10-02 12:23:20.579966289 +0000 UTC m=+0.069228076 container remove 36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.587 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2b51381b-86c3-4935-a887-daf528cfbfab]: (4, ('Thu Oct  2 12:23:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3 (36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b)\n36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b\nThu Oct  2 12:23:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3 (36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b)\n36bf4d033f844213a055b223413f0b06923e1a8c8f96ac4696094b5d2bc0c75b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.590 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c56256a9-8ec6-41ed-9617-efef391a45c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.591 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5696736-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:20 np0005465988 kernel: tapb5696736-50: left promiscuous mode
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.615 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ba165f-f0d8-4091-b7ae-691faee325ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.621 2 DEBUG nova.compute.manager [req-ff136e83-7219-44e9-a73e-2df05367a856 req-4e92a1ec-695f-44cb-93f7-a47615065100 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Received event network-vif-unplugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.623 2 DEBUG oslo_concurrency.lockutils [req-ff136e83-7219-44e9-a73e-2df05367a856 req-4e92a1ec-695f-44cb-93f7-a47615065100 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.623 2 DEBUG oslo_concurrency.lockutils [req-ff136e83-7219-44e9-a73e-2df05367a856 req-4e92a1ec-695f-44cb-93f7-a47615065100 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.624 2 DEBUG oslo_concurrency.lockutils [req-ff136e83-7219-44e9-a73e-2df05367a856 req-4e92a1ec-695f-44cb-93f7-a47615065100 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.624 2 DEBUG nova.compute.manager [req-ff136e83-7219-44e9-a73e-2df05367a856 req-4e92a1ec-695f-44cb-93f7-a47615065100 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] No waiting events found dispatching network-vif-unplugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:20 np0005465988 nova_compute[236126]: 2025-10-02 12:23:20.625 2 DEBUG nova.compute.manager [req-ff136e83-7219-44e9-a73e-2df05367a856 req-4e92a1ec-695f-44cb-93f7-a47615065100 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Received event network-vif-unplugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.645 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e970fdd3-b392-4452-91a5-ad0245b1c8eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.647 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5a4183-7f65-4dcb-953e-c0c55527a66d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.663 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a8b0b3-6520-4c0f-8eab-9312ed7e4398]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585633, 'reachable_time': 40789, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276263, 'error': None, 'target': 'ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.666 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5696736-5eda-4019-be30-cf82f91c84f3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:23:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:20.666 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[90c5cf38-2629-4a4c-bc3c-a70f721137f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:20 np0005465988 systemd[1]: run-netns-ovnmeta\x2db5696736\x2d5eda\x2d4019\x2dbe30\x2dcf82f91c84f3.mount: Deactivated successfully.
Oct  2 08:23:20 np0005465988 podman[276254]: 2025-10-02 12:23:20.717918314 +0000 UTC m=+0.073369046 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:23:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e282 e282: 3 total, 3 up, 3 in
Oct  2 08:23:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:23:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:21.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:23:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:21.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:22 np0005465988 nova_compute[236126]: 2025-10-02 12:23:22.509 2 INFO nova.virt.libvirt.driver [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Deleting instance files /var/lib/nova/instances/465fce36-cba2-4b45-b592-eeda70de3c2a_del#033[00m
Oct  2 08:23:22 np0005465988 nova_compute[236126]: 2025-10-02 12:23:22.510 2 INFO nova.virt.libvirt.driver [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Deletion of /var/lib/nova/instances/465fce36-cba2-4b45-b592-eeda70de3c2a_del complete#033[00m
Oct  2 08:23:22 np0005465988 nova_compute[236126]: 2025-10-02 12:23:22.587 2 INFO nova.compute.manager [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Took 2.62 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:23:22 np0005465988 nova_compute[236126]: 2025-10-02 12:23:22.588 2 DEBUG oslo.service.loopingcall [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:23:22 np0005465988 nova_compute[236126]: 2025-10-02 12:23:22.589 2 DEBUG nova.compute.manager [-] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:23:22 np0005465988 nova_compute[236126]: 2025-10-02 12:23:22.589 2 DEBUG nova.network.neutron [-] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:23:23 np0005465988 nova_compute[236126]: 2025-10-02 12:23:23.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:23.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:23.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:23 np0005465988 nova_compute[236126]: 2025-10-02 12:23:23.606 2 DEBUG nova.compute.manager [req-fc795f18-a97d-473f-b173-3c363a6deae8 req-7a6ed0a5-c052-47ff-87d9-55a3b1c582a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Received event network-vif-plugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:23 np0005465988 nova_compute[236126]: 2025-10-02 12:23:23.607 2 DEBUG oslo_concurrency.lockutils [req-fc795f18-a97d-473f-b173-3c363a6deae8 req-7a6ed0a5-c052-47ff-87d9-55a3b1c582a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:23 np0005465988 nova_compute[236126]: 2025-10-02 12:23:23.607 2 DEBUG oslo_concurrency.lockutils [req-fc795f18-a97d-473f-b173-3c363a6deae8 req-7a6ed0a5-c052-47ff-87d9-55a3b1c582a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:23 np0005465988 nova_compute[236126]: 2025-10-02 12:23:23.607 2 DEBUG oslo_concurrency.lockutils [req-fc795f18-a97d-473f-b173-3c363a6deae8 req-7a6ed0a5-c052-47ff-87d9-55a3b1c582a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:23 np0005465988 nova_compute[236126]: 2025-10-02 12:23:23.608 2 DEBUG nova.compute.manager [req-fc795f18-a97d-473f-b173-3c363a6deae8 req-7a6ed0a5-c052-47ff-87d9-55a3b1c582a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] No waiting events found dispatching network-vif-plugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:23 np0005465988 nova_compute[236126]: 2025-10-02 12:23:23.608 2 WARNING nova.compute.manager [req-fc795f18-a97d-473f-b173-3c363a6deae8 req-7a6ed0a5-c052-47ff-87d9-55a3b1c582a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Received unexpected event network-vif-plugged-af1fa88c-fdf1-43be-9ce6-bf553125c03b for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:23:24 np0005465988 nova_compute[236126]: 2025-10-02 12:23:24.236 2 DEBUG nova.network.neutron [-] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:24 np0005465988 nova_compute[236126]: 2025-10-02 12:23:24.257 2 INFO nova.compute.manager [-] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Took 1.67 seconds to deallocate network for instance.#033[00m
Oct  2 08:23:24 np0005465988 nova_compute[236126]: 2025-10-02 12:23:24.319 2 DEBUG oslo_concurrency.lockutils [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:24 np0005465988 nova_compute[236126]: 2025-10-02 12:23:24.320 2 DEBUG oslo_concurrency.lockutils [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:24 np0005465988 nova_compute[236126]: 2025-10-02 12:23:24.342 2 DEBUG nova.compute.manager [req-4f248ccb-3cf8-44bb-8aec-1a0f0c074b3a req-21012440-3138-4bb1-a3a3-7d16269dbeed d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Received event network-vif-deleted-af1fa88c-fdf1-43be-9ce6-bf553125c03b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:24 np0005465988 nova_compute[236126]: 2025-10-02 12:23:24.419 2 DEBUG oslo_concurrency.processutils [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:23:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1689512058' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:23:24 np0005465988 nova_compute[236126]: 2025-10-02 12:23:24.902 2 DEBUG oslo_concurrency.processutils [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:24 np0005465988 nova_compute[236126]: 2025-10-02 12:23:24.912 2 DEBUG nova.compute.provider_tree [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:24 np0005465988 nova_compute[236126]: 2025-10-02 12:23:24.946 2 DEBUG nova.scheduler.client.report [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:24 np0005465988 nova_compute[236126]: 2025-10-02 12:23:24.984 2 DEBUG oslo_concurrency.lockutils [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.664s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:25 np0005465988 nova_compute[236126]: 2025-10-02 12:23:25.017 2 INFO nova.scheduler.client.report [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Deleted allocations for instance 465fce36-cba2-4b45-b592-eeda70de3c2a#033[00m
Oct  2 08:23:25 np0005465988 nova_compute[236126]: 2025-10-02 12:23:25.109 2 DEBUG oslo_concurrency.lockutils [None req-55bb41d7-9229-4588-8edf-7a95e66094e3 6e7ac8498cf5493d9eb7fd8747db6b07 17f64fa8d6e845999cf42a2e95664585 - - default default] Lock "465fce36-cba2-4b45-b592-eeda70de3c2a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:25 np0005465988 nova_compute[236126]: 2025-10-02 12:23:25.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:25.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:25.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:27.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:27.352 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:27.353 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:27.353 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:27.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:28 np0005465988 nova_compute[236126]: 2025-10-02 12:23:28.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:23:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.0 total, 600.0 interval#012Cumulative writes: 8655 writes, 43K keys, 8655 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.03 MB/s#012Cumulative WAL: 8655 writes, 8655 syncs, 1.00 writes per sync, written: 0.09 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1808 writes, 8813 keys, 1808 commit groups, 1.0 writes per commit group, ingest: 17.27 MB, 0.03 MB/s#012Interval WAL: 1808 writes, 1808 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     87.5      0.60              0.17        24    0.025       0      0       0.0       0.0#012  L6      1/0    9.51 MB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   3.9    140.0    115.7      1.77              0.80        23    0.077    126K    13K       0.0       0.0#012 Sum      1/0    9.51 MB   0.0      0.2     0.1      0.2       0.3      0.1       0.0   4.9    104.5    108.6      2.36              0.97        47    0.050    126K    13K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.3    138.7    138.4      0.50              0.31        12    0.041     41K   3043       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   0.0    140.0    115.7      1.77              0.80        23    0.077    126K    13K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     87.7      0.60              0.17        23    0.026       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3000.0 total, 600.0 interval#012Flush(GB): cumulative 0.051, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.25 GB write, 0.09 MB/s write, 0.24 GB read, 0.08 MB/s read, 2.4 seconds#012Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 304.00 MB usage: 27.58 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.00025 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1600,26.60 MB,8.74917%) FilterBlock(47,360.80 KB,0.115902%) IndexBlock(47,640.83 KB,0.205858%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 08:23:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:29.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:29.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:30 np0005465988 nova_compute[236126]: 2025-10-02 12:23:30.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:31 np0005465988 nova_compute[236126]: 2025-10-02 12:23:31.289 2 DEBUG oslo_concurrency.lockutils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:31 np0005465988 nova_compute[236126]: 2025-10-02 12:23:31.290 2 DEBUG oslo_concurrency.lockutils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:31 np0005465988 nova_compute[236126]: 2025-10-02 12:23:31.290 2 INFO nova.compute.manager [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Shelving#033[00m
Oct  2 08:23:31 np0005465988 nova_compute[236126]: 2025-10-02 12:23:31.318 2 DEBUG nova.virt.libvirt.driver [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:23:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:23:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:31.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:23:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:31.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:32Z|00370|binding|INFO|Releasing lport 18276c7d-4e7d-4b5c-a013-87c3ea8e7868 from this chassis (sb_readonly=0)
Oct  2 08:23:32 np0005465988 nova_compute[236126]: 2025-10-02 12:23:32.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:32.736 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:32 np0005465988 nova_compute[236126]: 2025-10-02 12:23:32.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:32.739 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:23:33 np0005465988 nova_compute[236126]: 2025-10-02 12:23:33.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:33.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:33.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:33 np0005465988 kernel: tap7b92f05d-cc (unregistering): left promiscuous mode
Oct  2 08:23:33 np0005465988 NetworkManager[45041]: <info>  [1759407813.6238] device (tap7b92f05d-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:23:33 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:33Z|00371|binding|INFO|Releasing lport 7b92f05d-cce2-48f0-a124-0408773ce275 from this chassis (sb_readonly=0)
Oct  2 08:23:33 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:33Z|00372|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 down in Southbound
Oct  2 08:23:33 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:33Z|00373|binding|INFO|Removing iface tap7b92f05d-cc ovn-installed in OVS
Oct  2 08:23:33 np0005465988 nova_compute[236126]: 2025-10-02 12:23:33.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:33 np0005465988 nova_compute[236126]: 2025-10-02 12:23:33.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:33.649 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:0c:a5 10.100.0.11'], port_security=['fa:16:3e:80:0c:a5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3a4d32fc-bed8-4e11-9033-5b73501128fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f671ae-bb65-4932-84ce-cef4210e4599', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c740a14d1c5c45d1a0959b0e24ac460b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0cf8cc6d-2482-45e4-b576-7d811b75025a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5feaa29-5f25-4f45-a24f-ce44451fb322, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7b92f05d-cce2-48f0-a124-0408773ce275) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:33.650 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7b92f05d-cce2-48f0-a124-0408773ce275 in datapath d5f671ae-bb65-4932-84ce-cef4210e4599 unbound from our chassis#033[00m
Oct  2 08:23:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:33.652 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5f671ae-bb65-4932-84ce-cef4210e4599, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:23:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:33.655 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[25151694-4679-416a-8ea5-6e3ffbfa8320]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:33.656 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 namespace which is not needed anymore#033[00m
Oct  2 08:23:33 np0005465988 nova_compute[236126]: 2025-10-02 12:23:33.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:33 np0005465988 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Oct  2 08:23:33 np0005465988 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000005a.scope: Consumed 18.770s CPU time.
Oct  2 08:23:33 np0005465988 systemd-machined[192594]: Machine qemu-34-instance-0000005a terminated.
Oct  2 08:23:33 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[273929]: [NOTICE]   (273933) : haproxy version is 2.8.14-c23fe91
Oct  2 08:23:33 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[273929]: [NOTICE]   (273933) : path to executable is /usr/sbin/haproxy
Oct  2 08:23:33 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[273929]: [WARNING]  (273933) : Exiting Master process...
Oct  2 08:23:33 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[273929]: [WARNING]  (273933) : Exiting Master process...
Oct  2 08:23:33 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[273929]: [ALERT]    (273933) : Current worker (273935) exited with code 143 (Terminated)
Oct  2 08:23:33 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[273929]: [WARNING]  (273933) : All workers exited. Exiting... (0)
Oct  2 08:23:33 np0005465988 systemd[1]: libpod-c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2.scope: Deactivated successfully.
Oct  2 08:23:33 np0005465988 conmon[273929]: conmon c8e91c8f7b53f1054d64 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2.scope/container/memory.events
Oct  2 08:23:33 np0005465988 podman[276382]: 2025-10-02 12:23:33.870937621 +0000 UTC m=+0.084826871 container died c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:23:33 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2-userdata-shm.mount: Deactivated successfully.
Oct  2 08:23:33 np0005465988 systemd[1]: var-lib-containers-storage-overlay-3ae8a9327a574cf18d50ce8534c5d794212191a572b93dfc73eb571439a05eda-merged.mount: Deactivated successfully.
Oct  2 08:23:33 np0005465988 podman[276382]: 2025-10-02 12:23:33.925930793 +0000 UTC m=+0.139819973 container cleanup c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:23:33 np0005465988 systemd[1]: libpod-conmon-c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2.scope: Deactivated successfully.
Oct  2 08:23:34 np0005465988 podman[276423]: 2025-10-02 12:23:34.012968777 +0000 UTC m=+0.052669386 container remove c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:23:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:34.020 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0eb49a35-0a13-4f2c-a663-bb016e9c1f67]: (4, ('Thu Oct  2 12:23:33 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 (c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2)\nc8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2\nThu Oct  2 12:23:33 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 (c8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2)\nc8e91c8f7b53f1054d645faefe78eb0cf8ee3f2247d617a1d3d527803ea55fa2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:34.023 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[71a5fc4a-53cb-4c67-9c7e-8b227879a950]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:34.024 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5f671ae-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:34 np0005465988 kernel: tapd5f671ae-b0: left promiscuous mode
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:34 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:23:34 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:34.053 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[47505872-b6e1-4a3a-b792-9eb239c36be1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:34.080 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[38fd0904-d7c6-4301-8339-1b0b0c52389f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:34.081 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ef9bb0d0-4d10-4d35-ae71-5ba61628522c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:34.103 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0ee715f4-8aeb-46b9-a939-2ee657eef9cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577673, 'reachable_time': 17815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276442, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:34 np0005465988 systemd[1]: run-netns-ovnmeta\x2dd5f671ae\x2dbb65\x2d4932\x2d84ce\x2dcef4210e4599.mount: Deactivated successfully.
Oct  2 08:23:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:34.107 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:23:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:34.107 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[17b8ab23-7cb2-4444-ac62-9e92fbcc7566]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.340 2 INFO nova.virt.libvirt.driver [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.352 2 INFO nova.virt.libvirt.driver [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance destroyed successfully.#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.353 2 DEBUG nova.objects.instance [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'numa_topology' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.762 2 DEBUG nova.compute.manager [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-unplugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.763 2 DEBUG oslo_concurrency.lockutils [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.763 2 DEBUG oslo_concurrency.lockutils [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.763 2 DEBUG oslo_concurrency.lockutils [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.763 2 DEBUG nova.compute.manager [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-unplugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.764 2 WARNING nova.compute.manager [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-unplugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state active and task_state shelving.#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.764 2 DEBUG nova.compute.manager [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.764 2 DEBUG oslo_concurrency.lockutils [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.765 2 DEBUG oslo_concurrency.lockutils [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.765 2 DEBUG oslo_concurrency.lockutils [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.766 2 DEBUG nova.compute.manager [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:34 np0005465988 nova_compute[236126]: 2025-10-02 12:23:34.766 2 WARNING nova.compute.manager [req-dd1c769b-7604-4e40-8bf5-c8954100828e req-a1f2d0de-8c9b-4779-bf43-052f9bc48cbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state active and task_state shelving.#033[00m
Oct  2 08:23:35 np0005465988 nova_compute[236126]: 2025-10-02 12:23:35.151 2 INFO nova.virt.libvirt.driver [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Beginning cold snapshot process#033[00m
Oct  2 08:23:35 np0005465988 nova_compute[236126]: 2025-10-02 12:23:35.212 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407800.211258, 465fce36-cba2-4b45-b592-eeda70de3c2a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:35 np0005465988 nova_compute[236126]: 2025-10-02 12:23:35.212 2 INFO nova.compute.manager [-] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:23:35 np0005465988 nova_compute[236126]: 2025-10-02 12:23:35.243 2 DEBUG nova.compute.manager [None req-59870cdf-88ae-4e75-ad12-b0567fd0d128 - - - - - -] [instance: 465fce36-cba2-4b45-b592-eeda70de3c2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:35 np0005465988 nova_compute[236126]: 2025-10-02 12:23:35.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:35 np0005465988 nova_compute[236126]: 2025-10-02 12:23:35.333 2 DEBUG nova.virt.libvirt.imagebackend [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:23:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:35.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:35.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:35 np0005465988 nova_compute[236126]: 2025-10-02 12:23:35.661 2 DEBUG nova.storage.rbd_utils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] creating snapshot(a20c7a3cfe9e42b68bfe4c07e0af77cd) on rbd image(3a4d32fc-bed8-4e11-9033-5b73501128fe_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:23:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e283 e283: 3 total, 3 up, 3 in
Oct  2 08:23:36 np0005465988 nova_compute[236126]: 2025-10-02 12:23:36.400 2 DEBUG nova.storage.rbd_utils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] cloning vms/3a4d32fc-bed8-4e11-9033-5b73501128fe_disk@a20c7a3cfe9e42b68bfe4c07e0af77cd to images/69218608-9c78-4f65-a885-9cb0c1edca97 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:23:36 np0005465988 nova_compute[236126]: 2025-10-02 12:23:36.537 2 DEBUG nova.storage.rbd_utils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] flattening images/69218608-9c78-4f65-a885-9cb0c1edca97 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:23:37 np0005465988 nova_compute[236126]: 2025-10-02 12:23:37.034 2 DEBUG nova.storage.rbd_utils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] removing snapshot(a20c7a3cfe9e42b68bfe4c07e0af77cd) on rbd image(3a4d32fc-bed8-4e11-9033-5b73501128fe_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:23:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:37.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e284 e284: 3 total, 3 up, 3 in
Oct  2 08:23:37 np0005465988 nova_compute[236126]: 2025-10-02 12:23:37.451 2 DEBUG nova.storage.rbd_utils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] creating snapshot(snap) on rbd image(69218608-9c78-4f65-a885-9cb0c1edca97) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:23:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:37.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:38 np0005465988 nova_compute[236126]: 2025-10-02 12:23:38.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e285 e285: 3 total, 3 up, 3 in
Oct  2 08:23:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:39.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:39.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:40 np0005465988 nova_compute[236126]: 2025-10-02 12:23:40.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:40 np0005465988 podman[276588]: 2025-10-02 12:23:40.534328038 +0000 UTC m=+0.066887746 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:23:40 np0005465988 podman[276589]: 2025-10-02 12:23:40.561899941 +0000 UTC m=+0.083378550 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 08:23:40 np0005465988 podman[276587]: 2025-10-02 12:23:40.567130181 +0000 UTC m=+0.102053517 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=ovn_controller)
Oct  2 08:23:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:40.741 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:40 np0005465988 nova_compute[236126]: 2025-10-02 12:23:40.850 2 INFO nova.virt.libvirt.driver [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Snapshot image upload complete#033[00m
Oct  2 08:23:40 np0005465988 nova_compute[236126]: 2025-10-02 12:23:40.851 2 DEBUG nova.compute.manager [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:40 np0005465988 nova_compute[236126]: 2025-10-02 12:23:40.925 2 INFO nova.compute.manager [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Shelve offloading#033[00m
Oct  2 08:23:40 np0005465988 nova_compute[236126]: 2025-10-02 12:23:40.934 2 INFO nova.virt.libvirt.driver [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance destroyed successfully.#033[00m
Oct  2 08:23:40 np0005465988 nova_compute[236126]: 2025-10-02 12:23:40.934 2 DEBUG nova.compute.manager [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:40 np0005465988 nova_compute[236126]: 2025-10-02 12:23:40.937 2 DEBUG oslo_concurrency.lockutils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:40 np0005465988 nova_compute[236126]: 2025-10-02 12:23:40.937 2 DEBUG oslo_concurrency.lockutils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquired lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:40 np0005465988 nova_compute[236126]: 2025-10-02 12:23:40.938 2 DEBUG nova.network.neutron [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:23:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:41.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:41.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:42 np0005465988 nova_compute[236126]: 2025-10-02 12:23:42.779 2 DEBUG nova.network.neutron [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating instance_info_cache with network_info: [{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:42 np0005465988 nova_compute[236126]: 2025-10-02 12:23:42.823 2 DEBUG oslo_concurrency.lockutils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Releasing lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:43 np0005465988 nova_compute[236126]: 2025-10-02 12:23:43.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:43.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:43.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:44 np0005465988 nova_compute[236126]: 2025-10-02 12:23:44.743 2 INFO nova.virt.libvirt.driver [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance destroyed successfully.#033[00m
Oct  2 08:23:44 np0005465988 nova_compute[236126]: 2025-10-02 12:23:44.743 2 DEBUG nova.objects.instance [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'resources' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:44 np0005465988 nova_compute[236126]: 2025-10-02 12:23:44.768 2 DEBUG nova.virt.libvirt.vif [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-757604995',display_name='tempest-ServersNegativeTestJSON-server-757604995',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-757604995',id=90,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c740a14d1c5c45d1a0959b0e24ac460b',ramdisk_id='',reservation_id='r-1qzpbgvk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-462972452',owner_user_name='tempest-ServersNegativeTestJSON-462972452-project-member',shelved_at='2025-10-02T12:23:40.851232',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='69218608-9c78-4f65-a885-9cb0c1edca97'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:23:35Z,user_data=None,user_id='4146a31af09c4e6a8aee251f2fec4f98',uuid=3a4d32fc-bed8-4e11-9033-5b73501128fe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:23:44 np0005465988 nova_compute[236126]: 2025-10-02 12:23:44.769 2 DEBUG nova.network.os_vif_util [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converting VIF {"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:44 np0005465988 nova_compute[236126]: 2025-10-02 12:23:44.771 2 DEBUG nova.network.os_vif_util [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:44 np0005465988 nova_compute[236126]: 2025-10-02 12:23:44.772 2 DEBUG os_vif [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:23:44 np0005465988 nova_compute[236126]: 2025-10-02 12:23:44.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:44 np0005465988 nova_compute[236126]: 2025-10-02 12:23:44.775 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b92f05d-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:44 np0005465988 nova_compute[236126]: 2025-10-02 12:23:44.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:44 np0005465988 nova_compute[236126]: 2025-10-02 12:23:44.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:23:44 np0005465988 nova_compute[236126]: 2025-10-02 12:23:44.785 2 INFO os_vif [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc')#033[00m
Oct  2 08:23:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:23:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:45.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:23:45 np0005465988 nova_compute[236126]: 2025-10-02 12:23:45.403 2 INFO nova.virt.libvirt.driver [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Deleting instance files /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe_del#033[00m
Oct  2 08:23:45 np0005465988 nova_compute[236126]: 2025-10-02 12:23:45.404 2 INFO nova.virt.libvirt.driver [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Deletion of /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe_del complete#033[00m
Oct  2 08:23:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:45.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:45 np0005465988 nova_compute[236126]: 2025-10-02 12:23:45.705 2 INFO nova.scheduler.client.report [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Deleted allocations for instance 3a4d32fc-bed8-4e11-9033-5b73501128fe#033[00m
Oct  2 08:23:45 np0005465988 nova_compute[236126]: 2025-10-02 12:23:45.749 2 DEBUG oslo_concurrency.lockutils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:45 np0005465988 nova_compute[236126]: 2025-10-02 12:23:45.750 2 DEBUG oslo_concurrency.lockutils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:45 np0005465988 nova_compute[236126]: 2025-10-02 12:23:45.805 2 DEBUG oslo_concurrency.processutils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:23:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3496810732' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:23:46 np0005465988 nova_compute[236126]: 2025-10-02 12:23:46.276 2 DEBUG oslo_concurrency.processutils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:46 np0005465988 nova_compute[236126]: 2025-10-02 12:23:46.285 2 DEBUG nova.compute.provider_tree [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:46 np0005465988 nova_compute[236126]: 2025-10-02 12:23:46.304 2 DEBUG nova.scheduler.client.report [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:46 np0005465988 nova_compute[236126]: 2025-10-02 12:23:46.331 2 DEBUG oslo_concurrency.lockutils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:46 np0005465988 nova_compute[236126]: 2025-10-02 12:23:46.395 2 DEBUG oslo_concurrency.lockutils [None req-6704bedb-c901-416c-a0ea-a01f95615a94 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 15.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e286 e286: 3 total, 3 up, 3 in
Oct  2 08:23:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:47.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:47 np0005465988 nova_compute[236126]: 2025-10-02 12:23:47.472 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:47 np0005465988 nova_compute[236126]: 2025-10-02 12:23:47.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:47 np0005465988 nova_compute[236126]: 2025-10-02 12:23:47.499 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:47 np0005465988 nova_compute[236126]: 2025-10-02 12:23:47.500 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:47 np0005465988 nova_compute[236126]: 2025-10-02 12:23:47.500 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:47 np0005465988 nova_compute[236126]: 2025-10-02 12:23:47.501 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:23:47 np0005465988 nova_compute[236126]: 2025-10-02 12:23:47.501 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:47.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:23:47 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3789223055' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:23:47 np0005465988 nova_compute[236126]: 2025-10-02 12:23:47.946 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.208 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.210 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4461MB free_disk=20.818401336669922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.210 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.211 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.279 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.280 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.295 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:23:48 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/52643392' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.785 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.795 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.815 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.845 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.847 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.891 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407813.8896835, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.892 2 INFO nova.compute.manager [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:23:48 np0005465988 nova_compute[236126]: 2025-10-02 12:23:48.920 2 DEBUG nova.compute.manager [None req-cd7073bc-3574-4310-a0d0-ab47950020e4 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:23:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:49.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:23:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:49.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:49 np0005465988 nova_compute[236126]: 2025-10-02 12:23:49.738 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:49 np0005465988 nova_compute[236126]: 2025-10-02 12:23:49.738 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:49 np0005465988 nova_compute[236126]: 2025-10-02 12:23:49.739 2 INFO nova.compute.manager [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Unshelving#033[00m
Oct  2 08:23:49 np0005465988 nova_compute[236126]: 2025-10-02 12:23:49.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:49 np0005465988 nova_compute[236126]: 2025-10-02 12:23:49.915 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:49 np0005465988 nova_compute[236126]: 2025-10-02 12:23:49.916 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:49 np0005465988 nova_compute[236126]: 2025-10-02 12:23:49.924 2 DEBUG nova.objects.instance [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'pci_requests' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:49 np0005465988 nova_compute[236126]: 2025-10-02 12:23:49.941 2 DEBUG nova.objects.instance [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'numa_topology' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:49 np0005465988 nova_compute[236126]: 2025-10-02 12:23:49.956 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:23:49 np0005465988 nova_compute[236126]: 2025-10-02 12:23:49.957 2 INFO nova.compute.claims [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.059 2 DEBUG oslo_concurrency.processutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.312 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.314 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.336 2 DEBUG nova.compute.manager [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.401 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:23:50 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3708690614' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.527 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Acquiring lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.528 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.541 2 DEBUG oslo_concurrency.processutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.545 2 DEBUG nova.compute.manager [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.550 2 DEBUG nova.compute.provider_tree [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.568 2 DEBUG nova.scheduler.client.report [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.589 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.593 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.602 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.603 2 INFO nova.compute.claims [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.619 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.726 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:50 np0005465988 nova_compute[236126]: 2025-10-02 12:23:50.760 2 INFO nova.network.neutron [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating port 7b92f05d-cce2-48f0-a124-0408773ce275 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  2 08:23:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:23:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1161124527' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.189 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.197 2 DEBUG nova.compute.provider_tree [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.221 2 DEBUG nova.scheduler.client.report [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.244 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.245 2 DEBUG nova.compute.manager [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.249 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.258 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.259 2 INFO nova.compute.claims [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.330 2 DEBUG nova.compute.manager [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.331 2 DEBUG nova.network.neutron [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.359 2 INFO nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:23:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:51.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.395 2 DEBUG nova.compute.manager [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.451 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:51.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.550 2 DEBUG nova.compute.manager [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.553 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.554 2 INFO nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Creating image(s)#033[00m
Oct  2 08:23:51 np0005465988 podman[276838]: 2025-10-02 12:23:51.558696845 +0000 UTC m=+0.085352946 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.594 2 DEBUG nova.storage.rbd_utils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.628 2 DEBUG nova.storage.rbd_utils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.657 2 DEBUG nova.storage.rbd_utils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.661 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.691 2 DEBUG nova.policy [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2bd16d1f5f9d4eb396c474eedee67165', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.695 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.696 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquired lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.696 2 DEBUG nova.network.neutron [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.730 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.731 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.731 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.732 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.760 2 DEBUG nova.storage.rbd_utils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.763 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.847 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.848 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.848 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.848 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.848 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:23:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:23:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/450895416' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.889 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.896 2 DEBUG nova.compute.provider_tree [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.915 2 DEBUG nova.scheduler.client.report [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.943 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:51 np0005465988 nova_compute[236126]: 2025-10-02 12:23:51.944 2 DEBUG nova.compute.manager [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.034 2 DEBUG nova.compute.manager [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.034 2 DEBUG nova.network.neutron [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.081 2 INFO nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.090 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.326s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.135 2 DEBUG nova.compute.manager [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.189 2 DEBUG nova.storage.rbd_utils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] resizing rbd image 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.268 2 DEBUG nova.policy [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '25468893d71641a385711fd2982bb00b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '10fff81da7a54740a53a0771ce916329', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.327 2 DEBUG nova.compute.manager [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.329 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.330 2 INFO nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Creating image(s)#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.367 2 DEBUG nova.storage.rbd_utils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] rbd image 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.406 2 DEBUG nova.storage.rbd_utils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] rbd image 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.440 2 DEBUG nova.storage.rbd_utils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] rbd image 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.445 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.536 2 DEBUG nova.objects.instance [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.539 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.540 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.540 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.540 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.566 2 DEBUG nova.storage.rbd_utils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] rbd image 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.570 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.602 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.603 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Ensure instance console log exists: /var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.603 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.604 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.604 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.758 2 DEBUG nova.network.neutron [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Successfully created port: 3bdb6970-487f-4313-ab25-aa900f8b084a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.883 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.313s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:52 np0005465988 nova_compute[236126]: 2025-10-02 12:23:52.988 2 DEBUG nova.storage.rbd_utils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] resizing rbd image 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.118 2 DEBUG nova.objects.instance [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lazy-loading 'migration_context' on Instance uuid 7b4bdbc9-7451-4500-8794-c8edef50d6a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.143 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.144 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Ensure instance console log exists: /var/lib/nova/instances/7b4bdbc9-7451-4500-8794-c8edef50d6a4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.144 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.144 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.145 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.247 2 DEBUG nova.network.neutron [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Successfully created port: 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:23:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:23:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:53.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:53.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.738 2 DEBUG nova.compute.manager [req-4b478065-79a7-47b8-af5d-9e4a31338838 req-23febb75-7fb5-49ca-be95-c443d47b003d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-changed-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.738 2 DEBUG nova.compute.manager [req-4b478065-79a7-47b8-af5d-9e4a31338838 req-23febb75-7fb5-49ca-be95-c443d47b003d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Refreshing instance network info cache due to event network-changed-7b92f05d-cce2-48f0-a124-0408773ce275. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.739 2 DEBUG oslo_concurrency.lockutils [req-4b478065-79a7-47b8-af5d-9e4a31338838 req-23febb75-7fb5-49ca-be95-c443d47b003d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.750 2 DEBUG nova.network.neutron [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating instance_info_cache with network_info: [{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.765 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Releasing lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.766 2 DEBUG nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.766 2 INFO nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Creating image(s)#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.789 2 DEBUG nova.storage.rbd_utils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.793 2 DEBUG nova.objects.instance [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.794 2 DEBUG oslo_concurrency.lockutils [req-4b478065-79a7-47b8-af5d-9e4a31338838 req-23febb75-7fb5-49ca-be95-c443d47b003d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.794 2 DEBUG nova.network.neutron [req-4b478065-79a7-47b8-af5d-9e4a31338838 req-23febb75-7fb5-49ca-be95-c443d47b003d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Refreshing network info cache for port 7b92f05d-cce2-48f0-a124-0408773ce275 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.848 2 DEBUG nova.storage.rbd_utils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.884 2 DEBUG nova.storage.rbd_utils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.889 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "6143e917cc9fb22ebd2966ac33bc64dedd5da319" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:53 np0005465988 nova_compute[236126]: 2025-10-02 12:23:53.890 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "6143e917cc9fb22ebd2966ac33bc64dedd5da319" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.277 2 DEBUG nova.virt.libvirt.imagebackend [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Image locations are: [{'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/69218608-9c78-4f65-a885-9cb0c1edca97/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/69218608-9c78-4f65-a885-9cb0c1edca97/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.336 2 DEBUG nova.network.neutron [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Successfully updated port: 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.339 2 DEBUG nova.network.neutron [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Successfully updated port: 3bdb6970-487f-4313-ab25-aa900f8b084a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.343 2 DEBUG nova.virt.libvirt.imagebackend [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Selected location: {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/69218608-9c78-4f65-a885-9cb0c1edca97/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.344 2 DEBUG nova.storage.rbd_utils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] cloning images/69218608-9c78-4f65-a885-9cb0c1edca97@snap to None/3a4d32fc-bed8-4e11-9033-5b73501128fe_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.385 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.385 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquired lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.385 2 DEBUG nova.network.neutron [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.386 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Acquiring lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.387 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Acquired lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.387 2 DEBUG nova.network.neutron [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.438 2 DEBUG nova.compute.manager [req-6667d6e5-c51e-4a95-b653-a126fc69b4bd req-c9625ee8-b049-4f1a-92fe-5eb1728a8083 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Received event network-changed-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.439 2 DEBUG nova.compute.manager [req-6667d6e5-c51e-4a95-b653-a126fc69b4bd req-c9625ee8-b049-4f1a-92fe-5eb1728a8083 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Refreshing instance network info cache due to event network-changed-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.440 2 DEBUG oslo_concurrency.lockutils [req-6667d6e5-c51e-4a95-b653-a126fc69b4bd req-c9625ee8-b049-4f1a-92fe-5eb1728a8083 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.488 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "6143e917cc9fb22ebd2966ac33bc64dedd5da319" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.662 2 DEBUG nova.objects.instance [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'migration_context' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.742 2 DEBUG nova.storage.rbd_utils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] flattening vms/3a4d32fc-bed8-4e11-9033-5b73501128fe_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:23:54 np0005465988 nova_compute[236126]: 2025-10-02 12:23:54.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.084 2 DEBUG nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Image rbd:vms/3a4d32fc-bed8-4e11-9033-5b73501128fe_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.085 2 DEBUG nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.085 2 DEBUG nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Ensure instance console log exists: /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.086 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.086 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.087 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.090 2 DEBUG nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Start _get_guest_xml network_info=[{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-10-02T12:23:31Z,direct_url=<?>,disk_format='raw',id=69218608-9c78-4f65-a885-9cb0c1edca97,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-757604995-shelved',owner='c740a14d1c5c45d1a0959b0e24ac460b',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-10-02T12:23:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.096 2 WARNING nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.101 2 DEBUG nova.virt.libvirt.host [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.103 2 DEBUG nova.virt.libvirt.host [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.106 2 DEBUG nova.virt.libvirt.host [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.107 2 DEBUG nova.virt.libvirt.host [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.109 2 DEBUG nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.109 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-10-02T12:23:31Z,direct_url=<?>,disk_format='raw',id=69218608-9c78-4f65-a885-9cb0c1edca97,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-757604995-shelved',owner='c740a14d1c5c45d1a0959b0e24ac460b',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-10-02T12:23:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.110 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.110 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.110 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.111 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.111 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.111 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.112 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.112 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.112 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.113 2 DEBUG nova.virt.hardware [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.113 2 DEBUG nova.objects.instance [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.134 2 DEBUG oslo_concurrency.processutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.222 2 DEBUG nova.network.neutron [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.283 2 DEBUG nova.network.neutron [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:23:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:55.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:23:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:55.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:23:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:23:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2161220600' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.621 2 DEBUG oslo_concurrency.processutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.662 2 DEBUG nova.storage.rbd_utils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.668 2 DEBUG oslo_concurrency.processutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.965 2 DEBUG nova.compute.manager [req-0e787e3c-b9c4-473b-ace0-5364588aa154 req-3410c0d5-8811-42e4-9364-8f277c692122 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-changed-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.965 2 DEBUG nova.compute.manager [req-0e787e3c-b9c4-473b-ace0-5364588aa154 req-3410c0d5-8811-42e4-9364-8f277c692122 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Refreshing instance network info cache due to event network-changed-3bdb6970-487f-4313-ab25-aa900f8b084a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:23:55 np0005465988 nova_compute[236126]: 2025-10-02 12:23:55.966 2 DEBUG oslo_concurrency.lockutils [req-0e787e3c-b9c4-473b-ace0-5364588aa154 req-3410c0d5-8811-42e4-9364-8f277c692122 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:23:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/834154144' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.153 2 DEBUG oslo_concurrency.processutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.156 2 DEBUG nova.virt.libvirt.vif [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:21:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-757604995',display_name='tempest-ServersNegativeTestJSON-server-757604995',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-757604995',id=90,image_ref='69218608-9c78-4f65-a885-9cb0c1edca97',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='c740a14d1c5c45d1a0959b0e24ac460b',ramdisk_id='',reservation_id='r-1qzpbgvk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-462972452',owner_user_name='tempest-ServersNegativeTestJSON-462972452-project-member',shelved_at='2025-10-02T12:23:40.851232',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='69218608-9c78-4f65-a885-9cb0c1edca97'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:49Z,user_data=None,user_id='4146a31af09c4e6a8aee251f2fec4f98',uuid=3a4d32fc-bed8-4e11-9033-5b73501128fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.156 2 DEBUG nova.network.os_vif_util [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converting VIF {"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.158 2 DEBUG nova.network.os_vif_util [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.160 2 DEBUG nova.objects.instance [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.183 2 DEBUG nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  <uuid>3a4d32fc-bed8-4e11-9033-5b73501128fe</uuid>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  <name>instance-0000005a</name>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersNegativeTestJSON-server-757604995</nova:name>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:23:55</nova:creationTime>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <nova:user uuid="4146a31af09c4e6a8aee251f2fec4f98">tempest-ServersNegativeTestJSON-462972452-project-member</nova:user>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <nova:project uuid="c740a14d1c5c45d1a0959b0e24ac460b">tempest-ServersNegativeTestJSON-462972452</nova:project>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="69218608-9c78-4f65-a885-9cb0c1edca97"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <nova:port uuid="7b92f05d-cce2-48f0-a124-0408773ce275">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <entry name="serial">3a4d32fc-bed8-4e11-9033-5b73501128fe</entry>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <entry name="uuid">3a4d32fc-bed8-4e11-9033-5b73501128fe</entry>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/3a4d32fc-bed8-4e11-9033-5b73501128fe_disk">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:80:0c:a5"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <target dev="tap7b92f05d-cc"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/console.log" append="off"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <input type="keyboard" bus="usb"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:23:56 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:23:56 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:23:56 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:23:56 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.184 2 DEBUG nova.compute.manager [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Preparing to wait for external event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.185 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.186 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.186 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.188 2 DEBUG nova.virt.libvirt.vif [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:21:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-757604995',display_name='tempest-ServersNegativeTestJSON-server-757604995',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-757604995',id=90,image_ref='69218608-9c78-4f65-a885-9cb0c1edca97',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='c740a14d1c5c45d1a0959b0e24ac460b',ramdisk_id='',reservation_id='r-1qzpbgvk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-462972452',owner_user_name='tempest-ServersNegativeTestJSON-462972452-project-member',shelved_at='2025-10-02T12:23:40.851232',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='69218608-9c78-4f65-a885-9cb0c1edca97'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:49Z,user_data=None,user_id='4146a31af09c4e6a8aee251f2fec4f98',uuid=3a4d32fc-bed8-4e11-9033-5b73501128fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.188 2 DEBUG nova.network.os_vif_util [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converting VIF {"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.189 2 DEBUG nova.network.os_vif_util [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.190 2 DEBUG os_vif [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.192 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.192 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.199 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b92f05d-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.200 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b92f05d-cc, col_values=(('external_ids', {'iface-id': '7b92f05d-cce2-48f0-a124-0408773ce275', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:0c:a5', 'vm-uuid': '3a4d32fc-bed8-4e11-9033-5b73501128fe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:56 np0005465988 NetworkManager[45041]: <info>  [1759407836.2036] manager: (tap7b92f05d-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.212 2 INFO os_vif [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc')#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.269 2 DEBUG nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.270 2 DEBUG nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.270 2 DEBUG nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] No VIF found with MAC fa:16:3e:80:0c:a5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.272 2 INFO nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Using config drive#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.305 2 DEBUG nova.storage.rbd_utils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.353 2 DEBUG nova.objects.instance [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'ec2_ids' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.423 2 DEBUG nova.objects.instance [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'keypairs' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.492 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.509 2 DEBUG nova.network.neutron [req-4b478065-79a7-47b8-af5d-9e4a31338838 req-23febb75-7fb5-49ca-be95-c443d47b003d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updated VIF entry in instance network info cache for port 7b92f05d-cce2-48f0-a124-0408773ce275. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.509 2 DEBUG nova.network.neutron [req-4b478065-79a7-47b8-af5d-9e4a31338838 req-23febb75-7fb5-49ca-be95-c443d47b003d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating instance_info_cache with network_info: [{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:56 np0005465988 nova_compute[236126]: 2025-10-02 12:23:56.532 2 DEBUG oslo_concurrency.lockutils [req-4b478065-79a7-47b8-af5d-9e4a31338838 req-23febb75-7fb5-49ca-be95-c443d47b003d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.005174) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407837005215, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 851, "num_deletes": 257, "total_data_size": 1476239, "memory_usage": 1508080, "flush_reason": "Manual Compaction"}
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407837011436, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 972642, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 43407, "largest_seqno": 44253, "table_properties": {"data_size": 968677, "index_size": 1681, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9264, "raw_average_key_size": 19, "raw_value_size": 960429, "raw_average_value_size": 2017, "num_data_blocks": 73, "num_entries": 476, "num_filter_entries": 476, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759407787, "oldest_key_time": 1759407787, "file_creation_time": 1759407837, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 6292 microseconds, and 3317 cpu microseconds.
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.011466) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 972642 bytes OK
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.011483) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.013407) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.013419) EVENT_LOG_v1 {"time_micros": 1759407837013415, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.013436) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 1471813, prev total WAL file size 1471813, number of live WAL files 2.
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.013990) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031323534' seq:72057594037927935, type:22 .. '6C6F676D0031353036' seq:0, type:0; will stop at (end)
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(949KB)], [81(9743KB)]
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407837014019, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 10949667, "oldest_snapshot_seqno": -1}
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6840 keys, 10814248 bytes, temperature: kUnknown
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407837058338, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 10814248, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10767626, "index_size": 28398, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 176059, "raw_average_key_size": 25, "raw_value_size": 10644341, "raw_average_value_size": 1556, "num_data_blocks": 1133, "num_entries": 6840, "num_filter_entries": 6840, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759407837, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.058812) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 10814248 bytes
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.060149) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 245.9 rd, 242.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 9.5 +0.0 blob) out(10.3 +0.0 blob), read-write-amplify(22.4) write-amplify(11.1) OK, records in: 7370, records dropped: 530 output_compression: NoCompression
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.060173) EVENT_LOG_v1 {"time_micros": 1759407837060161, "job": 50, "event": "compaction_finished", "compaction_time_micros": 44527, "compaction_time_cpu_micros": 22841, "output_level": 6, "num_output_files": 1, "total_output_size": 10814248, "num_input_records": 7370, "num_output_records": 6840, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407837060570, "job": 50, "event": "table_file_deletion", "file_number": 83}
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407837062998, "job": 50, "event": "table_file_deletion", "file_number": 81}
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.013915) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.063069) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.063075) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.063077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.063078) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:23:57.063080) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:23:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:23:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:57.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:23:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.529 2 DEBUG nova.network.neutron [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Updating instance_info_cache with network_info: [{"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:57.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.559 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Releasing lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.559 2 DEBUG nova.compute.manager [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Instance network_info: |[{"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.560 2 DEBUG oslo_concurrency.lockutils [req-6667d6e5-c51e-4a95-b653-a126fc69b4bd req-c9625ee8-b049-4f1a-92fe-5eb1728a8083 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.560 2 DEBUG nova.network.neutron [req-6667d6e5-c51e-4a95-b653-a126fc69b4bd req-c9625ee8-b049-4f1a-92fe-5eb1728a8083 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Refreshing network info cache for port 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.563 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Start _get_guest_xml network_info=[{"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.567 2 WARNING nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.571 2 DEBUG nova.virt.libvirt.host [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.572 2 DEBUG nova.virt.libvirt.host [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.575 2 DEBUG nova.virt.libvirt.host [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.575 2 DEBUG nova.virt.libvirt.host [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.576 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.576 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.577 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.577 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.577 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.577 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.578 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.578 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.578 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.578 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.579 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.579 2 DEBUG nova.virt.hardware [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:23:57 np0005465988 nova_compute[236126]: 2025-10-02 12:23:57.582 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.037 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.068 2 DEBUG nova.storage.rbd_utils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] rbd image 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.072 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.103 2 INFO nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Creating config drive at /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.108 2 DEBUG oslo_concurrency.processutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_3ng4mde execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.137 2 DEBUG nova.network.neutron [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.184 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Releasing lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.184 2 DEBUG nova.compute.manager [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance network_info: |[{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.185 2 DEBUG oslo_concurrency.lockutils [req-0e787e3c-b9c4-473b-ace0-5364588aa154 req-3410c0d5-8811-42e4-9364-8f277c692122 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.185 2 DEBUG nova.network.neutron [req-0e787e3c-b9c4-473b-ace0-5364588aa154 req-3410c0d5-8811-42e4-9364-8f277c692122 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Refreshing network info cache for port 3bdb6970-487f-4313-ab25-aa900f8b084a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.190 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Start _get_guest_xml network_info=[{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.195 2 WARNING nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.199 2 DEBUG nova.virt.libvirt.host [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.200 2 DEBUG nova.virt.libvirt.host [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.203 2 DEBUG nova.virt.libvirt.host [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.204 2 DEBUG nova.virt.libvirt.host [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.205 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.205 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.206 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.206 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.207 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.207 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.207 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.207 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.208 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.208 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.208 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.209 2 DEBUG nova.virt.hardware [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.213 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.243 2 DEBUG oslo_concurrency.processutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_3ng4mde" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.286 2 DEBUG nova.storage.rbd_utils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] rbd image 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.292 2 DEBUG oslo_concurrency.processutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:23:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1692804411' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.516 2 DEBUG oslo_concurrency.processutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config 3a4d32fc-bed8-4e11-9033-5b73501128fe_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.224s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.517 2 INFO nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Deleting local config drive /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe/disk.config because it was imported into RBD.#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.521 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.524 2 DEBUG nova.virt.libvirt.vif [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1305802395',display_name='tempest-ServerActionsTestOtherB-server-1305802395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1305802395',id=101,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGD2jbBFmRg2ZrnheVnZyLwDISk/dFTNtp10+sWyF/q+rC4Q86cvBQSRgacxSPIqXVpmiVTqI66cLDPhvjcnRFXyQqHRS/RWGvUZk+wm1wfft8CveiGko+Vh4vSox2iOrA==',key_name='tempest-keypair-1336245373',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='10fff81da7a54740a53a0771ce916329',ramdisk_id='',reservation_id='r-bt70f33h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1686489955',owner_user_name='tempest-ServerActionsTestOtherB-1686489955-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25468893d71641a385711fd2982bb00b',uuid=7b4bdbc9-7451-4500-8794-c8edef50d6a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.524 2 DEBUG nova.network.os_vif_util [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Converting VIF {"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.526 2 DEBUG nova.network.os_vif_util [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:91:e2,bridge_name='br-int',has_traffic_filtering=True,id=9d6e67d8-8c6a-4b95-b332-80f8674a0ebb,network=Network(4035a600-4a5e-41ee-a619-d81e2c993b79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d6e67d8-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.528 2 DEBUG nova.objects.instance [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7b4bdbc9-7451-4500-8794-c8edef50d6a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.550 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  <uuid>7b4bdbc9-7451-4500-8794-c8edef50d6a4</uuid>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  <name>instance-00000065</name>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerActionsTestOtherB-server-1305802395</nova:name>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:23:57</nova:creationTime>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <nova:user uuid="25468893d71641a385711fd2982bb00b">tempest-ServerActionsTestOtherB-1686489955-project-member</nova:user>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <nova:project uuid="10fff81da7a54740a53a0771ce916329">tempest-ServerActionsTestOtherB-1686489955</nova:project>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <nova:port uuid="9d6e67d8-8c6a-4b95-b332-80f8674a0ebb">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <entry name="serial">7b4bdbc9-7451-4500-8794-c8edef50d6a4</entry>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <entry name="uuid">7b4bdbc9-7451-4500-8794-c8edef50d6a4</entry>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk.config">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:6d:91:e2"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <target dev="tap9d6e67d8-8c"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/7b4bdbc9-7451-4500-8794-c8edef50d6a4/console.log" append="off"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:23:58 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:23:58 np0005465988 kernel: tap7b92f05d-cc: entered promiscuous mode
Oct  2 08:23:58 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:23:58 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:23:58 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:23:58 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.552 2 DEBUG nova.compute.manager [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Preparing to wait for external event network-vif-plugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.552 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Acquiring lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.553 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.553 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.554 2 DEBUG nova.virt.libvirt.vif [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1305802395',display_name='tempest-ServerActionsTestOtherB-server-1305802395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1305802395',id=101,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGD2jbBFmRg2ZrnheVnZyLwDISk/dFTNtp10+sWyF/q+rC4Q86cvBQSRgacxSPIqXVpmiVTqI66cLDPhvjcnRFXyQqHRS/RWGvUZk+wm1wfft8CveiGko+Vh4vSox2iOrA==',key_name='tempest-keypair-1336245373',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='10fff81da7a54740a53a0771ce916329',ramdisk_id='',reservation_id='r-bt70f33h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1686489955',owner_user_name='tempest-ServerActionsTestOtherB-1686489955-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25468893d71641a385711fd2982bb00b',uuid=7b4bdbc9-7451-4500-8794-c8edef50d6a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.555 2 DEBUG nova.network.os_vif_util [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Converting VIF {"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.556 2 DEBUG nova.network.os_vif_util [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:91:e2,bridge_name='br-int',has_traffic_filtering=True,id=9d6e67d8-8c6a-4b95-b332-80f8674a0ebb,network=Network(4035a600-4a5e-41ee-a619-d81e2c993b79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d6e67d8-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.557 2 DEBUG os_vif [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:91:e2,bridge_name='br-int',has_traffic_filtering=True,id=9d6e67d8-8c6a-4b95-b332-80f8674a0ebb,network=Network(4035a600-4a5e-41ee-a619-d81e2c993b79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d6e67d8-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.559 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.560 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9d6e67d8-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.565 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9d6e67d8-8c, col_values=(('external_ids', {'iface-id': '9d6e67d8-8c6a-4b95-b332-80f8674a0ebb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:91:e2', 'vm-uuid': '7b4bdbc9-7451-4500-8794-c8edef50d6a4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:58 np0005465988 NetworkManager[45041]: <info>  [1759407838.5685] manager: (tap9d6e67d8-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:23:58 np0005465988 NetworkManager[45041]: <info>  [1759407838.5740] manager: (tap7b92f05d-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/184)
Oct  2 08:23:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:58Z|00374|binding|INFO|Claiming lport 7b92f05d-cce2-48f0-a124-0408773ce275 for this chassis.
Oct  2 08:23:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:58Z|00375|binding|INFO|7b92f05d-cce2-48f0-a124-0408773ce275: Claiming fa:16:3e:80:0c:a5 10.100.0.11
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.588 2 INFO os_vif [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:91:e2,bridge_name='br-int',has_traffic_filtering=True,id=9d6e67d8-8c6a-4b95-b332-80f8674a0ebb,network=Network(4035a600-4a5e-41ee-a619-d81e2c993b79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d6e67d8-8c')#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.593 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:0c:a5 10.100.0.11'], port_security=['fa:16:3e:80:0c:a5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3a4d32fc-bed8-4e11-9033-5b73501128fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f671ae-bb65-4932-84ce-cef4210e4599', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c740a14d1c5c45d1a0959b0e24ac460b', 'neutron:revision_number': '7', 'neutron:security_group_ids': '0cf8cc6d-2482-45e4-b576-7d811b75025a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5feaa29-5f25-4f45-a24f-ce44451fb322, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7b92f05d-cce2-48f0-a124-0408773ce275) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.594 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7b92f05d-cce2-48f0-a124-0408773ce275 in datapath d5f671ae-bb65-4932-84ce-cef4210e4599 bound to our chassis#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.597 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d5f671ae-bb65-4932-84ce-cef4210e4599#033[00m
Oct  2 08:23:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:58Z|00376|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 ovn-installed in OVS
Oct  2 08:23:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:58Z|00377|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 up in Southbound
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.618 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a2bc6dc5-0ae7-4af9-b3df-4e428c89adb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.619 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd5f671ae-b1 in ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.622 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd5f671ae-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.622 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7a83895a-78eb-4b2b-bc27-8f3040233511]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.623 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8274e81b-b738-477b-ab72-9e94be8c40b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 systemd-machined[192594]: New machine qemu-37-instance-0000005a.
Oct  2 08:23:58 np0005465988 systemd-udevd[277649]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:23:58 np0005465988 systemd[1]: Started Virtual Machine qemu-37-instance-0000005a.
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.642 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[6de530e4-2289-4e82-883f-d7c1e6b3a1e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 NetworkManager[45041]: <info>  [1759407838.6512] device (tap7b92f05d-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:23:58 np0005465988 NetworkManager[45041]: <info>  [1759407838.6521] device (tap7b92f05d-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:23:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:23:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3779338660' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.676 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[852d24e0-fcec-4a81-bf54-6825057ecc47]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.685 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.686 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.686 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] No VIF found with MAC fa:16:3e:6d:91:e2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.687 2 INFO nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Using config drive#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.718 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[6b941e18-31bc-4e4f-a0ee-3318b3976989]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.723 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cdba6064-5846-4e44-849f-0e8591e787f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 NetworkManager[45041]: <info>  [1759407838.7240] manager: (tapd5f671ae-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/185)
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.737 2 DEBUG nova.storage.rbd_utils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] rbd image 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.757 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.761 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[50bc2c74-70c7-4378-8f81-8b306cda44ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.765 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[887c6b6e-3436-4d29-a11d-aba81114bb11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.785 2 DEBUG nova.storage.rbd_utils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:58 np0005465988 NetworkManager[45041]: <info>  [1759407838.7882] device (tapd5f671ae-b0): carrier: link connected
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.793 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ff62f542-8d93-4cb0-b97e-d6fe94f16643]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.794 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.814 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f96fe3-e2f9-45ca-8493-fc08a09a1b5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5f671ae-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:3a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589911, 'reachable_time': 22841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277722, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.834 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[05320ee8-3c62-4b4a-ae63-c2205c7152a4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:3a86'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589911, 'tstamp': 589911}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277724, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.859 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a6268001-ca0a-46d3-aeb3-5b99da0f2e6b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5f671ae-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:3a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589911, 'reachable_time': 22841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 277725, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.897 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d00c07bf-2e04-4299-ae87-d60b923ea429]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.969 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[181973c2-8369-4902-aaff-8630c02bbe04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.970 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5f671ae-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.970 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.971 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5f671ae-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:58 np0005465988 NetworkManager[45041]: <info>  [1759407838.9736] manager: (tapd5f671ae-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:58 np0005465988 kernel: tapd5f671ae-b0: entered promiscuous mode
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:58.979 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd5f671ae-b0, col_values=(('external_ids', {'iface-id': '18276c7d-4e7d-4b5c-a013-87c3ea8e7868'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:58 np0005465988 nova_compute[236126]: 2025-10-02 12:23:58.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:23:58Z|00378|binding|INFO|Releasing lport 18276c7d-4e7d-4b5c-a013-87c3ea8e7868 from this chassis (sb_readonly=0)
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:59.013 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f671ae-bb65-4932-84ce-cef4210e4599.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f671ae-bb65-4932-84ce-cef4210e4599.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:59.015 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8d22700c-c7ab-4f26-b755-aab7a39ada7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:59.016 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-d5f671ae-bb65-4932-84ce-cef4210e4599
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/d5f671ae-bb65-4932-84ce-cef4210e4599.pid.haproxy
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID d5f671ae-bb65-4932-84ce-cef4210e4599
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:23:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:23:59.019 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'env', 'PROCESS_TAG=haproxy-d5f671ae-bb65-4932-84ce-cef4210e4599', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d5f671ae-bb65-4932-84ce-cef4210e4599.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:23:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e287 e287: 3 total, 3 up, 3 in
Oct  2 08:23:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:23:59 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2618027857' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.324 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.327 2 DEBUG nova.virt.libvirt.vif [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.327 2 DEBUG nova.network.os_vif_util [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.328 2 DEBUG nova.network.os_vif_util [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.329 2 DEBUG nova.objects.instance [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.347 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  <uuid>87ebffd5-69af-414b-be5d-67ba42e8cae1</uuid>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  <name>instance-00000064</name>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerActionsTestJSON-server-131502281</nova:name>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:23:58</nova:creationTime>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <nova:user uuid="2bd16d1f5f9d4eb396c474eedee67165">tempest-ServerActionsTestJSON-842270816-project-member</nova:user>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <nova:project uuid="4b8ca48cb5f64ef3b0736b8be82378b8">tempest-ServerActionsTestJSON-842270816</nova:project>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <nova:port uuid="3bdb6970-487f-4313-ab25-aa900f8b084a">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <entry name="serial">87ebffd5-69af-414b-be5d-67ba42e8cae1</entry>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <entry name="uuid">87ebffd5-69af-414b-be5d-67ba42e8cae1</entry>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/87ebffd5-69af-414b-be5d-67ba42e8cae1_disk">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/87ebffd5-69af-414b-be5d-67ba42e8cae1_disk.config">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:22:0e:b9"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <target dev="tap3bdb6970-48"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/console.log" append="off"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:23:59 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:23:59 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:23:59 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:23:59 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.348 2 DEBUG nova.compute.manager [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Preparing to wait for external event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.348 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.349 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.349 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.351 2 DEBUG nova.virt.libvirt.vif [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.351 2 DEBUG nova.network.os_vif_util [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.352 2 DEBUG nova.network.os_vif_util [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.353 2 DEBUG os_vif [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.354 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.355 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.359 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bdb6970-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.361 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3bdb6970-48, col_values=(('external_ids', {'iface-id': '3bdb6970-487f-4313-ab25-aa900f8b084a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:0e:b9', 'vm-uuid': '87ebffd5-69af-414b-be5d-67ba42e8cae1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:59 np0005465988 NetworkManager[45041]: <info>  [1759407839.3642] manager: (tap3bdb6970-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.371 2 INFO os_vif [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48')#033[00m
Oct  2 08:23:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:23:59.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.422 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.422 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.422 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No VIF found with MAC fa:16:3e:22:0e:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.423 2 INFO nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Using config drive#033[00m
Oct  2 08:23:59 np0005465988 podman[277826]: 2025-10-02 12:23:59.440842671 +0000 UTC m=+0.068854332 container create 9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.466 2 DEBUG nova.storage.rbd_utils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:23:59 np0005465988 systemd[1]: Started libpod-conmon-9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438.scope.
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.492 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:59 np0005465988 podman[277826]: 2025-10-02 12:23:59.403652431 +0000 UTC m=+0.031664172 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:23:59 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:23:59 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fb22b3706838831b1effe3b467fdd1dd8508f4374ae9b309fb302e97bb1a94f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:23:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:23:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:23:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:23:59.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.544 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407839.544202, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.544 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Started (Lifecycle Event)#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.547 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.547 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.547 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:23:59 np0005465988 podman[277826]: 2025-10-02 12:23:59.550231707 +0000 UTC m=+0.178243378 container init 9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:23:59 np0005465988 podman[277826]: 2025-10-02 12:23:59.560655737 +0000 UTC m=+0.188667378 container start 9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.581 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.581 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.581 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.582 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.582 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.582 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.584 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.588 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407839.5459652, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.589 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:23:59 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[277862]: [NOTICE]   (277866) : New worker (277868) forked
Oct  2 08:23:59 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[277862]: [NOTICE]   (277866) : Loading success.
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.620 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.624 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:59 np0005465988 nova_compute[236126]: 2025-10-02 12:23:59.659 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.526 2 INFO nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Creating config drive at /var/lib/nova/instances/7b4bdbc9-7451-4500-8794-c8edef50d6a4/disk.config#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.539 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7b4bdbc9-7451-4500-8794-c8edef50d6a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ectwf3q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.702 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7b4bdbc9-7451-4500-8794-c8edef50d6a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ectwf3q" returned: 0 in 0.163s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.745 2 DEBUG nova.storage.rbd_utils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] rbd image 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.752 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7b4bdbc9-7451-4500-8794-c8edef50d6a4/disk.config 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.795 2 DEBUG nova.compute.manager [req-6c350550-ddd3-4f32-9965-891d4c3c639a req-d4b3b890-d65a-41f3-ab5d-f3abbe59fc5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.796 2 DEBUG oslo_concurrency.lockutils [req-6c350550-ddd3-4f32-9965-891d4c3c639a req-d4b3b890-d65a-41f3-ab5d-f3abbe59fc5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.797 2 DEBUG oslo_concurrency.lockutils [req-6c350550-ddd3-4f32-9965-891d4c3c639a req-d4b3b890-d65a-41f3-ab5d-f3abbe59fc5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.797 2 DEBUG oslo_concurrency.lockutils [req-6c350550-ddd3-4f32-9965-891d4c3c639a req-d4b3b890-d65a-41f3-ab5d-f3abbe59fc5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.797 2 DEBUG nova.compute.manager [req-6c350550-ddd3-4f32-9965-891d4c3c639a req-d4b3b890-d65a-41f3-ab5d-f3abbe59fc5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Processing event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.798 2 DEBUG nova.compute.manager [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.802 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407840.8022668, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.802 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.808 2 DEBUG nova.virt.libvirt.driver [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.813 2 INFO nova.virt.libvirt.driver [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance spawned successfully.#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.822 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.829 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.861 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.978 2 DEBUG oslo_concurrency.processutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7b4bdbc9-7451-4500-8794-c8edef50d6a4/disk.config 7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:00 np0005465988 nova_compute[236126]: 2025-10-02 12:24:00.979 2 INFO nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Deleting local config drive /var/lib/nova/instances/7b4bdbc9-7451-4500-8794-c8edef50d6a4/disk.config because it was imported into RBD.#033[00m
Oct  2 08:24:01 np0005465988 kernel: tap9d6e67d8-8c: entered promiscuous mode
Oct  2 08:24:01 np0005465988 NetworkManager[45041]: <info>  [1759407841.0538] manager: (tap9d6e67d8-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Oct  2 08:24:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:01Z|00379|binding|INFO|Claiming lport 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb for this chassis.
Oct  2 08:24:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:01Z|00380|binding|INFO|9d6e67d8-8c6a-4b95-b332-80f8674a0ebb: Claiming fa:16:3e:6d:91:e2 10.100.0.10
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.074 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:91:e2 10.100.0.10'], port_security=['fa:16:3e:6d:91:e2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7b4bdbc9-7451-4500-8794-c8edef50d6a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4035a600-4a5e-41ee-a619-d81e2c993b79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10fff81da7a54740a53a0771ce916329', 'neutron:revision_number': '2', 'neutron:security_group_ids': '32af0a94-4565-470d-9918-1bc97e347f8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5dc7931-b785-4336-99b8-936a17be87c3, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=9d6e67d8-8c6a-4b95-b332-80f8674a0ebb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.075 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb in datapath 4035a600-4a5e-41ee-a619-d81e2c993b79 bound to our chassis#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.076 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4035a600-4a5e-41ee-a619-d81e2c993b79#033[00m
Oct  2 08:24:01 np0005465988 NetworkManager[45041]: <info>  [1759407841.0883] device (tap9d6e67d8-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:24:01 np0005465988 NetworkManager[45041]: <info>  [1759407841.0892] device (tap9d6e67d8-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.101 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a6dee69b-e479-4d1a-860a-f61d36c8b368]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.102 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4035a600-41 in ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.104 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4035a600-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.104 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[85bed30c-4db7-423a-ad2b-8583e539b843]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.105 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f186ad5c-a8f1-4b0d-b1f0-c10ebed3f69d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 systemd-machined[192594]: New machine qemu-38-instance-00000065.
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.119 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[441ce64f-0ec2-4e26-8398-73ff439e3501]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 systemd[1]: Started Virtual Machine qemu-38-instance-00000065.
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.159 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e967f480-0723-428c-aa4a-4c1c0bb1a5f1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:01Z|00381|binding|INFO|Setting lport 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb ovn-installed in OVS
Oct  2 08:24:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:01Z|00382|binding|INFO|Setting lport 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb up in Southbound
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.212 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[bb253f7b-cb66-4768-96e4-392296dff801]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 NetworkManager[45041]: <info>  [1759407841.2241] manager: (tap4035a600-40): new Veth device (/org/freedesktop/NetworkManager/Devices/189)
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.223 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7513134f-0c47-4eaf-9095-2eb612601220]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.278 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[012b47ab-bfc9-4d9f-9d74-0d4ef00c46a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.281 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4f71a6-6f05-4c63-b59c-887625fdb6a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 NetworkManager[45041]: <info>  [1759407841.3079] device (tap4035a600-40): carrier: link connected
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.312 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[eca464a8-8000-4654-b384-ab9eaa7ca8a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.329 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[55ffe4ec-809f-416f-8fd3-3675f958ebc0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4035a600-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:fb:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590163, 'reachable_time': 18938, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277956, 'error': None, 'target': 'ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.345 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[184543f9-26ba-4afc-9aee-263f3333dc73]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed0:fb3f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590163, 'tstamp': 590163}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277957, 'error': None, 'target': 'ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.362 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e7dec933-801c-4da7-bbaa-c7166034184b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4035a600-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:fb:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590163, 'reachable_time': 18938, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 277958, 'error': None, 'target': 'ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:01.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.403 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5a5444ab-aadc-42ff-a23f-d611dcebff76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.465 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[db138296-bcca-493e-9208-44573ed7d41b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.467 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4035a600-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.467 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.468 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4035a600-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:01 np0005465988 NetworkManager[45041]: <info>  [1759407841.4716] manager: (tap4035a600-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Oct  2 08:24:01 np0005465988 kernel: tap4035a600-40: entered promiscuous mode
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.478 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4035a600-40, col_values=(('external_ids', {'iface-id': '1befa812-080f-4694-ba8b-9130fe81621d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:01Z|00383|binding|INFO|Releasing lport 1befa812-080f-4694-ba8b-9130fe81621d from this chassis (sb_readonly=0)
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.509 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4035a600-4a5e-41ee-a619-d81e2c993b79.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4035a600-4a5e-41ee-a619-d81e2c993b79.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.511 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8b53e5-309d-405c-b894-a2e37d18575e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.512 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-4035a600-4a5e-41ee-a619-d81e2c993b79
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/4035a600-4a5e-41ee-a619-d81e2c993b79.pid.haproxy
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 4035a600-4a5e-41ee-a619-d81e2c993b79
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:24:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:01.514 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79', 'env', 'PROCESS_TAG=haproxy-4035a600-4a5e-41ee-a619-d81e2c993b79', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4035a600-4a5e-41ee-a619-d81e2c993b79.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:24:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:01.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.570 2 INFO nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Creating config drive at /var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/disk.config#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.581 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpttle04cj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.708 2 DEBUG nova.compute.manager [req-3a77b170-778c-49d6-b246-decfd180a70a req-66d6068d-2404-49c9-9332-2548d1f3bc3f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Received event network-vif-plugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.709 2 DEBUG oslo_concurrency.lockutils [req-3a77b170-778c-49d6-b246-decfd180a70a req-66d6068d-2404-49c9-9332-2548d1f3bc3f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.710 2 DEBUG oslo_concurrency.lockutils [req-3a77b170-778c-49d6-b246-decfd180a70a req-66d6068d-2404-49c9-9332-2548d1f3bc3f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.710 2 DEBUG oslo_concurrency.lockutils [req-3a77b170-778c-49d6-b246-decfd180a70a req-66d6068d-2404-49c9-9332-2548d1f3bc3f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.710 2 DEBUG nova.compute.manager [req-3a77b170-778c-49d6-b246-decfd180a70a req-66d6068d-2404-49c9-9332-2548d1f3bc3f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Processing event network-vif-plugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.722 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpttle04cj" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.781 2 DEBUG nova.storage.rbd_utils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:24:01 np0005465988 nova_compute[236126]: 2025-10-02 12:24:01.795 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/disk.config 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:01 np0005465988 podman[278110]: 2025-10-02 12:24:01.937572554 +0000 UTC m=+0.076283205 container create a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:24:01 np0005465988 podman[278110]: 2025-10-02 12:24:01.89260609 +0000 UTC m=+0.031316821 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:24:01 np0005465988 systemd[1]: Started libpod-conmon-a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78.scope.
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.012 2 DEBUG nova.network.neutron [req-6667d6e5-c51e-4a95-b653-a126fc69b4bd req-c9625ee8-b049-4f1a-92fe-5eb1728a8083 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Updated VIF entry in instance network info cache for port 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.014 2 DEBUG nova.network.neutron [req-6667d6e5-c51e-4a95-b653-a126fc69b4bd req-c9625ee8-b049-4f1a-92fe-5eb1728a8083 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Updating instance_info_cache with network_info: [{"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:02 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:24:02 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51944d077d4ed90800edaf890b3d23307635a413602e8c0e8704020086611126/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:24:02 np0005465988 podman[278110]: 2025-10-02 12:24:02.039812175 +0000 UTC m=+0.178522926 container init a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.042 2 DEBUG oslo_concurrency.lockutils [req-6667d6e5-c51e-4a95-b653-a126fc69b4bd req-c9625ee8-b049-4f1a-92fe-5eb1728a8083 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:02 np0005465988 podman[278110]: 2025-10-02 12:24:02.045623952 +0000 UTC m=+0.184334633 container start a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.081 2 DEBUG oslo_concurrency.processutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/disk.config 87ebffd5-69af-414b-be5d-67ba42e8cae1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.286s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.082 2 INFO nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Deleting local config drive /var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/disk.config because it was imported into RBD.#033[00m
Oct  2 08:24:02 np0005465988 neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79[278143]: [NOTICE]   (278147) : New worker (278150) forked
Oct  2 08:24:02 np0005465988 neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79[278143]: [NOTICE]   (278147) : Loading success.
Oct  2 08:24:02 np0005465988 kernel: tap3bdb6970-48: entered promiscuous mode
Oct  2 08:24:02 np0005465988 NetworkManager[45041]: <info>  [1759407842.1579] manager: (tap3bdb6970-48): new Tun device (/org/freedesktop/NetworkManager/Devices/191)
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:02 np0005465988 systemd-udevd[278084]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:24:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:02Z|00384|binding|INFO|Claiming lport 3bdb6970-487f-4313-ab25-aa900f8b084a for this chassis.
Oct  2 08:24:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:02Z|00385|binding|INFO|3bdb6970-487f-4313-ab25-aa900f8b084a: Claiming fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:24:02 np0005465988 NetworkManager[45041]: <info>  [1759407842.1817] device (tap3bdb6970-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:24:02 np0005465988 NetworkManager[45041]: <info>  [1759407842.1837] device (tap3bdb6970-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.186 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.188 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff bound to our chassis#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.191 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.206 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c435659d-b4bd-4c67-b189-69b29442a2bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.207 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2c62a66-f1 in ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:24:02 np0005465988 systemd-machined[192594]: New machine qemu-39-instance-00000064.
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.209 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2c62a66-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.209 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[262ebadf-6ea6-4214-bb99-a975b723a1c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.211 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[775aab20-73c3-4f72-990e-42309e76fbe3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 systemd[1]: Started Virtual Machine qemu-39-instance-00000064.
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.234 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[878eda66-3bda-474f-b0a5-fafd2b798781]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.262 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c31ee0-b19d-493e-9a7f-107defeeaa8a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:02Z|00386|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a ovn-installed in OVS
Oct  2 08:24:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:02Z|00387|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a up in Southbound
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e288 e288: 3 total, 3 up, 3 in
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.303 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d245cf-c40f-48cd-a866-d2429bbab9b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 NetworkManager[45041]: <info>  [1759407842.3100] manager: (tapb2c62a66-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/192)
Oct  2 08:24:02 np0005465988 systemd-udevd[278169]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.309 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[02dcc587-1b17-4aa5-b25a-9fa951841f34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.355 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[21fec6e8-709f-4cfe-ab6e-ac86d790ed18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.360 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0277c5f1-a9b1-4044-b155-1dcf84722c2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 NetworkManager[45041]: <info>  [1759407842.3831] device (tapb2c62a66-f0): carrier: link connected
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.388 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a5e8ee-42d0-4b17-a519-8f821b1dd82f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.411 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[614e74a4-d36b-4243-88c7-091655d489ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590271, 'reachable_time': 20325, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278203, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.427 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4d47a736-aa1e-4484-ba06-dc36cdd14d77]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:7a7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590271, 'tstamp': 590271}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278204, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.448 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[387f94df-14f9-43cd-ad73-2829ade5445d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590271, 'reachable_time': 20325, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 278205, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.462 2 DEBUG nova.compute.manager [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.463 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407842.4624445, 7b4bdbc9-7451-4500-8794-c8edef50d6a4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.464 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] VM Started (Lifecycle Event)#033[00m
Oct  2 08:24:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e288 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.485 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.492 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.493 2 INFO nova.virt.libvirt.driver [-] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Instance spawned successfully.#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.494 2 DEBUG nova.network.neutron [req-0e787e3c-b9c4-473b-ace0-5364588aa154 req-3410c0d5-8811-42e4-9364-8f277c692122 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updated VIF entry in instance network info cache for port 3bdb6970-487f-4313-ab25-aa900f8b084a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.495 2 DEBUG nova.network.neutron [req-0e787e3c-b9c4-473b-ace0-5364588aa154 req-3410c0d5-8811-42e4-9364-8f277c692122 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.497 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.502 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.519 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[40cfa4b3-7fea-4a86-b072-6fa1140c040b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.520 2 DEBUG oslo_concurrency.lockutils [req-0e787e3c-b9c4-473b-ace0-5364588aa154 req-3410c0d5-8811-42e4-9364-8f277c692122 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.531 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.531 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.532 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.533 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.534 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.534 2 DEBUG nova.virt.libvirt.driver [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.559 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.560 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407842.4633322, 7b4bdbc9-7451-4500-8794-c8edef50d6a4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.560 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.584 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating instance_info_cache with network_info: [{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.593 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.598 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407842.4658763, 7b4bdbc9-7451-4500-8794-c8edef50d6a4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.598 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.601 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[db8d0875-848b-453b-ad17-eecec5ba9cec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.604 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.604 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.604 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.604 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.604 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c62a66-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:02 np0005465988 kernel: tapb2c62a66-f0: entered promiscuous mode
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:02 np0005465988 NetworkManager[45041]: <info>  [1759407842.6071] manager: (tapb2c62a66-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.610 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2c62a66-f0, col_values=(('external_ids', {'iface-id': '8c1234f6-f595-4979-94c0-98da2211f0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:02Z|00388|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.613 2 INFO nova.compute.manager [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Took 10.29 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.613 2 DEBUG nova.compute.manager [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.621 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.632 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.633 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c56c441c-b84b-405d-9b43-5acb19e0366a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.634 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.636 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:24:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:02.637 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'env', 'PROCESS_TAG=haproxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.668 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.687 2 INFO nova.compute.manager [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Took 12.09 seconds to build instance.#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.702 2 DEBUG oslo_concurrency.lockutils [None req-a7c93e4b-27a3-423a-9a72-193efeb31566 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.814 2 DEBUG nova.compute.manager [req-141973e0-f06c-4234-bac3-ef25730c8382 req-8cb5f2bb-0b1f-44d1-9746-f853f3712da6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.815 2 DEBUG oslo_concurrency.lockutils [req-141973e0-f06c-4234-bac3-ef25730c8382 req-8cb5f2bb-0b1f-44d1-9746-f853f3712da6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.816 2 DEBUG oslo_concurrency.lockutils [req-141973e0-f06c-4234-bac3-ef25730c8382 req-8cb5f2bb-0b1f-44d1-9746-f853f3712da6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.816 2 DEBUG oslo_concurrency.lockutils [req-141973e0-f06c-4234-bac3-ef25730c8382 req-8cb5f2bb-0b1f-44d1-9746-f853f3712da6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.816 2 DEBUG nova.compute.manager [req-141973e0-f06c-4234-bac3-ef25730c8382 req-8cb5f2bb-0b1f-44d1-9746-f853f3712da6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:02 np0005465988 nova_compute[236126]: 2025-10-02 12:24:02.817 2 WARNING nova.compute.manager [req-141973e0-f06c-4234-bac3-ef25730c8382 req-8cb5f2bb-0b1f-44d1-9746-f853f3712da6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:03 np0005465988 podman[278280]: 2025-10-02 12:24:03.081071679 +0000 UTC m=+0.057028931 container create 5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:24:03 np0005465988 systemd[1]: Started libpod-conmon-5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1.scope.
Oct  2 08:24:03 np0005465988 podman[278280]: 2025-10-02 12:24:03.050743007 +0000 UTC m=+0.026700279 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:24:03 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:24:03 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e99aee037ade3170fd0cd66535ad543672f18b74efc9c407a3d4b4e75b30a7fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:24:03 np0005465988 podman[278280]: 2025-10-02 12:24:03.178216293 +0000 UTC m=+0.154173545 container init 5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:24:03 np0005465988 podman[278280]: 2025-10-02 12:24:03.184638027 +0000 UTC m=+0.160595279 container start 5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:24:03 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[278295]: [NOTICE]   (278299) : New worker (278301) forked
Oct  2 08:24:03 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[278295]: [NOTICE]   (278299) : Loading success.
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.374 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407843.3735318, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.374 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:24:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.395 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 08:24:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:03.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.398 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407843.3737702, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.398 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.418 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.422 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.440 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:03.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.689 2 DEBUG nova.compute.manager [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:03 np0005465988 nova_compute[236126]: 2025-10-02 12:24:03.763 2 DEBUG oslo_concurrency.lockutils [None req-b7f848b7-5c9b-45f8-b022-9e1b9eaaf757 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 14.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.170 2 DEBUG nova.compute.manager [req-fb6f30dc-73a4-4f02-b0c6-ae6becd7a5f8 req-8e378c1f-35e4-4c1d-9ea8-17209db55420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Received event network-vif-plugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.170 2 DEBUG oslo_concurrency.lockutils [req-fb6f30dc-73a4-4f02-b0c6-ae6becd7a5f8 req-8e378c1f-35e4-4c1d-9ea8-17209db55420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.170 2 DEBUG oslo_concurrency.lockutils [req-fb6f30dc-73a4-4f02-b0c6-ae6becd7a5f8 req-8e378c1f-35e4-4c1d-9ea8-17209db55420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.171 2 DEBUG oslo_concurrency.lockutils [req-fb6f30dc-73a4-4f02-b0c6-ae6becd7a5f8 req-8e378c1f-35e4-4c1d-9ea8-17209db55420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.171 2 DEBUG nova.compute.manager [req-fb6f30dc-73a4-4f02-b0c6-ae6becd7a5f8 req-8e378c1f-35e4-4c1d-9ea8-17209db55420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] No waiting events found dispatching network-vif-plugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.171 2 WARNING nova.compute.manager [req-fb6f30dc-73a4-4f02-b0c6-ae6becd7a5f8 req-8e378c1f-35e4-4c1d-9ea8-17209db55420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Received unexpected event network-vif-plugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb for instance with vm_state active and task_state None.#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.515 2 DEBUG nova.compute.manager [req-d52f5a9f-1479-484b-b477-3238f5b5dd39 req-31b26d69-095f-475e-af1f-64bf469b3686 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.515 2 DEBUG oslo_concurrency.lockutils [req-d52f5a9f-1479-484b-b477-3238f5b5dd39 req-31b26d69-095f-475e-af1f-64bf469b3686 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.515 2 DEBUG oslo_concurrency.lockutils [req-d52f5a9f-1479-484b-b477-3238f5b5dd39 req-31b26d69-095f-475e-af1f-64bf469b3686 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.515 2 DEBUG oslo_concurrency.lockutils [req-d52f5a9f-1479-484b-b477-3238f5b5dd39 req-31b26d69-095f-475e-af1f-64bf469b3686 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.516 2 DEBUG nova.compute.manager [req-d52f5a9f-1479-484b-b477-3238f5b5dd39 req-31b26d69-095f-475e-af1f-64bf469b3686 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Processing event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.516 2 DEBUG nova.compute.manager [req-d52f5a9f-1479-484b-b477-3238f5b5dd39 req-31b26d69-095f-475e-af1f-64bf469b3686 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.516 2 DEBUG oslo_concurrency.lockutils [req-d52f5a9f-1479-484b-b477-3238f5b5dd39 req-31b26d69-095f-475e-af1f-64bf469b3686 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.516 2 DEBUG oslo_concurrency.lockutils [req-d52f5a9f-1479-484b-b477-3238f5b5dd39 req-31b26d69-095f-475e-af1f-64bf469b3686 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.517 2 DEBUG oslo_concurrency.lockutils [req-d52f5a9f-1479-484b-b477-3238f5b5dd39 req-31b26d69-095f-475e-af1f-64bf469b3686 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.517 2 DEBUG nova.compute.manager [req-d52f5a9f-1479-484b-b477-3238f5b5dd39 req-31b26d69-095f-475e-af1f-64bf469b3686 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.517 2 WARNING nova.compute.manager [req-d52f5a9f-1479-484b-b477-3238f5b5dd39 req-31b26d69-095f-475e-af1f-64bf469b3686 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.518 2 DEBUG nova.compute.manager [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.521 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407844.5217645, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.522 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.523 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.526 2 INFO nova.virt.libvirt.driver [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance spawned successfully.#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.527 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.568 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.575 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.580 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.580 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.581 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.582 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.582 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.583 2 DEBUG nova.virt.libvirt.driver [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.610 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.653 2 INFO nova.compute.manager [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Took 13.10 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.653 2 DEBUG nova.compute.manager [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.743 2 INFO nova.compute.manager [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Took 14.36 seconds to build instance.#033[00m
Oct  2 08:24:04 np0005465988 nova_compute[236126]: 2025-10-02 12:24:04.765 2 DEBUG oslo_concurrency.lockutils [None req-a7bb25be-792a-4981-838b-de6dbafa533e 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:05 np0005465988 nova_compute[236126]: 2025-10-02 12:24:05.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:05 np0005465988 NetworkManager[45041]: <info>  [1759407845.3371] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Oct  2 08:24:05 np0005465988 NetworkManager[45041]: <info>  [1759407845.3380] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Oct  2 08:24:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:05.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:05 np0005465988 nova_compute[236126]: 2025-10-02 12:24:05.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:05Z|00389|binding|INFO|Releasing lport 18276c7d-4e7d-4b5c-a013-87c3ea8e7868 from this chassis (sb_readonly=0)
Oct  2 08:24:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:05Z|00390|binding|INFO|Releasing lport 1befa812-080f-4694-ba8b-9130fe81621d from this chassis (sb_readonly=0)
Oct  2 08:24:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:05Z|00391|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:24:05 np0005465988 nova_compute[236126]: 2025-10-02 12:24:05.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:05.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:05 np0005465988 nova_compute[236126]: 2025-10-02 12:24:05.575 2 DEBUG nova.compute.manager [req-2b7bd0d0-a667-4406-91af-b7a43fbd28a8 req-5c840be5-1aa4-43bf-a379-3de27b8783b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Received event network-changed-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:05 np0005465988 nova_compute[236126]: 2025-10-02 12:24:05.575 2 DEBUG nova.compute.manager [req-2b7bd0d0-a667-4406-91af-b7a43fbd28a8 req-5c840be5-1aa4-43bf-a379-3de27b8783b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Refreshing instance network info cache due to event network-changed-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:24:05 np0005465988 nova_compute[236126]: 2025-10-02 12:24:05.576 2 DEBUG oslo_concurrency.lockutils [req-2b7bd0d0-a667-4406-91af-b7a43fbd28a8 req-5c840be5-1aa4-43bf-a379-3de27b8783b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:05 np0005465988 nova_compute[236126]: 2025-10-02 12:24:05.576 2 DEBUG oslo_concurrency.lockutils [req-2b7bd0d0-a667-4406-91af-b7a43fbd28a8 req-5c840be5-1aa4-43bf-a379-3de27b8783b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:05 np0005465988 nova_compute[236126]: 2025-10-02 12:24:05.576 2 DEBUG nova.network.neutron [req-2b7bd0d0-a667-4406-91af-b7a43fbd28a8 req-5c840be5-1aa4-43bf-a379-3de27b8783b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Refreshing network info cache for port 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:24:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e289 e289: 3 total, 3 up, 3 in
Oct  2 08:24:06 np0005465988 nova_compute[236126]: 2025-10-02 12:24:06.496 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:06 np0005465988 nova_compute[236126]: 2025-10-02 12:24:06.498 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:24:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:24:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3001.0 total, 600.0 interval#012Cumulative writes: 37K writes, 152K keys, 37K commit groups, 1.0 writes per commit group, ingest: 0.15 GB, 0.05 MB/s#012Cumulative WAL: 37K writes, 13K syncs, 2.76 writes per sync, written: 0.15 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 12K writes, 48K keys, 12K commit groups, 1.0 writes per commit group, ingest: 48.72 MB, 0.08 MB/s#012Interval WAL: 12K writes, 4836 syncs, 2.49 writes per sync, written: 0.05 GB, 0.08 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 08:24:06 np0005465988 nova_compute[236126]: 2025-10-02 12:24:06.742 2 DEBUG nova.network.neutron [req-2b7bd0d0-a667-4406-91af-b7a43fbd28a8 req-5c840be5-1aa4-43bf-a379-3de27b8783b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Updated VIF entry in instance network info cache for port 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:24:06 np0005465988 nova_compute[236126]: 2025-10-02 12:24:06.743 2 DEBUG nova.network.neutron [req-2b7bd0d0-a667-4406-91af-b7a43fbd28a8 req-5c840be5-1aa4-43bf-a379-3de27b8783b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Updating instance_info_cache with network_info: [{"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:06 np0005465988 nova_compute[236126]: 2025-10-02 12:24:06.766 2 DEBUG oslo_concurrency.lockutils [req-2b7bd0d0-a667-4406-91af-b7a43fbd28a8 req-5c840be5-1aa4-43bf-a379-3de27b8783b7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:07.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:07.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:07 np0005465988 nova_compute[236126]: 2025-10-02 12:24:07.698 2 DEBUG nova.compute.manager [req-6b329f94-1512-465c-b4a6-9bf81f75e18f req-5a0c6c3f-59da-4527-aa9e-efff6fb4887d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-changed-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:07 np0005465988 nova_compute[236126]: 2025-10-02 12:24:07.699 2 DEBUG nova.compute.manager [req-6b329f94-1512-465c-b4a6-9bf81f75e18f req-5a0c6c3f-59da-4527-aa9e-efff6fb4887d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Refreshing instance network info cache due to event network-changed-3bdb6970-487f-4313-ab25-aa900f8b084a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:24:07 np0005465988 nova_compute[236126]: 2025-10-02 12:24:07.700 2 DEBUG oslo_concurrency.lockutils [req-6b329f94-1512-465c-b4a6-9bf81f75e18f req-5a0c6c3f-59da-4527-aa9e-efff6fb4887d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:07 np0005465988 nova_compute[236126]: 2025-10-02 12:24:07.700 2 DEBUG oslo_concurrency.lockutils [req-6b329f94-1512-465c-b4a6-9bf81f75e18f req-5a0c6c3f-59da-4527-aa9e-efff6fb4887d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:07 np0005465988 nova_compute[236126]: 2025-10-02 12:24:07.700 2 DEBUG nova.network.neutron [req-6b329f94-1512-465c-b4a6-9bf81f75e18f req-5a0c6c3f-59da-4527-aa9e-efff6fb4887d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Refreshing network info cache for port 3bdb6970-487f-4313-ab25-aa900f8b084a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:24:08 np0005465988 nova_compute[236126]: 2025-10-02 12:24:08.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:09 np0005465988 nova_compute[236126]: 2025-10-02 12:24:09.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:09.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:09.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.028 2 DEBUG nova.objects.instance [None req-b5ace6ea-c0d1-472d-8b24-f34f6ead0747 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.051 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407850.0514379, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.052 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.070 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.090 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.109 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.526 2 DEBUG nova.network.neutron [req-6b329f94-1512-465c-b4a6-9bf81f75e18f req-5a0c6c3f-59da-4527-aa9e-efff6fb4887d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updated VIF entry in instance network info cache for port 3bdb6970-487f-4313-ab25-aa900f8b084a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.527 2 DEBUG nova.network.neutron [req-6b329f94-1512-465c-b4a6-9bf81f75e18f req-5a0c6c3f-59da-4527-aa9e-efff6fb4887d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.549 2 DEBUG oslo_concurrency.lockutils [req-6b329f94-1512-465c-b4a6-9bf81f75e18f req-5a0c6c3f-59da-4527-aa9e-efff6fb4887d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:10 np0005465988 kernel: tap7b92f05d-cc (unregistering): left promiscuous mode
Oct  2 08:24:10 np0005465988 NetworkManager[45041]: <info>  [1759407850.5630] device (tap7b92f05d-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00392|binding|INFO|Releasing lport 7b92f05d-cce2-48f0-a124-0408773ce275 from this chassis (sb_readonly=0)
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00393|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 down in Southbound
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00394|binding|INFO|Removing iface tap7b92f05d-cc ovn-installed in OVS
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.597 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:0c:a5 10.100.0.11'], port_security=['fa:16:3e:80:0c:a5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3a4d32fc-bed8-4e11-9033-5b73501128fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f671ae-bb65-4932-84ce-cef4210e4599', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c740a14d1c5c45d1a0959b0e24ac460b', 'neutron:revision_number': '9', 'neutron:security_group_ids': '0cf8cc6d-2482-45e4-b576-7d811b75025a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5feaa29-5f25-4f45-a24f-ce44451fb322, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7b92f05d-cce2-48f0-a124-0408773ce275) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.599 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7b92f05d-cce2-48f0-a124-0408773ce275 in datapath d5f671ae-bb65-4932-84ce-cef4210e4599 unbound from our chassis#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.601 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5f671ae-bb65-4932-84ce-cef4210e4599, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.605 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b192bdc0-901d-4b29-960b-b77513eaac60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.607 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 namespace which is not needed anymore#033[00m
Oct  2 08:24:10 np0005465988 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Oct  2 08:24:10 np0005465988 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005a.scope: Consumed 10.652s CPU time.
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:10 np0005465988 systemd-machined[192594]: Machine qemu-37-instance-0000005a terminated.
Oct  2 08:24:10 np0005465988 podman[278321]: 2025-10-02 12:24:10.691600932 +0000 UTC m=+0.092031359 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:24:10 np0005465988 kernel: tap7b92f05d-cc: entered promiscuous mode
Oct  2 08:24:10 np0005465988 NetworkManager[45041]: <info>  [1759407850.7188] manager: (tap7b92f05d-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Oct  2 08:24:10 np0005465988 systemd-udevd[278348]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:24:10 np0005465988 podman[278317]: 2025-10-02 12:24:10.720201354 +0000 UTC m=+0.117845621 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00395|binding|INFO|Claiming lport 7b92f05d-cce2-48f0-a124-0408773ce275 for this chassis.
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00396|binding|INFO|7b92f05d-cce2-48f0-a124-0408773ce275: Claiming fa:16:3e:80:0c:a5 10.100.0.11
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:10 np0005465988 kernel: tap7b92f05d-cc (unregistering): left promiscuous mode
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.733 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:0c:a5 10.100.0.11'], port_security=['fa:16:3e:80:0c:a5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3a4d32fc-bed8-4e11-9033-5b73501128fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f671ae-bb65-4932-84ce-cef4210e4599', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c740a14d1c5c45d1a0959b0e24ac460b', 'neutron:revision_number': '9', 'neutron:security_group_ids': '0cf8cc6d-2482-45e4-b576-7d811b75025a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5feaa29-5f25-4f45-a24f-ce44451fb322, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7b92f05d-cce2-48f0-a124-0408773ce275) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:10 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[277862]: [NOTICE]   (277866) : haproxy version is 2.8.14-c23fe91
Oct  2 08:24:10 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[277862]: [NOTICE]   (277866) : path to executable is /usr/sbin/haproxy
Oct  2 08:24:10 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[277862]: [WARNING]  (277866) : Exiting Master process...
Oct  2 08:24:10 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[277862]: [ALERT]    (277866) : Current worker (277868) exited with code 143 (Terminated)
Oct  2 08:24:10 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[277862]: [WARNING]  (277866) : All workers exited. Exiting... (0)
Oct  2 08:24:10 np0005465988 systemd[1]: libpod-9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438.scope: Deactivated successfully.
Oct  2 08:24:10 np0005465988 podman[278388]: 2025-10-02 12:24:10.753596585 +0000 UTC m=+0.053122949 container died 9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.754 2 DEBUG nova.compute.manager [None req-b5ace6ea-c0d1-472d-8b24-f34f6ead0747 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00397|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 ovn-installed in OVS
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00398|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 up in Southbound
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00399|binding|INFO|Releasing lport 7b92f05d-cce2-48f0-a124-0408773ce275 from this chassis (sb_readonly=1)
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00400|if_status|INFO|Dropped 2 log messages in last 231 seconds (most recently, 231 seconds ago) due to excessive rate
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00401|if_status|INFO|Not setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 down as sb is readonly
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00402|binding|INFO|Removing iface tap7b92f05d-cc ovn-installed in OVS
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00403|binding|INFO|Releasing lport 7b92f05d-cce2-48f0-a124-0408773ce275 from this chassis (sb_readonly=0)
Oct  2 08:24:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:10Z|00404|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 down in Southbound
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.774 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:0c:a5 10.100.0.11'], port_security=['fa:16:3e:80:0c:a5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3a4d32fc-bed8-4e11-9033-5b73501128fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f671ae-bb65-4932-84ce-cef4210e4599', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c740a14d1c5c45d1a0959b0e24ac460b', 'neutron:revision_number': '9', 'neutron:security_group_ids': '0cf8cc6d-2482-45e4-b576-7d811b75025a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5feaa29-5f25-4f45-a24f-ce44451fb322, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7b92f05d-cce2-48f0-a124-0408773ce275) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:10 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438-userdata-shm.mount: Deactivated successfully.
Oct  2 08:24:10 np0005465988 systemd[1]: var-lib-containers-storage-overlay-8fb22b3706838831b1effe3b467fdd1dd8508f4374ae9b309fb302e97bb1a94f-merged.mount: Deactivated successfully.
Oct  2 08:24:10 np0005465988 podman[278320]: 2025-10-02 12:24:10.792118833 +0000 UTC m=+0.191208631 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:24:10 np0005465988 podman[278388]: 2025-10-02 12:24:10.802142392 +0000 UTC m=+0.101668746 container cleanup 9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:24:10 np0005465988 systemd[1]: libpod-conmon-9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438.scope: Deactivated successfully.
Oct  2 08:24:10 np0005465988 podman[278429]: 2025-10-02 12:24:10.862817037 +0000 UTC m=+0.040179737 container remove 9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.869 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[58940b8b-059f-4e06-af20-0d104c9843e5]: (4, ('Thu Oct  2 12:24:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 (9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438)\n9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438\nThu Oct  2 12:24:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 (9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438)\n9282b115facb0a81f91ffc0d6621878dc9eedccff1b87012acbd4ceeffa54438\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.871 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[18df46e7-cf53-4162-9380-0587dcbecbaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.872 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5f671ae-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:10 np0005465988 kernel: tapd5f671ae-b0: left promiscuous mode
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.895 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9222d64f-38c0-4291-8320-e36cf6f2c648]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.904 2 DEBUG nova.compute.manager [req-14c21796-c51a-490e-8a1c-7d2c0214ad99 req-cb817c26-b78d-4bd6-bc61-61d4454d3018 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-unplugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.904 2 DEBUG oslo_concurrency.lockutils [req-14c21796-c51a-490e-8a1c-7d2c0214ad99 req-cb817c26-b78d-4bd6-bc61-61d4454d3018 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.904 2 DEBUG oslo_concurrency.lockutils [req-14c21796-c51a-490e-8a1c-7d2c0214ad99 req-cb817c26-b78d-4bd6-bc61-61d4454d3018 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.904 2 DEBUG oslo_concurrency.lockutils [req-14c21796-c51a-490e-8a1c-7d2c0214ad99 req-cb817c26-b78d-4bd6-bc61-61d4454d3018 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.905 2 DEBUG nova.compute.manager [req-14c21796-c51a-490e-8a1c-7d2c0214ad99 req-cb817c26-b78d-4bd6-bc61-61d4454d3018 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-unplugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:10 np0005465988 nova_compute[236126]: 2025-10-02 12:24:10.905 2 WARNING nova.compute.manager [req-14c21796-c51a-490e-8a1c-7d2c0214ad99 req-cb817c26-b78d-4bd6-bc61-61d4454d3018 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-unplugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state suspended and task_state None.#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.917 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[995d2712-7576-4856-a1ad-8abc9ca5f148]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.918 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dc9c6fd8-f3c2-4f73-9135-fb782b5c1227]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.933 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e117971a-6d7c-4c3e-b1fb-61e6d82e5d76]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589904, 'reachable_time': 20291, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278448, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:10 np0005465988 systemd[1]: run-netns-ovnmeta\x2dd5f671ae\x2dbb65\x2d4932\x2d84ce\x2dcef4210e4599.mount: Deactivated successfully.
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.935 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.935 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[3be3b5a2-08bb-4a8d-aa5a-4e0a4b6058a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.937 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7b92f05d-cce2-48f0-a124-0408773ce275 in datapath d5f671ae-bb65-4932-84ce-cef4210e4599 unbound from our chassis#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.939 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5f671ae-bb65-4932-84ce-cef4210e4599, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.940 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6b8be07c-bd9c-434e-adae-078c95f01d7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.940 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7b92f05d-cce2-48f0-a124-0408773ce275 in datapath d5f671ae-bb65-4932-84ce-cef4210e4599 unbound from our chassis#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.942 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5f671ae-bb65-4932-84ce-cef4210e4599, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:10.942 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[82024ee9-80f5-40fe-b67f-08af331cec99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:11.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:11.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e290 e290: 3 total, 3 up, 3 in
Oct  2 08:24:12 np0005465988 nova_compute[236126]: 2025-10-02 12:24:12.472 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:12 np0005465988 nova_compute[236126]: 2025-10-02 12:24:12.794 2 INFO nova.compute.manager [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Resuming#033[00m
Oct  2 08:24:12 np0005465988 nova_compute[236126]: 2025-10-02 12:24:12.796 2 DEBUG nova.objects.instance [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'flavor' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:12 np0005465988 nova_compute[236126]: 2025-10-02 12:24:12.840 2 DEBUG oslo_concurrency.lockutils [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:12 np0005465988 nova_compute[236126]: 2025-10-02 12:24:12.841 2 DEBUG oslo_concurrency.lockutils [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquired lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:12 np0005465988 nova_compute[236126]: 2025-10-02 12:24:12.841 2 DEBUG nova.network.neutron [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:24:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:24:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:24:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.075 2 DEBUG nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.075 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.076 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.076 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.076 2 DEBUG nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.077 2 WARNING nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.077 2 DEBUG nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.078 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.078 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.078 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.079 2 DEBUG nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.079 2 WARNING nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.080 2 DEBUG nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.080 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.081 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.081 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.082 2 DEBUG nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.082 2 WARNING nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.083 2 DEBUG nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-unplugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.083 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.084 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.084 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.085 2 DEBUG nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-unplugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.085 2 WARNING nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-unplugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.086 2 DEBUG nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.086 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.087 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.087 2 DEBUG oslo_concurrency.lockutils [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.087 2 DEBUG nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.088 2 WARNING nova.compute.manager [req-cc9627d1-b5d5-4726-b4d2-5d125bdffe20 req-bf4ecbcf-2ff3-4dcd-ad07-3ca20cf2e3f7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 08:24:13 np0005465988 nova_compute[236126]: 2025-10-02 12:24:13.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:13.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:13.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.875 2 DEBUG nova.network.neutron [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating instance_info_cache with network_info: [{"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.922 2 DEBUG oslo_concurrency.lockutils [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Releasing lock "refresh_cache-3a4d32fc-bed8-4e11-9033-5b73501128fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.928 2 DEBUG nova.virt.libvirt.vif [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:21:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-757604995',display_name='tempest-ServersNegativeTestJSON-server-757604995',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-757604995',id=90,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c740a14d1c5c45d1a0959b0e24ac460b',ramdisk_id='',reservation_id='r-1qzpbgvk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServersNegativeTestJSON-462972452',owner_user_name='tempest-ServersNegativeTestJSON-462972452-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:24:10Z,user_data=None,user_id='4146a31af09c4e6a8aee251f2fec4f98',uuid=3a4d32fc-bed8-4e11-9033-5b73501128fe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.928 2 DEBUG nova.network.os_vif_util [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converting VIF {"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.929 2 DEBUG nova.network.os_vif_util [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.929 2 DEBUG os_vif [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.930 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.931 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.936 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b92f05d-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.936 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b92f05d-cc, col_values=(('external_ids', {'iface-id': '7b92f05d-cce2-48f0-a124-0408773ce275', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:0c:a5', 'vm-uuid': '3a4d32fc-bed8-4e11-9033-5b73501128fe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.936 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.937 2 INFO os_vif [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc')#033[00m
Oct  2 08:24:14 np0005465988 nova_compute[236126]: 2025-10-02 12:24:14.982 2 DEBUG nova.objects.instance [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'numa_topology' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:15 np0005465988 kernel: tap7b92f05d-cc: entered promiscuous mode
Oct  2 08:24:15 np0005465988 NetworkManager[45041]: <info>  [1759407855.0709] manager: (tap7b92f05d-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/197)
Oct  2 08:24:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:15Z|00405|binding|INFO|Claiming lport 7b92f05d-cce2-48f0-a124-0408773ce275 for this chassis.
Oct  2 08:24:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:15Z|00406|binding|INFO|7b92f05d-cce2-48f0-a124-0408773ce275: Claiming fa:16:3e:80:0c:a5 10.100.0.11
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.085 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:0c:a5 10.100.0.11'], port_security=['fa:16:3e:80:0c:a5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3a4d32fc-bed8-4e11-9033-5b73501128fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f671ae-bb65-4932-84ce-cef4210e4599', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c740a14d1c5c45d1a0959b0e24ac460b', 'neutron:revision_number': '12', 'neutron:security_group_ids': '0cf8cc6d-2482-45e4-b576-7d811b75025a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5feaa29-5f25-4f45-a24f-ce44451fb322, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7b92f05d-cce2-48f0-a124-0408773ce275) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.087 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7b92f05d-cce2-48f0-a124-0408773ce275 in datapath d5f671ae-bb65-4932-84ce-cef4210e4599 bound to our chassis#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.091 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d5f671ae-bb65-4932-84ce-cef4210e4599#033[00m
Oct  2 08:24:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:15Z|00407|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 ovn-installed in OVS
Oct  2 08:24:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:15Z|00408|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 up in Southbound
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.113 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6f2184-92b8-4574-81f5-75f379e86a3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.115 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd5f671ae-b1 in ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.119 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd5f671ae-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.120 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[104fa586-b5ca-4c20-a7bf-19cb67282d19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.121 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bfdfdac1-52c4-409a-a21f-7e7e36a89b53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 systemd-machined[192594]: New machine qemu-40-instance-0000005a.
Oct  2 08:24:15 np0005465988 systemd[1]: Started Virtual Machine qemu-40-instance-0000005a.
Oct  2 08:24:15 np0005465988 systemd-udevd[278602]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.152 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[08771de6-b4e7-44a0-8314-9808508e58ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 NetworkManager[45041]: <info>  [1759407855.1741] device (tap7b92f05d-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:24:15 np0005465988 NetworkManager[45041]: <info>  [1759407855.1753] device (tap7b92f05d-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.183 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f984b278-ebca-4dae-8568-cbab10b935b8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.222 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2191ee56-3e3e-46e3-b399-6d042540b455]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.229 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5562aede-a7bc-4304-832a-93a11dbe82a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 NetworkManager[45041]: <info>  [1759407855.2301] manager: (tapd5f671ae-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/198)
Oct  2 08:24:15 np0005465988 systemd-udevd[278605]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.276 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[74662050-e2d1-40a8-8eaf-09c7f32b0f2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.280 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[da317c48-904e-46de-99df-12821a941df4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 NetworkManager[45041]: <info>  [1759407855.3175] device (tapd5f671ae-b0): carrier: link connected
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.324 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9808f8f5-6cfb-414b-8563-07ac3f766431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.346 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d62cc6d7-0a86-47d9-b89e-6555c313cc50]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5f671ae-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:3a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591564, 'reachable_time': 44730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278633, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.366 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2b4180df-84e2-4b0a-84e2-1efc14044b52]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:3a86'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591564, 'tstamp': 591564}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278634, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.388 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce5342e-fcef-4483-afe0-fcca676f3c95]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5f671ae-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:3a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591564, 'reachable_time': 44730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 278635, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:15.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.425 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[be24df77-ccf7-47a0-a58a-851c2ae4601b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.498 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[434fb2fc-a801-4c9d-b09a-6f3787a6c755]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.500 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5f671ae-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.500 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.500 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5f671ae-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:15 np0005465988 kernel: tapd5f671ae-b0: entered promiscuous mode
Oct  2 08:24:15 np0005465988 NetworkManager[45041]: <info>  [1759407855.5029] manager: (tapd5f671ae-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/199)
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.506 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd5f671ae-b0, col_values=(('external_ids', {'iface-id': '18276c7d-4e7d-4b5c-a013-87c3ea8e7868'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:15Z|00409|binding|INFO|Releasing lport 18276c7d-4e7d-4b5c-a013-87c3ea8e7868 from this chassis (sb_readonly=0)
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.525 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f671ae-bb65-4932-84ce-cef4210e4599.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f671ae-bb65-4932-84ce-cef4210e4599.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.526 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[066c413e-7459-4e1f-863e-8dfac4fe93c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.527 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-d5f671ae-bb65-4932-84ce-cef4210e4599
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/d5f671ae-bb65-4932-84ce-cef4210e4599.pid.haproxy
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID d5f671ae-bb65-4932-84ce-cef4210e4599
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:24:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:15.529 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'env', 'PROCESS_TAG=haproxy-d5f671ae-bb65-4932-84ce-cef4210e4599', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d5f671ae-bb65-4932-84ce-cef4210e4599.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.562 2 DEBUG nova.compute.manager [req-1c9d3ee1-2dfa-4747-a4c4-4e6eb45aac48 req-495a740e-1fb6-4e06-87e9-0db020750bd8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.562 2 DEBUG oslo_concurrency.lockutils [req-1c9d3ee1-2dfa-4747-a4c4-4e6eb45aac48 req-495a740e-1fb6-4e06-87e9-0db020750bd8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.563 2 DEBUG oslo_concurrency.lockutils [req-1c9d3ee1-2dfa-4747-a4c4-4e6eb45aac48 req-495a740e-1fb6-4e06-87e9-0db020750bd8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.563 2 DEBUG oslo_concurrency.lockutils [req-1c9d3ee1-2dfa-4747-a4c4-4e6eb45aac48 req-495a740e-1fb6-4e06-87e9-0db020750bd8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.563 2 DEBUG nova.compute.manager [req-1c9d3ee1-2dfa-4747-a4c4-4e6eb45aac48 req-495a740e-1fb6-4e06-87e9-0db020750bd8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:15 np0005465988 nova_compute[236126]: 2025-10-02 12:24:15.563 2 WARNING nova.compute.manager [req-1c9d3ee1-2dfa-4747-a4c4-4e6eb45aac48 req-495a740e-1fb6-4e06-87e9-0db020750bd8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 08:24:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:15.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:15 np0005465988 podman[278709]: 2025-10-02 12:24:15.94339393 +0000 UTC m=+0.061118589 container create e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:24:15 np0005465988 systemd[1]: Started libpod-conmon-e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d.scope.
Oct  2 08:24:16 np0005465988 podman[278709]: 2025-10-02 12:24:15.910993558 +0000 UTC m=+0.028718237 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:24:16 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:24:16 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e00563669224245f64be1ff0d4e9939d049a1abf12af8ea90acf18abf4f90f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:24:16 np0005465988 podman[278709]: 2025-10-02 12:24:16.037513218 +0000 UTC m=+0.155237887 container init e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:24:16 np0005465988 podman[278709]: 2025-10-02 12:24:16.04523443 +0000 UTC m=+0.162959079 container start e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  2 08:24:16 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[278725]: [NOTICE]   (278729) : New worker (278731) forked
Oct  2 08:24:16 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[278725]: [NOTICE]   (278729) : Loading success.
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.256 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 3a4d32fc-bed8-4e11-9033-5b73501128fe due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.257 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407856.2559319, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.257 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Started (Lifecycle Event)#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.280 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.285 2 DEBUG nova.compute.manager [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.285 2 DEBUG nova.objects.instance [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.288 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.310 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.310 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407856.259967, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.311 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.317 2 INFO nova.virt.libvirt.driver [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance running successfully.#033[00m
Oct  2 08:24:16 np0005465988 virtqemud[235689]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.322 2 DEBUG nova.virt.libvirt.guest [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.322 2 DEBUG nova.compute.manager [None req-aadad0dc-fa7a-418a-a58f-3827e1ab99b8 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.337 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.343 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:16 np0005465988 nova_compute[236126]: 2025-10-02 12:24:16.372 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  2 08:24:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:16Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6d:91:e2 10.100.0.10
Oct  2 08:24:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:16Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6d:91:e2 10.100.0.10
Oct  2 08:24:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:17.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:17.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:17 np0005465988 nova_compute[236126]: 2025-10-02 12:24:17.690 2 DEBUG nova.compute.manager [req-39a4fe3e-2b19-4471-a37b-9f25477cfa84 req-58f13228-748e-4f71-b8e2-2ee1c2beb391 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:17 np0005465988 nova_compute[236126]: 2025-10-02 12:24:17.691 2 DEBUG oslo_concurrency.lockutils [req-39a4fe3e-2b19-4471-a37b-9f25477cfa84 req-58f13228-748e-4f71-b8e2-2ee1c2beb391 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:17 np0005465988 nova_compute[236126]: 2025-10-02 12:24:17.693 2 DEBUG oslo_concurrency.lockutils [req-39a4fe3e-2b19-4471-a37b-9f25477cfa84 req-58f13228-748e-4f71-b8e2-2ee1c2beb391 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:17 np0005465988 nova_compute[236126]: 2025-10-02 12:24:17.693 2 DEBUG oslo_concurrency.lockutils [req-39a4fe3e-2b19-4471-a37b-9f25477cfa84 req-58f13228-748e-4f71-b8e2-2ee1c2beb391 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:17 np0005465988 nova_compute[236126]: 2025-10-02 12:24:17.694 2 DEBUG nova.compute.manager [req-39a4fe3e-2b19-4471-a37b-9f25477cfa84 req-58f13228-748e-4f71-b8e2-2ee1c2beb391 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] No waiting events found dispatching network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:17 np0005465988 nova_compute[236126]: 2025-10-02 12:24:17.694 2 WARNING nova.compute.manager [req-39a4fe3e-2b19-4471-a37b-9f25477cfa84 req-58f13228-748e-4f71-b8e2-2ee1c2beb391 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received unexpected event network-vif-plugged-7b92f05d-cce2-48f0-a124-0408773ce275 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:24:18 np0005465988 nova_compute[236126]: 2025-10-02 12:24:18.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:18Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:24:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:18Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #85. Immutable memtables: 0.
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.268345) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 51] Flushing memtable with next log file: 85
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407859268424, "job": 51, "event": "flush_started", "num_memtables": 1, "num_entries": 560, "num_deletes": 253, "total_data_size": 755924, "memory_usage": 767336, "flush_reason": "Manual Compaction"}
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 51] Level-0 flush table #86: started
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407859274531, "cf_name": "default", "job": 51, "event": "table_file_creation", "file_number": 86, "file_size": 497678, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44258, "largest_seqno": 44813, "table_properties": {"data_size": 494701, "index_size": 949, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7504, "raw_average_key_size": 19, "raw_value_size": 488510, "raw_average_value_size": 1295, "num_data_blocks": 41, "num_entries": 377, "num_filter_entries": 377, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759407838, "oldest_key_time": 1759407838, "file_creation_time": 1759407859, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 86, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 51] Flush lasted 6214 microseconds, and 3239 cpu microseconds.
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.274581) [db/flush_job.cc:967] [default] [JOB 51] Level-0 flush table #86: 497678 bytes OK
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.274607) [db/memtable_list.cc:519] [default] Level-0 commit table #86 started
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.276706) [db/memtable_list.cc:722] [default] Level-0 commit table #86: memtable #1 done
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.276725) EVENT_LOG_v1 {"time_micros": 1759407859276719, "job": 51, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.276751) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 51] Try to delete WAL files size 752651, prev total WAL file size 752651, number of live WAL files 2.
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000082.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.277686) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033353134' seq:72057594037927935, type:22 .. '7061786F730033373636' seq:0, type:0; will stop at (end)
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 52] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 51 Base level 0, inputs: [86(486KB)], [84(10MB)]
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407859277767, "job": 52, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [86], "files_L6": [84], "score": -1, "input_data_size": 11311926, "oldest_snapshot_seqno": -1}
Oct  2 08:24:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:19Z|00410|binding|INFO|Releasing lport 18276c7d-4e7d-4b5c-a013-87c3ea8e7868 from this chassis (sb_readonly=0)
Oct  2 08:24:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:19Z|00411|binding|INFO|Releasing lport 1befa812-080f-4694-ba8b-9130fe81621d from this chassis (sb_readonly=0)
Oct  2 08:24:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:19Z|00412|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 52] Generated table #87: 6695 keys, 9424538 bytes, temperature: kUnknown
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407859327842, "cf_name": "default", "job": 52, "event": "table_file_creation", "file_number": 87, "file_size": 9424538, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9380259, "index_size": 26439, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16773, "raw_key_size": 173829, "raw_average_key_size": 25, "raw_value_size": 9260762, "raw_average_value_size": 1383, "num_data_blocks": 1042, "num_entries": 6695, "num_filter_entries": 6695, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759407859, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 87, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.328197) [db/compaction/compaction_job.cc:1663] [default] [JOB 52] Compacted 1@0 + 1@6 files to L6 => 9424538 bytes
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.329563) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 225.4 rd, 187.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 10.3 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(41.7) write-amplify(18.9) OK, records in: 7217, records dropped: 522 output_compression: NoCompression
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.329587) EVENT_LOG_v1 {"time_micros": 1759407859329575, "job": 52, "event": "compaction_finished", "compaction_time_micros": 50186, "compaction_time_cpu_micros": 32265, "output_level": 6, "num_output_files": 1, "total_output_size": 9424538, "num_input_records": 7217, "num_output_records": 6695, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000086.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407859329918, "job": 52, "event": "table_file_deletion", "file_number": 86}
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000084.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759407859332538, "job": 52, "event": "table_file_deletion", "file_number": 84}
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.277556) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.332623) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.332662) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.332666) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.332669) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:24:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:24:19.332672) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:24:19 np0005465988 nova_compute[236126]: 2025-10-02 12:24:19.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:19 np0005465988 nova_compute[236126]: 2025-10-02 12:24:19.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:19.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:19.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:24:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:24:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:21.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:21.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:21 np0005465988 podman[278817]: 2025-10-02 12:24:21.710277585 +0000 UTC m=+0.076009987 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:24:21 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:21Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:80:0c:a5 10.100.0.11
Oct  2 08:24:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:23 np0005465988 nova_compute[236126]: 2025-10-02 12:24:23.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:23.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:23.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:24 np0005465988 nova_compute[236126]: 2025-10-02 12:24:24.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:25.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:25.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:27.353 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:27.354 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:27.355 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:27 np0005465988 nova_compute[236126]: 2025-10-02 12:24:27.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:27.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:27.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:24:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3511363623' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:24:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:24:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3511363623' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:24:28 np0005465988 nova_compute[236126]: 2025-10-02 12:24:28.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:29 np0005465988 nova_compute[236126]: 2025-10-02 12:24:29.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:29.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:29.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:29 np0005465988 nova_compute[236126]: 2025-10-02 12:24:29.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.103 2 DEBUG oslo_concurrency.lockutils [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.103 2 DEBUG oslo_concurrency.lockutils [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.104 2 DEBUG oslo_concurrency.lockutils [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.104 2 DEBUG oslo_concurrency.lockutils [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.105 2 DEBUG oslo_concurrency.lockutils [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.107 2 INFO nova.compute.manager [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Terminating instance#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.109 2 DEBUG nova.compute.manager [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:24:30 np0005465988 kernel: tap7b92f05d-cc (unregistering): left promiscuous mode
Oct  2 08:24:30 np0005465988 NetworkManager[45041]: <info>  [1759407870.1687] device (tap7b92f05d-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:24:30 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:30Z|00413|binding|INFO|Releasing lport 7b92f05d-cce2-48f0-a124-0408773ce275 from this chassis (sb_readonly=0)
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:30 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:30Z|00414|binding|INFO|Setting lport 7b92f05d-cce2-48f0-a124-0408773ce275 down in Southbound
Oct  2 08:24:30 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:30Z|00415|binding|INFO|Removing iface tap7b92f05d-cc ovn-installed in OVS
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.195 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:0c:a5 10.100.0.11'], port_security=['fa:16:3e:80:0c:a5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3a4d32fc-bed8-4e11-9033-5b73501128fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f671ae-bb65-4932-84ce-cef4210e4599', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c740a14d1c5c45d1a0959b0e24ac460b', 'neutron:revision_number': '13', 'neutron:security_group_ids': '0cf8cc6d-2482-45e4-b576-7d811b75025a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5feaa29-5f25-4f45-a24f-ce44451fb322, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7b92f05d-cce2-48f0-a124-0408773ce275) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.197 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7b92f05d-cce2-48f0-a124-0408773ce275 in datapath d5f671ae-bb65-4932-84ce-cef4210e4599 unbound from our chassis#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.200 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5f671ae-bb65-4932-84ce-cef4210e4599, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.202 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ca19d0-173b-4811-a885-444acc1867ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.203 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 namespace which is not needed anymore#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:30 np0005465988 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Oct  2 08:24:30 np0005465988 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000005a.scope: Consumed 5.698s CPU time.
Oct  2 08:24:30 np0005465988 systemd-machined[192594]: Machine qemu-40-instance-0000005a terminated.
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.347 2 INFO nova.virt.libvirt.driver [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Instance destroyed successfully.#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.348 2 DEBUG nova.objects.instance [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lazy-loading 'resources' on Instance uuid 3a4d32fc-bed8-4e11-9033-5b73501128fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.363 2 DEBUG nova.virt.libvirt.vif [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:21:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-757604995',display_name='tempest-ServersNegativeTestJSON-server-757604995',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-757604995',id=90,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c740a14d1c5c45d1a0959b0e24ac460b',ramdisk_id='',reservation_id='r-1qzpbgvk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-462972452',owner_user_name='tempest-ServersNegativeTestJSON-462972452-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:24:16Z,user_data=None,user_id='4146a31af09c4e6a8aee251f2fec4f98',uuid=3a4d32fc-bed8-4e11-9033-5b73501128fe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.363 2 DEBUG nova.network.os_vif_util [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converting VIF {"id": "7b92f05d-cce2-48f0-a124-0408773ce275", "address": "fa:16:3e:80:0c:a5", "network": {"id": "d5f671ae-bb65-4932-84ce-cef4210e4599", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-212680174-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c740a14d1c5c45d1a0959b0e24ac460b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b92f05d-cc", "ovs_interfaceid": "7b92f05d-cce2-48f0-a124-0408773ce275", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.364 2 DEBUG nova.network.os_vif_util [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.364 2 DEBUG os_vif [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.367 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b92f05d-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:30 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[278725]: [NOTICE]   (278729) : haproxy version is 2.8.14-c23fe91
Oct  2 08:24:30 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[278725]: [NOTICE]   (278729) : path to executable is /usr/sbin/haproxy
Oct  2 08:24:30 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[278725]: [WARNING]  (278729) : Exiting Master process...
Oct  2 08:24:30 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[278725]: [WARNING]  (278729) : Exiting Master process...
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.375 2 INFO os_vif [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:0c:a5,bridge_name='br-int',has_traffic_filtering=True,id=7b92f05d-cce2-48f0-a124-0408773ce275,network=Network(d5f671ae-bb65-4932-84ce-cef4210e4599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b92f05d-cc')#033[00m
Oct  2 08:24:30 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[278725]: [ALERT]    (278729) : Current worker (278731) exited with code 143 (Terminated)
Oct  2 08:24:30 np0005465988 neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599[278725]: [WARNING]  (278729) : All workers exited. Exiting... (0)
Oct  2 08:24:30 np0005465988 systemd[1]: libpod-e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d.scope: Deactivated successfully.
Oct  2 08:24:30 np0005465988 podman[278887]: 2025-10-02 12:24:30.387586974 +0000 UTC m=+0.070437647 container died e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:24:30 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d-userdata-shm.mount: Deactivated successfully.
Oct  2 08:24:30 np0005465988 systemd[1]: var-lib-containers-storage-overlay-3e00563669224245f64be1ff0d4e9939d049a1abf12af8ea90acf18abf4f90f2-merged.mount: Deactivated successfully.
Oct  2 08:24:30 np0005465988 podman[278887]: 2025-10-02 12:24:30.441235257 +0000 UTC m=+0.124085920 container cleanup e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:24:30 np0005465988 systemd[1]: libpod-conmon-e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d.scope: Deactivated successfully.
Oct  2 08:24:30 np0005465988 podman[278947]: 2025-10-02 12:24:30.507015489 +0000 UTC m=+0.042145423 container remove e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.514 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a30315-9701-40b5-aa67-8c51dd1240c7]: (4, ('Thu Oct  2 12:24:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 (e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d)\ne62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d\nThu Oct  2 12:24:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 (e62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d)\ne62c5226c12b8edb0ef56c6fa1122cb9b16d02616626d086006ccb36f1c5e91d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.516 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ec6ee456-7a07-4c8a-9734-d1298348ad78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.517 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5f671ae-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:30 np0005465988 kernel: tapd5f671ae-b0: left promiscuous mode
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.536 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fe118ab0-f588-4872-9d8e-8cc28f15cc3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.564 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d99ab63f-9901-497c-8db0-feda823214c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.565 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c264373a-1b6a-4634-999f-ad014255f7d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.580 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[47bc68c5-d3f1-4e29-8dde-aeb1ed0e4486]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591554, 'reachable_time': 21069, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278962, 'error': None, 'target': 'ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.582 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d5f671ae-bb65-4932-84ce-cef4210e4599 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:24:30 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:30.583 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[29ee5cf4-8228-4dd5-8cd3-407f4e9a0b20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:30 np0005465988 systemd[1]: run-netns-ovnmeta\x2dd5f671ae\x2dbb65\x2d4932\x2d84ce\x2dcef4210e4599.mount: Deactivated successfully.
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.918 2 INFO nova.virt.libvirt.driver [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Deleting instance files /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe_del#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.919 2 INFO nova.virt.libvirt.driver [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Deletion of /var/lib/nova/instances/3a4d32fc-bed8-4e11-9033-5b73501128fe_del complete#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.983 2 INFO nova.compute.manager [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Took 0.87 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.984 2 DEBUG oslo.service.loopingcall [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.984 2 DEBUG nova.compute.manager [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:24:30 np0005465988 nova_compute[236126]: 2025-10-02 12:24:30.984 2 DEBUG nova.network.neutron [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:24:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:31.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:24:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:31.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:24:31 np0005465988 nova_compute[236126]: 2025-10-02 12:24:31.762 2 DEBUG nova.network.neutron [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:31 np0005465988 nova_compute[236126]: 2025-10-02 12:24:31.850 2 INFO nova.compute.manager [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Took 0.87 seconds to deallocate network for instance.#033[00m
Oct  2 08:24:31 np0005465988 nova_compute[236126]: 2025-10-02 12:24:31.856 2 DEBUG nova.compute.manager [req-8cc15f82-95e8-43fb-b661-1799a4c3d678 req-e51cfebe-9b8e-4fcc-963d-f3b34529a5c4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Received event network-vif-deleted-7b92f05d-cce2-48f0-a124-0408773ce275 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:31 np0005465988 nova_compute[236126]: 2025-10-02 12:24:31.857 2 INFO nova.compute.manager [req-8cc15f82-95e8-43fb-b661-1799a4c3d678 req-e51cfebe-9b8e-4fcc-963d-f3b34529a5c4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Neutron deleted interface 7b92f05d-cce2-48f0-a124-0408773ce275; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:24:31 np0005465988 nova_compute[236126]: 2025-10-02 12:24:31.857 2 DEBUG nova.network.neutron [req-8cc15f82-95e8-43fb-b661-1799a4c3d678 req-e51cfebe-9b8e-4fcc-963d-f3b34529a5c4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:31 np0005465988 nova_compute[236126]: 2025-10-02 12:24:31.901 2 DEBUG nova.compute.manager [req-8cc15f82-95e8-43fb-b661-1799a4c3d678 req-e51cfebe-9b8e-4fcc-963d-f3b34529a5c4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Detach interface failed, port_id=7b92f05d-cce2-48f0-a124-0408773ce275, reason: Instance 3a4d32fc-bed8-4e11-9033-5b73501128fe could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:24:31 np0005465988 nova_compute[236126]: 2025-10-02 12:24:31.941 2 DEBUG oslo_concurrency.lockutils [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:31 np0005465988 nova_compute[236126]: 2025-10-02 12:24:31.942 2 DEBUG oslo_concurrency.lockutils [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:32 np0005465988 nova_compute[236126]: 2025-10-02 12:24:32.073 2 DEBUG oslo_concurrency.processutils [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:24:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2167894051' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:24:32 np0005465988 nova_compute[236126]: 2025-10-02 12:24:32.556 2 DEBUG oslo_concurrency.processutils [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:32 np0005465988 nova_compute[236126]: 2025-10-02 12:24:32.565 2 DEBUG nova.compute.provider_tree [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:24:32 np0005465988 nova_compute[236126]: 2025-10-02 12:24:32.582 2 DEBUG nova.scheduler.client.report [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:24:32 np0005465988 nova_compute[236126]: 2025-10-02 12:24:32.603 2 DEBUG oslo_concurrency.lockutils [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:32 np0005465988 nova_compute[236126]: 2025-10-02 12:24:32.636 2 INFO nova.scheduler.client.report [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Deleted allocations for instance 3a4d32fc-bed8-4e11-9033-5b73501128fe#033[00m
Oct  2 08:24:32 np0005465988 nova_compute[236126]: 2025-10-02 12:24:32.719 2 DEBUG oslo_concurrency.lockutils [None req-a5198259-b27c-46ff-a36d-55d273eb2611 4146a31af09c4e6a8aee251f2fec4f98 c740a14d1c5c45d1a0959b0e24ac460b - - default default] Lock "3a4d32fc-bed8-4e11-9033-5b73501128fe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:33 np0005465988 nova_compute[236126]: 2025-10-02 12:24:33.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:33.164 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:33 np0005465988 nova_compute[236126]: 2025-10-02 12:24:33.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:33.166 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:24:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:33.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:33.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:34 np0005465988 nova_compute[236126]: 2025-10-02 12:24:34.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:35 np0005465988 nova_compute[236126]: 2025-10-02 12:24:35.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:35.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:35.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:36Z|00416|binding|INFO|Releasing lport 1befa812-080f-4694-ba8b-9130fe81621d from this chassis (sb_readonly=0)
Oct  2 08:24:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:36Z|00417|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:24:36 np0005465988 nova_compute[236126]: 2025-10-02 12:24:36.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:37.168 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:37.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:37.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:37 np0005465988 nova_compute[236126]: 2025-10-02 12:24:37.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:38 np0005465988 nova_compute[236126]: 2025-10-02 12:24:38.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:39.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:39.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:40 np0005465988 nova_compute[236126]: 2025-10-02 12:24:40.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:41.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:41 np0005465988 podman[278993]: 2025-10-02 12:24:41.607526372 +0000 UTC m=+0.123379570 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:24:41 np0005465988 podman[278994]: 2025-10-02 12:24:41.608259283 +0000 UTC m=+0.110919522 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:24:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:41.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:41 np0005465988 podman[278992]: 2025-10-02 12:24:41.646265556 +0000 UTC m=+0.159332594 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:24:42 np0005465988 nova_compute[236126]: 2025-10-02 12:24:42.439 2 DEBUG oslo_concurrency.lockutils [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:42 np0005465988 nova_compute[236126]: 2025-10-02 12:24:42.440 2 DEBUG oslo_concurrency.lockutils [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:42 np0005465988 nova_compute[236126]: 2025-10-02 12:24:42.441 2 INFO nova.compute.manager [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Rebooting instance#033[00m
Oct  2 08:24:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:42 np0005465988 nova_compute[236126]: 2025-10-02 12:24:42.570 2 DEBUG oslo_concurrency.lockutils [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:42 np0005465988 nova_compute[236126]: 2025-10-02 12:24:42.571 2 DEBUG oslo_concurrency.lockutils [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquired lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:42 np0005465988 nova_compute[236126]: 2025-10-02 12:24:42.571 2 DEBUG nova.network.neutron [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:24:43 np0005465988 nova_compute[236126]: 2025-10-02 12:24:43.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:43.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:43.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:45 np0005465988 nova_compute[236126]: 2025-10-02 12:24:45.346 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407870.3446894, 3a4d32fc-bed8-4e11-9033-5b73501128fe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:45 np0005465988 nova_compute[236126]: 2025-10-02 12:24:45.347 2 INFO nova.compute.manager [-] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:24:45 np0005465988 nova_compute[236126]: 2025-10-02 12:24:45.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:45 np0005465988 nova_compute[236126]: 2025-10-02 12:24:45.384 2 DEBUG nova.compute.manager [None req-6f041241-9e0e-48b6-b46a-7d13fa308035 - - - - - -] [instance: 3a4d32fc-bed8-4e11-9033-5b73501128fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:45.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:24:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:45.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:24:45 np0005465988 nova_compute[236126]: 2025-10-02 12:24:45.805 2 DEBUG nova.network.neutron [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:45 np0005465988 nova_compute[236126]: 2025-10-02 12:24:45.846 2 DEBUG oslo_concurrency.lockutils [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Releasing lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:45 np0005465988 nova_compute[236126]: 2025-10-02 12:24:45.849 2 DEBUG nova.compute.manager [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:46 np0005465988 kernel: tap3bdb6970-48 (unregistering): left promiscuous mode
Oct  2 08:24:46 np0005465988 NetworkManager[45041]: <info>  [1759407886.3502] device (tap3bdb6970-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:46Z|00418|binding|INFO|Releasing lport 3bdb6970-487f-4313-ab25-aa900f8b084a from this chassis (sb_readonly=0)
Oct  2 08:24:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:46Z|00419|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a down in Southbound
Oct  2 08:24:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:46Z|00420|binding|INFO|Removing iface tap3bdb6970-48 ovn-installed in OVS
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.427 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.429 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff unbound from our chassis#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.432 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.434 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b67d7e32-a9c4-417c-854d-238133dc2d82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.435 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace which is not needed anymore#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:46 np0005465988 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000064.scope: Deactivated successfully.
Oct  2 08:24:46 np0005465988 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000064.scope: Consumed 15.193s CPU time.
Oct  2 08:24:46 np0005465988 systemd-machined[192594]: Machine qemu-39-instance-00000064 terminated.
Oct  2 08:24:46 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[278295]: [NOTICE]   (278299) : haproxy version is 2.8.14-c23fe91
Oct  2 08:24:46 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[278295]: [NOTICE]   (278299) : path to executable is /usr/sbin/haproxy
Oct  2 08:24:46 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[278295]: [WARNING]  (278299) : Exiting Master process...
Oct  2 08:24:46 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[278295]: [ALERT]    (278299) : Current worker (278301) exited with code 143 (Terminated)
Oct  2 08:24:46 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[278295]: [WARNING]  (278299) : All workers exited. Exiting... (0)
Oct  2 08:24:46 np0005465988 systemd[1]: libpod-5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1.scope: Deactivated successfully.
Oct  2 08:24:46 np0005465988 podman[279127]: 2025-10-02 12:24:46.606845658 +0000 UTC m=+0.058079432 container died 5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:24:46 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1-userdata-shm.mount: Deactivated successfully.
Oct  2 08:24:46 np0005465988 systemd[1]: var-lib-containers-storage-overlay-e99aee037ade3170fd0cd66535ad543672f18b74efc9c407a3d4b4e75b30a7fc-merged.mount: Deactivated successfully.
Oct  2 08:24:46 np0005465988 podman[279127]: 2025-10-02 12:24:46.648425194 +0000 UTC m=+0.099658928 container cleanup 5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.659 2 INFO nova.virt.libvirt.driver [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance destroyed successfully.#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.660 2 DEBUG nova.objects.instance [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'resources' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:46 np0005465988 systemd[1]: libpod-conmon-5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1.scope: Deactivated successfully.
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.711 2 DEBUG nova.virt.libvirt.vif [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:24:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.712 2 DEBUG nova.network.os_vif_util [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.713 2 DEBUG nova.network.os_vif_util [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.713 2 DEBUG os_vif [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.716 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bdb6970-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.723 2 INFO os_vif [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48')#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.731 2 DEBUG nova.virt.libvirt.driver [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Start _get_guest_xml network_info=[{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.737 2 WARNING nova.virt.libvirt.driver [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.743 2 DEBUG nova.virt.libvirt.host [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.744 2 DEBUG nova.virt.libvirt.host [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.749 2 DEBUG nova.virt.libvirt.host [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.750 2 DEBUG nova.virt.libvirt.host [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:24:46 np0005465988 podman[279164]: 2025-10-02 12:24:46.751084968 +0000 UTC m=+0.055648142 container remove 5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.751 2 DEBUG nova.virt.libvirt.driver [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.752 2 DEBUG nova.virt.hardware [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.753 2 DEBUG nova.virt.hardware [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.753 2 DEBUG nova.virt.hardware [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.754 2 DEBUG nova.virt.hardware [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.754 2 DEBUG nova.virt.hardware [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.754 2 DEBUG nova.virt.hardware [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.755 2 DEBUG nova.virt.hardware [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.755 2 DEBUG nova.virt.hardware [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.755 2 DEBUG nova.virt.hardware [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.756 2 DEBUG nova.virt.hardware [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.756 2 DEBUG nova.virt.hardware [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.757 2 DEBUG nova.objects.instance [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.762 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c65004-e4ec-4572-8e83-690caaf9e700]: (4, ('Thu Oct  2 12:24:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1)\n5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1\nThu Oct  2 12:24:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1)\n5c9cfc1746260fcacd606608e7aa1f37987133d7b6060a2ff0ece2370efd32b1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.762 2 DEBUG nova.compute.manager [req-db316f51-53d8-46fc-a203-4703b6032c04 req-70e9bef5-0320-4428-81c7-80cfe251be86 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.763 2 DEBUG oslo_concurrency.lockutils [req-db316f51-53d8-46fc-a203-4703b6032c04 req-70e9bef5-0320-4428-81c7-80cfe251be86 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.764 2 DEBUG oslo_concurrency.lockutils [req-db316f51-53d8-46fc-a203-4703b6032c04 req-70e9bef5-0320-4428-81c7-80cfe251be86 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.764 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c076ff32-9185-412b-bd3d-46d35610bd72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.764 2 DEBUG oslo_concurrency.lockutils [req-db316f51-53d8-46fc-a203-4703b6032c04 req-70e9bef5-0320-4428-81c7-80cfe251be86 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.764 2 DEBUG nova.compute.manager [req-db316f51-53d8-46fc-a203-4703b6032c04 req-70e9bef5-0320-4428-81c7-80cfe251be86 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.765 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.765 2 WARNING nova.compute.manager [req-db316f51-53d8-46fc-a203-4703b6032c04 req-70e9bef5-0320-4428-81c7-80cfe251be86 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state reboot_started_hard.#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.778 2 DEBUG oslo_concurrency.processutils [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:46 np0005465988 kernel: tapb2c62a66-f0: left promiscuous mode
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.791 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[12d59840-a1d8-4881-83a9-a5064071d85c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.819 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[55f0e8d5-c687-497a-8a6b-ee752edcc9e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.821 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[179e340c-0706-4ed1-9812-af8d420b7eec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:46 np0005465988 nova_compute[236126]: 2025-10-02 12:24:46.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.846 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d3551dfc-dd1e-4290-9c7a-8bb481d33171]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590262, 'reachable_time': 27372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279181, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:46 np0005465988 systemd[1]: run-netns-ovnmeta\x2db2c62a66\x2df9bc\x2d4a45\x2da843\x2daef2e12a7fff.mount: Deactivated successfully.
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.854 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:24:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:46.855 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[8062a6ed-321f-49ab-9f83-666206a7de76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:24:47 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3330506762' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.240 2 DEBUG oslo_concurrency.processutils [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.300 2 DEBUG oslo_concurrency.processutils [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:47.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.507 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.508 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.535 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.536 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.537 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.537 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.538 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:47.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:24:47 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1133758493' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.806 2 DEBUG oslo_concurrency.processutils [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.809 2 DEBUG nova.virt.libvirt.vif [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:24:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.810 2 DEBUG nova.network.os_vif_util [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.811 2 DEBUG nova.network.os_vif_util [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.813 2 DEBUG nova.objects.instance [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.841 2 DEBUG nova.virt.libvirt.driver [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  <uuid>87ebffd5-69af-414b-be5d-67ba42e8cae1</uuid>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  <name>instance-00000064</name>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerActionsTestJSON-server-131502281</nova:name>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:24:46</nova:creationTime>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <nova:user uuid="2bd16d1f5f9d4eb396c474eedee67165">tempest-ServerActionsTestJSON-842270816-project-member</nova:user>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <nova:project uuid="4b8ca48cb5f64ef3b0736b8be82378b8">tempest-ServerActionsTestJSON-842270816</nova:project>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <nova:port uuid="3bdb6970-487f-4313-ab25-aa900f8b084a">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <entry name="serial">87ebffd5-69af-414b-be5d-67ba42e8cae1</entry>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <entry name="uuid">87ebffd5-69af-414b-be5d-67ba42e8cae1</entry>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/87ebffd5-69af-414b-be5d-67ba42e8cae1_disk">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/87ebffd5-69af-414b-be5d-67ba42e8cae1_disk.config">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:22:0e:b9"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <target dev="tap3bdb6970-48"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/console.log" append="off"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <input type="keyboard" bus="usb"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:24:47 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:24:47 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:24:47 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:24:47 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.844 2 DEBUG nova.virt.libvirt.driver [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.845 2 DEBUG nova.virt.libvirt.driver [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.847 2 DEBUG nova.virt.libvirt.vif [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:24:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.848 2 DEBUG nova.network.os_vif_util [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.849 2 DEBUG nova.network.os_vif_util [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.849 2 DEBUG os_vif [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.851 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.852 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.857 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bdb6970-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.858 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3bdb6970-48, col_values=(('external_ids', {'iface-id': '3bdb6970-487f-4313-ab25-aa900f8b084a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:0e:b9', 'vm-uuid': '87ebffd5-69af-414b-be5d-67ba42e8cae1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:47 np0005465988 NetworkManager[45041]: <info>  [1759407887.8626] manager: (tap3bdb6970-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.874 2 INFO os_vif [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48')#033[00m
Oct  2 08:24:47 np0005465988 kernel: tap3bdb6970-48: entered promiscuous mode
Oct  2 08:24:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:47Z|00421|binding|INFO|Claiming lport 3bdb6970-487f-4313-ab25-aa900f8b084a for this chassis.
Oct  2 08:24:47 np0005465988 systemd-udevd[279107]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:24:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:47Z|00422|binding|INFO|3bdb6970-487f-4313-ab25-aa900f8b084a: Claiming fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:24:47 np0005465988 NetworkManager[45041]: <info>  [1759407887.9837] manager: (tap3bdb6970-48): new Tun device (/org/freedesktop/NetworkManager/Devices/201)
Oct  2 08:24:47 np0005465988 nova_compute[236126]: 2025-10-02 12:24:47.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:47 np0005465988 NetworkManager[45041]: <info>  [1759407887.9990] device (tap3bdb6970-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:24:48 np0005465988 NetworkManager[45041]: <info>  [1759407888.0005] device (tap3bdb6970-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:24:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:24:48 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2262425354' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.003 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.004 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff bound to our chassis#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.006 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff#033[00m
Oct  2 08:24:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:48Z|00423|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a ovn-installed in OVS
Oct  2 08:24:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:48Z|00424|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a up in Southbound
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.025 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6d4775e9-5fd3-460d-aaf5-aca44cdae4f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.026 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2c62a66-f1 in ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.028 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2c62a66-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.029 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[96948d12-36b4-49a5-8dd5-6427dc74c5cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.029 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a829d0a2-fd3c-4022-882e-fe8fb1c00d37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.036 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:48 np0005465988 systemd-machined[192594]: New machine qemu-41-instance-00000064.
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.052 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[a3222517-43e4-4346-b6fd-5fb2ca53f6d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 systemd[1]: Started Virtual Machine qemu-41-instance-00000064.
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.075 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1d531129-ec67-4c8e-9c01-a51563d3f0c6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.118 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8110713e-0451-4e38-a792-0549b4383636]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 NetworkManager[45041]: <info>  [1759407888.1282] manager: (tapb2c62a66-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/202)
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.127 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3e8330-67ab-47a9-a51d-7540d0676c6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.177 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d1625050-dca1-4059-8dad-ee8206a164e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.181 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ce029496-141e-4d47-a89c-0f7401ca2d95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 NetworkManager[45041]: <info>  [1759407888.2138] device (tapb2c62a66-f0): carrier: link connected
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.222 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0f4a3e-99d2-4249-a4c9-69427f06ee39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.241 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000065 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.241 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000065 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.245 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.246 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.246 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bab682cf-e67a-4354-b44e-4e12c557cfba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594854, 'reachable_time': 20202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279312, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.277 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ea4d18db-77c0-4961-b5d7-7b383b40ead5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:7a7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594854, 'tstamp': 594854}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 279313, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.310 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb2fec3-8d55-44cf-b820-43e0e2b49700]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594854, 'reachable_time': 20202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 279314, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.358 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1fd5224f-641e-43a2-8758-a76a2e1dde98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.446 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[62f10fb3-425e-4d7d-bc3f-19a42139f4ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.447 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.448 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.448 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c62a66-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:48 np0005465988 kernel: tapb2c62a66-f0: entered promiscuous mode
Oct  2 08:24:48 np0005465988 NetworkManager[45041]: <info>  [1759407888.4518] manager: (tapb2c62a66-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.457 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2c62a66-f0, col_values=(('external_ids', {'iface-id': '8c1234f6-f595-4979-94c0-98da2211f0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:24:48Z|00425|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.478 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.479 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4203MB free_disk=20.851573944091797GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.479 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.480 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.488 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.489 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[518ceca7-a3da-4d01-93ec-8f5f6868edb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.490 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:24:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:24:48.491 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'env', 'PROCESS_TAG=haproxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.613 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 87ebffd5-69af-414b-be5d-67ba42e8cae1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.614 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 7b4bdbc9-7451-4500-8794-c8edef50d6a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.614 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.614 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.633 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.738 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.738 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.757 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.774 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:24:48 np0005465988 nova_compute[236126]: 2025-10-02 12:24:48.816 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:48 np0005465988 podman[279382]: 2025-10-02 12:24:48.959771245 +0000 UTC m=+0.074861964 container create 637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:24:49 np0005465988 systemd[1]: Started libpod-conmon-637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91.scope.
Oct  2 08:24:49 np0005465988 podman[279382]: 2025-10-02 12:24:48.922178124 +0000 UTC m=+0.037268863 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:24:49 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:24:49 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2efd9f58a7616a3f75728e3989e5b1348449c05afbd040c8d7c636e813e50925/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:24:49 np0005465988 podman[279382]: 2025-10-02 12:24:49.081422025 +0000 UTC m=+0.196512734 container init 637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:24:49 np0005465988 podman[279382]: 2025-10-02 12:24:49.09270755 +0000 UTC m=+0.207798259 container start 637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:24:49 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279414]: [NOTICE]   (279427) : New worker (279429) forked
Oct  2 08:24:49 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279414]: [NOTICE]   (279427) : Loading success.
Oct  2 08:24:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:24:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2525089251' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.359 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.367 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.386 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.410 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.411 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.931s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:49.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.515 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 87ebffd5-69af-414b-be5d-67ba42e8cae1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.515 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407889.514247, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.516 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.519 2 DEBUG nova.compute.manager [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.526 2 INFO nova.virt.libvirt.driver [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance rebooted successfully.#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.526 2 DEBUG nova.compute.manager [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.594 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.599 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.615 2 DEBUG oslo_concurrency.lockutils [None req-c81da2d6-ca8d-46fc-9175-87888a78d06f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 7.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.626 2 DEBUG nova.compute.manager [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.627 2 DEBUG oslo_concurrency.lockutils [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.627 2 DEBUG oslo_concurrency.lockutils [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.628 2 DEBUG oslo_concurrency.lockutils [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.628 2 DEBUG nova.compute.manager [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:49.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.629 2 WARNING nova.compute.manager [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state None.#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.630 2 DEBUG nova.compute.manager [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.630 2 DEBUG oslo_concurrency.lockutils [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.631 2 DEBUG oslo_concurrency.lockutils [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.631 2 DEBUG oslo_concurrency.lockutils [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.632 2 DEBUG nova.compute.manager [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.632 2 WARNING nova.compute.manager [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state None.#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.633 2 DEBUG nova.compute.manager [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.633 2 DEBUG oslo_concurrency.lockutils [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.633 2 DEBUG oslo_concurrency.lockutils [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.634 2 DEBUG oslo_concurrency.lockutils [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.634 2 DEBUG nova.compute.manager [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.635 2 WARNING nova.compute.manager [req-5463c4e0-aa07-47e2-a9cc-783e1025fa50 req-481f14f9-5e04-4b97-9447-4635bf75028b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state None.#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.637 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407889.5190032, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.637 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.656 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:49 np0005465988 nova_compute[236126]: 2025-10-02 12:24:49.661 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:50 np0005465988 nova_compute[236126]: 2025-10-02 12:24:50.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:51.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:51.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:52 np0005465988 nova_compute[236126]: 2025-10-02 12:24:52.377 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:52 np0005465988 nova_compute[236126]: 2025-10-02 12:24:52.378 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:52 np0005465988 nova_compute[236126]: 2025-10-02 12:24:52.379 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:24:52 np0005465988 nova_compute[236126]: 2025-10-02 12:24:52.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:52 np0005465988 podman[279441]: 2025-10-02 12:24:52.55678779 +0000 UTC m=+0.076679226 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:24:52 np0005465988 nova_compute[236126]: 2025-10-02 12:24:52.664 2 INFO nova.compute.manager [None req-96523129-6568-4751-a71e-9a4e92f17bf7 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Get console output#033[00m
Oct  2 08:24:52 np0005465988 nova_compute[236126]: 2025-10-02 12:24:52.674 2 INFO oslo.privsep.daemon [None req-96523129-6568-4751-a71e-9a4e92f17bf7 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpreixioc8/privsep.sock']#033[00m
Oct  2 08:24:52 np0005465988 nova_compute[236126]: 2025-10-02 12:24:52.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:53 np0005465988 nova_compute[236126]: 2025-10-02 12:24:53.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:53 np0005465988 nova_compute[236126]: 2025-10-02 12:24:53.397 2 INFO oslo.privsep.daemon [None req-96523129-6568-4751-a71e-9a4e92f17bf7 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 08:24:53 np0005465988 nova_compute[236126]: 2025-10-02 12:24:53.264 15591 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 08:24:53 np0005465988 nova_compute[236126]: 2025-10-02 12:24:53.270 15591 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 08:24:53 np0005465988 nova_compute[236126]: 2025-10-02 12:24:53.274 15591 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  2 08:24:53 np0005465988 nova_compute[236126]: 2025-10-02 12:24:53.275 15591 INFO oslo.privsep.daemon [-] privsep daemon running as pid 15591#033[00m
Oct  2 08:24:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:53.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:53 np0005465988 nova_compute[236126]: 2025-10-02 12:24:53.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:53 np0005465988 nova_compute[236126]: 2025-10-02 12:24:53.497 15591 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:24:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:53.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:54 np0005465988 nova_compute[236126]: 2025-10-02 12:24:54.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:54 np0005465988 nova_compute[236126]: 2025-10-02 12:24:54.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:55.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:55.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:57 np0005465988 nova_compute[236126]: 2025-10-02 12:24:57.266 2 DEBUG oslo_concurrency.lockutils [None req-921bee37-cc4c-40cf-8eaf-7ede77a10a84 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:57 np0005465988 nova_compute[236126]: 2025-10-02 12:24:57.268 2 DEBUG oslo_concurrency.lockutils [None req-921bee37-cc4c-40cf-8eaf-7ede77a10a84 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:57 np0005465988 nova_compute[236126]: 2025-10-02 12:24:57.269 2 DEBUG nova.compute.manager [None req-921bee37-cc4c-40cf-8eaf-7ede77a10a84 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:57 np0005465988 nova_compute[236126]: 2025-10-02 12:24:57.274 2 DEBUG nova.compute.manager [None req-921bee37-cc4c-40cf-8eaf-7ede77a10a84 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Oct  2 08:24:57 np0005465988 nova_compute[236126]: 2025-10-02 12:24:57.276 2 DEBUG nova.objects.instance [None req-921bee37-cc4c-40cf-8eaf-7ede77a10a84 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'flavor' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:57 np0005465988 nova_compute[236126]: 2025-10-02 12:24:57.314 2 DEBUG nova.virt.libvirt.driver [None req-921bee37-cc4c-40cf-8eaf-7ede77a10a84 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:24:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:24:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:57.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:24:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:24:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:57.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:57 np0005465988 nova_compute[236126]: 2025-10-02 12:24:57.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:58 np0005465988 nova_compute[236126]: 2025-10-02 12:24:58.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:58 np0005465988 nova_compute[236126]: 2025-10-02 12:24:58.470 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:24:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:24:59.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:24:59 np0005465988 nova_compute[236126]: 2025-10-02 12:24:59.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:59 np0005465988 nova_compute[236126]: 2025-10-02 12:24:59.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:24:59 np0005465988 nova_compute[236126]: 2025-10-02 12:24:59.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:24:59 np0005465988 nova_compute[236126]: 2025-10-02 12:24:59.526 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:59 np0005465988 nova_compute[236126]: 2025-10-02 12:24:59.527 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:59 np0005465988 nova_compute[236126]: 2025-10-02 12:24:59.528 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:24:59 np0005465988 nova_compute[236126]: 2025-10-02 12:24:59.529 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:24:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:24:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:24:59.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:24:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e291 e291: 3 total, 3 up, 3 in
Oct  2 08:25:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e292 e292: 3 total, 3 up, 3 in
Oct  2 08:25:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:01.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:01 np0005465988 nova_compute[236126]: 2025-10-02 12:25:01.573 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:01 np0005465988 nova_compute[236126]: 2025-10-02 12:25:01.599 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:01 np0005465988 nova_compute[236126]: 2025-10-02 12:25:01.600 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:25:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:01.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e293 e293: 3 total, 3 up, 3 in
Oct  2 08:25:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:01Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:25:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:02 np0005465988 nova_compute[236126]: 2025-10-02 12:25:02.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:02 np0005465988 nova_compute[236126]: 2025-10-02 12:25:02.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:03 np0005465988 nova_compute[236126]: 2025-10-02 12:25:03.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:03.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:03.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:05.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:05.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e294 e294: 3 total, 3 up, 3 in
Oct  2 08:25:07 np0005465988 nova_compute[236126]: 2025-10-02 12:25:07.367 2 DEBUG nova.virt.libvirt.driver [None req-921bee37-cc4c-40cf-8eaf-7ede77a10a84 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:25:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:07.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:07.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:07 np0005465988 nova_compute[236126]: 2025-10-02 12:25:07.856 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:07 np0005465988 nova_compute[236126]: 2025-10-02 12:25:07.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:07 np0005465988 nova_compute[236126]: 2025-10-02 12:25:07.949 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Triggering sync for uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:25:07 np0005465988 nova_compute[236126]: 2025-10-02 12:25:07.950 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Triggering sync for uuid 7b4bdbc9-7451-4500-8794-c8edef50d6a4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:25:07 np0005465988 nova_compute[236126]: 2025-10-02 12:25:07.950 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:07 np0005465988 nova_compute[236126]: 2025-10-02 12:25:07.950 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:07 np0005465988 nova_compute[236126]: 2025-10-02 12:25:07.951 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:08 np0005465988 nova_compute[236126]: 2025-10-02 12:25:08.052 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:08 np0005465988 nova_compute[236126]: 2025-10-02 12:25:08.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:08 np0005465988 nova_compute[236126]: 2025-10-02 12:25:08.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:09.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:09.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:10 np0005465988 kernel: tap3bdb6970-48 (unregistering): left promiscuous mode
Oct  2 08:25:10 np0005465988 NetworkManager[45041]: <info>  [1759407910.6263] device (tap3bdb6970-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00426|binding|INFO|Releasing lport 3bdb6970-487f-4313-ab25-aa900f8b084a from this chassis (sb_readonly=0)
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00427|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a down in Southbound
Oct  2 08:25:10 np0005465988 nova_compute[236126]: 2025-10-02 12:25:10.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00428|binding|INFO|Removing iface tap3bdb6970-48 ovn-installed in OVS
Oct  2 08:25:10 np0005465988 nova_compute[236126]: 2025-10-02 12:25:10.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:10 np0005465988 nova_compute[236126]: 2025-10-02 12:25:10.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:10 np0005465988 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000064.scope: Deactivated successfully.
Oct  2 08:25:10 np0005465988 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000064.scope: Consumed 14.849s CPU time.
Oct  2 08:25:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:10.690 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:10.693 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff unbound from our chassis#033[00m
Oct  2 08:25:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:10.695 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:25:10 np0005465988 systemd-machined[192594]: Machine qemu-41-instance-00000064 terminated.
Oct  2 08:25:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:10.697 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[99e3dfd3-2d53-45f7-b8ed-2f6b12f49547]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:10.698 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace which is not needed anymore#033[00m
Oct  2 08:25:10 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279414]: [NOTICE]   (279427) : haproxy version is 2.8.14-c23fe91
Oct  2 08:25:10 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279414]: [NOTICE]   (279427) : path to executable is /usr/sbin/haproxy
Oct  2 08:25:10 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279414]: [WARNING]  (279427) : Exiting Master process...
Oct  2 08:25:10 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279414]: [ALERT]    (279427) : Current worker (279429) exited with code 143 (Terminated)
Oct  2 08:25:10 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279414]: [WARNING]  (279427) : All workers exited. Exiting... (0)
Oct  2 08:25:10 np0005465988 systemd[1]: libpod-637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91.scope: Deactivated successfully.
Oct  2 08:25:10 np0005465988 kernel: tap3bdb6970-48: entered promiscuous mode
Oct  2 08:25:10 np0005465988 NetworkManager[45041]: <info>  [1759407910.8631] manager: (tap3bdb6970-48): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Oct  2 08:25:10 np0005465988 podman[279549]: 2025-10-02 12:25:10.864270251 +0000 UTC m=+0.061280664 container died 637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:25:10 np0005465988 nova_compute[236126]: 2025-10-02 12:25:10.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00429|binding|INFO|Claiming lport 3bdb6970-487f-4313-ab25-aa900f8b084a for this chassis.
Oct  2 08:25:10 np0005465988 kernel: tap3bdb6970-48 (unregistering): left promiscuous mode
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00430|binding|INFO|3bdb6970-487f-4313-ab25-aa900f8b084a: Claiming fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:25:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:10.890 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:10 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91-userdata-shm.mount: Deactivated successfully.
Oct  2 08:25:10 np0005465988 nova_compute[236126]: 2025-10-02 12:25:10.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00431|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a ovn-installed in OVS
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00432|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a up in Southbound
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00433|binding|INFO|Releasing lport 3bdb6970-487f-4313-ab25-aa900f8b084a from this chassis (sb_readonly=1)
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00434|if_status|INFO|Dropped 2 log messages in last 60 seconds (most recently, 60 seconds ago) due to excessive rate
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00435|if_status|INFO|Not setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a down as sb is readonly
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00436|binding|INFO|Removing iface tap3bdb6970-48 ovn-installed in OVS
Oct  2 08:25:10 np0005465988 systemd[1]: var-lib-containers-storage-overlay-2efd9f58a7616a3f75728e3989e5b1348449c05afbd040c8d7c636e813e50925-merged.mount: Deactivated successfully.
Oct  2 08:25:10 np0005465988 podman[279549]: 2025-10-02 12:25:10.916248906 +0000 UTC m=+0.113259339 container cleanup 637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:25:10 np0005465988 nova_compute[236126]: 2025-10-02 12:25:10.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:10 np0005465988 systemd[1]: libpod-conmon-637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91.scope: Deactivated successfully.
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00437|binding|INFO|Releasing lport 3bdb6970-487f-4313-ab25-aa900f8b084a from this chassis (sb_readonly=0)
Oct  2 08:25:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:10Z|00438|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a down in Southbound
Oct  2 08:25:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:10.945 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:10 np0005465988 podman[279583]: 2025-10-02 12:25:10.988855315 +0000 UTC m=+0.050923546 container remove 637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:25:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:10.995 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[039f8461-c23a-4c49-8265-c8de6de7e1e4]: (4, ('Thu Oct  2 12:25:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91)\n637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91\nThu Oct  2 12:25:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91)\n637f0602e46583a8d287a6846c710a2242ab345d9f903948f11995ff49549b91\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:10.997 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[680b1e11-2908-471b-a75b-04261514727e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:10.998 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:11 np0005465988 kernel: tapb2c62a66-f0: left promiscuous mode
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.023 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e736a3c7-fa2d-4d06-91f4-bf15cb25b5d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.060 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4d52b446-68d7-46f9-b038-775a5adb7131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.062 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[164165fb-6848-4984-a039-2af56ff2832e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.089 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a209dc08-42c8-4b49-b3da-b52f687e4e47]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594844, 'reachable_time': 43948, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279604, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005465988 systemd[1]: run-netns-ovnmeta\x2db2c62a66\x2df9bc\x2d4a45\x2da843\x2daef2e12a7fff.mount: Deactivated successfully.
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.096 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.096 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[70a209f4-8ff6-4676-aec9-c88c5c362795]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.097 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff unbound from our chassis#033[00m
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.100 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.101 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[45d6574f-914c-4626-8064-5bd22455b2cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.103 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff unbound from our chassis#033[00m
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.105 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:25:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:11.106 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dc6c3fff-3c2d-491b-b465-236f8d99243f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.389 2 INFO nova.virt.libvirt.driver [None req-921bee37-cc4c-40cf-8eaf-7ede77a10a84 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance shutdown successfully after 14 seconds.#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.394 2 INFO nova.virt.libvirt.driver [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance destroyed successfully.#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.394 2 DEBUG nova.objects.instance [None req-921bee37-cc4c-40cf-8eaf-7ede77a10a84 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:11.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.585 2 DEBUG nova.compute.manager [None req-921bee37-cc4c-40cf-8eaf-7ede77a10a84 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:11.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.801 2 DEBUG oslo_concurrency.lockutils [None req-921bee37-cc4c-40cf-8eaf-7ede77a10a84 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 14.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.803 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 3.853s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.803 2 INFO nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] During sync_power_state the instance has a pending task (powering-off). Skip.#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.803 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.962 2 DEBUG nova.compute.manager [req-706a51de-8425-4460-bc44-7976311623a6 req-4d984d41-7f37-4731-bfb9-9a240d3313f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.962 2 DEBUG oslo_concurrency.lockutils [req-706a51de-8425-4460-bc44-7976311623a6 req-4d984d41-7f37-4731-bfb9-9a240d3313f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.963 2 DEBUG oslo_concurrency.lockutils [req-706a51de-8425-4460-bc44-7976311623a6 req-4d984d41-7f37-4731-bfb9-9a240d3313f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.963 2 DEBUG oslo_concurrency.lockutils [req-706a51de-8425-4460-bc44-7976311623a6 req-4d984d41-7f37-4731-bfb9-9a240d3313f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.963 2 DEBUG nova.compute.manager [req-706a51de-8425-4460-bc44-7976311623a6 req-4d984d41-7f37-4731-bfb9-9a240d3313f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:11 np0005465988 nova_compute[236126]: 2025-10-02 12:25:11.963 2 WARNING nova.compute.manager [req-706a51de-8425-4460-bc44-7976311623a6 req-4d984d41-7f37-4731-bfb9-9a240d3313f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state stopped and task_state None.#033[00m
Oct  2 08:25:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:12 np0005465988 podman[279606]: 2025-10-02 12:25:12.553464202 +0000 UTC m=+0.081264048 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 08:25:12 np0005465988 podman[279607]: 2025-10-02 12:25:12.618114362 +0000 UTC m=+0.143656943 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd)
Oct  2 08:25:12 np0005465988 podman[279605]: 2025-10-02 12:25:12.670942912 +0000 UTC m=+0.201879518 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:25:12 np0005465988 nova_compute[236126]: 2025-10-02 12:25:12.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:13 np0005465988 nova_compute[236126]: 2025-10-02 12:25:13.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:13.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:13.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.303 2 DEBUG nova.compute.manager [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.305 2 DEBUG oslo_concurrency.lockutils [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.305 2 DEBUG oslo_concurrency.lockutils [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.306 2 DEBUG oslo_concurrency.lockutils [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.307 2 DEBUG nova.compute.manager [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.307 2 WARNING nova.compute.manager [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state stopped and task_state None.#033[00m
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.308 2 DEBUG nova.compute.manager [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.308 2 DEBUG oslo_concurrency.lockutils [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.309 2 DEBUG oslo_concurrency.lockutils [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.310 2 DEBUG oslo_concurrency.lockutils [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.310 2 DEBUG nova.compute.manager [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:14 np0005465988 nova_compute[236126]: 2025-10-02 12:25:14.311 2 WARNING nova.compute.manager [req-5ac0d43e-a6d6-49af-9344-19fef5f06779 req-036e4ba4-934d-4ad2-901b-063022b88695 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state stopped and task_state None.#033[00m
Oct  2 08:25:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:15.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:15 np0005465988 nova_compute[236126]: 2025-10-02 12:25:15.548 2 DEBUG nova.objects.instance [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'flavor' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:15.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:15 np0005465988 nova_compute[236126]: 2025-10-02 12:25:15.819 2 DEBUG oslo_concurrency.lockutils [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:15 np0005465988 nova_compute[236126]: 2025-10-02 12:25:15.820 2 DEBUG oslo_concurrency.lockutils [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquired lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:15 np0005465988 nova_compute[236126]: 2025-10-02 12:25:15.820 2 DEBUG nova.network.neutron [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:25:15 np0005465988 nova_compute[236126]: 2025-10-02 12:25:15.821 2 DEBUG nova.objects.instance [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'info_cache' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.581 2 DEBUG nova.compute.manager [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.583 2 DEBUG oslo_concurrency.lockutils [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.583 2 DEBUG oslo_concurrency.lockutils [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.583 2 DEBUG oslo_concurrency.lockutils [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.584 2 DEBUG nova.compute.manager [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.584 2 WARNING nova.compute.manager [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state stopped and task_state powering-on.#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.584 2 DEBUG nova.compute.manager [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.585 2 DEBUG oslo_concurrency.lockutils [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.585 2 DEBUG oslo_concurrency.lockutils [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.585 2 DEBUG oslo_concurrency.lockutils [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.586 2 DEBUG nova.compute.manager [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.586 2 WARNING nova.compute.manager [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state stopped and task_state powering-on.#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.586 2 DEBUG nova.compute.manager [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.587 2 DEBUG oslo_concurrency.lockutils [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.587 2 DEBUG oslo_concurrency.lockutils [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.588 2 DEBUG oslo_concurrency.lockutils [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.588 2 DEBUG nova.compute.manager [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.588 2 WARNING nova.compute.manager [req-58d4124e-f91d-4db7-afa5-6e2a40edce5b req-93f12871-fd89-47f5-b57d-967958cf6468 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state stopped and task_state powering-on.#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.783 2 DEBUG oslo_concurrency.lockutils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Acquiring lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.784 2 DEBUG oslo_concurrency.lockutils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.784 2 INFO nova.compute.manager [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Shelving#033[00m
Oct  2 08:25:16 np0005465988 nova_compute[236126]: 2025-10-02 12:25:16.849 2 DEBUG nova.virt.libvirt.driver [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:25:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:17.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:17.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:17 np0005465988 nova_compute[236126]: 2025-10-02 12:25:17.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:17 np0005465988 nova_compute[236126]: 2025-10-02 12:25:17.927 2 DEBUG nova.network.neutron [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.341 2 DEBUG oslo_concurrency.lockutils [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Releasing lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.456 2 INFO nova.virt.libvirt.driver [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance destroyed successfully.#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.457 2 DEBUG nova.objects.instance [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.576 2 DEBUG nova.objects.instance [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'resources' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.946 2 DEBUG nova.virt.libvirt.vif [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:25:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.947 2 DEBUG nova.network.os_vif_util [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.948 2 DEBUG nova.network.os_vif_util [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.949 2 DEBUG os_vif [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.953 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bdb6970-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.962 2 INFO os_vif [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48')#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.973 2 DEBUG nova.virt.libvirt.driver [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Start _get_guest_xml network_info=[{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.978 2 WARNING nova.virt.libvirt.driver [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.986 2 DEBUG nova.virt.libvirt.host [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.987 2 DEBUG nova.virt.libvirt.host [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.990 2 DEBUG nova.virt.libvirt.host [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.991 2 DEBUG nova.virt.libvirt.host [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.993 2 DEBUG nova.virt.libvirt.driver [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.993 2 DEBUG nova.virt.hardware [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.994 2 DEBUG nova.virt.hardware [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.995 2 DEBUG nova.virt.hardware [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.995 2 DEBUG nova.virt.hardware [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.996 2 DEBUG nova.virt.hardware [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.996 2 DEBUG nova.virt.hardware [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.997 2 DEBUG nova.virt.hardware [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.997 2 DEBUG nova.virt.hardware [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.998 2 DEBUG nova.virt.hardware [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:25:18 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.998 2 DEBUG nova.virt.hardware [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:25:19 np0005465988 nova_compute[236126]: 2025-10-02 12:25:18.999 2 DEBUG nova.virt.hardware [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:25:19 np0005465988 nova_compute[236126]: 2025-10-02 12:25:19.000 2 DEBUG nova.objects.instance [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:19 np0005465988 nova_compute[236126]: 2025-10-02 12:25:19.078 2 DEBUG oslo_concurrency.processutils [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:19.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:25:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2698005824' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:25:19 np0005465988 nova_compute[236126]: 2025-10-02 12:25:19.570 2 DEBUG oslo_concurrency.processutils [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:19 np0005465988 nova_compute[236126]: 2025-10-02 12:25:19.622 2 DEBUG oslo_concurrency.processutils [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:19.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:25:20 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/25624865' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.162 2 DEBUG oslo_concurrency.processutils [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.539s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.167 2 DEBUG nova.virt.libvirt.vif [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:25:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.169 2 DEBUG nova.network.os_vif_util [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.171 2 DEBUG nova.network.os_vif_util [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.174 2 DEBUG nova.objects.instance [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.207 2 DEBUG nova.virt.libvirt.driver [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  <uuid>87ebffd5-69af-414b-be5d-67ba42e8cae1</uuid>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  <name>instance-00000064</name>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerActionsTestJSON-server-131502281</nova:name>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:25:18</nova:creationTime>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <nova:user uuid="2bd16d1f5f9d4eb396c474eedee67165">tempest-ServerActionsTestJSON-842270816-project-member</nova:user>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <nova:project uuid="4b8ca48cb5f64ef3b0736b8be82378b8">tempest-ServerActionsTestJSON-842270816</nova:project>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <nova:port uuid="3bdb6970-487f-4313-ab25-aa900f8b084a">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <entry name="serial">87ebffd5-69af-414b-be5d-67ba42e8cae1</entry>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <entry name="uuid">87ebffd5-69af-414b-be5d-67ba42e8cae1</entry>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/87ebffd5-69af-414b-be5d-67ba42e8cae1_disk">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/87ebffd5-69af-414b-be5d-67ba42e8cae1_disk.config">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:22:0e:b9"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <target dev="tap3bdb6970-48"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/console.log" append="off"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <input type="keyboard" bus="usb"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:25:20 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:25:20 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:25:20 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:25:20 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.210 2 DEBUG nova.virt.libvirt.driver [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.211 2 DEBUG nova.virt.libvirt.driver [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.212 2 DEBUG nova.virt.libvirt.vif [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:25:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.212 2 DEBUG nova.network.os_vif_util [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.213 2 DEBUG nova.network.os_vif_util [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.213 2 DEBUG os_vif [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.214 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.214 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.218 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bdb6970-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.219 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3bdb6970-48, col_values=(('external_ids', {'iface-id': '3bdb6970-487f-4313-ab25-aa900f8b084a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:0e:b9', 'vm-uuid': '87ebffd5-69af-414b-be5d-67ba42e8cae1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 NetworkManager[45041]: <info>  [1759407920.2224] manager: (tap3bdb6970-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.231 2 INFO os_vif [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48')#033[00m
Oct  2 08:25:20 np0005465988 kernel: tap3bdb6970-48: entered promiscuous mode
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 NetworkManager[45041]: <info>  [1759407920.3545] manager: (tap3bdb6970-48): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:20Z|00439|binding|INFO|Claiming lport 3bdb6970-487f-4313-ab25-aa900f8b084a for this chassis.
Oct  2 08:25:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:20Z|00440|binding|INFO|3bdb6970-487f-4313-ab25-aa900f8b084a: Claiming fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:25:20 np0005465988 systemd-udevd[279861]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:25:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:20Z|00441|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a ovn-installed in OVS
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 kernel: tap9d6e67d8-8c (unregistering): left promiscuous mode
Oct  2 08:25:20 np0005465988 NetworkManager[45041]: <info>  [1759407920.4164] device (tap9d6e67d8-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:25:20 np0005465988 systemd-machined[192594]: New machine qemu-42-instance-00000064.
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:20Z|00442|binding|INFO|Releasing lport 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb from this chassis (sb_readonly=1)
Oct  2 08:25:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:20Z|00443|binding|INFO|Removing iface tap9d6e67d8-8c ovn-installed in OVS
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 systemd[1]: Started Virtual Machine qemu-42-instance-00000064.
Oct  2 08:25:20 np0005465988 NetworkManager[45041]: <info>  [1759407920.4339] device (tap3bdb6970-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:25:20 np0005465988 NetworkManager[45041]: <info>  [1759407920.4351] device (tap3bdb6970-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:20Z|00444|binding|INFO|Setting lport 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb down in Southbound
Oct  2 08:25:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:20Z|00445|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a up in Southbound
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.456 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '9', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.458 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff bound to our chassis#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.460 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.476 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c16b211e-87fa-4300-9b55-8c1d08809785]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.477 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2c62a66-f1 in ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.479 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2c62a66-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.480 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4a209e47-1b40-4e50-8585-e0967c86b456]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.480 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[182e4193-38aa-41bd-b002-06e3f0b005af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000065.scope: Deactivated successfully.
Oct  2 08:25:20 np0005465988 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000065.scope: Consumed 17.563s CPU time.
Oct  2 08:25:20 np0005465988 systemd-machined[192594]: Machine qemu-38-instance-00000065 terminated.
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.496 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[4a557329-b1e4-45f3-b85d-d691c5b90ac1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.523 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:91:e2 10.100.0.10'], port_security=['fa:16:3e:6d:91:e2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7b4bdbc9-7451-4500-8794-c8edef50d6a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4035a600-4a5e-41ee-a619-d81e2c993b79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10fff81da7a54740a53a0771ce916329', 'neutron:revision_number': '4', 'neutron:security_group_ids': '32af0a94-4565-470d-9918-1bc97e347f8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.188'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5dc7931-b785-4336-99b8-936a17be87c3, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=9d6e67d8-8c6a-4b95-b332-80f8674a0ebb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.530 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e59ad6c0-5a6c-4048-beb5-b089af5e3070]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.570 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5a134c93-2746-4127-bfd0-237db6d4910b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.579 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[da216358-4f84-47aa-bb1f-b42d072fcf29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 NetworkManager[45041]: <info>  [1759407920.5798] manager: (tapb2c62a66-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/207)
Oct  2 08:25:20 np0005465988 systemd-udevd[279867]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.622 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c3be730f-fcc0-42bb-a6d0-52da2298454c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.626 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4e824e90-1561-4b99-8c74-eb286c18cfa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 NetworkManager[45041]: <info>  [1759407920.6590] device (tapb2c62a66-f0): carrier: link connected
Oct  2 08:25:20 np0005465988 NetworkManager[45041]: <info>  [1759407920.6688] manager: (tap9d6e67d8-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.673 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[17d439fa-ac6b-4487-bfcc-13a3b0f87015]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.695 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7191afdc-12d8-4fe9-912a-4de4ac0bc86e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598099, 'reachable_time': 30795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279913, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.721 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7bab2415-43e3-464c-8ddf-05b18d644a85]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:7a7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 598099, 'tstamp': 598099}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 279922, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.744 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ee93034e-3ff5-4a59-9a88-83e97345e484]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598099, 'reachable_time': 30795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 279931, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.786 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[968fd2b7-3f6d-4d91-a55f-deb43bea2d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.862 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ef7575c7-7e2a-4b9b-9eb5-8685ea55fd08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.865 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.866 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.867 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c62a66-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 NetworkManager[45041]: <info>  [1759407920.8707] manager: (tapb2c62a66-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.873 2 INFO nova.virt.libvirt.driver [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Instance shutdown successfully after 4 seconds.#033[00m
Oct  2 08:25:20 np0005465988 kernel: tapb2c62a66-f0: entered promiscuous mode
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.882 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2c62a66-f0, col_values=(('external_ids', {'iface-id': '8c1234f6-f595-4979-94c0-98da2211f0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:20Z|00446|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.887 2 INFO nova.virt.libvirt.driver [-] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Instance destroyed successfully.#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.893 2 DEBUG nova.objects.instance [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7b4bdbc9-7451-4500-8794-c8edef50d6a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.913 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.915 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7deb6b56-e213-4b8b-9ac3-e128d82c3fc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.916 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:25:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:20.917 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'env', 'PROCESS_TAG=haproxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.969 2 DEBUG nova.compute.manager [req-b573b693-329c-448b-958c-fc7fe492e488 req-08f85933-7801-4e5a-a42e-b3df42342c94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.969 2 DEBUG oslo_concurrency.lockutils [req-b573b693-329c-448b-958c-fc7fe492e488 req-08f85933-7801-4e5a-a42e-b3df42342c94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.970 2 DEBUG oslo_concurrency.lockutils [req-b573b693-329c-448b-958c-fc7fe492e488 req-08f85933-7801-4e5a-a42e-b3df42342c94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.970 2 DEBUG oslo_concurrency.lockutils [req-b573b693-329c-448b-958c-fc7fe492e488 req-08f85933-7801-4e5a-a42e-b3df42342c94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.970 2 DEBUG nova.compute.manager [req-b573b693-329c-448b-958c-fc7fe492e488 req-08f85933-7801-4e5a-a42e-b3df42342c94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:20 np0005465988 nova_compute[236126]: 2025-10-02 12:25:20.970 2 WARNING nova.compute.manager [req-b573b693-329c-448b-958c-fc7fe492e488 req-08f85933-7801-4e5a-a42e-b3df42342c94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state stopped and task_state powering-on.#033[00m
Oct  2 08:25:21 np0005465988 nova_compute[236126]: 2025-10-02 12:25:21.237 2 INFO nova.virt.libvirt.driver [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Beginning cold snapshot process#033[00m
Oct  2 08:25:21 np0005465988 podman[279968]: 2025-10-02 12:25:21.366418306 +0000 UTC m=+0.055362694 container create 9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:25:21 np0005465988 systemd[1]: Started libpod-conmon-9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba.scope.
Oct  2 08:25:21 np0005465988 podman[279968]: 2025-10-02 12:25:21.3380509 +0000 UTC m=+0.026995288 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:25:21 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:25:21 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/570144e1a970adaae4f4cf9bd9c1fd9ce43161f700c79a92b125d5322a4464cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:25:21 np0005465988 podman[279968]: 2025-10-02 12:25:21.471595572 +0000 UTC m=+0.160539950 container init 9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:25:21 np0005465988 podman[279968]: 2025-10-02 12:25:21.482976449 +0000 UTC m=+0.171920837 container start 9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:25:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:21.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:21 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279984]: [NOTICE]   (279988) : New worker (279990) forked
Oct  2 08:25:21 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279984]: [NOTICE]   (279988) : Loading success.
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.538 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb in datapath 4035a600-4a5e-41ee-a619-d81e2c993b79 unbound from our chassis#033[00m
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.542 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4035a600-4a5e-41ee-a619-d81e2c993b79, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.543 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3498e53e-cec1-4180-98aa-c1b925a18c6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.544 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79 namespace which is not needed anymore#033[00m
Oct  2 08:25:21 np0005465988 neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79[278143]: [NOTICE]   (278147) : haproxy version is 2.8.14-c23fe91
Oct  2 08:25:21 np0005465988 neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79[278143]: [NOTICE]   (278147) : path to executable is /usr/sbin/haproxy
Oct  2 08:25:21 np0005465988 neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79[278143]: [WARNING]  (278147) : Exiting Master process...
Oct  2 08:25:21 np0005465988 neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79[278143]: [WARNING]  (278147) : Exiting Master process...
Oct  2 08:25:21 np0005465988 neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79[278143]: [ALERT]    (278147) : Current worker (278150) exited with code 143 (Terminated)
Oct  2 08:25:21 np0005465988 neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79[278143]: [WARNING]  (278147) : All workers exited. Exiting... (0)
Oct  2 08:25:21 np0005465988 systemd[1]: libpod-a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78.scope: Deactivated successfully.
Oct  2 08:25:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:21.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:21 np0005465988 podman[280015]: 2025-10-02 12:25:21.692684552 +0000 UTC m=+0.047648932 container died a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:25:21 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78-userdata-shm.mount: Deactivated successfully.
Oct  2 08:25:21 np0005465988 systemd[1]: var-lib-containers-storage-overlay-51944d077d4ed90800edaf890b3d23307635a413602e8c0e8704020086611126-merged.mount: Deactivated successfully.
Oct  2 08:25:21 np0005465988 podman[280015]: 2025-10-02 12:25:21.744499212 +0000 UTC m=+0.099463592 container cleanup a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:25:21 np0005465988 systemd[1]: libpod-conmon-a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78.scope: Deactivated successfully.
Oct  2 08:25:21 np0005465988 podman[280045]: 2025-10-02 12:25:21.828714645 +0000 UTC m=+0.055146157 container remove a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.842 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5da88506-767b-47a6-a128-c710c126ae75]: (4, ('Thu Oct  2 12:25:21 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79 (a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78)\na7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78\nThu Oct  2 12:25:21 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79 (a7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78)\na7bf7efcda56a5cb2d1dc9722a1ed6e4f613056815793ba5a8f35b0a712cda78\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.846 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad36193-a976-42ae-ab3d-b41f0722bcc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:21 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:25:21 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:25:21 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:25:21 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.851 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4035a600-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:21 np0005465988 kernel: tap4035a600-40: left promiscuous mode
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.885 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f0721761-40d7-43f5-9c42-0be7172d696c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.915 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb245a7-7cfc-441e-8374-ec4da5ed4976]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.917 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f65ab0bc-4956-40c3-8c7d-e3961c21aee4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:21 np0005465988 nova_compute[236126]: 2025-10-02 12:25:21.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.938 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c4823aab-bd49-43e2-8c02-8c886817e72a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590153, 'reachable_time': 36805, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280094, 'error': None, 'target': 'ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:21 np0005465988 systemd[1]: run-netns-ovnmeta\x2d4035a600\x2d4a5e\x2d41ee\x2da619\x2dd81e2c993b79.mount: Deactivated successfully.
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.945 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4035a600-4a5e-41ee-a619-d81e2c993b79 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:25:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:21.945 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[84ba9e65-a49e-4892-9cac-ad6897befb25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:21 np0005465988 nova_compute[236126]: 2025-10-02 12:25:21.949 2 DEBUG nova.virt.libvirt.imagebackend [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:25:22 np0005465988 nova_compute[236126]: 2025-10-02 12:25:22.179 2 DEBUG nova.storage.rbd_utils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] creating snapshot(90095c29b70e4292bf868a2b1730cb7d) on rbd image(7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:25:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:25:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:25:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.058 2 DEBUG nova.compute.manager [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Received event network-vif-unplugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.059 2 DEBUG oslo_concurrency.lockutils [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.060 2 DEBUG oslo_concurrency.lockutils [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.061 2 DEBUG oslo_concurrency.lockutils [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.061 2 DEBUG nova.compute.manager [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] No waiting events found dispatching network-vif-unplugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.061 2 WARNING nova.compute.manager [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Received unexpected event network-vif-unplugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.062 2 DEBUG nova.compute.manager [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Received event network-vif-plugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.062 2 DEBUG oslo_concurrency.lockutils [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.063 2 DEBUG oslo_concurrency.lockutils [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.063 2 DEBUG oslo_concurrency.lockutils [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.063 2 DEBUG nova.compute.manager [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] No waiting events found dispatching network-vif-plugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.064 2 WARNING nova.compute.manager [req-3f9a0f79-eef4-42f5-ab30-09ba9d14c436 req-44658e04-e6b8-4064-bc30-b876d4a2af4b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Received unexpected event network-vif-plugged-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.067 2 DEBUG nova.compute.manager [req-d486ca1a-38bf-4429-87c5-2501fc246086 req-c3d7dcc5-4ac2-4c09-a5c7-23ba9fed98ae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.068 2 DEBUG oslo_concurrency.lockutils [req-d486ca1a-38bf-4429-87c5-2501fc246086 req-c3d7dcc5-4ac2-4c09-a5c7-23ba9fed98ae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.068 2 DEBUG oslo_concurrency.lockutils [req-d486ca1a-38bf-4429-87c5-2501fc246086 req-c3d7dcc5-4ac2-4c09-a5c7-23ba9fed98ae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.069 2 DEBUG oslo_concurrency.lockutils [req-d486ca1a-38bf-4429-87c5-2501fc246086 req-c3d7dcc5-4ac2-4c09-a5c7-23ba9fed98ae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.069 2 DEBUG nova.compute.manager [req-d486ca1a-38bf-4429-87c5-2501fc246086 req-c3d7dcc5-4ac2-4c09-a5c7-23ba9fed98ae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.070 2 WARNING nova.compute.manager [req-d486ca1a-38bf-4429-87c5-2501fc246086 req-c3d7dcc5-4ac2-4c09-a5c7-23ba9fed98ae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state stopped and task_state powering-on.#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.464 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 87ebffd5-69af-414b-be5d-67ba42e8cae1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.465 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407923.4642882, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.465 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.468 2 DEBUG nova.compute.manager [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.471 2 INFO nova.virt.libvirt.driver [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance rebooted successfully.#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.471 2 DEBUG nova.compute.manager [None req-0631294d-9ba6-4220-b1e4-c81d32a963c9 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:23.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.516 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.520 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:23 np0005465988 podman[280210]: 2025-10-02 12:25:23.555939022 +0000 UTC m=+0.078037516 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.581 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407923.4691029, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.581 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:25:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e295 e295: 3 total, 3 up, 3 in
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.600 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:23 np0005465988 nova_compute[236126]: 2025-10-02 12:25:23.605 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:23.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:24 np0005465988 nova_compute[236126]: 2025-10-02 12:25:24.650 2 DEBUG nova.storage.rbd_utils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] cloning vms/7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk@90095c29b70e4292bf868a2b1730cb7d to images/cf693858-d747-42e7-8f75-d6c36d36cc6c clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:25:24 np0005465988 nova_compute[236126]: 2025-10-02 12:25:24.877 2 DEBUG nova.storage.rbd_utils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] flattening images/cf693858-d747-42e7-8f75-d6c36d36cc6c flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:25:25 np0005465988 nova_compute[236126]: 2025-10-02 12:25:25.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:25.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:25.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:26 np0005465988 nova_compute[236126]: 2025-10-02 12:25:26.448 2 DEBUG nova.storage.rbd_utils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] removing snapshot(90095c29b70e4292bf868a2b1730cb7d) on rbd image(7b4bdbc9-7451-4500-8794-c8edef50d6a4_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:25:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:27.354 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:27.355 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:27.356 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e296 e296: 3 total, 3 up, 3 in
Oct  2 08:25:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:27.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:27.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:27 np0005465988 nova_compute[236126]: 2025-10-02 12:25:27.927 2 DEBUG nova.storage.rbd_utils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] creating snapshot(snap) on rbd image(cf693858-d747-42e7-8f75-d6c36d36cc6c) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:25:28 np0005465988 nova_compute[236126]: 2025-10-02 12:25:28.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e297 e297: 3 total, 3 up, 3 in
Oct  2 08:25:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:29.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:29.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:29 np0005465988 nova_compute[236126]: 2025-10-02 12:25:29.997 2 INFO nova.compute.manager [None req-6faca91a-36e3-4323-a239-3c66f6ad586f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Pausing#033[00m
Oct  2 08:25:30 np0005465988 nova_compute[236126]: 2025-10-02 12:25:30.000 2 DEBUG nova.objects.instance [None req-6faca91a-36e3-4323-a239-3c66f6ad586f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'flavor' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:30 np0005465988 nova_compute[236126]: 2025-10-02 12:25:30.057 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407930.0566874, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:30 np0005465988 nova_compute[236126]: 2025-10-02 12:25:30.057 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:25:30 np0005465988 nova_compute[236126]: 2025-10-02 12:25:30.059 2 DEBUG nova.compute.manager [None req-6faca91a-36e3-4323-a239-3c66f6ad586f 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:30 np0005465988 nova_compute[236126]: 2025-10-02 12:25:30.084 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:30 np0005465988 nova_compute[236126]: 2025-10-02 12:25:30.089 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:30 np0005465988 nova_compute[236126]: 2025-10-02 12:25:30.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:31 np0005465988 nova_compute[236126]: 2025-10-02 12:25:31.322 2 INFO nova.virt.libvirt.driver [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Snapshot image upload complete#033[00m
Oct  2 08:25:31 np0005465988 nova_compute[236126]: 2025-10-02 12:25:31.323 2 DEBUG nova.compute.manager [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:31 np0005465988 nova_compute[236126]: 2025-10-02 12:25:31.461 2 INFO nova.compute.manager [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Shelve offloading#033[00m
Oct  2 08:25:31 np0005465988 nova_compute[236126]: 2025-10-02 12:25:31.471 2 INFO nova.virt.libvirt.driver [-] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Instance destroyed successfully.#033[00m
Oct  2 08:25:31 np0005465988 nova_compute[236126]: 2025-10-02 12:25:31.472 2 DEBUG nova.compute.manager [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:31 np0005465988 nova_compute[236126]: 2025-10-02 12:25:31.475 2 DEBUG oslo_concurrency.lockutils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Acquiring lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:31 np0005465988 nova_compute[236126]: 2025-10-02 12:25:31.476 2 DEBUG oslo_concurrency.lockutils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Acquired lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:31 np0005465988 nova_compute[236126]: 2025-10-02 12:25:31.476 2 DEBUG nova.network.neutron [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:25:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:31.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:25:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:31.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:25:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e297 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:33 np0005465988 nova_compute[236126]: 2025-10-02 12:25:33.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:33.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:25:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:33.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:25:34 np0005465988 nova_compute[236126]: 2025-10-02 12:25:34.172 2 INFO nova.compute.manager [None req-10699888-8385-4f36-add1-27e5f673177c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Unpausing#033[00m
Oct  2 08:25:34 np0005465988 nova_compute[236126]: 2025-10-02 12:25:34.174 2 DEBUG nova.objects.instance [None req-10699888-8385-4f36-add1-27e5f673177c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'flavor' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:34 np0005465988 nova_compute[236126]: 2025-10-02 12:25:34.251 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407934.2506757, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:34 np0005465988 nova_compute[236126]: 2025-10-02 12:25:34.252 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:25:34 np0005465988 virtqemud[235689]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:25:34 np0005465988 nova_compute[236126]: 2025-10-02 12:25:34.259 2 DEBUG nova.virt.libvirt.guest [None req-10699888-8385-4f36-add1-27e5f673177c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:25:34 np0005465988 nova_compute[236126]: 2025-10-02 12:25:34.260 2 DEBUG nova.compute.manager [None req-10699888-8385-4f36-add1-27e5f673177c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:34 np0005465988 nova_compute[236126]: 2025-10-02 12:25:34.292 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:34 np0005465988 nova_compute[236126]: 2025-10-02 12:25:34.299 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:34 np0005465988 nova_compute[236126]: 2025-10-02 12:25:34.369 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Oct  2 08:25:34 np0005465988 nova_compute[236126]: 2025-10-02 12:25:34.621 2 DEBUG nova.network.neutron [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Updating instance_info_cache with network_info: [{"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:34 np0005465988 nova_compute[236126]: 2025-10-02 12:25:34.759 2 DEBUG oslo_concurrency.lockutils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Releasing lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:35 np0005465988 nova_compute[236126]: 2025-10-02 12:25:35.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:25:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:35.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:25:35 np0005465988 nova_compute[236126]: 2025-10-02 12:25:35.682 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407920.6816967, 7b4bdbc9-7451-4500-8794-c8edef50d6a4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:35 np0005465988 nova_compute[236126]: 2025-10-02 12:25:35.683 2 INFO nova.compute.manager [-] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:25:35 np0005465988 nova_compute[236126]: 2025-10-02 12:25:35.709 2 DEBUG nova.compute.manager [None req-580189ad-d62d-4244-a700-c2f5997ddced - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:35 np0005465988 nova_compute[236126]: 2025-10-02 12:25:35.712 2 DEBUG nova.compute.manager [None req-580189ad-d62d-4244-a700-c2f5997ddced - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:35.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:35 np0005465988 nova_compute[236126]: 2025-10-02 12:25:35.749 2 INFO nova.compute.manager [None req-580189ad-d62d-4244-a700-c2f5997ddced - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] During sync_power_state the instance has a pending task (shelving_offloading). Skip.#033[00m
Oct  2 08:25:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:36.185 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:36.186 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:25:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.233 2 INFO nova.virt.libvirt.driver [-] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Instance destroyed successfully.#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.234 2 DEBUG nova.objects.instance [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lazy-loading 'resources' on Instance uuid 7b4bdbc9-7451-4500-8794-c8edef50d6a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.247 2 DEBUG nova.virt.libvirt.vif [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1305802395',display_name='tempest-ServerActionsTestOtherB-server-1305802395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1305802395',id=101,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGD2jbBFmRg2ZrnheVnZyLwDISk/dFTNtp10+sWyF/q+rC4Q86cvBQSRgacxSPIqXVpmiVTqI66cLDPhvjcnRFXyQqHRS/RWGvUZk+wm1wfft8CveiGko+Vh4vSox2iOrA==',key_name='tempest-keypair-1336245373',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='10fff81da7a54740a53a0771ce916329',ramdisk_id='',reservation_id='r-bt70f33h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1686489955',owner_user_name='tempest-ServerActionsTestOtherB-1686489955-project-member',shelved_at='2025-10-02T12:25:31.322938',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='cf693858-d747-42e7-8f75-d6c36d36cc6c'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:25:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25468893d71641a385711fd2982bb00b',uuid=7b4bdbc9-7451-4500-8794-c8edef50d6a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.248 2 DEBUG nova.network.os_vif_util [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Converting VIF {"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.250 2 DEBUG nova.network.os_vif_util [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:91:e2,bridge_name='br-int',has_traffic_filtering=True,id=9d6e67d8-8c6a-4b95-b332-80f8674a0ebb,network=Network(4035a600-4a5e-41ee-a619-d81e2c993b79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d6e67d8-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.250 2 DEBUG os_vif [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:91:e2,bridge_name='br-int',has_traffic_filtering=True,id=9d6e67d8-8c6a-4b95-b332-80f8674a0ebb,network=Network(4035a600-4a5e-41ee-a619-d81e2c993b79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d6e67d8-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.252 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d6e67d8-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.258 2 INFO os_vif [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:91:e2,bridge_name='br-int',has_traffic_filtering=True,id=9d6e67d8-8c6a-4b95-b332-80f8674a0ebb,network=Network(4035a600-4a5e-41ee-a619-d81e2c993b79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d6e67d8-8c')#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.329 2 DEBUG nova.compute.manager [req-bcae5444-2c51-4121-b805-3bcf92524a97 req-bbe18d5d-4d63-42ea-8584-a41158cd7869 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Received event network-changed-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.330 2 DEBUG nova.compute.manager [req-bcae5444-2c51-4121-b805-3bcf92524a97 req-bbe18d5d-4d63-42ea-8584-a41158cd7869 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Refreshing instance network info cache due to event network-changed-9d6e67d8-8c6a-4b95-b332-80f8674a0ebb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.330 2 DEBUG oslo_concurrency.lockutils [req-bcae5444-2c51-4121-b805-3bcf92524a97 req-bbe18d5d-4d63-42ea-8584-a41158cd7869 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.330 2 DEBUG oslo_concurrency.lockutils [req-bcae5444-2c51-4121-b805-3bcf92524a97 req-bbe18d5d-4d63-42ea-8584-a41158cd7869 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:36 np0005465988 nova_compute[236126]: 2025-10-02 12:25:36.330 2 DEBUG nova.network.neutron [req-bcae5444-2c51-4121-b805-3bcf92524a97 req-bbe18d5d-4d63-42ea-8584-a41158cd7869 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Refreshing network info cache for port 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:25:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e298 e298: 3 total, 3 up, 3 in
Oct  2 08:25:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:37.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:37.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:38 np0005465988 nova_compute[236126]: 2025-10-02 12:25:38.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:38 np0005465988 nova_compute[236126]: 2025-10-02 12:25:38.306 2 INFO nova.virt.libvirt.driver [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Deleting instance files /var/lib/nova/instances/7b4bdbc9-7451-4500-8794-c8edef50d6a4_del#033[00m
Oct  2 08:25:38 np0005465988 nova_compute[236126]: 2025-10-02 12:25:38.307 2 INFO nova.virt.libvirt.driver [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Deletion of /var/lib/nova/instances/7b4bdbc9-7451-4500-8794-c8edef50d6a4_del complete#033[00m
Oct  2 08:25:38 np0005465988 nova_compute[236126]: 2025-10-02 12:25:38.468 2 INFO nova.scheduler.client.report [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Deleted allocations for instance 7b4bdbc9-7451-4500-8794-c8edef50d6a4#033[00m
Oct  2 08:25:38 np0005465988 nova_compute[236126]: 2025-10-02 12:25:38.523 2 DEBUG oslo_concurrency.lockutils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:38 np0005465988 nova_compute[236126]: 2025-10-02 12:25:38.525 2 DEBUG oslo_concurrency.lockutils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:38 np0005465988 nova_compute[236126]: 2025-10-02 12:25:38.601 2 DEBUG oslo_concurrency.processutils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:38 np0005465988 nova_compute[236126]: 2025-10-02 12:25:38.685 2 DEBUG nova.network.neutron [req-bcae5444-2c51-4121-b805-3bcf92524a97 req-bbe18d5d-4d63-42ea-8584-a41158cd7869 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Updated VIF entry in instance network info cache for port 9d6e67d8-8c6a-4b95-b332-80f8674a0ebb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:25:38 np0005465988 nova_compute[236126]: 2025-10-02 12:25:38.686 2 DEBUG nova.network.neutron [req-bcae5444-2c51-4121-b805-3bcf92524a97 req-bbe18d5d-4d63-42ea-8584-a41158cd7869 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Updating instance_info_cache with network_info: [{"id": "9d6e67d8-8c6a-4b95-b332-80f8674a0ebb", "address": "fa:16:3e:6d:91:e2", "network": {"id": "4035a600-4a5e-41ee-a619-d81e2c993b79", "bridge": null, "label": "tempest-ServerActionsTestOtherB-2080271662-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10fff81da7a54740a53a0771ce916329", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap9d6e67d8-8c", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:38 np0005465988 nova_compute[236126]: 2025-10-02 12:25:38.711 2 DEBUG oslo_concurrency.lockutils [req-bcae5444-2c51-4121-b805-3bcf92524a97 req-bbe18d5d-4d63-42ea-8584-a41158cd7869 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-7b4bdbc9-7451-4500-8794-c8edef50d6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:25:39 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/144315044' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:25:39 np0005465988 nova_compute[236126]: 2025-10-02 12:25:39.063 2 DEBUG oslo_concurrency.processutils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:39 np0005465988 nova_compute[236126]: 2025-10-02 12:25:39.070 2 DEBUG nova.compute.provider_tree [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:25:39 np0005465988 nova_compute[236126]: 2025-10-02 12:25:39.096 2 DEBUG nova.scheduler.client.report [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:25:39 np0005465988 nova_compute[236126]: 2025-10-02 12:25:39.187 2 DEBUG oslo_concurrency.lockutils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:39 np0005465988 nova_compute[236126]: 2025-10-02 12:25:39.267 2 DEBUG oslo_concurrency.lockutils [None req-ac3731e0-4710-4fbe-96a9-0d17c1ba97c5 25468893d71641a385711fd2982bb00b 10fff81da7a54740a53a0771ce916329 - - default default] Lock "7b4bdbc9-7451-4500-8794-c8edef50d6a4" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 22.483s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:39.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:39.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:41 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:41Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:25:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:25:41 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2687855842' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:25:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:41.192 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:41 np0005465988 nova_compute[236126]: 2025-10-02 12:25:41.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:25:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:41.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:25:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:41.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:43 np0005465988 nova_compute[236126]: 2025-10-02 12:25:43.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:43.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:43 np0005465988 podman[280474]: 2025-10-02 12:25:43.557295749 +0000 UTC m=+0.079197710 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:25:43 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:43Z|00447|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:25:43 np0005465988 podman[280473]: 2025-10-02 12:25:43.576578913 +0000 UTC m=+0.098403961 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:25:43 np0005465988 nova_compute[236126]: 2025-10-02 12:25:43.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:43 np0005465988 podman[280472]: 2025-10-02 12:25:43.672797261 +0000 UTC m=+0.203601018 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:25:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:43.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:45.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:45.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:46 np0005465988 nova_compute[236126]: 2025-10-02 12:25:46.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:47 np0005465988 nova_compute[236126]: 2025-10-02 12:25:47.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:47.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:47 np0005465988 nova_compute[236126]: 2025-10-02 12:25:47.553 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:47 np0005465988 nova_compute[236126]: 2025-10-02 12:25:47.554 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:47 np0005465988 nova_compute[236126]: 2025-10-02 12:25:47.554 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:47 np0005465988 nova_compute[236126]: 2025-10-02 12:25:47.555 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:25:47 np0005465988 nova_compute[236126]: 2025-10-02 12:25:47.555 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:47.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:25:48 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2938540743' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.049 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.473 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.474 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.495 2 DEBUG oslo_concurrency.lockutils [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.497 2 DEBUG oslo_concurrency.lockutils [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.497 2 INFO nova.compute.manager [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Rebooting instance#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.535 2 DEBUG oslo_concurrency.lockutils [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.536 2 DEBUG oslo_concurrency.lockutils [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquired lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.536 2 DEBUG nova.network.neutron [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.718 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.720 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4214MB free_disk=20.85152816772461GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.721 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.721 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.870 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 87ebffd5-69af-414b-be5d-67ba42e8cae1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.870 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.871 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:25:48 np0005465988 nova_compute[236126]: 2025-10-02 12:25:48.946 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:25:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1143002270' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:25:49 np0005465988 nova_compute[236126]: 2025-10-02 12:25:49.420 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:49 np0005465988 nova_compute[236126]: 2025-10-02 12:25:49.429 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:25:49 np0005465988 nova_compute[236126]: 2025-10-02 12:25:49.450 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:25:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:49.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:49 np0005465988 nova_compute[236126]: 2025-10-02 12:25:49.535 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:25:49 np0005465988 nova_compute[236126]: 2025-10-02 12:25:49.536 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.814s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:49.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:50 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:50Z|00448|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:25:50 np0005465988 nova_compute[236126]: 2025-10-02 12:25:50.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:50 np0005465988 nova_compute[236126]: 2025-10-02 12:25:50.512 2 DEBUG nova.network.neutron [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:50 np0005465988 nova_compute[236126]: 2025-10-02 12:25:50.537 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:50 np0005465988 nova_compute[236126]: 2025-10-02 12:25:50.550 2 DEBUG oslo_concurrency.lockutils [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Releasing lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:50 np0005465988 nova_compute[236126]: 2025-10-02 12:25:50.551 2 DEBUG nova.compute.manager [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:51 np0005465988 kernel: tap3bdb6970-48 (unregistering): left promiscuous mode
Oct  2 08:25:51 np0005465988 NetworkManager[45041]: <info>  [1759407951.0465] device (tap3bdb6970-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:25:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:51Z|00449|binding|INFO|Releasing lport 3bdb6970-487f-4313-ab25-aa900f8b084a from this chassis (sb_readonly=0)
Oct  2 08:25:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:51Z|00450|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a down in Southbound
Oct  2 08:25:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:51Z|00451|binding|INFO|Removing iface tap3bdb6970-48 ovn-installed in OVS
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.090 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.091 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff unbound from our chassis#033[00m
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.092 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.094 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[22590389-5314-4bd5-8d78-618a5673e52e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.100 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace which is not needed anymore#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:51 np0005465988 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000064.scope: Deactivated successfully.
Oct  2 08:25:51 np0005465988 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000064.scope: Consumed 16.732s CPU time.
Oct  2 08:25:51 np0005465988 systemd-machined[192594]: Machine qemu-42-instance-00000064 terminated.
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.306 2 INFO nova.virt.libvirt.driver [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance destroyed successfully.#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.307 2 DEBUG nova.objects.instance [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'resources' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:51 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279984]: [NOTICE]   (279988) : haproxy version is 2.8.14-c23fe91
Oct  2 08:25:51 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279984]: [NOTICE]   (279988) : path to executable is /usr/sbin/haproxy
Oct  2 08:25:51 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279984]: [WARNING]  (279988) : Exiting Master process...
Oct  2 08:25:51 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279984]: [ALERT]    (279988) : Current worker (279990) exited with code 143 (Terminated)
Oct  2 08:25:51 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[279984]: [WARNING]  (279988) : All workers exited. Exiting... (0)
Oct  2 08:25:51 np0005465988 systemd[1]: libpod-9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba.scope: Deactivated successfully.
Oct  2 08:25:51 np0005465988 podman[280613]: 2025-10-02 12:25:51.338563263 +0000 UTC m=+0.102849500 container died 9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.347 2 DEBUG nova.virt.libvirt.vif [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:25:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.348 2 DEBUG nova.network.os_vif_util [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.349 2 DEBUG nova.network.os_vif_util [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.349 2 DEBUG os_vif [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.351 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bdb6970-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.358 2 INFO os_vif [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48')#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.366 2 DEBUG nova.virt.libvirt.driver [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Start _get_guest_xml network_info=[{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.371 2 WARNING nova.virt.libvirt.driver [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:25:51 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba-userdata-shm.mount: Deactivated successfully.
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.382 2 DEBUG nova.virt.libvirt.host [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.383 2 DEBUG nova.virt.libvirt.host [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:25:51 np0005465988 systemd[1]: var-lib-containers-storage-overlay-570144e1a970adaae4f4cf9bd9c1fd9ce43161f700c79a92b125d5322a4464cf-merged.mount: Deactivated successfully.
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.390 2 DEBUG nova.virt.libvirt.host [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.391 2 DEBUG nova.virt.libvirt.host [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.392 2 DEBUG nova.virt.libvirt.driver [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.393 2 DEBUG nova.virt.hardware [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.393 2 DEBUG nova.virt.hardware [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.393 2 DEBUG nova.virt.hardware [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.394 2 DEBUG nova.virt.hardware [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.394 2 DEBUG nova.virt.hardware [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.394 2 DEBUG nova.virt.hardware [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.395 2 DEBUG nova.virt.hardware [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.395 2 DEBUG nova.virt.hardware [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.395 2 DEBUG nova.virt.hardware [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.395 2 DEBUG nova.virt.hardware [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.396 2 DEBUG nova.virt.hardware [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.396 2 DEBUG nova.objects.instance [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:51 np0005465988 podman[280613]: 2025-10-02 12:25:51.399909678 +0000 UTC m=+0.164195925 container cleanup 9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:25:51 np0005465988 systemd[1]: libpod-conmon-9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba.scope: Deactivated successfully.
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.445 2 DEBUG oslo_concurrency.processutils [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:51 np0005465988 podman[280649]: 2025-10-02 12:25:51.472232149 +0000 UTC m=+0.045356576 container remove 9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.480 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b19a065f-d804-4b8f-bcf0-271852733502]: (4, ('Thu Oct  2 12:25:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba)\n9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba\nThu Oct  2 12:25:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba)\n9c766e423ca50286d51a49696ffe40d9a853346559f4510219302061acfdecba\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.482 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ae8d3208-a195-4b0c-9a8c-387a070f0f99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.483 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:51 np0005465988 kernel: tapb2c62a66-f0: left promiscuous mode
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.488 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.489 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.500 2 DEBUG nova.compute.manager [req-37ef6b36-c35d-4309-be1d-c69945ce38cc req-40534c90-3ccf-4066-82bb-cb90f95004a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.501 2 DEBUG oslo_concurrency.lockutils [req-37ef6b36-c35d-4309-be1d-c69945ce38cc req-40534c90-3ccf-4066-82bb-cb90f95004a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.501 2 DEBUG oslo_concurrency.lockutils [req-37ef6b36-c35d-4309-be1d-c69945ce38cc req-40534c90-3ccf-4066-82bb-cb90f95004a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.501 2 DEBUG oslo_concurrency.lockutils [req-37ef6b36-c35d-4309-be1d-c69945ce38cc req-40534c90-3ccf-4066-82bb-cb90f95004a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.501 2 DEBUG nova.compute.manager [req-37ef6b36-c35d-4309-be1d-c69945ce38cc req-40534c90-3ccf-4066-82bb-cb90f95004a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.502 2 WARNING nova.compute.manager [req-37ef6b36-c35d-4309-be1d-c69945ce38cc req-40534c90-3ccf-4066-82bb-cb90f95004a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state reboot_started_hard.#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.510 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1b3c8e7e-32fe-4ea6-9f43-899bc346745b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:51.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.552 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a813d351-193e-4af6-8f24-6d9d995921f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.554 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[eb62c02c-7b2b-497e-8d24-bd6870834078]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.574 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f2058a55-349e-4028-a9e2-d7746f2bada8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598089, 'reachable_time': 29704, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280665, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:51 np0005465988 systemd[1]: run-netns-ovnmeta\x2db2c62a66\x2df9bc\x2d4a45\x2da843\x2daef2e12a7fff.mount: Deactivated successfully.
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.581 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:25:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:51.582 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[0304254f-2991-4efc-8be6-82748e1a9ae7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:51.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:25:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2681305090' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.939 2 DEBUG oslo_concurrency.processutils [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:51 np0005465988 nova_compute[236126]: 2025-10-02 12:25:51.979 2 DEBUG oslo_concurrency.processutils [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:25:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3189314125' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.393 2 DEBUG oslo_concurrency.processutils [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.395 2 DEBUG nova.virt.libvirt.vif [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:25:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.396 2 DEBUG nova.network.os_vif_util [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.397 2 DEBUG nova.network.os_vif_util [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.398 2 DEBUG nova.objects.instance [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.447 2 DEBUG nova.virt.libvirt.driver [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  <uuid>87ebffd5-69af-414b-be5d-67ba42e8cae1</uuid>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  <name>instance-00000064</name>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerActionsTestJSON-server-131502281</nova:name>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:25:51</nova:creationTime>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <nova:user uuid="2bd16d1f5f9d4eb396c474eedee67165">tempest-ServerActionsTestJSON-842270816-project-member</nova:user>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <nova:project uuid="4b8ca48cb5f64ef3b0736b8be82378b8">tempest-ServerActionsTestJSON-842270816</nova:project>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <nova:port uuid="3bdb6970-487f-4313-ab25-aa900f8b084a">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <entry name="serial">87ebffd5-69af-414b-be5d-67ba42e8cae1</entry>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <entry name="uuid">87ebffd5-69af-414b-be5d-67ba42e8cae1</entry>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/87ebffd5-69af-414b-be5d-67ba42e8cae1_disk">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/87ebffd5-69af-414b-be5d-67ba42e8cae1_disk.config">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:22:0e:b9"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <target dev="tap3bdb6970-48"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/console.log" append="off"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <input type="keyboard" bus="usb"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:25:52 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:25:52 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:25:52 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:25:52 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.450 2 DEBUG nova.virt.libvirt.driver [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.451 2 DEBUG nova.virt.libvirt.driver [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.452 2 DEBUG nova.virt.libvirt.vif [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:25:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.453 2 DEBUG nova.network.os_vif_util [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.454 2 DEBUG nova.network.os_vif_util [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.455 2 DEBUG os_vif [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.457 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.458 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.463 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bdb6970-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.463 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3bdb6970-48, col_values=(('external_ids', {'iface-id': '3bdb6970-487f-4313-ab25-aa900f8b084a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:0e:b9', 'vm-uuid': '87ebffd5-69af-414b-be5d-67ba42e8cae1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.477 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.477 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:52 np0005465988 NetworkManager[45041]: <info>  [1759407952.4876] manager: (tap3bdb6970-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:25:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.497 2 INFO os_vif [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48')#033[00m
Oct  2 08:25:52 np0005465988 kernel: tap3bdb6970-48: entered promiscuous mode
Oct  2 08:25:52 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:52Z|00452|binding|INFO|Claiming lport 3bdb6970-487f-4313-ab25-aa900f8b084a for this chassis.
Oct  2 08:25:52 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:52Z|00453|binding|INFO|3bdb6970-487f-4313-ab25-aa900f8b084a: Claiming fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:25:52 np0005465988 systemd-udevd[280592]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:52 np0005465988 NetworkManager[45041]: <info>  [1759407952.6131] manager: (tap3bdb6970-48): new Tun device (/org/freedesktop/NetworkManager/Devices/211)
Oct  2 08:25:52 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:52Z|00454|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a ovn-installed in OVS
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:52 np0005465988 nova_compute[236126]: 2025-10-02 12:25:52.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:52 np0005465988 NetworkManager[45041]: <info>  [1759407952.6335] device (tap3bdb6970-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:25:52 np0005465988 NetworkManager[45041]: <info>  [1759407952.6359] device (tap3bdb6970-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:25:52 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:52Z|00455|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a up in Southbound
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.645 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '11', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.648 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff bound to our chassis#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.652 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff#033[00m
Oct  2 08:25:52 np0005465988 systemd-machined[192594]: New machine qemu-43-instance-00000064.
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.669 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8987587e-77fc-4871-8719-6ad4e6b05892]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.670 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2c62a66-f1 in ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.673 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2c62a66-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.673 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d20cb56a-e68b-4196-a479-6f455aadc0db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.674 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[03af983a-ab1d-49c4-af47-5a21d452d09e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 systemd[1]: Started Virtual Machine qemu-43-instance-00000064.
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.688 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[841becaa-2e17-4a94-8c33-0cc24691f93e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.705 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9f97c75a-ae55-452d-b525-328bfd3d1b75]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.742 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e26ede-8723-4fbd-9b6d-d641d1f024e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.753 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bec5dc1c-8ffc-476b-a460-097c7e86e1b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 NetworkManager[45041]: <info>  [1759407952.7557] manager: (tapb2c62a66-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/212)
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.800 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8190b617-167f-4732-a1b0-60be6e4d1b0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.805 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[64cd8b58-2df0-4429-b6e3-79eb8c909845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 NetworkManager[45041]: <info>  [1759407952.8403] device (tapb2c62a66-f0): carrier: link connected
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.854 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[398915c8-9fe8-43eb-acb9-40c425f1bd57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.885 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[791debd7-0a1a-4e48-a075-71e38e83c54c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601317, 'reachable_time': 26000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280773, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.915 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac88cbd-d880-4f70-80e5-cf309b9cd738]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:7a7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 601317, 'tstamp': 601317}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280774, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:52.949 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f9b27d60-335f-461c-97c2-763b7fab01ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601317, 'reachable_time': 26000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 280776, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:53.001 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[71512468-3144-404f-a9c4-994db1133061]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:53.100 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ef2a7c-7f41-4943-b8a8-1f9ec17d98d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:53.102 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:53.102 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:53.103 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c62a66-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:53 np0005465988 NetworkManager[45041]: <info>  [1759407953.1058] manager: (tapb2c62a66-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/213)
Oct  2 08:25:53 np0005465988 kernel: tapb2c62a66-f0: entered promiscuous mode
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:53.110 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2c62a66-f0, col_values=(('external_ids', {'iface-id': '8c1234f6-f595-4979-94c0-98da2211f0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:53 np0005465988 ovn_controller[132601]: 2025-10-02T12:25:53Z|00456|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:53.126 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:53.129 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[877482a9-efda-44cc-b4cd-31e7b61eaf37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:53.130 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:25:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:25:53.131 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'env', 'PROCESS_TAG=haproxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:53.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:53 np0005465988 podman[280849]: 2025-10-02 12:25:53.651356297 +0000 UTC m=+0.084344418 container create 1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:25:53 np0005465988 systemd[1]: Started libpod-conmon-1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2.scope.
Oct  2 08:25:53 np0005465988 podman[280849]: 2025-10-02 12:25:53.598140766 +0000 UTC m=+0.031128917 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.695 2 DEBUG nova.compute.manager [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.696 2 DEBUG oslo_concurrency.lockutils [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.696 2 DEBUG oslo_concurrency.lockutils [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.696 2 DEBUG oslo_concurrency.lockutils [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:53 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.696 2 DEBUG nova.compute.manager [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.697 2 WARNING nova.compute.manager [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state reboot_started_hard.#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.697 2 DEBUG nova.compute.manager [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.697 2 DEBUG oslo_concurrency.lockutils [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.697 2 DEBUG oslo_concurrency.lockutils [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.698 2 DEBUG oslo_concurrency.lockutils [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.698 2 DEBUG nova.compute.manager [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.698 2 WARNING nova.compute.manager [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state reboot_started_hard.#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.698 2 DEBUG nova.compute.manager [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.698 2 DEBUG oslo_concurrency.lockutils [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.699 2 DEBUG oslo_concurrency.lockutils [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.699 2 DEBUG oslo_concurrency.lockutils [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.699 2 DEBUG nova.compute.manager [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.699 2 WARNING nova.compute.manager [req-060e37ec-978a-4cdc-a7e8-5863eb14c0ad req-c1bad5b3-c249-4837-8a35-72735125d020 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state reboot_started_hard.#033[00m
Oct  2 08:25:53 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b4c8c5613f7cd0d02446f2d7e03a560ea0266981874cd2ef23f007f5078e168/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:25:53 np0005465988 podman[280849]: 2025-10-02 12:25:53.734237891 +0000 UTC m=+0.167226022 container init 1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:25:53 np0005465988 podman[280849]: 2025-10-02 12:25:53.74150286 +0000 UTC m=+0.174490971 container start 1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:25:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:53.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:53 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[280865]: [NOTICE]   (280879) : New worker (280881) forked
Oct  2 08:25:53 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[280865]: [NOTICE]   (280879) : Loading success.
Oct  2 08:25:53 np0005465988 podman[280862]: 2025-10-02 12:25:53.806672995 +0000 UTC m=+0.114038672 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.886 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 87ebffd5-69af-414b-be5d-67ba42e8cae1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.887 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407953.8861327, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.887 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.889 2 DEBUG nova.compute.manager [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.893 2 INFO nova.virt.libvirt.driver [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance rebooted successfully.#033[00m
Oct  2 08:25:53 np0005465988 nova_compute[236126]: 2025-10-02 12:25:53.893 2 DEBUG nova.compute.manager [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:54 np0005465988 nova_compute[236126]: 2025-10-02 12:25:54.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:55.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:55.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:57 np0005465988 nova_compute[236126]: 2025-10-02 12:25:57.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:25:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:25:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:57.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:25:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:25:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:57.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:25:58 np0005465988 nova_compute[236126]: 2025-10-02 12:25:58.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:59 np0005465988 nova_compute[236126]: 2025-10-02 12:25:59.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:25:59.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:25:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:25:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:25:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:25:59.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:00 np0005465988 nova_compute[236126]: 2025-10-02 12:26:00.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:00 np0005465988 nova_compute[236126]: 2025-10-02 12:26:00.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:26:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:01.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:01.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:02 np0005465988 nova_compute[236126]: 2025-10-02 12:26:02.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:03 np0005465988 nova_compute[236126]: 2025-10-02 12:26:03.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:03.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:03.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:05.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:26:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:05.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:26:06 np0005465988 nova_compute[236126]: 2025-10-02 12:26:06.251 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:06 np0005465988 nova_compute[236126]: 2025-10-02 12:26:06.258 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 7b4bdbc9-7451-4500-8794-c8edef50d6a4] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Oct  2 08:26:06 np0005465988 nova_compute[236126]: 2025-10-02 12:26:06.259 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:26:06 np0005465988 nova_compute[236126]: 2025-10-02 12:26:06.260 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:06 np0005465988 nova_compute[236126]: 2025-10-02 12:26:06.680 2 DEBUG oslo_concurrency.lockutils [None req-9d194be4-4c0b-4e0e-968e-6e0d7dfbc619 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 18.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:06 np0005465988 nova_compute[236126]: 2025-10-02 12:26:06.811 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407953.887065, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:06 np0005465988 nova_compute[236126]: 2025-10-02 12:26:06.812 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:26:06 np0005465988 nova_compute[236126]: 2025-10-02 12:26:06.993 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:06 np0005465988 nova_compute[236126]: 2025-10-02 12:26:06.996 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:07Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:26:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:07 np0005465988 nova_compute[236126]: 2025-10-02 12:26:07.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:07.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:07.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:08 np0005465988 nova_compute[236126]: 2025-10-02 12:26:08.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:09 np0005465988 nova_compute[236126]: 2025-10-02 12:26:09.255 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:26:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:09.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:26:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:09.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e299 e299: 3 total, 3 up, 3 in
Oct  2 08:26:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:11.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:11.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:12 np0005465988 nova_compute[236126]: 2025-10-02 12:26:12.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:13 np0005465988 nova_compute[236126]: 2025-10-02 12:26:13.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:13.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:13.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:14 np0005465988 podman[280958]: 2025-10-02 12:26:14.567907982 +0000 UTC m=+0.081855205 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:26:14 np0005465988 podman[280957]: 2025-10-02 12:26:14.583004437 +0000 UTC m=+0.096846727 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid)
Oct  2 08:26:14 np0005465988 podman[280956]: 2025-10-02 12:26:14.591150001 +0000 UTC m=+0.115711240 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Oct  2 08:26:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:15.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:15.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e300 e300: 3 total, 3 up, 3 in
Oct  2 08:26:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:17 np0005465988 nova_compute[236126]: 2025-10-02 12:26:17.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:17.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:17 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #50. Immutable memtables: 7.
Oct  2 08:26:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:17.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:18 np0005465988 nova_compute[236126]: 2025-10-02 12:26:18.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:26:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:19.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:26:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:19.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:21.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:21.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:21 np0005465988 nova_compute[236126]: 2025-10-02 12:26:21.975 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:21 np0005465988 nova_compute[236126]: 2025-10-02 12:26:21.975 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.025 2 DEBUG nova.compute.manager [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.149 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.151 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.159 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.159 2 INFO nova.compute.claims [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.350 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e301 e301: 3 total, 3 up, 3 in
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:26:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2806093705' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.810 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.817 2 DEBUG nova.compute.provider_tree [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.842 2 DEBUG nova.scheduler.client.report [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.872 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.873 2 DEBUG nova.compute.manager [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.950 2 DEBUG nova.compute.manager [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.950 2 DEBUG nova.network.neutron [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:26:22 np0005465988 nova_compute[236126]: 2025-10-02 12:26:22.979 2 INFO nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.006 2 DEBUG nova.compute.manager [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:26:23 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:23Z|00457|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.118 2 DEBUG nova.compute.manager [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.120 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.120 2 INFO nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Creating image(s)#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.146 2 DEBUG nova.storage.rbd_utils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.178 2 DEBUG nova.storage.rbd_utils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.202 2 DEBUG nova.storage.rbd_utils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.206 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.241 2 DEBUG nova.policy [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2bd16d1f5f9d4eb396c474eedee67165', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.282 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.282 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.283 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.283 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.309 2 DEBUG nova.storage.rbd_utils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:23 np0005465988 nova_compute[236126]: 2025-10-02 12:26:23.313 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 9bca5e7a-108e-472a-80ce-ec40358d5475_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:23.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:23.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:24 np0005465988 nova_compute[236126]: 2025-10-02 12:26:24.114 2 DEBUG nova.network.neutron [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Successfully created port: 288a32fc-3d4a-4184-a507-2629d1d19415 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:26:24 np0005465988 nova_compute[236126]: 2025-10-02 12:26:24.142 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 9bca5e7a-108e-472a-80ce-ec40358d5475_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.829s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:24 np0005465988 nova_compute[236126]: 2025-10-02 12:26:24.240 2 DEBUG nova.storage.rbd_utils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] resizing rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:26:24 np0005465988 nova_compute[236126]: 2025-10-02 12:26:24.374 2 DEBUG nova.objects.instance [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 9bca5e7a-108e-472a-80ce-ec40358d5475 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:24 np0005465988 nova_compute[236126]: 2025-10-02 12:26:24.391 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:26:24 np0005465988 nova_compute[236126]: 2025-10-02 12:26:24.391 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Ensure instance console log exists: /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:26:24 np0005465988 nova_compute[236126]: 2025-10-02 12:26:24.392 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:24 np0005465988 nova_compute[236126]: 2025-10-02 12:26:24.393 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:24 np0005465988 nova_compute[236126]: 2025-10-02 12:26:24.393 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:24 np0005465988 podman[281263]: 2025-10-02 12:26:24.520219662 +0000 UTC m=+0.064011802 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:26:25 np0005465988 nova_compute[236126]: 2025-10-02 12:26:25.060 2 DEBUG nova.network.neutron [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Successfully updated port: 288a32fc-3d4a-4184-a507-2629d1d19415 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:26:25 np0005465988 nova_compute[236126]: 2025-10-02 12:26:25.080 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "refresh_cache-9bca5e7a-108e-472a-80ce-ec40358d5475" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:26:25 np0005465988 nova_compute[236126]: 2025-10-02 12:26:25.081 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquired lock "refresh_cache-9bca5e7a-108e-472a-80ce-ec40358d5475" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:26:25 np0005465988 nova_compute[236126]: 2025-10-02 12:26:25.081 2 DEBUG nova.network.neutron [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:26:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:25.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:25 np0005465988 nova_compute[236126]: 2025-10-02 12:26:25.575 2 DEBUG nova.network.neutron [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:26:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:25.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.365 2 DEBUG nova.compute.manager [req-a18257e5-bd50-4ffa-9eba-b25258babd2d req-c99e2565-956f-4208-a909-d42d22f35ffb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received event network-changed-288a32fc-3d4a-4184-a507-2629d1d19415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.365 2 DEBUG nova.compute.manager [req-a18257e5-bd50-4ffa-9eba-b25258babd2d req-c99e2565-956f-4208-a909-d42d22f35ffb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Refreshing instance network info cache due to event network-changed-288a32fc-3d4a-4184-a507-2629d1d19415. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.366 2 DEBUG oslo_concurrency.lockutils [req-a18257e5-bd50-4ffa-9eba-b25258babd2d req-c99e2565-956f-4208-a909-d42d22f35ffb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-9bca5e7a-108e-472a-80ce-ec40358d5475" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.595 2 DEBUG nova.network.neutron [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Updating instance_info_cache with network_info: [{"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.613 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Releasing lock "refresh_cache-9bca5e7a-108e-472a-80ce-ec40358d5475" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.614 2 DEBUG nova.compute.manager [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Instance network_info: |[{"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.614 2 DEBUG oslo_concurrency.lockutils [req-a18257e5-bd50-4ffa-9eba-b25258babd2d req-c99e2565-956f-4208-a909-d42d22f35ffb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-9bca5e7a-108e-472a-80ce-ec40358d5475" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.614 2 DEBUG nova.network.neutron [req-a18257e5-bd50-4ffa-9eba-b25258babd2d req-c99e2565-956f-4208-a909-d42d22f35ffb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Refreshing network info cache for port 288a32fc-3d4a-4184-a507-2629d1d19415 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.617 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Start _get_guest_xml network_info=[{"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.627 2 WARNING nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.634 2 DEBUG nova.virt.libvirt.host [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.635 2 DEBUG nova.virt.libvirt.host [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.643 2 DEBUG nova.virt.libvirt.host [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.643 2 DEBUG nova.virt.libvirt.host [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.645 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.645 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.645 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.646 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.646 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.646 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.646 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.646 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.647 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.647 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.647 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.647 2 DEBUG nova.virt.hardware [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:26:26 np0005465988 nova_compute[236126]: 2025-10-02 12:26:26.650 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:27 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:27Z|00458|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:26:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:26:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2256351142' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.180 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.223 2 DEBUG nova.storage.rbd_utils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.228 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 e302: 3 total, 3 up, 3 in
Oct  2 08:26:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:27.355 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:27.356 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:27.357 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:27.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:26:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/49128928' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.681 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.683 2 DEBUG nova.virt.libvirt.vif [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:26:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1542100027',display_name='tempest-tempest.common.compute-instance-1542100027',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1542100027',id=105,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-h6tzp1ed',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:26:23Z,user_data=None,user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=9bca5e7a-108e-472a-80ce-ec40358d5475,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.684 2 DEBUG nova.network.os_vif_util [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.686 2 DEBUG nova.network.os_vif_util [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.689 2 DEBUG nova.objects.instance [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9bca5e7a-108e-472a-80ce-ec40358d5475 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.770 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  <uuid>9bca5e7a-108e-472a-80ce-ec40358d5475</uuid>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  <name>instance-00000069</name>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <nova:name>tempest-tempest.common.compute-instance-1542100027</nova:name>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:26:26</nova:creationTime>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <nova:user uuid="2bd16d1f5f9d4eb396c474eedee67165">tempest-ServerActionsTestJSON-842270816-project-member</nova:user>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <nova:project uuid="4b8ca48cb5f64ef3b0736b8be82378b8">tempest-ServerActionsTestJSON-842270816</nova:project>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <nova:port uuid="288a32fc-3d4a-4184-a507-2629d1d19415">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <entry name="serial">9bca5e7a-108e-472a-80ce-ec40358d5475</entry>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <entry name="uuid">9bca5e7a-108e-472a-80ce-ec40358d5475</entry>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/9bca5e7a-108e-472a-80ce-ec40358d5475_disk">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:fd:01:b6"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <target dev="tap288a32fc-3d"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/console.log" append="off"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:26:27 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:26:27 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:26:27 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:26:27 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.772 2 DEBUG nova.compute.manager [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Preparing to wait for external event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.773 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.773 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.774 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.774 2 DEBUG nova.virt.libvirt.vif [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:26:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1542100027',display_name='tempest-tempest.common.compute-instance-1542100027',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1542100027',id=105,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-h6tzp1ed',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:26:23Z,user_data=None,user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=9bca5e7a-108e-472a-80ce-ec40358d5475,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.775 2 DEBUG nova.network.os_vif_util [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.775 2 DEBUG nova.network.os_vif_util [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.776 2 DEBUG os_vif [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.777 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.777 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.782 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap288a32fc-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.782 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap288a32fc-3d, col_values=(('external_ids', {'iface-id': '288a32fc-3d4a-4184-a507-2629d1d19415', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:01:b6', 'vm-uuid': '9bca5e7a-108e-472a-80ce-ec40358d5475'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:27 np0005465988 NetworkManager[45041]: <info>  [1759407987.7850] manager: (tap288a32fc-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.800 2 INFO os_vif [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d')#033[00m
Oct  2 08:26:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:27.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.899 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.900 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.900 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No VIF found with MAC fa:16:3e:fd:01:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.901 2 INFO nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Using config drive#033[00m
Oct  2 08:26:27 np0005465988 nova_compute[236126]: 2025-10-02 12:26:27.937 2 DEBUG nova.storage.rbd_utils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.188 2 DEBUG nova.network.neutron [req-a18257e5-bd50-4ffa-9eba-b25258babd2d req-c99e2565-956f-4208-a909-d42d22f35ffb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Updated VIF entry in instance network info cache for port 288a32fc-3d4a-4184-a507-2629d1d19415. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.189 2 DEBUG nova.network.neutron [req-a18257e5-bd50-4ffa-9eba-b25258babd2d req-c99e2565-956f-4208-a909-d42d22f35ffb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Updating instance_info_cache with network_info: [{"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.207 2 DEBUG oslo_concurrency.lockutils [req-a18257e5-bd50-4ffa-9eba-b25258babd2d req-c99e2565-956f-4208-a909-d42d22f35ffb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-9bca5e7a-108e-472a-80ce-ec40358d5475" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.353 2 INFO nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Creating config drive at /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config#033[00m
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.362 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmlzl3h7b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.513 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmlzl3h7b" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.547 2 DEBUG nova.storage.rbd_utils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.552 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config 9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.746 2 DEBUG oslo_concurrency.processutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config 9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.194s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.748 2 INFO nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Deleting local config drive /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config because it was imported into RBD.#033[00m
Oct  2 08:26:28 np0005465988 kernel: tap288a32fc-3d: entered promiscuous mode
Oct  2 08:26:28 np0005465988 NetworkManager[45041]: <info>  [1759407988.8159] manager: (tap288a32fc-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/215)
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:28Z|00459|binding|INFO|Claiming lport 288a32fc-3d4a-4184-a507-2629d1d19415 for this chassis.
Oct  2 08:26:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:28Z|00460|binding|INFO|288a32fc-3d4a-4184-a507-2629d1d19415: Claiming fa:16:3e:fd:01:b6 10.100.0.10
Oct  2 08:26:28 np0005465988 systemd-udevd[281419]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:26:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:28Z|00461|binding|INFO|Setting lport 288a32fc-3d4a-4184-a507-2629d1d19415 ovn-installed in OVS
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:28 np0005465988 systemd-machined[192594]: New machine qemu-44-instance-00000069.
Oct  2 08:26:28 np0005465988 nova_compute[236126]: 2025-10-02 12:26:28.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:28.873 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:01:b6 10.100.0.10'], port_security=['fa:16:3e:fd:01:b6 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9bca5e7a-108e-472a-80ce-ec40358d5475', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '062aeef7-5182-4ff6-9976-014dcb98df92', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=288a32fc-3d4a-4184-a507-2629d1d19415) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:28Z|00462|binding|INFO|Setting lport 288a32fc-3d4a-4184-a507-2629d1d19415 up in Southbound
Oct  2 08:26:28 np0005465988 NetworkManager[45041]: <info>  [1759407988.8760] device (tap288a32fc-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:26:28 np0005465988 NetworkManager[45041]: <info>  [1759407988.8779] device (tap288a32fc-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:26:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:28.877 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 288a32fc-3d4a-4184-a507-2629d1d19415 in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff bound to our chassis#033[00m
Oct  2 08:26:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:28.880 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff#033[00m
Oct  2 08:26:28 np0005465988 systemd[1]: Started Virtual Machine qemu-44-instance-00000069.
Oct  2 08:26:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:28.901 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[32cdeef9-46b0-4ba5-be60-9ee7567f7bc2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:28.942 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[12c10bdb-edc5-49e7-acef-46c6de15e855]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:28.945 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c56f9784-8869-4826-ad2a-40f683d709d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:28.986 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2928ddac-0a4a-47ce-b122-716ae7988b54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:29.014 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[78275608-49bf-4ea2-9cfe-ed9e1c2b6cfb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601317, 'reachable_time': 26000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281435, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:29.039 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[81da00dd-3599-4ae1-80bd-dd7259757bff]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb2c62a66-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 601337, 'tstamp': 601337}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281436, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb2c62a66-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 601342, 'tstamp': 601342}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281436, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:29.041 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:29 np0005465988 nova_compute[236126]: 2025-10-02 12:26:29.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:29 np0005465988 nova_compute[236126]: 2025-10-02 12:26:29.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:29.045 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c62a66-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:29.046 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:29.046 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2c62a66-f0, col_values=(('external_ids', {'iface-id': '8c1234f6-f595-4979-94c0-98da2211f0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:29.047 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:29.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:29.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:29 np0005465988 nova_compute[236126]: 2025-10-02 12:26:29.949 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407989.9489555, 9bca5e7a-108e-472a-80ce-ec40358d5475 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:29 np0005465988 nova_compute[236126]: 2025-10-02 12:26:29.950 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] VM Started (Lifecycle Event)#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.212 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.220 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407989.9492111, 9bca5e7a-108e-472a-80ce-ec40358d5475 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.221 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.256 2 DEBUG nova.compute.manager [req-b3a179b7-6efb-4f19-9dcb-68d6bad40532 req-9d5b4828-c0ee-4fe7-95e7-9a5aaf41829e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.256 2 DEBUG oslo_concurrency.lockutils [req-b3a179b7-6efb-4f19-9dcb-68d6bad40532 req-9d5b4828-c0ee-4fe7-95e7-9a5aaf41829e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.257 2 DEBUG oslo_concurrency.lockutils [req-b3a179b7-6efb-4f19-9dcb-68d6bad40532 req-9d5b4828-c0ee-4fe7-95e7-9a5aaf41829e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.257 2 DEBUG oslo_concurrency.lockutils [req-b3a179b7-6efb-4f19-9dcb-68d6bad40532 req-9d5b4828-c0ee-4fe7-95e7-9a5aaf41829e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.257 2 DEBUG nova.compute.manager [req-b3a179b7-6efb-4f19-9dcb-68d6bad40532 req-9d5b4828-c0ee-4fe7-95e7-9a5aaf41829e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Processing event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.258 2 DEBUG nova.compute.manager [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.262 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.264 2 INFO nova.virt.libvirt.driver [-] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Instance spawned successfully.#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.265 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.702 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.707 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759407990.2614846, 9bca5e7a-108e-472a-80ce-ec40358d5475 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.708 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.710 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.710 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.711 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.711 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.711 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.712 2 DEBUG nova.virt.libvirt.driver [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.805 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.810 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.841 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.875 2 INFO nova.compute.manager [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Took 7.76 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.875 2 DEBUG nova.compute.manager [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:30 np0005465988 nova_compute[236126]: 2025-10-02 12:26:30.966 2 INFO nova.compute.manager [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Took 8.85 seconds to build instance.#033[00m
Oct  2 08:26:31 np0005465988 nova_compute[236126]: 2025-10-02 12:26:31.016 2 DEBUG oslo_concurrency.lockutils [None req-bb91d5e9-fcae-40e7-9cfb-870eb6b4634c 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:31.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:31.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:32 np0005465988 nova_compute[236126]: 2025-10-02 12:26:32.688 2 DEBUG nova.compute.manager [req-b4e7f532-acda-4fc4-adf2-60574f3653d9 req-c6e8b51f-7d01-4c49-a6fa-db35b15a1ca9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:32 np0005465988 nova_compute[236126]: 2025-10-02 12:26:32.689 2 DEBUG oslo_concurrency.lockutils [req-b4e7f532-acda-4fc4-adf2-60574f3653d9 req-c6e8b51f-7d01-4c49-a6fa-db35b15a1ca9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:32 np0005465988 nova_compute[236126]: 2025-10-02 12:26:32.689 2 DEBUG oslo_concurrency.lockutils [req-b4e7f532-acda-4fc4-adf2-60574f3653d9 req-c6e8b51f-7d01-4c49-a6fa-db35b15a1ca9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:32 np0005465988 nova_compute[236126]: 2025-10-02 12:26:32.690 2 DEBUG oslo_concurrency.lockutils [req-b4e7f532-acda-4fc4-adf2-60574f3653d9 req-c6e8b51f-7d01-4c49-a6fa-db35b15a1ca9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:32 np0005465988 nova_compute[236126]: 2025-10-02 12:26:32.690 2 DEBUG nova.compute.manager [req-b4e7f532-acda-4fc4-adf2-60574f3653d9 req-c6e8b51f-7d01-4c49-a6fa-db35b15a1ca9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] No waiting events found dispatching network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:32 np0005465988 nova_compute[236126]: 2025-10-02 12:26:32.691 2 WARNING nova.compute.manager [req-b4e7f532-acda-4fc4-adf2-60574f3653d9 req-c6e8b51f-7d01-4c49-a6fa-db35b15a1ca9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received unexpected event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:26:32 np0005465988 nova_compute[236126]: 2025-10-02 12:26:32.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:33 np0005465988 nova_compute[236126]: 2025-10-02 12:26:33.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:33.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:33.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:34 np0005465988 nova_compute[236126]: 2025-10-02 12:26:34.601 2 INFO nova.compute.manager [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Rebuilding instance#033[00m
Oct  2 08:26:34 np0005465988 nova_compute[236126]: 2025-10-02 12:26:34.979 2 DEBUG nova.objects.instance [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9bca5e7a-108e-472a-80ce-ec40358d5475 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:35 np0005465988 nova_compute[236126]: 2025-10-02 12:26:35.005 2 DEBUG nova.compute.manager [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:35 np0005465988 nova_compute[236126]: 2025-10-02 12:26:35.055 2 DEBUG nova.objects.instance [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'pci_requests' on Instance uuid 9bca5e7a-108e-472a-80ce-ec40358d5475 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:35 np0005465988 nova_compute[236126]: 2025-10-02 12:26:35.068 2 DEBUG nova.objects.instance [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9bca5e7a-108e-472a-80ce-ec40358d5475 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:35 np0005465988 nova_compute[236126]: 2025-10-02 12:26:35.079 2 DEBUG nova.objects.instance [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'resources' on Instance uuid 9bca5e7a-108e-472a-80ce-ec40358d5475 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:35 np0005465988 nova_compute[236126]: 2025-10-02 12:26:35.093 2 DEBUG nova.objects.instance [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 9bca5e7a-108e-472a-80ce-ec40358d5475 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:35 np0005465988 nova_compute[236126]: 2025-10-02 12:26:35.106 2 DEBUG nova.objects.instance [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:26:35 np0005465988 nova_compute[236126]: 2025-10-02 12:26:35.110 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:26:35 np0005465988 nova_compute[236126]: 2025-10-02 12:26:35.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:35.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:35.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:36 np0005465988 nova_compute[236126]: 2025-10-02 12:26:36.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:36.435 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:36.437 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:26:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 08:26:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct  2 08:26:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:26:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:26:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:37.441 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:37.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 08:26:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:26:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:26:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:26:37 np0005465988 nova_compute[236126]: 2025-10-02 12:26:37.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:37.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:38 np0005465988 nova_compute[236126]: 2025-10-02 12:26:38.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:38 np0005465988 nova_compute[236126]: 2025-10-02 12:26:38.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:39.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:39.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:40 np0005465988 nova_compute[236126]: 2025-10-02 12:26:40.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:40Z|00463|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:26:40 np0005465988 nova_compute[236126]: 2025-10-02 12:26:40.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:41.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:41.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:42 np0005465988 nova_compute[236126]: 2025-10-02 12:26:42.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:42 np0005465988 nova_compute[236126]: 2025-10-02 12:26:42.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:42 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:42Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fd:01:b6 10.100.0.10
Oct  2 08:26:42 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:42Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fd:01:b6 10.100.0.10
Oct  2 08:26:43 np0005465988 nova_compute[236126]: 2025-10-02 12:26:43.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:43.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:43.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:26:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:26:45 np0005465988 nova_compute[236126]: 2025-10-02 12:26:45.157 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:26:45 np0005465988 podman[281840]: 2025-10-02 12:26:45.535125065 +0000 UTC m=+0.063934850 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:26:45 np0005465988 podman[281839]: 2025-10-02 12:26:45.562179373 +0000 UTC m=+0.089418833 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:26:45 np0005465988 podman[281838]: 2025-10-02 12:26:45.569640858 +0000 UTC m=+0.097852076 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:26:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:45.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:45.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:47 np0005465988 kernel: tap288a32fc-3d (unregistering): left promiscuous mode
Oct  2 08:26:47 np0005465988 NetworkManager[45041]: <info>  [1759408007.5466] device (tap288a32fc-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:26:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:47Z|00464|binding|INFO|Releasing lport 288a32fc-3d4a-4184-a507-2629d1d19415 from this chassis (sb_readonly=0)
Oct  2 08:26:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:47Z|00465|binding|INFO|Setting lport 288a32fc-3d4a-4184-a507-2629d1d19415 down in Southbound
Oct  2 08:26:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:47Z|00466|binding|INFO|Removing iface tap288a32fc-3d ovn-installed in OVS
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.570 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:01:b6 10.100.0.10'], port_security=['fa:16:3e:fd:01:b6 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9bca5e7a-108e-472a-80ce-ec40358d5475', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '062aeef7-5182-4ff6-9976-014dcb98df92', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=288a32fc-3d4a-4184-a507-2629d1d19415) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.571 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 288a32fc-3d4a-4184-a507-2629d1d19415 in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff unbound from our chassis#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.572 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff#033[00m
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:47.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.588 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c1539b91-a8a1-4ab4-9d64-af6a3ad558e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:47 np0005465988 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000069.scope: Deactivated successfully.
Oct  2 08:26:47 np0005465988 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000069.scope: Consumed 13.729s CPU time.
Oct  2 08:26:47 np0005465988 systemd-machined[192594]: Machine qemu-44-instance-00000069 terminated.
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.624 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2dc00e35-b691-473c-b8f2-a6459e915cad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.629 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c74493ae-a06b-4362-b993-7b5c171bb4cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.654 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[13dcb956-1a39-4cb5-838f-16d6014c8288]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.672 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3d37af54-a9f2-460f-9c8f-7a8fa2f843da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 916, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 916, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601317, 'reachable_time': 26000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281916, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.688 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[66e3024d-2c74-4f39-8e0a-98c13b7f4efe]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb2c62a66-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 601337, 'tstamp': 601337}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281917, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb2c62a66-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 601342, 'tstamp': 601342}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281917, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.689 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.698 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c62a66-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.699 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.699 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2c62a66-f0, col_values=(('external_ids', {'iface-id': '8c1234f6-f595-4979-94c0-98da2211f0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:47.699 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:47.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.964 2 DEBUG nova.compute.manager [req-e72d682e-75c9-4039-a0cb-e714f2c2c34a req-ee4a8cb8-1799-4665-a94c-8ab0eb554323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received event network-vif-unplugged-288a32fc-3d4a-4184-a507-2629d1d19415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.965 2 DEBUG oslo_concurrency.lockutils [req-e72d682e-75c9-4039-a0cb-e714f2c2c34a req-ee4a8cb8-1799-4665-a94c-8ab0eb554323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.965 2 DEBUG oslo_concurrency.lockutils [req-e72d682e-75c9-4039-a0cb-e714f2c2c34a req-ee4a8cb8-1799-4665-a94c-8ab0eb554323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.965 2 DEBUG oslo_concurrency.lockutils [req-e72d682e-75c9-4039-a0cb-e714f2c2c34a req-ee4a8cb8-1799-4665-a94c-8ab0eb554323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.965 2 DEBUG nova.compute.manager [req-e72d682e-75c9-4039-a0cb-e714f2c2c34a req-ee4a8cb8-1799-4665-a94c-8ab0eb554323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] No waiting events found dispatching network-vif-unplugged-288a32fc-3d4a-4184-a507-2629d1d19415 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:47 np0005465988 nova_compute[236126]: 2025-10-02 12:26:47.966 2 WARNING nova.compute.manager [req-e72d682e-75c9-4039-a0cb-e714f2c2c34a req-ee4a8cb8-1799-4665-a94c-8ab0eb554323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received unexpected event network-vif-unplugged-288a32fc-3d4a-4184-a507-2629d1d19415 for instance with vm_state active and task_state rebuilding.#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.175 2 INFO nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.180 2 INFO nova.virt.libvirt.driver [-] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Instance destroyed successfully.#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.185 2 INFO nova.virt.libvirt.driver [-] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Instance destroyed successfully.#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.185 2 DEBUG nova.virt.libvirt.vif [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:26:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1542100027',display_name='tempest-ServerActionsTestJSON-server-798919841',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1542100027',id=105,image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:26:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-h6tzp1ed',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:26:33Z,user_data=None,user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=9bca5e7a-108e-472a-80ce-ec40358d5475,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.186 2 DEBUG nova.network.os_vif_util [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.187 2 DEBUG nova.network.os_vif_util [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.187 2 DEBUG os_vif [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap288a32fc-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.193 2 INFO os_vif [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d')#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.969 2 INFO nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Deleting instance files /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475_del#033[00m
Oct  2 08:26:48 np0005465988 nova_compute[236126]: 2025-10-02 12:26:48.970 2 INFO nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Deletion of /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475_del complete#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.163 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.164 2 INFO nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Creating image(s)#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.194 2 DEBUG nova.storage.rbd_utils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.231 2 DEBUG nova.storage.rbd_utils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.260 2 DEBUG nova.storage.rbd_utils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.264 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.265 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:49.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.608 2 DEBUG nova.virt.libvirt.imagebackend [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image locations are: [{'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/db05f54c-61f8-42d6-a1e2-da3219a77b12/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/db05f54c-61f8-42d6-a1e2-da3219a77b12/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.613 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.613 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.613 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.614 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:26:49 np0005465988 nova_compute[236126]: 2025-10-02 12:26:49.614 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:49.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:26:50 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/351298520' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.120 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.216 2 DEBUG nova.compute.manager [req-6b3f2cc2-872d-41aa-8461-ac9243d23639 req-9c81c5cc-5ea2-4cdc-9f74-be157945b249 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.216 2 DEBUG oslo_concurrency.lockutils [req-6b3f2cc2-872d-41aa-8461-ac9243d23639 req-9c81c5cc-5ea2-4cdc-9f74-be157945b249 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.217 2 DEBUG oslo_concurrency.lockutils [req-6b3f2cc2-872d-41aa-8461-ac9243d23639 req-9c81c5cc-5ea2-4cdc-9f74-be157945b249 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.217 2 DEBUG oslo_concurrency.lockutils [req-6b3f2cc2-872d-41aa-8461-ac9243d23639 req-9c81c5cc-5ea2-4cdc-9f74-be157945b249 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.217 2 DEBUG nova.compute.manager [req-6b3f2cc2-872d-41aa-8461-ac9243d23639 req-9c81c5cc-5ea2-4cdc-9f74-be157945b249 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] No waiting events found dispatching network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.217 2 WARNING nova.compute.manager [req-6b3f2cc2-872d-41aa-8461-ac9243d23639 req-9c81c5cc-5ea2-4cdc-9f74-be157945b249 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received unexpected event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.253 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.253 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.610 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.611 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4200MB free_disk=20.901226043701172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.611 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.612 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.752 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.862 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609.part --force-share --output=json" returned: 0 in 0.110s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.864 2 DEBUG nova.virt.images [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] db05f54c-61f8-42d6-a1e2-da3219a77b12 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.884 2 DEBUG nova.privsep.utils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.885 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609.part /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.971 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 87ebffd5-69af-414b-be5d-67ba42e8cae1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.972 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 9bca5e7a-108e-472a-80ce-ec40358d5475 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.972 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:26:50 np0005465988 nova_compute[236126]: 2025-10-02 12:26:50.972 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.106 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.406 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609.part /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609.converted" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.413 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.528 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609.converted --force-share --output=json" returned: 0 in 0.113s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.531 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.562 2 DEBUG nova.storage.rbd_utils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.566 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 9bca5e7a-108e-472a-80ce-ec40358d5475_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:26:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/277064801' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:26:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:51.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.607 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.615 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.638 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.680 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:26:51 np0005465988 nova_compute[236126]: 2025-10-02 12:26:51.681 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:51.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.221 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 9bca5e7a-108e-472a-80ce-ec40358d5475_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.654s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.334 2 DEBUG nova.storage.rbd_utils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] resizing rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:26:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:53.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.670 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.671 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Ensure instance console log exists: /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.672 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.673 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.673 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.675 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Start _get_guest_xml network_info=[{"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:47Z,direct_url=<?>,disk_format='qcow2',id=db05f54c-61f8-42d6-a1e2-da3219a77b12,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.680 2 WARNING nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.684 2 DEBUG nova.virt.libvirt.host [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.685 2 DEBUG nova.virt.libvirt.host [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.688 2 DEBUG nova.virt.libvirt.host [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.689 2 DEBUG nova.virt.libvirt.host [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.690 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.691 2 DEBUG nova.virt.hardware [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:47Z,direct_url=<?>,disk_format='qcow2',id=db05f54c-61f8-42d6-a1e2-da3219a77b12,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.691 2 DEBUG nova.virt.hardware [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.691 2 DEBUG nova.virt.hardware [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.692 2 DEBUG nova.virt.hardware [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.692 2 DEBUG nova.virt.hardware [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.692 2 DEBUG nova.virt.hardware [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.692 2 DEBUG nova.virt.hardware [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.692 2 DEBUG nova.virt.hardware [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.693 2 DEBUG nova.virt.hardware [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.693 2 DEBUG nova.virt.hardware [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.693 2 DEBUG nova.virt.hardware [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.693 2 DEBUG nova.objects.instance [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9bca5e7a-108e-472a-80ce-ec40358d5475 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:53 np0005465988 nova_compute[236126]: 2025-10-02 12:26:53.846 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:53.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:26:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3392361648' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.329 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.360 2 DEBUG nova.storage.rbd_utils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.364 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.684 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.686 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.687 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.687 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.688 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:26:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:26:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2361208000' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.818 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.821 2 DEBUG nova.virt.libvirt.vif [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:26:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1542100027',display_name='tempest-ServerActionsTestJSON-server-798919841',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1542100027',id=105,image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:26:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-h6tzp1ed',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:26:49Z,user_data=None,user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=9bca5e7a-108e-472a-80ce-ec40358d5475,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.822 2 DEBUG nova.network.os_vif_util [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.824 2 DEBUG nova.network.os_vif_util [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.828 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  <uuid>9bca5e7a-108e-472a-80ce-ec40358d5475</uuid>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  <name>instance-00000069</name>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerActionsTestJSON-server-798919841</nova:name>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:26:53</nova:creationTime>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <nova:user uuid="2bd16d1f5f9d4eb396c474eedee67165">tempest-ServerActionsTestJSON-842270816-project-member</nova:user>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <nova:project uuid="4b8ca48cb5f64ef3b0736b8be82378b8">tempest-ServerActionsTestJSON-842270816</nova:project>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="db05f54c-61f8-42d6-a1e2-da3219a77b12"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <nova:port uuid="288a32fc-3d4a-4184-a507-2629d1d19415">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <entry name="serial">9bca5e7a-108e-472a-80ce-ec40358d5475</entry>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <entry name="uuid">9bca5e7a-108e-472a-80ce-ec40358d5475</entry>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/9bca5e7a-108e-472a-80ce-ec40358d5475_disk">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:fd:01:b6"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <target dev="tap288a32fc-3d"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/console.log" append="off"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:26:54 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:26:54 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:26:54 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:26:54 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.830 2 DEBUG nova.compute.manager [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Preparing to wait for external event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.831 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.832 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.832 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.833 2 DEBUG nova.virt.libvirt.vif [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:26:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1542100027',display_name='tempest-ServerActionsTestJSON-server-798919841',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1542100027',id=105,image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:26:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-h6tzp1ed',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:26:49Z,user_data=None,user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=9bca5e7a-108e-472a-80ce-ec40358d5475,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.834 2 DEBUG nova.network.os_vif_util [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.835 2 DEBUG nova.network.os_vif_util [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.836 2 DEBUG os_vif [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.838 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.838 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.842 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap288a32fc-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.843 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap288a32fc-3d, col_values=(('external_ids', {'iface-id': '288a32fc-3d4a-4184-a507-2629d1d19415', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:01:b6', 'vm-uuid': '9bca5e7a-108e-472a-80ce-ec40358d5475'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:54 np0005465988 NetworkManager[45041]: <info>  [1759408014.8472] manager: (tap288a32fc-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/216)
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:54 np0005465988 nova_compute[236126]: 2025-10-02 12:26:54.856 2 INFO os_vif [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d')#033[00m
Oct  2 08:26:55 np0005465988 podman[282241]: 2025-10-02 12:26:55.551983498 +0000 UTC m=+0.079597161 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:26:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:55.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:55.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:56 np0005465988 nova_compute[236126]: 2025-10-02 12:26:56.106 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:26:56 np0005465988 nova_compute[236126]: 2025-10-02 12:26:56.108 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:26:56 np0005465988 nova_compute[236126]: 2025-10-02 12:26:56.108 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No VIF found with MAC fa:16:3e:fd:01:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:26:56 np0005465988 nova_compute[236126]: 2025-10-02 12:26:56.109 2 INFO nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Using config drive#033[00m
Oct  2 08:26:56 np0005465988 nova_compute[236126]: 2025-10-02 12:26:56.144 2 DEBUG nova.storage.rbd_utils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:56 np0005465988 nova_compute[236126]: 2025-10-02 12:26:56.178 2 DEBUG nova.objects.instance [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 9bca5e7a-108e-472a-80ce-ec40358d5475 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:56 np0005465988 nova_compute[236126]: 2025-10-02 12:26:56.226 2 DEBUG nova.objects.instance [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'keypairs' on Instance uuid 9bca5e7a-108e-472a-80ce-ec40358d5475 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:56 np0005465988 nova_compute[236126]: 2025-10-02 12:26:56.476 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:26:57 np0005465988 nova_compute[236126]: 2025-10-02 12:26:57.524 2 INFO nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Creating config drive at /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config#033[00m
Oct  2 08:26:57 np0005465988 nova_compute[236126]: 2025-10-02 12:26:57.532 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp32w5cqh5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:57.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:26:57 np0005465988 nova_compute[236126]: 2025-10-02 12:26:57.693 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp32w5cqh5" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:57 np0005465988 nova_compute[236126]: 2025-10-02 12:26:57.740 2 DEBUG nova.storage.rbd_utils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] rbd image 9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:26:57 np0005465988 nova_compute[236126]: 2025-10-02 12:26:57.745 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config 9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:26:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:57.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:26:57 np0005465988 nova_compute[236126]: 2025-10-02 12:26:57.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:57 np0005465988 nova_compute[236126]: 2025-10-02 12:26:57.973 2 DEBUG oslo_concurrency.processutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config 9bca5e7a-108e-472a-80ce-ec40358d5475_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.227s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:57 np0005465988 nova_compute[236126]: 2025-10-02 12:26:57.974 2 INFO nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Deleting local config drive /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475/disk.config because it was imported into RBD.#033[00m
Oct  2 08:26:58 np0005465988 kernel: tap288a32fc-3d: entered promiscuous mode
Oct  2 08:26:58 np0005465988 NetworkManager[45041]: <info>  [1759408018.0474] manager: (tap288a32fc-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/217)
Oct  2 08:26:58 np0005465988 nova_compute[236126]: 2025-10-02 12:26:58.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:58Z|00467|binding|INFO|Claiming lport 288a32fc-3d4a-4184-a507-2629d1d19415 for this chassis.
Oct  2 08:26:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:58Z|00468|binding|INFO|288a32fc-3d4a-4184-a507-2629d1d19415: Claiming fa:16:3e:fd:01:b6 10.100.0.10
Oct  2 08:26:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:58Z|00469|binding|INFO|Setting lport 288a32fc-3d4a-4184-a507-2629d1d19415 ovn-installed in OVS
Oct  2 08:26:58 np0005465988 nova_compute[236126]: 2025-10-02 12:26:58.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:58 np0005465988 systemd-machined[192594]: New machine qemu-45-instance-00000069.
Oct  2 08:26:58 np0005465988 systemd[1]: Started Virtual Machine qemu-45-instance-00000069.
Oct  2 08:26:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:26:58Z|00470|binding|INFO|Setting lport 288a32fc-3d4a-4184-a507-2629d1d19415 up in Southbound
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.107 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:01:b6 10.100.0.10'], port_security=['fa:16:3e:fd:01:b6 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9bca5e7a-108e-472a-80ce-ec40358d5475', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '062aeef7-5182-4ff6-9976-014dcb98df92', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=288a32fc-3d4a-4184-a507-2629d1d19415) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.108 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 288a32fc-3d4a-4184-a507-2629d1d19415 in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff bound to our chassis#033[00m
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.110 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff#033[00m
Oct  2 08:26:58 np0005465988 systemd-udevd[282335]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.137 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[478a0dc7-0b41-498a-86e2-fb7dbc617c6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:58 np0005465988 NetworkManager[45041]: <info>  [1759408018.1476] device (tap288a32fc-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:26:58 np0005465988 NetworkManager[45041]: <info>  [1759408018.1485] device (tap288a32fc-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.181 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a1f082-366b-46c0-838e-ed3cdab3ba1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.186 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1b9ea12b-da2e-4edb-bac8-37e79951db8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.217 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c046d74d-4fd4-40fd-a30b-3ddc6dac4ecd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.238 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[873e4250-3c97-45cf-9ba2-cae3de573bad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 10, 'rx_bytes': 916, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 10, 'rx_bytes': 916, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601317, 'reachable_time': 26000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282348, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.256 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[98e5af70-5518-4899-913a-79210b10df4b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb2c62a66-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 601337, 'tstamp': 601337}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282349, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb2c62a66-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 601342, 'tstamp': 601342}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282349, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.258 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:58 np0005465988 nova_compute[236126]: 2025-10-02 12:26:58.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:58 np0005465988 nova_compute[236126]: 2025-10-02 12:26:58.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.264 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c62a66-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.264 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.265 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2c62a66-f0, col_values=(('external_ids', {'iface-id': '8c1234f6-f595-4979-94c0-98da2211f0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:26:58.265 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:58 np0005465988 nova_compute[236126]: 2025-10-02 12:26:58.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:58 np0005465988 nova_compute[236126]: 2025-10-02 12:26:58.726 2 DEBUG nova.compute.manager [req-d5fd3c5b-8060-452f-8143-baff326d5f31 req-f1a0d5cf-e07a-4a18-a0b4-f7d5f69bf6f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:58 np0005465988 nova_compute[236126]: 2025-10-02 12:26:58.727 2 DEBUG oslo_concurrency.lockutils [req-d5fd3c5b-8060-452f-8143-baff326d5f31 req-f1a0d5cf-e07a-4a18-a0b4-f7d5f69bf6f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:58 np0005465988 nova_compute[236126]: 2025-10-02 12:26:58.727 2 DEBUG oslo_concurrency.lockutils [req-d5fd3c5b-8060-452f-8143-baff326d5f31 req-f1a0d5cf-e07a-4a18-a0b4-f7d5f69bf6f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:58 np0005465988 nova_compute[236126]: 2025-10-02 12:26:58.727 2 DEBUG oslo_concurrency.lockutils [req-d5fd3c5b-8060-452f-8143-baff326d5f31 req-f1a0d5cf-e07a-4a18-a0b4-f7d5f69bf6f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:58 np0005465988 nova_compute[236126]: 2025-10-02 12:26:58.728 2 DEBUG nova.compute.manager [req-d5fd3c5b-8060-452f-8143-baff326d5f31 req-f1a0d5cf-e07a-4a18-a0b4-f7d5f69bf6f3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Processing event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.232 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 9bca5e7a-108e-472a-80ce-ec40358d5475 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.233 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408019.23182, 9bca5e7a-108e-472a-80ce-ec40358d5475 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.233 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] VM Started (Lifecycle Event)#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.237 2 DEBUG nova.compute.manager [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.241 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.247 2 INFO nova.virt.libvirt.driver [-] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Instance spawned successfully.#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.248 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.283 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.288 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.300 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.301 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.302 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.303 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.304 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.304 2 DEBUG nova.virt.libvirt.driver [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.315 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.316 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408019.2360935, 9bca5e7a-108e-472a-80ce-ec40358d5475 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.316 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.366 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.372 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408019.240133, 9bca5e7a-108e-472a-80ce-ec40358d5475 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.372 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.399 2 DEBUG nova.compute.manager [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.401 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.412 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.452 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.499 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.499 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.500 2 DEBUG nova.objects.instance [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.591 2 DEBUG oslo_concurrency.lockutils [None req-9dea0c9f-f1fc-45b5-b680-056443492251 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:26:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:26:59.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:26:59 np0005465988 nova_compute[236126]: 2025-10-02 12:26:59.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:26:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:26:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:26:59.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:00 np0005465988 nova_compute[236126]: 2025-10-02 12:27:00.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:01 np0005465988 nova_compute[236126]: 2025-10-02 12:27:01.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:01 np0005465988 nova_compute[236126]: 2025-10-02 12:27:01.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:27:01 np0005465988 nova_compute[236126]: 2025-10-02 12:27:01.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:27:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:01.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:01.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:01 np0005465988 nova_compute[236126]: 2025-10-02 12:27:01.930 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:27:01 np0005465988 nova_compute[236126]: 2025-10-02 12:27:01.930 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:27:01 np0005465988 nova_compute[236126]: 2025-10-02 12:27:01.930 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:27:01 np0005465988 nova_compute[236126]: 2025-10-02 12:27:01.931 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:02 np0005465988 nova_compute[236126]: 2025-10-02 12:27:02.668 2 DEBUG nova.compute.manager [req-f057e5c9-5733-41b5-9e0d-68ea825769d8 req-d8b2a6ac-bce6-4c53-9f83-7e769deb78cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:02 np0005465988 nova_compute[236126]: 2025-10-02 12:27:02.670 2 DEBUG oslo_concurrency.lockutils [req-f057e5c9-5733-41b5-9e0d-68ea825769d8 req-d8b2a6ac-bce6-4c53-9f83-7e769deb78cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:02 np0005465988 nova_compute[236126]: 2025-10-02 12:27:02.670 2 DEBUG oslo_concurrency.lockutils [req-f057e5c9-5733-41b5-9e0d-68ea825769d8 req-d8b2a6ac-bce6-4c53-9f83-7e769deb78cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:02 np0005465988 nova_compute[236126]: 2025-10-02 12:27:02.671 2 DEBUG oslo_concurrency.lockutils [req-f057e5c9-5733-41b5-9e0d-68ea825769d8 req-d8b2a6ac-bce6-4c53-9f83-7e769deb78cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:02 np0005465988 nova_compute[236126]: 2025-10-02 12:27:02.672 2 DEBUG nova.compute.manager [req-f057e5c9-5733-41b5-9e0d-68ea825769d8 req-d8b2a6ac-bce6-4c53-9f83-7e769deb78cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] No waiting events found dispatching network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:02 np0005465988 nova_compute[236126]: 2025-10-02 12:27:02.672 2 WARNING nova.compute.manager [req-f057e5c9-5733-41b5-9e0d-68ea825769d8 req-d8b2a6ac-bce6-4c53-9f83-7e769deb78cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received unexpected event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:27:03 np0005465988 nova_compute[236126]: 2025-10-02 12:27:03.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:03.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:03.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:04 np0005465988 nova_compute[236126]: 2025-10-02 12:27:04.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:05.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:05 np0005465988 nova_compute[236126]: 2025-10-02 12:27:05.635 2 DEBUG oslo_concurrency.lockutils [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:05 np0005465988 nova_compute[236126]: 2025-10-02 12:27:05.637 2 DEBUG oslo_concurrency.lockutils [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:05 np0005465988 nova_compute[236126]: 2025-10-02 12:27:05.637 2 DEBUG oslo_concurrency.lockutils [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:05 np0005465988 nova_compute[236126]: 2025-10-02 12:27:05.637 2 DEBUG oslo_concurrency.lockutils [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:05 np0005465988 nova_compute[236126]: 2025-10-02 12:27:05.638 2 DEBUG oslo_concurrency.lockutils [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:05 np0005465988 nova_compute[236126]: 2025-10-02 12:27:05.639 2 INFO nova.compute.manager [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Terminating instance#033[00m
Oct  2 08:27:05 np0005465988 nova_compute[236126]: 2025-10-02 12:27:05.640 2 DEBUG nova.compute.manager [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:27:05 np0005465988 kernel: tap288a32fc-3d (unregistering): left promiscuous mode
Oct  2 08:27:05 np0005465988 NetworkManager[45041]: <info>  [1759408025.7349] device (tap288a32fc-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:27:05 np0005465988 nova_compute[236126]: 2025-10-02 12:27:05.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:27:05Z|00471|binding|INFO|Releasing lport 288a32fc-3d4a-4184-a507-2629d1d19415 from this chassis (sb_readonly=0)
Oct  2 08:27:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:27:05Z|00472|binding|INFO|Setting lport 288a32fc-3d4a-4184-a507-2629d1d19415 down in Southbound
Oct  2 08:27:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:27:05Z|00473|binding|INFO|Removing iface tap288a32fc-3d ovn-installed in OVS
Oct  2 08:27:05 np0005465988 nova_compute[236126]: 2025-10-02 12:27:05.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:05 np0005465988 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000069.scope: Deactivated successfully.
Oct  2 08:27:05 np0005465988 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000069.scope: Consumed 7.729s CPU time.
Oct  2 08:27:05 np0005465988 systemd-machined[192594]: Machine qemu-45-instance-00000069 terminated.
Oct  2 08:27:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:05.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:05 np0005465988 nova_compute[236126]: 2025-10-02 12:27:05.887 2 INFO nova.virt.libvirt.driver [-] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Instance destroyed successfully.#033[00m
Oct  2 08:27:05 np0005465988 nova_compute[236126]: 2025-10-02 12:27:05.888 2 DEBUG nova.objects.instance [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'resources' on Instance uuid 9bca5e7a-108e-472a-80ce-ec40358d5475 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.012 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:01:b6 10.100.0.10'], port_security=['fa:16:3e:fd:01:b6 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9bca5e7a-108e-472a-80ce-ec40358d5475', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '062aeef7-5182-4ff6-9976-014dcb98df92', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=288a32fc-3d4a-4184-a507-2629d1d19415) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.014 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 288a32fc-3d4a-4184-a507-2629d1d19415 in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff unbound from our chassis#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.016 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.040 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0b7e7cc4-d010-43e2-8942-4cf788d608a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.090 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7163f958-2b06-4b8a-a278-7d6c19784c92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.094 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[3732e305-f959-4c5f-ade8-6fc5a328db2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.138 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9526d1d8-7309-4e70-8693-d0c64908ca4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.163 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d7e1d150-1233-49e1-8145-fd4cba050a02]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 12, 'rx_bytes': 916, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 12, 'rx_bytes': 916, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601317, 'reachable_time': 26000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282468, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.181 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a463bf-0e7a-4504-a5a5-bf4d2e6cb49f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb2c62a66-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 601337, 'tstamp': 601337}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282469, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb2c62a66-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 601342, 'tstamp': 601342}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282469, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.183 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:06 np0005465988 nova_compute[236126]: 2025-10-02 12:27:06.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:06 np0005465988 nova_compute[236126]: 2025-10-02 12:27:06.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.193 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c62a66-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.193 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.193 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2c62a66-f0, col_values=(('external_ids', {'iface-id': '8c1234f6-f595-4979-94c0-98da2211f0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:06.194 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:27:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:07.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:07.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:07 np0005465988 nova_compute[236126]: 2025-10-02 12:27:07.929 2 DEBUG nova.virt.libvirt.vif [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:26:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1542100027',display_name='tempest-ServerActionsTestJSON-server-798919841',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1542100027',id=105,image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:26:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-h6tzp1ed',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:26:59Z,user_data=None,user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=9bca5e7a-108e-472a-80ce-ec40358d5475,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:27:07 np0005465988 nova_compute[236126]: 2025-10-02 12:27:07.931 2 DEBUG nova.network.os_vif_util [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "288a32fc-3d4a-4184-a507-2629d1d19415", "address": "fa:16:3e:fd:01:b6", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap288a32fc-3d", "ovs_interfaceid": "288a32fc-3d4a-4184-a507-2629d1d19415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:27:07 np0005465988 nova_compute[236126]: 2025-10-02 12:27:07.932 2 DEBUG nova.network.os_vif_util [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:27:07 np0005465988 nova_compute[236126]: 2025-10-02 12:27:07.933 2 DEBUG os_vif [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:27:07 np0005465988 nova_compute[236126]: 2025-10-02 12:27:07.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:07 np0005465988 nova_compute[236126]: 2025-10-02 12:27:07.937 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap288a32fc-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:07 np0005465988 nova_compute[236126]: 2025-10-02 12:27:07.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:07 np0005465988 nova_compute[236126]: 2025-10-02 12:27:07.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:07 np0005465988 nova_compute[236126]: 2025-10-02 12:27:07.947 2 INFO os_vif [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:01:b6,bridge_name='br-int',has_traffic_filtering=True,id=288a32fc-3d4a-4184-a507-2629d1d19415,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap288a32fc-3d')#033[00m
Oct  2 08:27:08 np0005465988 nova_compute[236126]: 2025-10-02 12:27:08.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:08 np0005465988 nova_compute[236126]: 2025-10-02 12:27:08.668 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:27:08 np0005465988 nova_compute[236126]: 2025-10-02 12:27:08.712 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:27:08 np0005465988 nova_compute[236126]: 2025-10-02 12:27:08.713 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:27:08 np0005465988 nova_compute[236126]: 2025-10-02 12:27:08.895 2 DEBUG nova.compute.manager [req-e0a9d359-928c-4520-be31-86b26d97be47 req-6acfb79e-a8f4-41e2-b2ed-b7d374fc19a4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received event network-vif-unplugged-288a32fc-3d4a-4184-a507-2629d1d19415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:08 np0005465988 nova_compute[236126]: 2025-10-02 12:27:08.896 2 DEBUG oslo_concurrency.lockutils [req-e0a9d359-928c-4520-be31-86b26d97be47 req-6acfb79e-a8f4-41e2-b2ed-b7d374fc19a4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:08 np0005465988 nova_compute[236126]: 2025-10-02 12:27:08.896 2 DEBUG oslo_concurrency.lockutils [req-e0a9d359-928c-4520-be31-86b26d97be47 req-6acfb79e-a8f4-41e2-b2ed-b7d374fc19a4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:08 np0005465988 nova_compute[236126]: 2025-10-02 12:27:08.896 2 DEBUG oslo_concurrency.lockutils [req-e0a9d359-928c-4520-be31-86b26d97be47 req-6acfb79e-a8f4-41e2-b2ed-b7d374fc19a4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:08 np0005465988 nova_compute[236126]: 2025-10-02 12:27:08.897 2 DEBUG nova.compute.manager [req-e0a9d359-928c-4520-be31-86b26d97be47 req-6acfb79e-a8f4-41e2-b2ed-b7d374fc19a4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] No waiting events found dispatching network-vif-unplugged-288a32fc-3d4a-4184-a507-2629d1d19415 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:08 np0005465988 nova_compute[236126]: 2025-10-02 12:27:08.897 2 DEBUG nova.compute.manager [req-e0a9d359-928c-4520-be31-86b26d97be47 req-6acfb79e-a8f4-41e2-b2ed-b7d374fc19a4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received event network-vif-unplugged-288a32fc-3d4a-4184-a507-2629d1d19415 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:27:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:09.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:09.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.320 2 DEBUG nova.compute.manager [req-c809d8ff-6548-43e5-a10b-227fc8b1fada req-f138c931-97ad-42fa-9f72-ad040aee5d76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.321 2 DEBUG oslo_concurrency.lockutils [req-c809d8ff-6548-43e5-a10b-227fc8b1fada req-f138c931-97ad-42fa-9f72-ad040aee5d76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.321 2 DEBUG oslo_concurrency.lockutils [req-c809d8ff-6548-43e5-a10b-227fc8b1fada req-f138c931-97ad-42fa-9f72-ad040aee5d76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.322 2 DEBUG oslo_concurrency.lockutils [req-c809d8ff-6548-43e5-a10b-227fc8b1fada req-f138c931-97ad-42fa-9f72-ad040aee5d76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.322 2 DEBUG nova.compute.manager [req-c809d8ff-6548-43e5-a10b-227fc8b1fada req-f138c931-97ad-42fa-9f72-ad040aee5d76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] No waiting events found dispatching network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.322 2 WARNING nova.compute.manager [req-c809d8ff-6548-43e5-a10b-227fc8b1fada req-f138c931-97ad-42fa-9f72-ad040aee5d76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received unexpected event network-vif-plugged-288a32fc-3d4a-4184-a507-2629d1d19415 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:27:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:11.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.622 2 INFO nova.virt.libvirt.driver [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Deleting instance files /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475_del#033[00m
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.623 2 INFO nova.virt.libvirt.driver [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Deletion of /var/lib/nova/instances/9bca5e7a-108e-472a-80ce-ec40358d5475_del complete#033[00m
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.718 2 INFO nova.compute.manager [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Took 6.08 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.718 2 DEBUG oslo.service.loopingcall [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.719 2 DEBUG nova.compute.manager [-] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:27:11 np0005465988 nova_compute[236126]: 2025-10-02 12:27:11.719 2 DEBUG nova.network.neutron [-] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:27:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:11.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:12 np0005465988 nova_compute[236126]: 2025-10-02 12:27:12.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:13 np0005465988 nova_compute[236126]: 2025-10-02 12:27:13.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:27:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:13.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:27:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:13.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:15 np0005465988 nova_compute[236126]: 2025-10-02 12:27:15.064 2 DEBUG nova.network.neutron [-] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:27:15 np0005465988 nova_compute[236126]: 2025-10-02 12:27:15.350 2 INFO nova.compute.manager [-] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Took 3.63 seconds to deallocate network for instance.#033[00m
Oct  2 08:27:15 np0005465988 nova_compute[236126]: 2025-10-02 12:27:15.451 2 DEBUG oslo_concurrency.lockutils [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:15 np0005465988 nova_compute[236126]: 2025-10-02 12:27:15.452 2 DEBUG oslo_concurrency.lockutils [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:15 np0005465988 nova_compute[236126]: 2025-10-02 12:27:15.573 2 DEBUG oslo_concurrency.processutils [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:15.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:15 np0005465988 nova_compute[236126]: 2025-10-02 12:27:15.810 2 DEBUG nova.compute.manager [req-550406dc-384d-4154-a4bb-5f764abe7c19 req-36907eb1-9ef7-4c4c-973e-7bb4550536ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Received event network-vif-deleted-288a32fc-3d4a-4184-a507-2629d1d19415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:15.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:27:15 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2976274772' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:27:16 np0005465988 nova_compute[236126]: 2025-10-02 12:27:16.014 2 DEBUG oslo_concurrency.processutils [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:16 np0005465988 nova_compute[236126]: 2025-10-02 12:27:16.023 2 DEBUG nova.compute.provider_tree [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:27:16 np0005465988 nova_compute[236126]: 2025-10-02 12:27:16.165 2 DEBUG nova.scheduler.client.report [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:27:16 np0005465988 nova_compute[236126]: 2025-10-02 12:27:16.391 2 DEBUG oslo_concurrency.lockutils [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:16 np0005465988 nova_compute[236126]: 2025-10-02 12:27:16.461 2 INFO nova.scheduler.client.report [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Deleted allocations for instance 9bca5e7a-108e-472a-80ce-ec40358d5475#033[00m
Oct  2 08:27:16 np0005465988 podman[282519]: 2025-10-02 12:27:16.546834737 +0000 UTC m=+0.071486998 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:27:16 np0005465988 podman[282518]: 2025-10-02 12:27:16.553410446 +0000 UTC m=+0.078614103 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:27:16 np0005465988 podman[282517]: 2025-10-02 12:27:16.573141093 +0000 UTC m=+0.110944172 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:27:16 np0005465988 nova_compute[236126]: 2025-10-02 12:27:16.811 2 DEBUG oslo_concurrency.lockutils [None req-f7971a98-3cb6-43a4-be5a-683bc3035a86 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "9bca5e7a-108e-472a-80ce-ec40358d5475" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 11.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:17.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:17.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:17 np0005465988 nova_compute[236126]: 2025-10-02 12:27:17.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:18 np0005465988 nova_compute[236126]: 2025-10-02 12:27:18.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:19.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:27:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:19.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:27:20 np0005465988 nova_compute[236126]: 2025-10-02 12:27:20.886 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408025.8851922, 9bca5e7a-108e-472a-80ce-ec40358d5475 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:27:20 np0005465988 nova_compute[236126]: 2025-10-02 12:27:20.887 2 INFO nova.compute.manager [-] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:27:20 np0005465988 nova_compute[236126]: 2025-10-02 12:27:20.937 2 DEBUG nova.compute.manager [None req-8403141d-c97c-4094-aa7e-90cfd72606bf - - - - - -] [instance: 9bca5e7a-108e-472a-80ce-ec40358d5475] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:21.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:21.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:22 np0005465988 nova_compute[236126]: 2025-10-02 12:27:22.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:23 np0005465988 nova_compute[236126]: 2025-10-02 12:27:23.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:23.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:23.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:25.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:25.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:26 np0005465988 podman[282636]: 2025-10-02 12:27:26.536529361 +0000 UTC m=+0.066476944 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:27:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:27.356 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:27.357 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:27.358 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:27.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:27.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:27 np0005465988 nova_compute[236126]: 2025-10-02 12:27:27.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:28 np0005465988 nova_compute[236126]: 2025-10-02 12:27:28.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:28 np0005465988 nova_compute[236126]: 2025-10-02 12:27:28.880 2 DEBUG nova.compute.manager [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Oct  2 08:27:29 np0005465988 nova_compute[236126]: 2025-10-02 12:27:29.440 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:29 np0005465988 nova_compute[236126]: 2025-10-02 12:27:29.441 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:29 np0005465988 nova_compute[236126]: 2025-10-02 12:27:29.521 2 DEBUG nova.objects.instance [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'pci_requests' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:29 np0005465988 nova_compute[236126]: 2025-10-02 12:27:29.587 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:27:29 np0005465988 nova_compute[236126]: 2025-10-02 12:27:29.587 2 INFO nova.compute.claims [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:27:29 np0005465988 nova_compute[236126]: 2025-10-02 12:27:29.588 2 DEBUG nova.objects.instance [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'resources' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:29 np0005465988 nova_compute[236126]: 2025-10-02 12:27:29.627 2 DEBUG nova.objects.instance [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:29.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:29.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:27:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:31.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:27:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:31.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:32 np0005465988 nova_compute[236126]: 2025-10-02 12:27:32.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:33 np0005465988 nova_compute[236126]: 2025-10-02 12:27:33.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:33.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:33 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #88. Immutable memtables: 0.
Oct  2 08:27:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:33.904200) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:27:33 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 53] Flushing memtable with next log file: 88
Oct  2 08:27:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408053904273, "job": 53, "event": "flush_started", "num_memtables": 1, "num_entries": 2439, "num_deletes": 255, "total_data_size": 5521559, "memory_usage": 5601448, "flush_reason": "Manual Compaction"}
Oct  2 08:27:33 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 53] Level-0 flush table #89: started
Oct  2 08:27:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:33.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408054077348, "cf_name": "default", "job": 53, "event": "table_file_creation", "file_number": 89, "file_size": 3616185, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44818, "largest_seqno": 47252, "table_properties": {"data_size": 3606419, "index_size": 6132, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20964, "raw_average_key_size": 20, "raw_value_size": 3586573, "raw_average_value_size": 3554, "num_data_blocks": 265, "num_entries": 1009, "num_filter_entries": 1009, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759407860, "oldest_key_time": 1759407860, "file_creation_time": 1759408053, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 89, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 53] Flush lasted 173252 microseconds, and 15638 cpu microseconds.
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.077455) [db/flush_job.cc:967] [default] [JOB 53] Level-0 flush table #89: 3616185 bytes OK
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.077482) [db/memtable_list.cc:519] [default] Level-0 commit table #89 started
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.107822) [db/memtable_list.cc:722] [default] Level-0 commit table #89: memtable #1 done
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.107861) EVENT_LOG_v1 {"time_micros": 1759408054107851, "job": 53, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.107886) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 53] Try to delete WAL files size 5510867, prev total WAL file size 5510867, number of live WAL files 2.
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000085.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.110199) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033373635' seq:72057594037927935, type:22 .. '7061786F730034303137' seq:0, type:0; will stop at (end)
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 54] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 53 Base level 0, inputs: [89(3531KB)], [87(9203KB)]
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408054110257, "job": 54, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [89], "files_L6": [87], "score": -1, "input_data_size": 13040723, "oldest_snapshot_seqno": -1}
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 54] Generated table #90: 7175 keys, 11105780 bytes, temperature: kUnknown
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408054317235, "cf_name": "default", "job": 54, "event": "table_file_creation", "file_number": 90, "file_size": 11105780, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11056914, "index_size": 29810, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17989, "raw_key_size": 184725, "raw_average_key_size": 25, "raw_value_size": 10927755, "raw_average_value_size": 1523, "num_data_blocks": 1179, "num_entries": 7175, "num_filter_entries": 7175, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759408054, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 90, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.317902) [db/compaction/compaction_job.cc:1663] [default] [JOB 54] Compacted 1@0 + 1@6 files to L6 => 11105780 bytes
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.346923) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 62.9 rd, 53.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 9.0 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(6.7) write-amplify(3.1) OK, records in: 7704, records dropped: 529 output_compression: NoCompression
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.346973) EVENT_LOG_v1 {"time_micros": 1759408054346948, "job": 54, "event": "compaction_finished", "compaction_time_micros": 207403, "compaction_time_cpu_micros": 49054, "output_level": 6, "num_output_files": 1, "total_output_size": 11105780, "num_input_records": 7704, "num_output_records": 7175, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000089.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408054349313, "job": 54, "event": "table_file_deletion", "file_number": 89}
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000087.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408054353592, "job": 54, "event": "table_file_deletion", "file_number": 87}
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.110034) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.353820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.353825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.353827) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.353830) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:27:34 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:27:34.353832) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:27:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:35.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:27:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:35.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:27:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:37.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:37.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:37 np0005465988 nova_compute[236126]: 2025-10-02 12:27:37.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:38 np0005465988 nova_compute[236126]: 2025-10-02 12:27:38.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:39.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:39.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:41 np0005465988 nova_compute[236126]: 2025-10-02 12:27:41.006 2 INFO nova.compute.resource_tracker [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating resource usage from migration 915ae6b6-0039-4e4a-a1a3-3723d46e5040#033[00m
Oct  2 08:27:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:41.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:41.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:42 np0005465988 nova_compute[236126]: 2025-10-02 12:27:42.439 2 DEBUG oslo_concurrency.processutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:42 np0005465988 nova_compute[236126]: 2025-10-02 12:27:42.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:27:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1536085439' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:27:43 np0005465988 nova_compute[236126]: 2025-10-02 12:27:43.002 2 DEBUG oslo_concurrency.processutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.563s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:43 np0005465988 nova_compute[236126]: 2025-10-02 12:27:43.010 2 DEBUG nova.compute.provider_tree [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:27:43 np0005465988 nova_compute[236126]: 2025-10-02 12:27:43.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:43 np0005465988 nova_compute[236126]: 2025-10-02 12:27:43.467 2 DEBUG nova.scheduler.client.report [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:27:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:43.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:43 np0005465988 nova_compute[236126]: 2025-10-02 12:27:43.839 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 14.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:43 np0005465988 nova_compute[236126]: 2025-10-02 12:27:43.840 2 INFO nova.compute.manager [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Migrating#033[00m
Oct  2 08:27:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:43.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:27:45Z|00474|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Oct  2 08:27:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:45.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:45.668 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:27:45 np0005465988 nova_compute[236126]: 2025-10-02 12:27:45.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:45.671 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:27:45 np0005465988 nova_compute[236126]: 2025-10-02 12:27:45.832 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:27:45 np0005465988 nova_compute[236126]: 2025-10-02 12:27:45.833 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquired lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:27:45 np0005465988 nova_compute[236126]: 2025-10-02 12:27:45.834 2 DEBUG nova.network.neutron [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:27:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:45.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:46 np0005465988 nova_compute[236126]: 2025-10-02 12:27:46.532 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:46 np0005465988 nova_compute[236126]: 2025-10-02 12:27:46.533 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:27:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:27:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:27:46 np0005465988 nova_compute[236126]: 2025-10-02 12:27:46.876 2 DEBUG nova.compute.manager [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:27:47 np0005465988 podman[282870]: 2025-10-02 12:27:47.033056285 +0000 UTC m=+0.107269966 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid)
Oct  2 08:27:47 np0005465988 podman[282871]: 2025-10-02 12:27:47.03531205 +0000 UTC m=+0.114772162 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:27:47 np0005465988 podman[282869]: 2025-10-02 12:27:47.043238298 +0000 UTC m=+0.123094522 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:27:47 np0005465988 nova_compute[236126]: 2025-10-02 12:27:47.425 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:47 np0005465988 nova_compute[236126]: 2025-10-02 12:27:47.426 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:47 np0005465988 nova_compute[236126]: 2025-10-02 12:27:47.434 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:27:47 np0005465988 nova_compute[236126]: 2025-10-02 12:27:47.434 2 INFO nova.compute.claims [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:27:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:47.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:47 np0005465988 nova_compute[236126]: 2025-10-02 12:27:47.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:27:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:47.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:27:48 np0005465988 nova_compute[236126]: 2025-10-02 12:27:48.123 2 DEBUG oslo_concurrency.processutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:48 np0005465988 nova_compute[236126]: 2025-10-02 12:27:48.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:48 np0005465988 nova_compute[236126]: 2025-10-02 12:27:48.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:27:48 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1556605539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:27:48 np0005465988 nova_compute[236126]: 2025-10-02 12:27:48.567 2 DEBUG oslo_concurrency.processutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:48 np0005465988 nova_compute[236126]: 2025-10-02 12:27:48.576 2 DEBUG nova.compute.provider_tree [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:27:48 np0005465988 nova_compute[236126]: 2025-10-02 12:27:48.696 2 DEBUG nova.scheduler.client.report [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:27:48 np0005465988 nova_compute[236126]: 2025-10-02 12:27:48.791 2 DEBUG nova.network.neutron [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:27:49 np0005465988 nova_compute[236126]: 2025-10-02 12:27:49.161 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Releasing lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:27:49 np0005465988 nova_compute[236126]: 2025-10-02 12:27:49.232 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.807s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:49 np0005465988 nova_compute[236126]: 2025-10-02 12:27:49.233 2 DEBUG nova.compute.manager [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:27:49 np0005465988 nova_compute[236126]: 2025-10-02 12:27:49.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:49.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:49 np0005465988 nova_compute[236126]: 2025-10-02 12:27:49.812 2 DEBUG nova.compute.manager [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:27:49 np0005465988 nova_compute[236126]: 2025-10-02 12:27:49.813 2 DEBUG nova.network.neutron [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:27:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:27:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:49.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:27:50 np0005465988 nova_compute[236126]: 2025-10-02 12:27:50.004 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:50 np0005465988 nova_compute[236126]: 2025-10-02 12:27:50.004 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:50 np0005465988 nova_compute[236126]: 2025-10-02 12:27:50.005 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:50 np0005465988 nova_compute[236126]: 2025-10-02 12:27:50.005 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:27:50 np0005465988 nova_compute[236126]: 2025-10-02 12:27:50.006 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:50 np0005465988 nova_compute[236126]: 2025-10-02 12:27:50.360 2 INFO nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:27:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:27:50 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2369965828' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:27:50 np0005465988 nova_compute[236126]: 2025-10-02 12:27:50.502 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:50 np0005465988 nova_compute[236126]: 2025-10-02 12:27:50.748 2 DEBUG nova.policy [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c6bda06d7e6348bab069e07b21022b60', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7aad01ba5df14c1a9309451d0daaab83', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:27:50 np0005465988 nova_compute[236126]: 2025-10-02 12:27:50.777 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:27:50 np0005465988 nova_compute[236126]: 2025-10-02 12:27:50.783 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:27:50 np0005465988 nova_compute[236126]: 2025-10-02 12:27:50.854 2 DEBUG nova.compute.manager [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.051 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.052 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.174 2 INFO nova.virt.block_device [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Booting with volume 4c9d8719-7ebc-4171-8171-a0e55d9f5992 at /dev/vda#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.328 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.329 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4160MB free_disk=20.901382446289062GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.330 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.330 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.337 2 DEBUG os_brick.utils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.339 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.353 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.354 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef311a5-8d49-4e2a-aac0-f5882ddffb07]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.355 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.368 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.369 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[e068c4f4-b052-4d6d-b28f-e6c2ed5123b8]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.370 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.381 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.381 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[4875b998-c591-44e2-82af-6daa32989a3c]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.383 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[4358b07b-556d-4d64-bdaa-def180b90959]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.383 2 DEBUG oslo_concurrency.processutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.423 2 DEBUG oslo_concurrency.processutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CMD "nvme version" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.427 2 DEBUG os_brick.initiator.connectors.lightos [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.427 2 DEBUG os_brick.initiator.connectors.lightos [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.428 2 DEBUG os_brick.initiator.connectors.lightos [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.428 2 DEBUG os_brick.utils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] <== get_connector_properties: return (90ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.429 2 DEBUG nova.virt.block_device [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Updating existing volume attachment record: 4ecec6ac-c34a-4cf2-8a97-239fe3a61f1a _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:27:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:51.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.773 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Applying migration context for instance 87ebffd5-69af-414b-be5d-67ba42e8cae1 as it has an incoming, in-progress migration 915ae6b6-0039-4e4a-a1a3-3723d46e5040. Migration status is migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.774 2 INFO nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating resource usage from migration 915ae6b6-0039-4e4a-a1a3-3723d46e5040#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.799 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Migration 915ae6b6-0039-4e4a-a1a3-3723d46e5040 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.799 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 87ebffd5-69af-414b-be5d-67ba42e8cae1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.800 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 1e47e923-75c9-4c8c-b5f3-86f715462a64 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.800 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.801 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:27:51 np0005465988 nova_compute[236126]: 2025-10-02 12:27:51.887 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:51.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:27:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4062921269' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:27:52 np0005465988 nova_compute[236126]: 2025-10-02 12:27:52.353 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:52 np0005465988 nova_compute[236126]: 2025-10-02 12:27:52.359 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:27:52 np0005465988 nova_compute[236126]: 2025-10-02 12:27:52.569 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:27:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:27:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2321297802' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:27:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:52 np0005465988 nova_compute[236126]: 2025-10-02 12:27:52.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:53 np0005465988 nova_compute[236126]: 2025-10-02 12:27:53.089 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:27:53 np0005465988 nova_compute[236126]: 2025-10-02 12:27:53.089 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:53 np0005465988 nova_compute[236126]: 2025-10-02 12:27:53.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:27:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:53.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:27:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:27:53.673 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:53.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:54 np0005465988 nova_compute[236126]: 2025-10-02 12:27:54.122 2 DEBUG nova.network.neutron [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Successfully created port: 4abf20c2-f65e-479c-8fe9-62982b2fa096 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:27:54 np0005465988 nova_compute[236126]: 2025-10-02 12:27:54.640 2 DEBUG nova.compute.manager [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:27:54 np0005465988 nova_compute[236126]: 2025-10-02 12:27:54.642 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:27:54 np0005465988 nova_compute[236126]: 2025-10-02 12:27:54.643 2 INFO nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Creating image(s)#033[00m
Oct  2 08:27:54 np0005465988 nova_compute[236126]: 2025-10-02 12:27:54.643 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:27:54 np0005465988 nova_compute[236126]: 2025-10-02 12:27:54.643 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Ensure instance console log exists: /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:27:54 np0005465988 nova_compute[236126]: 2025-10-02 12:27:54.645 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:54 np0005465988 nova_compute[236126]: 2025-10-02 12:27:54.646 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:54 np0005465988 nova_compute[236126]: 2025-10-02 12:27:54.646 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:27:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2827393438' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:27:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:27:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2827393438' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:27:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:55.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:55.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:56 np0005465988 nova_compute[236126]: 2025-10-02 12:27:56.081 2 DEBUG nova.network.neutron [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Successfully updated port: 4abf20c2-f65e-479c-8fe9-62982b2fa096 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:27:56 np0005465988 nova_compute[236126]: 2025-10-02 12:27:56.249 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquiring lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:27:56 np0005465988 nova_compute[236126]: 2025-10-02 12:27:56.249 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquired lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:27:56 np0005465988 nova_compute[236126]: 2025-10-02 12:27:56.250 2 DEBUG nova.network.neutron [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:27:56 np0005465988 nova_compute[236126]: 2025-10-02 12:27:56.349 2 DEBUG nova.compute.manager [req-103eb1e4-0426-4499-b191-e7559968b788 req-c5fc32d7-d4ee-4f90-94dd-cdecfb85e636 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-changed-4abf20c2-f65e-479c-8fe9-62982b2fa096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:56 np0005465988 nova_compute[236126]: 2025-10-02 12:27:56.350 2 DEBUG nova.compute.manager [req-103eb1e4-0426-4499-b191-e7559968b788 req-c5fc32d7-d4ee-4f90-94dd-cdecfb85e636 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Refreshing instance network info cache due to event network-changed-4abf20c2-f65e-479c-8fe9-62982b2fa096. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:27:56 np0005465988 nova_compute[236126]: 2025-10-02 12:27:56.350 2 DEBUG oslo_concurrency.lockutils [req-103eb1e4-0426-4499-b191-e7559968b788 req-c5fc32d7-d4ee-4f90-94dd-cdecfb85e636 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:27:56 np0005465988 nova_compute[236126]: 2025-10-02 12:27:56.701 2 DEBUG nova.network.neutron [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.091 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.091 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.092 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.092 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.093 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.094 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.521 2 DEBUG nova.network.neutron [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Updating instance_info_cache with network_info: [{"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:27:57 np0005465988 podman[283019]: 2025-10-02 12:27:57.533856241 +0000 UTC m=+0.070249302 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 08:27:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:57.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.732 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Releasing lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.733 2 DEBUG nova.compute.manager [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Instance network_info: |[{"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.734 2 DEBUG oslo_concurrency.lockutils [req-103eb1e4-0426-4499-b191-e7559968b788 req-c5fc32d7-d4ee-4f90-94dd-cdecfb85e636 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.735 2 DEBUG nova.network.neutron [req-103eb1e4-0426-4499-b191-e7559968b788 req-c5fc32d7-d4ee-4f90-94dd-cdecfb85e636 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Refreshing network info cache for port 4abf20c2-f65e-479c-8fe9-62982b2fa096 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:27:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.741 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Start _get_guest_xml network_info=[{"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '4ecec6ac-c34a-4cf2-8a97-239fe3a61f1a', 'disk_bus': 'virtio', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-4c9d8719-7ebc-4171-8171-a0e55d9f5992', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '4c9d8719-7ebc-4171-8171-a0e55d9f5992', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1e47e923-75c9-4c8c-b5f3-86f715462a64', 'attached_at': '', 'detached_at': '', 'volume_id': '4c9d8719-7ebc-4171-8171-a0e55d9f5992', 'serial': '4c9d8719-7ebc-4171-8171-a0e55d9f5992'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.749 2 WARNING nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.756 2 DEBUG nova.virt.libvirt.host [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.757 2 DEBUG nova.virt.libvirt.host [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.761 2 DEBUG nova.virt.libvirt.host [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.762 2 DEBUG nova.virt.libvirt.host [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.763 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.763 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.764 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.764 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.764 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.764 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.765 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.765 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.765 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.765 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.766 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.766 2 DEBUG nova.virt.hardware [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.797 2 DEBUG nova.storage.rbd_utils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] rbd image 1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.805 2 DEBUG oslo_concurrency.processutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:57 np0005465988 nova_compute[236126]: 2025-10-02 12:27:57.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:27:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:57.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:27:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:27:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/894747619' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.623 2 DEBUG oslo_concurrency.processutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.817s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.784 2 DEBUG nova.virt.libvirt.vif [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-1358372400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-1358372400',id=108,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJsc13H1E1twasivn4l5fIknm6WxTVFzYRGRsGTsk9ZC9W5y41hzk124eOR+2RfREafyMEhaDPvhujygatWcUNbG54vVZEPcxsJp0CaKKZ+PeWEtzqfW2Ozb0JbEn7mctQ==',key_name='tempest-keypair-294989539',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7aad01ba5df14c1a9309451d0daaab83',ramdisk_id='',reservation_id='r-pdakyd3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServerActionsV293TestJSON-1846432011',owner_user_name='tempest-ServerActionsV293TestJSON-1846432011-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:27:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c6bda06d7e6348bab069e07b21022b60',uuid=1e47e923-75c9-4c8c-b5f3-86f715462a64,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.785 2 DEBUG nova.network.os_vif_util [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converting VIF {"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.786 2 DEBUG nova.network.os_vif_util [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.788 2 DEBUG nova.objects.instance [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e47e923-75c9-4c8c-b5f3-86f715462a64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.833 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  <uuid>1e47e923-75c9-4c8c-b5f3-86f715462a64</uuid>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  <name>instance-0000006c</name>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerActionsV293TestJSON-server-1358372400</nova:name>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:27:57</nova:creationTime>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <nova:user uuid="c6bda06d7e6348bab069e07b21022b60">tempest-ServerActionsV293TestJSON-1846432011-project-member</nova:user>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <nova:project uuid="7aad01ba5df14c1a9309451d0daaab83">tempest-ServerActionsV293TestJSON-1846432011</nova:project>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <nova:port uuid="4abf20c2-f65e-479c-8fe9-62982b2fa096">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <entry name="serial">1e47e923-75c9-4c8c-b5f3-86f715462a64</entry>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <entry name="uuid">1e47e923-75c9-4c8c-b5f3-86f715462a64</entry>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-4c9d8719-7ebc-4171-8171-a0e55d9f5992">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <serial>4c9d8719-7ebc-4171-8171-a0e55d9f5992</serial>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:a1:c1:78"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <target dev="tap4abf20c2-f6"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/console.log" append="off"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:27:58 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:27:58 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:27:58 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:27:58 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.836 2 DEBUG nova.compute.manager [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Preparing to wait for external event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.836 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.837 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.837 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.839 2 DEBUG nova.virt.libvirt.vif [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-1358372400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-1358372400',id=108,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJsc13H1E1twasivn4l5fIknm6WxTVFzYRGRsGTsk9ZC9W5y41hzk124eOR+2RfREafyMEhaDPvhujygatWcUNbG54vVZEPcxsJp0CaKKZ+PeWEtzqfW2Ozb0JbEn7mctQ==',key_name='tempest-keypair-294989539',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7aad01ba5df14c1a9309451d0daaab83',ramdisk_id='',reservation_id='r-pdakyd3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServerActionsV293TestJSON-1846432011',owner_user_name='tempest-ServerActionsV293TestJSON-1846432011-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:27:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c6bda06d7e6348bab069e07b21022b60',uuid=1e47e923-75c9-4c8c-b5f3-86f715462a64,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.839 2 DEBUG nova.network.os_vif_util [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converting VIF {"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.840 2 DEBUG nova.network.os_vif_util [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.841 2 DEBUG os_vif [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.843 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.843 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.849 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4abf20c2-f6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.850 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4abf20c2-f6, col_values=(('external_ids', {'iface-id': '4abf20c2-f65e-479c-8fe9-62982b2fa096', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a1:c1:78', 'vm-uuid': '1e47e923-75c9-4c8c-b5f3-86f715462a64'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:58 np0005465988 NetworkManager[45041]: <info>  [1759408078.8974] manager: (tap4abf20c2-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:27:58 np0005465988 nova_compute[236126]: 2025-10-02 12:27:58.909 2 INFO os_vif [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6')#033[00m
Oct  2 08:27:59 np0005465988 nova_compute[236126]: 2025-10-02 12:27:59.157 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:27:59 np0005465988 nova_compute[236126]: 2025-10-02 12:27:59.158 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:27:59 np0005465988 nova_compute[236126]: 2025-10-02 12:27:59.158 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] No VIF found with MAC fa:16:3e:a1:c1:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:27:59 np0005465988 nova_compute[236126]: 2025-10-02 12:27:59.159 2 INFO nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Using config drive#033[00m
Oct  2 08:27:59 np0005465988 nova_compute[236126]: 2025-10-02 12:27:59.190 2 DEBUG nova.storage.rbd_utils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] rbd image 1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:27:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:27:59.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:27:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:27:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:27:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:27:59.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:00 np0005465988 nova_compute[236126]: 2025-10-02 12:28:00.080 2 INFO nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Creating config drive at /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config#033[00m
Oct  2 08:28:00 np0005465988 nova_compute[236126]: 2025-10-02 12:28:00.093 2 DEBUG oslo_concurrency.processutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptj65dlqi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:00 np0005465988 nova_compute[236126]: 2025-10-02 12:28:00.155 2 DEBUG nova.network.neutron [req-103eb1e4-0426-4499-b191-e7559968b788 req-c5fc32d7-d4ee-4f90-94dd-cdecfb85e636 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Updated VIF entry in instance network info cache for port 4abf20c2-f65e-479c-8fe9-62982b2fa096. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:28:00 np0005465988 nova_compute[236126]: 2025-10-02 12:28:00.156 2 DEBUG nova.network.neutron [req-103eb1e4-0426-4499-b191-e7559968b788 req-c5fc32d7-d4ee-4f90-94dd-cdecfb85e636 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Updating instance_info_cache with network_info: [{"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:00 np0005465988 nova_compute[236126]: 2025-10-02 12:28:00.264 2 DEBUG oslo_concurrency.lockutils [req-103eb1e4-0426-4499-b191-e7559968b788 req-c5fc32d7-d4ee-4f90-94dd-cdecfb85e636 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:28:00 np0005465988 nova_compute[236126]: 2025-10-02 12:28:00.272 2 DEBUG oslo_concurrency.processutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptj65dlqi" returned: 0 in 0.178s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:00 np0005465988 nova_compute[236126]: 2025-10-02 12:28:00.304 2 DEBUG nova.storage.rbd_utils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] rbd image 1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:28:00 np0005465988 nova_compute[236126]: 2025-10-02 12:28:00.308 2 DEBUG oslo_concurrency.processutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config 1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:00 np0005465988 nova_compute[236126]: 2025-10-02 12:28:00.925 2 DEBUG oslo_concurrency.processutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config 1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.617s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:00 np0005465988 nova_compute[236126]: 2025-10-02 12:28:00.926 2 INFO nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Deleting local config drive /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config because it was imported into RBD.#033[00m
Oct  2 08:28:00 np0005465988 nova_compute[236126]: 2025-10-02 12:28:00.931 2 INFO nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance shutdown successfully after 10 seconds.#033[00m
Oct  2 08:28:00 np0005465988 kernel: tap4abf20c2-f6: entered promiscuous mode
Oct  2 08:28:01 np0005465988 NetworkManager[45041]: <info>  [1759408081.0027] manager: (tap4abf20c2-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/219)
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:01Z|00475|binding|INFO|Claiming lport 4abf20c2-f65e-479c-8fe9-62982b2fa096 for this chassis.
Oct  2 08:28:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:01Z|00476|binding|INFO|4abf20c2-f65e-479c-8fe9-62982b2fa096: Claiming fa:16:3e:a1:c1:78 10.100.0.10
Oct  2 08:28:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:01Z|00477|binding|INFO|Setting lport 4abf20c2-f65e-479c-8fe9-62982b2fa096 ovn-installed in OVS
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 systemd-udevd[283152]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:28:01 np0005465988 NetworkManager[45041]: <info>  [1759408081.0641] device (tap4abf20c2-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:28:01 np0005465988 systemd-machined[192594]: New machine qemu-46-instance-0000006c.
Oct  2 08:28:01 np0005465988 NetworkManager[45041]: <info>  [1759408081.0660] device (tap4abf20c2-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.069 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:c1:78 10.100.0.10'], port_security=['fa:16:3e:a1:c1:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1e47e923-75c9-4c8c-b5f3-86f715462a64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-454203c0-2170-41e3-a903-014f7d235b68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aad01ba5df14c1a9309451d0daaab83', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6993526b-112b-49f0-b6c4-5993fe406fdf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=825ba480-e228-47f3-8e23-c895bd3a1194, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=4abf20c2-f65e-479c-8fe9-62982b2fa096) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:01Z|00478|binding|INFO|Setting lport 4abf20c2-f65e-479c-8fe9-62982b2fa096 up in Southbound
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.071 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 4abf20c2-f65e-479c-8fe9-62982b2fa096 in datapath 454203c0-2170-41e3-a903-014f7d235b68 bound to our chassis#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.075 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 454203c0-2170-41e3-a903-014f7d235b68#033[00m
Oct  2 08:28:01 np0005465988 systemd[1]: Started Virtual Machine qemu-46-instance-0000006c.
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.093 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0ebef1b9-8d44-4cac-bf8f-10ac965c877b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.094 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap454203c0-21 in ovnmeta-454203c0-2170-41e3-a903-014f7d235b68 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.096 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap454203c0-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.096 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ce8a5486-b27f-4ab6-8227-f1a27669ea81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.097 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[50567665-a5b1-40f9-8939-48ad3aadff8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.118 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[692129b3-d17a-4c80-be70-6833c8215e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.153 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[acf89aba-670b-4e3d-b769-cc49de7c3f2e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.192 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4026e3ed-7303-41c6-9004-77246cf38041]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.199 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[574f0ab3-0093-46dd-9893-3ecd52ffbe44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 NetworkManager[45041]: <info>  [1759408081.2003] manager: (tap454203c0-20): new Veth device (/org/freedesktop/NetworkManager/Devices/220)
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.239 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a56530-b5c1-45f2-800f-6c3d1c183693]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.243 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[38141c6f-beea-4e63-a17c-3d97da9148ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 NetworkManager[45041]: <info>  [1759408081.3267] device (tap454203c0-20): carrier: link connected
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.332 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1dfd7ba2-0e5d-4bea-b18b-4bf9ff805c64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.350 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[42b3da24-2966-4829-bb83-9eed600d2ec6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap454203c0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:aa:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614165, 'reachable_time': 44641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283188, 'error': None, 'target': 'ovnmeta-454203c0-2170-41e3-a903-014f7d235b68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.369 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd1f0f8-c0be-4dc4-896d-c978ac2a8dcf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:aa5d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 614165, 'tstamp': 614165}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283189, 'error': None, 'target': 'ovnmeta-454203c0-2170-41e3-a903-014f7d235b68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.392 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d8815ddb-cb9b-494d-bd0e-4ccabb7f93b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap454203c0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:aa:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614165, 'reachable_time': 44641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283190, 'error': None, 'target': 'ovnmeta-454203c0-2170-41e3-a903-014f7d235b68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 kernel: tap3bdb6970-48 (unregistering): left promiscuous mode
Oct  2 08:28:01 np0005465988 NetworkManager[45041]: <info>  [1759408081.4241] device (tap3bdb6970-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:01Z|00479|binding|INFO|Releasing lport 3bdb6970-487f-4313-ab25-aa900f8b084a from this chassis (sb_readonly=0)
Oct  2 08:28:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:01Z|00480|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a down in Southbound
Oct  2 08:28:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:01Z|00481|binding|INFO|Removing iface tap3bdb6970-48 ovn-installed in OVS
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.445 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[caa14440-35d2-46cd-a343-1dee66ba3253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.480 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '12', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:01 np0005465988 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000064.scope: Deactivated successfully.
Oct  2 08:28:01 np0005465988 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000064.scope: Consumed 19.948s CPU time.
Oct  2 08:28:01 np0005465988 systemd-machined[192594]: Machine qemu-43-instance-00000064 terminated.
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.529 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[51aface0-0232-4801-996e-0b9965059d72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.530 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap454203c0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.531 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.531 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap454203c0-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 NetworkManager[45041]: <info>  [1759408081.5336] manager: (tap454203c0-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Oct  2 08:28:01 np0005465988 kernel: tap454203c0-20: entered promiscuous mode
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.544 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap454203c0-20, col_values=(('external_ids', {'iface-id': '785d24d0-2f0b-4375-86f8-0edd313fa257'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:01Z|00482|binding|INFO|Releasing lport 785d24d0-2f0b-4375-86f8-0edd313fa257 from this chassis (sb_readonly=0)
Oct  2 08:28:01 np0005465988 NetworkManager[45041]: <info>  [1759408081.5528] manager: (tap3bdb6970-48): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.569 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/454203c0-2170-41e3-a903-014f7d235b68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/454203c0-2170-41e3-a903-014f7d235b68.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.571 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8767607b-f31b-4049-8447-c14c6ece3c86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.572 2 INFO nova.virt.libvirt.driver [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance destroyed successfully.#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.572 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-454203c0-2170-41e3-a903-014f7d235b68
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/454203c0-2170-41e3-a903-014f7d235b68.pid.haproxy
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 454203c0-2170-41e3-a903-014f7d235b68
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:28:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:01.573 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-454203c0-2170-41e3-a903-014f7d235b68', 'env', 'PROCESS_TAG=haproxy-454203c0-2170-41e3-a903-014f7d235b68', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/454203c0-2170-41e3-a903-014f7d235b68.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.573 2 DEBUG nova.virt.libvirt.vif [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:27:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:22:0e:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.573 2 DEBUG nova.network.os_vif_util [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:22:0e:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.574 2 DEBUG nova.network.os_vif_util [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.575 2 DEBUG os_vif [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.577 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bdb6970-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.585 2 INFO os_vif [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48')#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.590 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.590 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:28:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:01.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:01 np0005465988 nova_compute[236126]: 2025-10-02 12:28:01.871 2 DEBUG nova.network.neutron [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Port 3bdb6970-487f-4313-ab25-aa900f8b084a binding to destination host compute-2.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Oct  2 08:28:01 np0005465988 podman[283279]: 2025-10-02 12:28:01.944355127 +0000 UTC m=+0.056888227 container create b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:28:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:01.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:01 np0005465988 systemd[1]: Started libpod-conmon-b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e.scope.
Oct  2 08:28:02 np0005465988 podman[283279]: 2025-10-02 12:28:01.910586826 +0000 UTC m=+0.023119936 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:28:02 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:28:02 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a88f9267b229884fd6a36e2416c5b138a03e422ca24e731d83993c029a098b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:28:02 np0005465988 podman[283279]: 2025-10-02 12:28:02.029763674 +0000 UTC m=+0.142296854 container init b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:28:02 np0005465988 podman[283279]: 2025-10-02 12:28:02.039180915 +0000 UTC m=+0.151714045 container start b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:28:02 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[283294]: [NOTICE]   (283298) : New worker (283300) forked
Oct  2 08:28:02 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[283294]: [NOTICE]   (283298) : Loading success.
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.108 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff unbound from our chassis#033[00m
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.110 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.111 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bcbb2fdd-66bf-4711-bfd9-210fee5b5000]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.112 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace which is not needed anymore#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.193 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408082.192899, 1e47e923-75c9-4c8c-b5f3-86f715462a64 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.194 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] VM Started (Lifecycle Event)#033[00m
Oct  2 08:28:02 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[280865]: [NOTICE]   (280879) : haproxy version is 2.8.14-c23fe91
Oct  2 08:28:02 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[280865]: [NOTICE]   (280879) : path to executable is /usr/sbin/haproxy
Oct  2 08:28:02 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[280865]: [WARNING]  (280879) : Exiting Master process...
Oct  2 08:28:02 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[280865]: [ALERT]    (280879) : Current worker (280881) exited with code 143 (Terminated)
Oct  2 08:28:02 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[280865]: [WARNING]  (280879) : All workers exited. Exiting... (0)
Oct  2 08:28:02 np0005465988 systemd[1]: libpod-1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2.scope: Deactivated successfully.
Oct  2 08:28:02 np0005465988 podman[283325]: 2025-10-02 12:28:02.254038416 +0000 UTC m=+0.046035325 container died 1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.266 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.267 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.267 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:02 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2-userdata-shm.mount: Deactivated successfully.
Oct  2 08:28:02 np0005465988 systemd[1]: var-lib-containers-storage-overlay-8b4c8c5613f7cd0d02446f2d7e03a560ea0266981874cd2ef23f007f5078e168-merged.mount: Deactivated successfully.
Oct  2 08:28:02 np0005465988 podman[283325]: 2025-10-02 12:28:02.295233081 +0000 UTC m=+0.087229950 container cleanup 1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:28:02 np0005465988 systemd[1]: libpod-conmon-1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2.scope: Deactivated successfully.
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.323 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.328 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408082.1930108, 1e47e923-75c9-4c8c-b5f3-86f715462a64 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.328 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:28:02 np0005465988 podman[283355]: 2025-10-02 12:28:02.363354861 +0000 UTC m=+0.047319652 container remove 1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.371 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d38bcb99-4f48-4815-8ba4-7e027ac7382f]: (4, ('Thu Oct  2 12:28:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2)\n1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2\nThu Oct  2 12:28:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2)\n1ed888cf00604ca1372c153d13bf96e49b5565d36e25347afc969eb19a330fc2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.373 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[533e09d7-6114-41c2-8045-8b3f200670fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.374 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:02 np0005465988 kernel: tapb2c62a66-f0: left promiscuous mode
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.439 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b03bb494-b177-4b80-afc0-625628f6d92a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.445 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.454 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.471 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.472 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d37d6c-2654-4c10-a819-e696c1beb8da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.474 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[63c12a11-bed6-498a-9b3d-efd1f81f901c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.494 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa44529-8b45-4a6e-a72b-2e53555fc322]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601306, 'reachable_time': 32112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283370, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.496 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:28:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:02.497 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[fa217b1c-e853-4ca3-9869-57b9749799df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:02 np0005465988 systemd[1]: run-netns-ovnmeta\x2db2c62a66\x2df9bc\x2d4a45\x2da843\x2daef2e12a7fff.mount: Deactivated successfully.
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.614 2 DEBUG nova.compute.manager [req-c45d7f9b-e720-48cb-9a02-f09b58e526d7 req-860a3f30-6780-478c-8d75-dada5e554e4d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.615 2 DEBUG oslo_concurrency.lockutils [req-c45d7f9b-e720-48cb-9a02-f09b58e526d7 req-860a3f30-6780-478c-8d75-dada5e554e4d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.616 2 DEBUG oslo_concurrency.lockutils [req-c45d7f9b-e720-48cb-9a02-f09b58e526d7 req-860a3f30-6780-478c-8d75-dada5e554e4d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.616 2 DEBUG oslo_concurrency.lockutils [req-c45d7f9b-e720-48cb-9a02-f09b58e526d7 req-860a3f30-6780-478c-8d75-dada5e554e4d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.616 2 DEBUG nova.compute.manager [req-c45d7f9b-e720-48cb-9a02-f09b58e526d7 req-860a3f30-6780-478c-8d75-dada5e554e4d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.616 2 WARNING nova.compute.manager [req-c45d7f9b-e720-48cb-9a02-f09b58e526d7 req-860a3f30-6780-478c-8d75-dada5e554e4d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.655 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.676 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:28:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.851 2 DEBUG nova.compute.manager [req-5448670d-9d52-4320-ab9a-5e0a6f572959 req-1fd642c8-7770-4986-a1b7-d80f2a1aeefe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.851 2 DEBUG oslo_concurrency.lockutils [req-5448670d-9d52-4320-ab9a-5e0a6f572959 req-1fd642c8-7770-4986-a1b7-d80f2a1aeefe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.851 2 DEBUG oslo_concurrency.lockutils [req-5448670d-9d52-4320-ab9a-5e0a6f572959 req-1fd642c8-7770-4986-a1b7-d80f2a1aeefe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.852 2 DEBUG oslo_concurrency.lockutils [req-5448670d-9d52-4320-ab9a-5e0a6f572959 req-1fd642c8-7770-4986-a1b7-d80f2a1aeefe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.852 2 DEBUG nova.compute.manager [req-5448670d-9d52-4320-ab9a-5e0a6f572959 req-1fd642c8-7770-4986-a1b7-d80f2a1aeefe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Processing event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.852 2 DEBUG nova.compute.manager [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.856 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408082.8567033, 1e47e923-75c9-4c8c-b5f3-86f715462a64 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.857 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.859 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.865 2 INFO nova.virt.libvirt.driver [-] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Instance spawned successfully.#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.866 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.936 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.939 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.939 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.940 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.941 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.942 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.942 2 DEBUG nova.virt.libvirt.driver [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:02 np0005465988 nova_compute[236126]: 2025-10-02 12:28:02.952 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:28:03 np0005465988 nova_compute[236126]: 2025-10-02 12:28:03.001 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:28:03 np0005465988 nova_compute[236126]: 2025-10-02 12:28:03.100 2 INFO nova.compute.manager [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Took 8.46 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:28:03 np0005465988 nova_compute[236126]: 2025-10-02 12:28:03.101 2 DEBUG nova.compute.manager [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:03 np0005465988 nova_compute[236126]: 2025-10-02 12:28:03.149 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:28:03 np0005465988 nova_compute[236126]: 2025-10-02 12:28:03.149 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquired lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:28:03 np0005465988 nova_compute[236126]: 2025-10-02 12:28:03.150 2 DEBUG nova.network.neutron [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:28:03 np0005465988 nova_compute[236126]: 2025-10-02 12:28:03.295 2 INFO nova.compute.manager [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Took 15.89 seconds to build instance.#033[00m
Oct  2 08:28:03 np0005465988 nova_compute[236126]: 2025-10-02 12:28:03.338 2 DEBUG oslo_concurrency.lockutils [None req-1ea9eeb8-6db9-4a7f-974e-0ab9bd753b9c c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:03 np0005465988 nova_compute[236126]: 2025-10-02 12:28:03.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:03.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:28:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:03.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:28:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:05.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:05.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.045 2 DEBUG nova.compute.manager [req-cec42c8b-1b33-4c2e-a6ac-b0a225498cb4 req-29bfcea4-2019-4ef9-9f09-8002548e9944 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.047 2 DEBUG oslo_concurrency.lockutils [req-cec42c8b-1b33-4c2e-a6ac-b0a225498cb4 req-29bfcea4-2019-4ef9-9f09-8002548e9944 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.048 2 DEBUG oslo_concurrency.lockutils [req-cec42c8b-1b33-4c2e-a6ac-b0a225498cb4 req-29bfcea4-2019-4ef9-9f09-8002548e9944 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.048 2 DEBUG oslo_concurrency.lockutils [req-cec42c8b-1b33-4c2e-a6ac-b0a225498cb4 req-29bfcea4-2019-4ef9-9f09-8002548e9944 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.048 2 DEBUG nova.compute.manager [req-cec42c8b-1b33-4c2e-a6ac-b0a225498cb4 req-29bfcea4-2019-4ef9-9f09-8002548e9944 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.049 2 WARNING nova.compute.manager [req-cec42c8b-1b33-4c2e-a6ac-b0a225498cb4 req-29bfcea4-2019-4ef9-9f09-8002548e9944 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.253 2 DEBUG nova.compute.manager [req-eb47a346-7d4c-4ea6-989e-9ca37ef7e760 req-9e32e878-577f-4ea7-9001-880262a74127 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.253 2 DEBUG oslo_concurrency.lockutils [req-eb47a346-7d4c-4ea6-989e-9ca37ef7e760 req-9e32e878-577f-4ea7-9001-880262a74127 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.253 2 DEBUG oslo_concurrency.lockutils [req-eb47a346-7d4c-4ea6-989e-9ca37ef7e760 req-9e32e878-577f-4ea7-9001-880262a74127 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.253 2 DEBUG oslo_concurrency.lockutils [req-eb47a346-7d4c-4ea6-989e-9ca37ef7e760 req-9e32e878-577f-4ea7-9001-880262a74127 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.254 2 DEBUG nova.compute.manager [req-eb47a346-7d4c-4ea6-989e-9ca37ef7e760 req-9e32e878-577f-4ea7-9001-880262a74127 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] No waiting events found dispatching network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.254 2 WARNING nova.compute.manager [req-eb47a346-7d4c-4ea6-989e-9ca37ef7e760 req-9e32e878-577f-4ea7-9001-880262a74127 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received unexpected event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:28:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:28:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:06 np0005465988 nova_compute[236126]: 2025-10-02 12:28:06.651 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:07.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:07 np0005465988 nova_compute[236126]: 2025-10-02 12:28:07.742 2 DEBUG nova.network.neutron [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:07 np0005465988 nova_compute[236126]: 2025-10-02 12:28:07.809 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Releasing lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:28:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:07.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:08 np0005465988 nova_compute[236126]: 2025-10-02 12:28:08.134 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Oct  2 08:28:08 np0005465988 nova_compute[236126]: 2025-10-02 12:28:08.136 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:28:08 np0005465988 nova_compute[236126]: 2025-10-02 12:28:08.136 2 INFO nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Creating image(s)#033[00m
Oct  2 08:28:08 np0005465988 nova_compute[236126]: 2025-10-02 12:28:08.191 2 DEBUG nova.storage.rbd_utils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] creating snapshot(nova-resize) on rbd image(87ebffd5-69af-414b-be5d-67ba42e8cae1_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:28:08 np0005465988 nova_compute[236126]: 2025-10-02 12:28:08.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:09.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:28:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:09.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:28:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e303 e303: 3 total, 3 up, 3 in
Oct  2 08:28:11 np0005465988 nova_compute[236126]: 2025-10-02 12:28:11.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:11.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:11.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.008 2 DEBUG nova.compute.manager [req-8def1249-e7ac-4921-889d-8e8728a31914 req-afe5f60b-97bc-4427-b9f3-1eb91b464030 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-changed-4abf20c2-f65e-479c-8fe9-62982b2fa096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.008 2 DEBUG nova.compute.manager [req-8def1249-e7ac-4921-889d-8e8728a31914 req-afe5f60b-97bc-4427-b9f3-1eb91b464030 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Refreshing instance network info cache due to event network-changed-4abf20c2-f65e-479c-8fe9-62982b2fa096. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.009 2 DEBUG oslo_concurrency.lockutils [req-8def1249-e7ac-4921-889d-8e8728a31914 req-afe5f60b-97bc-4427-b9f3-1eb91b464030 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.009 2 DEBUG oslo_concurrency.lockutils [req-8def1249-e7ac-4921-889d-8e8728a31914 req-afe5f60b-97bc-4427-b9f3-1eb91b464030 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.009 2 DEBUG nova.network.neutron [req-8def1249-e7ac-4921-889d-8e8728a31914 req-afe5f60b-97bc-4427-b9f3-1eb91b464030 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Refreshing network info cache for port 4abf20c2-f65e-479c-8fe9-62982b2fa096 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.055 2 DEBUG nova.objects.instance [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.211 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.211 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Ensure instance console log exists: /var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.212 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.212 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.213 2 DEBUG oslo_concurrency.lockutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.216 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Start _get_guest_xml network_info=[{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:22:0e:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.223 2 WARNING nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.232 2 DEBUG nova.virt.libvirt.host [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.233 2 DEBUG nova.virt.libvirt.host [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.236 2 DEBUG nova.virt.libvirt.host [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.237 2 DEBUG nova.virt.libvirt.host [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.238 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.238 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eb3a53f1-304b-4cb0-acc3-abffce0fb181',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.239 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.239 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.239 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.240 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.240 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.240 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.241 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.241 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.241 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.242 2 DEBUG nova.virt.hardware [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.242 2 DEBUG nova.objects.instance [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.263 2 DEBUG oslo_concurrency.processutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.711 2 DEBUG oslo_concurrency.processutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:12 np0005465988 nova_compute[236126]: 2025-10-02 12:28:12.762 2 DEBUG oslo_concurrency.processutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:28:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2193085479' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.211 2 DEBUG oslo_concurrency.processutils [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.213 2 DEBUG nova.virt.libvirt.vif [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:28:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:22:0e:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.215 2 DEBUG nova.network.os_vif_util [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:22:0e:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.216 2 DEBUG nova.network.os_vif_util [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.219 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  <uuid>87ebffd5-69af-414b-be5d-67ba42e8cae1</uuid>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  <name>instance-00000064</name>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  <memory>196608</memory>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerActionsTestJSON-server-131502281</nova:name>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:28:12</nova:creationTime>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.micro">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <nova:memory>192</nova:memory>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <nova:user uuid="2bd16d1f5f9d4eb396c474eedee67165">tempest-ServerActionsTestJSON-842270816-project-member</nova:user>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <nova:project uuid="4b8ca48cb5f64ef3b0736b8be82378b8">tempest-ServerActionsTestJSON-842270816</nova:project>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <nova:port uuid="3bdb6970-487f-4313-ab25-aa900f8b084a">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <entry name="serial">87ebffd5-69af-414b-be5d-67ba42e8cae1</entry>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <entry name="uuid">87ebffd5-69af-414b-be5d-67ba42e8cae1</entry>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/87ebffd5-69af-414b-be5d-67ba42e8cae1_disk">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/87ebffd5-69af-414b-be5d-67ba42e8cae1_disk.config">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:22:0e:b9"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <target dev="tap3bdb6970-48"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1/console.log" append="off"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:28:13 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:28:13 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:28:13 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:28:13 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.221 2 DEBUG nova.virt.libvirt.vif [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:28:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:22:0e:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.221 2 DEBUG nova.network.os_vif_util [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:22:0e:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.221 2 DEBUG nova.network.os_vif_util [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.222 2 DEBUG os_vif [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.223 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.224 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.232 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bdb6970-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.233 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3bdb6970-48, col_values=(('external_ids', {'iface-id': '3bdb6970-487f-4313-ab25-aa900f8b084a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:0e:b9', 'vm-uuid': '87ebffd5-69af-414b-be5d-67ba42e8cae1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005465988 NetworkManager[45041]: <info>  [1759408093.2362] manager: (tap3bdb6970-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.243 2 INFO os_vif [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48')#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.647 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.647 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.648 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No VIF found with MAC fa:16:3e:22:0e:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.649 2 INFO nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Using config drive#033[00m
Oct  2 08:28:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:28:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:13.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:28:13 np0005465988 NetworkManager[45041]: <info>  [1759408093.7786] manager: (tap3bdb6970-48): new Tun device (/org/freedesktop/NetworkManager/Devices/224)
Oct  2 08:28:13 np0005465988 kernel: tap3bdb6970-48: entered promiscuous mode
Oct  2 08:28:13 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:13Z|00483|binding|INFO|Claiming lport 3bdb6970-487f-4313-ab25-aa900f8b084a for this chassis.
Oct  2 08:28:13 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:13Z|00484|binding|INFO|3bdb6970-487f-4313-ab25-aa900f8b084a: Claiming fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:13Z|00485|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a ovn-installed in OVS
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.804 2 DEBUG nova.network.neutron [req-8def1249-e7ac-4921-889d-8e8728a31914 req-afe5f60b-97bc-4427-b9f3-1eb91b464030 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Updated VIF entry in instance network info cache for port 4abf20c2-f65e-479c-8fe9-62982b2fa096. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.805 2 DEBUG nova.network.neutron [req-8def1249-e7ac-4921-889d-8e8728a31914 req-afe5f60b-97bc-4427-b9f3-1eb91b464030 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Updating instance_info_cache with network_info: [{"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:13 np0005465988 systemd-udevd[283645]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:28:13 np0005465988 systemd-machined[192594]: New machine qemu-47-instance-00000064.
Oct  2 08:28:13 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:13Z|00486|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a up in Southbound
Oct  2 08:28:13 np0005465988 NetworkManager[45041]: <info>  [1759408093.8437] device (tap3bdb6970-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:28:13 np0005465988 systemd[1]: Started Virtual Machine qemu-47-instance-00000064.
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.843 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.844 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff bound to our chassis#033[00m
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.846 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff#033[00m
Oct  2 08:28:13 np0005465988 NetworkManager[45041]: <info>  [1759408093.8501] device (tap3bdb6970-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.860 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5b6a031f-2a13-4a43-8b2d-0bfe9b777791]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.861 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2c62a66-f1 in ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.864 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2c62a66-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.864 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[70986625-4762-4781-a47c-0036a4fbfa59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.865 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[de05d9af-24d6-4e6f-b334-934fa0a8f92b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.883 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[d08c01cc-33a2-484c-8a98-cb6fc8ce8adc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.914 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d0fdbe-ec3f-4c0e-9360-e3ef8d30f540]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:13 np0005465988 nova_compute[236126]: 2025-10-02 12:28:13.953 2 DEBUG oslo_concurrency.lockutils [req-8def1249-e7ac-4921-889d-8e8728a31914 req-afe5f60b-97bc-4427-b9f3-1eb91b464030 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.969 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[cb38114d-59e4-41bf-8cde-ab7c8f39bef7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:13.975 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d98decda-04dc-4dfa-8138-c7664489e111]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:13 np0005465988 NetworkManager[45041]: <info>  [1759408093.9771] manager: (tapb2c62a66-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/225)
Oct  2 08:28:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:14.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.015 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[26a0cd24-bab6-4d19-995f-8fa4a0a3201f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.019 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e8afd5a3-1e81-487e-822a-54d7865b69e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:14 np0005465988 NetworkManager[45041]: <info>  [1759408094.0413] device (tapb2c62a66-f0): carrier: link connected
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.052 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad16a10-2df0-408a-ab69-4fefe468dcff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.079 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8b641f27-d507-43e0-8b22-b5c72654a147]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615437, 'reachable_time': 21051, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283679, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.106 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb0f9e4-7fb7-48e2-baab-9f7df73e1801]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:7a7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 615437, 'tstamp': 615437}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283680, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.136 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[28a1e319-7283-4fe3-89af-5730e4785e92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615437, 'reachable_time': 21051, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283681, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.193 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[882f7913-01ad-457a-a632-7e00c058552e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.269 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c72ae2fc-3a98-4663-bb7a-ef12c014cddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.271 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.271 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.271 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c62a66-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:14 np0005465988 kernel: tapb2c62a66-f0: entered promiscuous mode
Oct  2 08:28:14 np0005465988 nova_compute[236126]: 2025-10-02 12:28:14.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:14 np0005465988 NetworkManager[45041]: <info>  [1759408094.2877] manager: (tapb2c62a66-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Oct  2 08:28:14 np0005465988 nova_compute[236126]: 2025-10-02 12:28:14.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.293 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2c62a66-f0, col_values=(('external_ids', {'iface-id': '8c1234f6-f595-4979-94c0-98da2211f0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:14 np0005465988 nova_compute[236126]: 2025-10-02 12:28:14.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:14Z|00487|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.296 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:28:14 np0005465988 nova_compute[236126]: 2025-10-02 12:28:14.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.297 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3fe2e04e-9fed-4ca4-b1c5-6ad291343265]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.297 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:28:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:14.298 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'env', 'PROCESS_TAG=haproxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:28:14 np0005465988 nova_compute[236126]: 2025-10-02 12:28:14.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:14 np0005465988 nova_compute[236126]: 2025-10-02 12:28:14.726 2 DEBUG nova.compute.manager [req-9a8833ff-0018-498b-9a2f-13c8f7b2f15b req-768177f5-9d26-4688-b22f-d4e22b992fbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:14 np0005465988 nova_compute[236126]: 2025-10-02 12:28:14.727 2 DEBUG oslo_concurrency.lockutils [req-9a8833ff-0018-498b-9a2f-13c8f7b2f15b req-768177f5-9d26-4688-b22f-d4e22b992fbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:14 np0005465988 nova_compute[236126]: 2025-10-02 12:28:14.728 2 DEBUG oslo_concurrency.lockutils [req-9a8833ff-0018-498b-9a2f-13c8f7b2f15b req-768177f5-9d26-4688-b22f-d4e22b992fbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:14 np0005465988 nova_compute[236126]: 2025-10-02 12:28:14.728 2 DEBUG oslo_concurrency.lockutils [req-9a8833ff-0018-498b-9a2f-13c8f7b2f15b req-768177f5-9d26-4688-b22f-d4e22b992fbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:14 np0005465988 nova_compute[236126]: 2025-10-02 12:28:14.729 2 DEBUG nova.compute.manager [req-9a8833ff-0018-498b-9a2f-13c8f7b2f15b req-768177f5-9d26-4688-b22f-d4e22b992fbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:14 np0005465988 nova_compute[236126]: 2025-10-02 12:28:14.729 2 WARNING nova.compute.manager [req-9a8833ff-0018-498b-9a2f-13c8f7b2f15b req-768177f5-9d26-4688-b22f-d4e22b992fbf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state resize_finish.#033[00m
Oct  2 08:28:14 np0005465988 podman[283713]: 2025-10-02 12:28:14.781985216 +0000 UTC m=+0.084493482 container create cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:28:14 np0005465988 podman[283713]: 2025-10-02 12:28:14.744061975 +0000 UTC m=+0.046570231 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:28:14 np0005465988 systemd[1]: Started libpod-conmon-cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e.scope.
Oct  2 08:28:14 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:28:14 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d187f0d9f9978bfd3e6adb10c1a9e4c642c815f089abfaf78b96e063d04e472a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:28:14 np0005465988 podman[283713]: 2025-10-02 12:28:14.893149684 +0000 UTC m=+0.195658020 container init cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct  2 08:28:14 np0005465988 podman[283713]: 2025-10-02 12:28:14.903981105 +0000 UTC m=+0.206489391 container start cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:28:14 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[283747]: [NOTICE]   (283751) : New worker (283754) forked
Oct  2 08:28:14 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[283747]: [NOTICE]   (283751) : Loading success.
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.594 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 87ebffd5-69af-414b-be5d-67ba42e8cae1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.595 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408095.5941687, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.595 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.597 2 DEBUG nova.compute.manager [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.600 2 INFO nova.virt.libvirt.driver [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance running successfully.#033[00m
Oct  2 08:28:15 np0005465988 virtqemud[235689]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.602 2 DEBUG nova.virt.libvirt.guest [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.602 2 DEBUG nova.virt.libvirt.driver [None req-6bfffd0f-0cd4-40d1-b895-df8a537bb5b8 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.630 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.633 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:28:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:15.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.694 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.695 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408095.5954008, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.695 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.725 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:15 np0005465988 nova_compute[236126]: 2025-10-02 12:28:15.730 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:28:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:16.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:16Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a1:c1:78 10.100.0.10
Oct  2 08:28:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:16Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a1:c1:78 10.100.0.10
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.275 2 DEBUG nova.compute.manager [req-ae8b3484-e825-44db-8d9d-6dda2f394767 req-5ea1a2f7-3be4-49c7-aa13-a783eb89ed42 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.276 2 DEBUG oslo_concurrency.lockutils [req-ae8b3484-e825-44db-8d9d-6dda2f394767 req-5ea1a2f7-3be4-49c7-aa13-a783eb89ed42 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.276 2 DEBUG oslo_concurrency.lockutils [req-ae8b3484-e825-44db-8d9d-6dda2f394767 req-5ea1a2f7-3be4-49c7-aa13-a783eb89ed42 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.276 2 DEBUG oslo_concurrency.lockutils [req-ae8b3484-e825-44db-8d9d-6dda2f394767 req-5ea1a2f7-3be4-49c7-aa13-a783eb89ed42 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.277 2 DEBUG nova.compute.manager [req-ae8b3484-e825-44db-8d9d-6dda2f394767 req-5ea1a2f7-3be4-49c7-aa13-a783eb89ed42 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.277 2 WARNING nova.compute.manager [req-ae8b3484-e825-44db-8d9d-6dda2f394767 req-5ea1a2f7-3be4-49c7-aa13-a783eb89ed42 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state resized and task_state None.#033[00m
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.463 2 DEBUG oslo_concurrency.lockutils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.463 2 DEBUG oslo_concurrency.lockutils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.464 2 DEBUG nova.compute.manager [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Going to confirm migration 14 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  2 08:28:17 np0005465988 podman[283789]: 2025-10-02 12:28:17.55615197 +0000 UTC m=+0.078323394 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  2 08:28:17 np0005465988 podman[283790]: 2025-10-02 12:28:17.567994731 +0000 UTC m=+0.084780700 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Oct  2 08:28:17 np0005465988 podman[283788]: 2025-10-02 12:28:17.594239076 +0000 UTC m=+0.128328473 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Oct  2 08:28:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:17.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.744 2 DEBUG oslo_concurrency.lockutils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.745 2 DEBUG oslo_concurrency.lockutils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquired lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.745 2 DEBUG nova.network.neutron [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:28:17 np0005465988 nova_compute[236126]: 2025-10-02 12:28:17.745 2 DEBUG nova.objects.instance [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'info_cache' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:18.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:18 np0005465988 nova_compute[236126]: 2025-10-02 12:28:18.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:18 np0005465988 nova_compute[236126]: 2025-10-02 12:28:18.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:19.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:28:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:20.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:28:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:21.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:21 np0005465988 nova_compute[236126]: 2025-10-02 12:28:21.855 2 DEBUG nova.network.neutron [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [{"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:22 np0005465988 nova_compute[236126]: 2025-10-02 12:28:22.007 2 DEBUG oslo_concurrency.lockutils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Releasing lock "refresh_cache-87ebffd5-69af-414b-be5d-67ba42e8cae1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:28:22 np0005465988 nova_compute[236126]: 2025-10-02 12:28:22.007 2 DEBUG nova.objects.instance [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:22.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:23 np0005465988 nova_compute[236126]: 2025-10-02 12:28:23.052 2 DEBUG nova.storage.rbd_utils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] removing snapshot(nova-resize) on rbd image(87ebffd5-69af-414b-be5d-67ba42e8cae1_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:28:23 np0005465988 nova_compute[236126]: 2025-10-02 12:28:23.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:23 np0005465988 nova_compute[236126]: 2025-10-02 12:28:23.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:28:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:23.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:28:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:24.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e304 e304: 3 total, 3 up, 3 in
Oct  2 08:28:25 np0005465988 nova_compute[236126]: 2025-10-02 12:28:25.527 2 DEBUG oslo_concurrency.lockutils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:25 np0005465988 nova_compute[236126]: 2025-10-02 12:28:25.529 2 DEBUG oslo_concurrency.lockutils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:25 np0005465988 nova_compute[236126]: 2025-10-02 12:28:25.654 2 DEBUG oslo_concurrency.processutils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:25.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:26.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:28:26 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2618760061' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:28:26 np0005465988 nova_compute[236126]: 2025-10-02 12:28:26.169 2 DEBUG oslo_concurrency.processutils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:26 np0005465988 nova_compute[236126]: 2025-10-02 12:28:26.175 2 DEBUG nova.compute.provider_tree [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:28:26 np0005465988 nova_compute[236126]: 2025-10-02 12:28:26.278 2 DEBUG nova.scheduler.client.report [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:28:26 np0005465988 nova_compute[236126]: 2025-10-02 12:28:26.864 2 DEBUG oslo_concurrency.lockutils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 1.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:27.358 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:27.359 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:27.360 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:27 np0005465988 nova_compute[236126]: 2025-10-02 12:28:27.386 2 INFO nova.scheduler.client.report [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Deleted allocation for migration 915ae6b6-0039-4e4a-a1a3-3723d46e5040#033[00m
Oct  2 08:28:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:27.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:27 np0005465988 nova_compute[236126]: 2025-10-02 12:28:27.715 2 DEBUG oslo_concurrency.lockutils [None req-538cfb11-f8ca-4819-9c46-aa1dc15de814 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 10.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e304 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:28.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:28 np0005465988 nova_compute[236126]: 2025-10-02 12:28:28.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:28 np0005465988 nova_compute[236126]: 2025-10-02 12:28:28.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:28 np0005465988 podman[283965]: 2025-10-02 12:28:28.545238412 +0000 UTC m=+0.072653391 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:28:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:28Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:28:29 np0005465988 nova_compute[236126]: 2025-10-02 12:28:29.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:29.698 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:29.698 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:28:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:29.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:30.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:30 np0005465988 nova_compute[236126]: 2025-10-02 12:28:30.804 2 INFO nova.compute.manager [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Rebuilding instance#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.137 2 DEBUG nova.objects.instance [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1e47e923-75c9-4c8c-b5f3-86f715462a64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.237 2 DEBUG nova.compute.manager [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.342 2 DEBUG nova.objects.instance [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lazy-loading 'pci_requests' on Instance uuid 1e47e923-75c9-4c8c-b5f3-86f715462a64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.374 2 DEBUG nova.objects.instance [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e47e923-75c9-4c8c-b5f3-86f715462a64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.391 2 DEBUG nova.objects.instance [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lazy-loading 'resources' on Instance uuid 1e47e923-75c9-4c8c-b5f3-86f715462a64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.423 2 DEBUG nova.objects.instance [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lazy-loading 'migration_context' on Instance uuid 1e47e923-75c9-4c8c-b5f3-86f715462a64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.435 2 DEBUG nova.objects.instance [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.440 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.631 2 DEBUG oslo_concurrency.lockutils [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.632 2 DEBUG oslo_concurrency.lockutils [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.633 2 DEBUG oslo_concurrency.lockutils [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.634 2 DEBUG oslo_concurrency.lockutils [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.634 2 DEBUG oslo_concurrency.lockutils [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.636 2 INFO nova.compute.manager [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Terminating instance#033[00m
Oct  2 08:28:31 np0005465988 nova_compute[236126]: 2025-10-02 12:28:31.638 2 DEBUG nova.compute.manager [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:28:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:31.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:32.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:32 np0005465988 kernel: tap3bdb6970-48 (unregistering): left promiscuous mode
Oct  2 08:28:32 np0005465988 NetworkManager[45041]: <info>  [1759408112.2344] device (tap3bdb6970-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:28:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:32Z|00488|binding|INFO|Releasing lport 3bdb6970-487f-4313-ab25-aa900f8b084a from this chassis (sb_readonly=0)
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:32Z|00489|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a down in Southbound
Oct  2 08:28:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:32Z|00490|binding|INFO|Removing iface tap3bdb6970-48 ovn-installed in OVS
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:32 np0005465988 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000064.scope: Deactivated successfully.
Oct  2 08:28:32 np0005465988 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000064.scope: Consumed 13.932s CPU time.
Oct  2 08:28:32 np0005465988 systemd-machined[192594]: Machine qemu-47-instance-00000064 terminated.
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.319 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '14', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.320 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff unbound from our chassis#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.321 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.322 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1eec3222-1672-40d0-9ba6-583d1721ea72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.323 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace which is not needed anymore#033[00m
Oct  2 08:28:32 np0005465988 kernel: tap3bdb6970-48: entered promiscuous mode
Oct  2 08:28:32 np0005465988 NetworkManager[45041]: <info>  [1759408112.4692] manager: (tap3bdb6970-48): new Tun device (/org/freedesktop/NetworkManager/Devices/227)
Oct  2 08:28:32 np0005465988 kernel: tap3bdb6970-48 (unregistering): left promiscuous mode
Oct  2 08:28:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:32Z|00491|binding|INFO|Claiming lport 3bdb6970-487f-4313-ab25-aa900f8b084a for this chassis.
Oct  2 08:28:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:32Z|00492|binding|INFO|3bdb6970-487f-4313-ab25-aa900f8b084a: Claiming fa:16:3e:22:0e:b9 10.100.0.6
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.514 2 INFO nova.virt.libvirt.driver [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Instance destroyed successfully.#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.515 2 DEBUG nova.objects.instance [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'resources' on Instance uuid 87ebffd5-69af-414b-be5d-67ba42e8cae1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:32Z|00493|binding|INFO|Setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a ovn-installed in OVS
Oct  2 08:28:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:32Z|00494|if_status|INFO|Dropped 13 log messages in last 201 seconds (most recently, 192 seconds ago) due to excessive rate
Oct  2 08:28:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:32Z|00495|if_status|INFO|Not setting lport 3bdb6970-487f-4313-ab25-aa900f8b084a down as sb is readonly
Oct  2 08:28:32 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[283747]: [NOTICE]   (283751) : haproxy version is 2.8.14-c23fe91
Oct  2 08:28:32 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[283747]: [NOTICE]   (283751) : path to executable is /usr/sbin/haproxy
Oct  2 08:28:32 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[283747]: [WARNING]  (283751) : Exiting Master process...
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:32 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[283747]: [ALERT]    (283751) : Current worker (283754) exited with code 143 (Terminated)
Oct  2 08:28:32 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[283747]: [WARNING]  (283751) : All workers exited. Exiting... (0)
Oct  2 08:28:32 np0005465988 systemd[1]: libpod-cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e.scope: Deactivated successfully.
Oct  2 08:28:32 np0005465988 podman[284009]: 2025-10-02 12:28:32.531847895 +0000 UTC m=+0.088202628 container died cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:28:32 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:28:32 np0005465988 systemd[1]: var-lib-containers-storage-overlay-d187f0d9f9978bfd3e6adb10c1a9e4c642c815f089abfaf78b96e063d04e472a-merged.mount: Deactivated successfully.
Oct  2 08:28:32 np0005465988 podman[284009]: 2025-10-02 12:28:32.584904391 +0000 UTC m=+0.141259124 container cleanup cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:28:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:32Z|00496|binding|INFO|Releasing lport 3bdb6970-487f-4313-ab25-aa900f8b084a from this chassis (sb_readonly=0)
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.594 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '14', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:32 np0005465988 systemd[1]: libpod-conmon-cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e.scope: Deactivated successfully.
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:32 np0005465988 podman[284044]: 2025-10-02 12:28:32.663148522 +0000 UTC m=+0.049459094 container remove cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.674 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9265a44b-b37e-444a-a37a-5c28819153ac]: (4, ('Thu Oct  2 12:28:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e)\ncb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e\nThu Oct  2 12:28:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (cb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e)\ncb554e3d773f6d3b576000bd31c9c7600178d50b27a9288865be800a9259446e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.677 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8b6db9ae-d87d-40d6-8c31-d76203d064a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.678 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:32 np0005465988 kernel: tapb2c62a66-f0: left promiscuous mode
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.704 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9030cdfd-8dba-46b0-b569-fe2eb0e0bb48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e304 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.749 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[74ed29e4-cf35-4deb-b83e-3a06ded1ef04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.751 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5aba1c4f-f62a-4a4c-bf17-5e60b01a124f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.775 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[063fd61f-ed17-40f6-898b-da7b7c74cae3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615429, 'reachable_time': 29576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284062, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.778 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.779 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[1747d62a-df24-43b1-aa14-e2bc8858aec5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.780 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff bound to our chassis#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.781 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff#033[00m
Oct  2 08:28:32 np0005465988 systemd[1]: run-netns-ovnmeta\x2db2c62a66\x2df9bc\x2d4a45\x2da843\x2daef2e12a7fff.mount: Deactivated successfully.
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.799 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a289ab78-fd6c-4d89-adb4-866bc675a2e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.800 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2c62a66-f1 in ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.804 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2c62a66-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.805 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1deab3c5-0ee9-4da2-a85c-7e546e690710]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.805 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c22c3e-50b8-4246-9801-9dc34a47296d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.824 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[c44ba07b-366a-4cfb-a4d8-ff692388e320]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.846 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:0e:b9 10.100.0.6'], port_security=['fa:16:3e:22:0e:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87ebffd5-69af-414b-be5d-67ba42e8cae1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '14', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3bdb6970-487f-4313-ab25-aa900f8b084a) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.847 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[05948223-7f94-4498-8b2b-429618adb4fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.877 2 DEBUG nova.virt.libvirt.vif [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-131502281',display_name='tempest-ServerActionsTestJSON-server-131502281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-131502281',id=100,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:28:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-dkbpwsb7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:28:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=87ebffd5-69af-414b-be5d-67ba42e8cae1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.878 2 DEBUG nova.network.os_vif_util [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "3bdb6970-487f-4313-ab25-aa900f8b084a", "address": "fa:16:3e:22:0e:b9", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdb6970-48", "ovs_interfaceid": "3bdb6970-487f-4313-ab25-aa900f8b084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.880 2 DEBUG nova.network.os_vif_util [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.881 2 DEBUG os_vif [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.884 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bdb6970-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.888 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1c721424-7d6e-4e5c-86ec-f46622876517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.893 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[72e0f124-2a48-409e-a2d2-9f7edb8ac3d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 nova_compute[236126]: 2025-10-02 12:28:32.894 2 INFO os_vif [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:0e:b9,bridge_name='br-int',has_traffic_filtering=True,id=3bdb6970-487f-4313-ab25-aa900f8b084a,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdb6970-48')#033[00m
Oct  2 08:28:32 np0005465988 NetworkManager[45041]: <info>  [1759408112.8955] manager: (tapb2c62a66-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/228)
Oct  2 08:28:32 np0005465988 systemd-udevd[283989]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.947 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9cf04256-8872-42d2-8782-caeca8abdf64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.949 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[3d90f34f-ad53-4c80-804a-e6f5fe63e42b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:32 np0005465988 NetworkManager[45041]: <info>  [1759408112.9730] device (tapb2c62a66-f0): carrier: link connected
Oct  2 08:28:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:32.982 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f74f00fc-46ba-414d-bbd8-728f5624d9dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.002 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1c02de25-9002-4d57-9b92-aa72dc8697a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 617330, 'reachable_time': 39675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284105, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.015 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[742cba63-06bf-44d5-9398-fe85abb05a96]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:7a7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 617330, 'tstamp': 617330}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 284106, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.029 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[32cefee7-6562-48cb-a37f-8cb30386e78e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 617330, 'reachable_time': 39675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 284107, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.065 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f450ea41-6b6d-48f8-87a8-44e84683e517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.123 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d9680a74-a628-4468-8f89-f02f88b3d4a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.125 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.126 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.126 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c62a66-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:33 np0005465988 nova_compute[236126]: 2025-10-02 12:28:33.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:33 np0005465988 kernel: tapb2c62a66-f0: entered promiscuous mode
Oct  2 08:28:33 np0005465988 NetworkManager[45041]: <info>  [1759408113.1298] manager: (tapb2c62a66-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.133 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2c62a66-f0, col_values=(('external_ids', {'iface-id': '8c1234f6-f595-4979-94c0-98da2211f0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:33 np0005465988 nova_compute[236126]: 2025-10-02 12:28:33.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:33 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:33Z|00497|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:28:33 np0005465988 nova_compute[236126]: 2025-10-02 12:28:33.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:33 np0005465988 nova_compute[236126]: 2025-10-02 12:28:33.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.169 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.169 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[67220976-3f35-4a6d-b5c1-26482538eed7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.170 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.171 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'env', 'PROCESS_TAG=haproxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:28:33 np0005465988 nova_compute[236126]: 2025-10-02 12:28:33.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 e305: 3 total, 3 up, 3 in
Oct  2 08:28:33 np0005465988 podman[284139]: 2025-10-02 12:28:33.60894815 +0000 UTC m=+0.087236710 container create 91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:28:33 np0005465988 systemd[1]: Started libpod-conmon-91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b.scope.
Oct  2 08:28:33 np0005465988 podman[284139]: 2025-10-02 12:28:33.569701441 +0000 UTC m=+0.047990031 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:28:33 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:28:33 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3c7e96b77e225d04c4cf21e8753bf32cdfabfab5ef1ac8f632ff2fc5af7240b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:28:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:33.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:33 np0005465988 podman[284139]: 2025-10-02 12:28:33.731609569 +0000 UTC m=+0.209898179 container init 91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:28:33 np0005465988 podman[284139]: 2025-10-02 12:28:33.737678753 +0000 UTC m=+0.215967313 container start 91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:28:33 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[284154]: [NOTICE]   (284158) : New worker (284160) forked
Oct  2 08:28:33 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[284154]: [NOTICE]   (284158) : Loading success.
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.802 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdb6970-487f-4313-ab25-aa900f8b084a in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff unbound from our chassis#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.806 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.808 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[46760172-efa8-4dbf-bd30-d82082811d50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:33.809 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace which is not needed anymore#033[00m
Oct  2 08:28:34 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[284154]: [NOTICE]   (284158) : haproxy version is 2.8.14-c23fe91
Oct  2 08:28:34 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[284154]: [NOTICE]   (284158) : path to executable is /usr/sbin/haproxy
Oct  2 08:28:34 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[284154]: [WARNING]  (284158) : Exiting Master process...
Oct  2 08:28:34 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[284154]: [ALERT]    (284158) : Current worker (284160) exited with code 143 (Terminated)
Oct  2 08:28:34 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[284154]: [WARNING]  (284158) : All workers exited. Exiting... (0)
Oct  2 08:28:34 np0005465988 systemd[1]: libpod-91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b.scope: Deactivated successfully.
Oct  2 08:28:34 np0005465988 podman[284187]: 2025-10-02 12:28:34.013664363 +0000 UTC m=+0.070059977 container died 91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:28:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:34.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:34 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b-userdata-shm.mount: Deactivated successfully.
Oct  2 08:28:34 np0005465988 systemd[1]: var-lib-containers-storage-overlay-c3c7e96b77e225d04c4cf21e8753bf32cdfabfab5ef1ac8f632ff2fc5af7240b-merged.mount: Deactivated successfully.
Oct  2 08:28:34 np0005465988 podman[284187]: 2025-10-02 12:28:34.060115439 +0000 UTC m=+0.116511053 container cleanup 91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:28:34 np0005465988 systemd[1]: libpod-conmon-91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b.scope: Deactivated successfully.
Oct  2 08:28:34 np0005465988 podman[284215]: 2025-10-02 12:28:34.132208533 +0000 UTC m=+0.046872510 container remove 91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:28:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:34.143 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[24166084-f9da-4c08-b99f-2d6cc41fc1cc]: (4, ('Thu Oct  2 12:28:33 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b)\n91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b\nThu Oct  2 12:28:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b)\n91adfba8f544f7a368964383e40ae352151a7d70118e51db321e7fa7c02b538b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:34.145 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0455da4e-2215-4452-8931-a3fb71cce4bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:34.147 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:34 np0005465988 nova_compute[236126]: 2025-10-02 12:28:34.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:34 np0005465988 kernel: tapb2c62a66-f0: left promiscuous mode
Oct  2 08:28:34 np0005465988 nova_compute[236126]: 2025-10-02 12:28:34.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:34.183 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4e8826fd-c9f9-45c5-b1d8-8718e46f3c46]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:34.199 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d13bfc-c025-4e4e-baac-6aaeb98bba45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:34.201 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ba2043d6-7f24-4fff-a0d4-69b161e8602c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:34.226 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[90076efa-8bba-4158-b13b-b8fa977897f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 617321, 'reachable_time': 30606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284231, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:34.229 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:28:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:34.229 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[68f48bfb-cf46-433e-8656-f99f7faf09e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:34 np0005465988 systemd[1]: run-netns-ovnmeta\x2db2c62a66\x2df9bc\x2d4a45\x2da843\x2daef2e12a7fff.mount: Deactivated successfully.
Oct  2 08:28:34 np0005465988 nova_compute[236126]: 2025-10-02 12:28:34.385 2 DEBUG nova.compute.manager [req-c0f3008b-ad59-4c94-82bb-2e01f4f8b13e req-82c62287-dc4d-482a-8185-fa96cb1fbd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:34 np0005465988 nova_compute[236126]: 2025-10-02 12:28:34.385 2 DEBUG oslo_concurrency.lockutils [req-c0f3008b-ad59-4c94-82bb-2e01f4f8b13e req-82c62287-dc4d-482a-8185-fa96cb1fbd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:34 np0005465988 nova_compute[236126]: 2025-10-02 12:28:34.386 2 DEBUG oslo_concurrency.lockutils [req-c0f3008b-ad59-4c94-82bb-2e01f4f8b13e req-82c62287-dc4d-482a-8185-fa96cb1fbd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:34 np0005465988 nova_compute[236126]: 2025-10-02 12:28:34.386 2 DEBUG oslo_concurrency.lockutils [req-c0f3008b-ad59-4c94-82bb-2e01f4f8b13e req-82c62287-dc4d-482a-8185-fa96cb1fbd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:34 np0005465988 nova_compute[236126]: 2025-10-02 12:28:34.386 2 DEBUG nova.compute.manager [req-c0f3008b-ad59-4c94-82bb-2e01f4f8b13e req-82c62287-dc4d-482a-8185-fa96cb1fbd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:34 np0005465988 nova_compute[236126]: 2025-10-02 12:28:34.387 2 DEBUG nova.compute.manager [req-c0f3008b-ad59-4c94-82bb-2e01f4f8b13e req-82c62287-dc4d-482a-8185-fa96cb1fbd7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-unplugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:28:35 np0005465988 kernel: tap4abf20c2-f6 (unregistering): left promiscuous mode
Oct  2 08:28:35 np0005465988 NetworkManager[45041]: <info>  [1759408115.1300] device (tap4abf20c2-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:28:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:35Z|00498|binding|INFO|Releasing lport 4abf20c2-f65e-479c-8fe9-62982b2fa096 from this chassis (sb_readonly=0)
Oct  2 08:28:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:35Z|00499|binding|INFO|Setting lport 4abf20c2-f65e-479c-8fe9-62982b2fa096 down in Southbound
Oct  2 08:28:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:28:35Z|00500|binding|INFO|Removing iface tap4abf20c2-f6 ovn-installed in OVS
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:35 np0005465988 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Oct  2 08:28:35 np0005465988 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006c.scope: Consumed 14.544s CPU time.
Oct  2 08:28:35 np0005465988 systemd-machined[192594]: Machine qemu-46-instance-0000006c terminated.
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.337 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:c1:78 10.100.0.10'], port_security=['fa:16:3e:a1:c1:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1e47e923-75c9-4c8c-b5f3-86f715462a64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-454203c0-2170-41e3-a903-014f7d235b68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aad01ba5df14c1a9309451d0daaab83', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6993526b-112b-49f0-b6c4-5993fe406fdf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.183'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=825ba480-e228-47f3-8e23-c895bd3a1194, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=4abf20c2-f65e-479c-8fe9-62982b2fa096) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.339 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 4abf20c2-f65e-479c-8fe9-62982b2fa096 in datapath 454203c0-2170-41e3-a903-014f7d235b68 unbound from our chassis#033[00m
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.343 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 454203c0-2170-41e3-a903-014f7d235b68, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.344 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d02f1e2c-f643-4ab4-acaa-6ff4836eff8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.345 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-454203c0-2170-41e3-a903-014f7d235b68 namespace which is not needed anymore#033[00m
Oct  2 08:28:35 np0005465988 NetworkManager[45041]: <info>  [1759408115.3713] manager: (tap4abf20c2-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/230)
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.473 2 INFO nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Instance shutdown successfully after 4 seconds.#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.478 2 INFO nova.virt.libvirt.driver [-] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Instance destroyed successfully.#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.483 2 INFO nova.virt.libvirt.driver [-] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Instance destroyed successfully.#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.484 2 DEBUG nova.virt.libvirt.vif [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-2141659460',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-1358372400',id=108,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJsc13H1E1twasivn4l5fIknm6WxTVFzYRGRsGTsk9ZC9W5y41hzk124eOR+2RfREafyMEhaDPvhujygatWcUNbG54vVZEPcxsJp0CaKKZ+PeWEtzqfW2Ozb0JbEn7mctQ==',key_name='tempest-keypair-294989539',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:28:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7aad01ba5df14c1a9309451d0daaab83',ramdisk_id='',reservation_id='r-pdakyd3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-1846432011',owner_user_name='tempest-ServerActionsV293TestJSON-1846432011-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:28:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c6bda06d7e6348bab069e07b21022b60',uuid=1e47e923-75c9-4c8c-b5f3-86f715462a64,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.485 2 DEBUG nova.network.os_vif_util [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converting VIF {"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.485 2 DEBUG nova.network.os_vif_util [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.486 2 DEBUG os_vif [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.488 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4abf20c2-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.495 2 INFO os_vif [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6')#033[00m
Oct  2 08:28:35 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[283294]: [NOTICE]   (283298) : haproxy version is 2.8.14-c23fe91
Oct  2 08:28:35 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[283294]: [NOTICE]   (283298) : path to executable is /usr/sbin/haproxy
Oct  2 08:28:35 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[283294]: [WARNING]  (283298) : Exiting Master process...
Oct  2 08:28:35 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[283294]: [ALERT]    (283298) : Current worker (283300) exited with code 143 (Terminated)
Oct  2 08:28:35 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[283294]: [WARNING]  (283298) : All workers exited. Exiting... (0)
Oct  2 08:28:35 np0005465988 systemd[1]: libpod-b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e.scope: Deactivated successfully.
Oct  2 08:28:35 np0005465988 podman[284264]: 2025-10-02 12:28:35.549226646 +0000 UTC m=+0.064611970 container died b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:28:35 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:28:35 np0005465988 systemd[1]: var-lib-containers-storage-overlay-f1a88f9267b229884fd6a36e2416c5b138a03e422ca24e731d83993c029a098b-merged.mount: Deactivated successfully.
Oct  2 08:28:35 np0005465988 podman[284264]: 2025-10-02 12:28:35.61297748 +0000 UTC m=+0.128362794 container cleanup b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:28:35 np0005465988 systemd[1]: libpod-conmon-b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e.scope: Deactivated successfully.
Oct  2 08:28:35 np0005465988 podman[284312]: 2025-10-02 12:28:35.69468546 +0000 UTC m=+0.050443232 container remove b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.704 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[56b650d5-92c0-411a-b162-63502e1436c0]: (4, ('Thu Oct  2 12:28:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68 (b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e)\nb51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e\nThu Oct  2 12:28:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68 (b51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e)\nb51dd6d3a0c7bba6e58ec5224569e5aea36eeb964fd70eaabb3ad4f72886b95e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.707 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[de3f6974-17b9-4697-a455-b42f6c68421d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.708 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap454203c0-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:35 np0005465988 kernel: tap454203c0-20: left promiscuous mode
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:35.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.718 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1983d042-cb74-451a-8602-0fe985a0cf18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:35 np0005465988 nova_compute[236126]: 2025-10-02 12:28:35.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.754 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[73736404-66a2-4e95-adfb-0d07aa15b2b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.755 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[56a4f2b9-a6da-49b7-899b-6a0bd064e6f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.780 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9744ccef-fb84-4220-9484-02907c88b9ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614151, 'reachable_time': 23097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284327, 'error': None, 'target': 'ovnmeta-454203c0-2170-41e3-a903-014f7d235b68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:35 np0005465988 systemd[1]: run-netns-ovnmeta\x2d454203c0\x2d2170\x2d41e3\x2da903\x2d014f7d235b68.mount: Deactivated successfully.
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.786 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-454203c0-2170-41e3-a903-014f7d235b68 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:28:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:35.787 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[2e671cb4-11d7-4bf7-a51a-19ec38ee22c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:28:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:36.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:28:36 np0005465988 nova_compute[236126]: 2025-10-02 12:28:36.551 2 INFO nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Deleting instance files /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64_del#033[00m
Oct  2 08:28:36 np0005465988 nova_compute[236126]: 2025-10-02 12:28:36.552 2 INFO nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Deletion of /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64_del complete#033[00m
Oct  2 08:28:36 np0005465988 nova_compute[236126]: 2025-10-02 12:28:36.836 2 DEBUG nova.compute.manager [req-5e3cb22d-f4d5-4f19-a060-87ea30203fc8 req-3a77882a-fe1d-43dd-8c1f-5a6914775261 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:36 np0005465988 nova_compute[236126]: 2025-10-02 12:28:36.836 2 DEBUG oslo_concurrency.lockutils [req-5e3cb22d-f4d5-4f19-a060-87ea30203fc8 req-3a77882a-fe1d-43dd-8c1f-5a6914775261 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:36 np0005465988 nova_compute[236126]: 2025-10-02 12:28:36.837 2 DEBUG oslo_concurrency.lockutils [req-5e3cb22d-f4d5-4f19-a060-87ea30203fc8 req-3a77882a-fe1d-43dd-8c1f-5a6914775261 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:36 np0005465988 nova_compute[236126]: 2025-10-02 12:28:36.837 2 DEBUG oslo_concurrency.lockutils [req-5e3cb22d-f4d5-4f19-a060-87ea30203fc8 req-3a77882a-fe1d-43dd-8c1f-5a6914775261 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:36 np0005465988 nova_compute[236126]: 2025-10-02 12:28:36.838 2 DEBUG nova.compute.manager [req-5e3cb22d-f4d5-4f19-a060-87ea30203fc8 req-3a77882a-fe1d-43dd-8c1f-5a6914775261 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] No waiting events found dispatching network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:36 np0005465988 nova_compute[236126]: 2025-10-02 12:28:36.838 2 WARNING nova.compute.manager [req-5e3cb22d-f4d5-4f19-a060-87ea30203fc8 req-3a77882a-fe1d-43dd-8c1f-5a6914775261 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received unexpected event network-vif-plugged-3bdb6970-487f-4313-ab25-aa900f8b084a for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:28:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:37.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:37 np0005465988 nova_compute[236126]: 2025-10-02 12:28:37.754 2 INFO nova.virt.libvirt.driver [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Deleting instance files /var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1_del#033[00m
Oct  2 08:28:37 np0005465988 nova_compute[236126]: 2025-10-02 12:28:37.754 2 INFO nova.virt.libvirt.driver [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Deletion of /var/lib/nova/instances/87ebffd5-69af-414b-be5d-67ba42e8cae1_del complete#033[00m
Oct  2 08:28:37 np0005465988 nova_compute[236126]: 2025-10-02 12:28:37.775 2 WARNING nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] During detach_volume, instance disappeared.: nova.exception.InstanceNotFound: Instance 1e47e923-75c9-4c8c-b5f3-86f715462a64 could not be found.#033[00m
Oct  2 08:28:37 np0005465988 nova_compute[236126]: 2025-10-02 12:28:37.907 2 INFO nova.compute.manager [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Took 6.27 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:28:37 np0005465988 nova_compute[236126]: 2025-10-02 12:28:37.908 2 DEBUG oslo.service.loopingcall [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:28:37 np0005465988 nova_compute[236126]: 2025-10-02 12:28:37.908 2 DEBUG nova.compute.manager [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:28:37 np0005465988 nova_compute[236126]: 2025-10-02 12:28:37.909 2 DEBUG nova.network.neutron [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:28:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:38.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:38 np0005465988 nova_compute[236126]: 2025-10-02 12:28:38.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:39 np0005465988 nova_compute[236126]: 2025-10-02 12:28:39.371 2 DEBUG nova.compute.manager [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Preparing to wait for external event volume-reimaged-4c9d8719-7ebc-4171-8171-a0e55d9f5992 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:28:39 np0005465988 nova_compute[236126]: 2025-10-02 12:28:39.372 2 DEBUG oslo_concurrency.lockutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:39 np0005465988 nova_compute[236126]: 2025-10-02 12:28:39.372 2 DEBUG oslo_concurrency.lockutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:39 np0005465988 nova_compute[236126]: 2025-10-02 12:28:39.372 2 DEBUG oslo_concurrency.lockutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:28:39.702 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:39.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:40.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:40 np0005465988 nova_compute[236126]: 2025-10-02 12:28:40.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:28:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:41.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:28:41 np0005465988 nova_compute[236126]: 2025-10-02 12:28:41.735 2 DEBUG nova.compute.manager [req-0c2a436e-69e6-4267-8b32-a51d75c7ce05 req-9479f5d6-ff38-4842-ab96-d76fb536cfc4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-vif-unplugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:41 np0005465988 nova_compute[236126]: 2025-10-02 12:28:41.735 2 DEBUG oslo_concurrency.lockutils [req-0c2a436e-69e6-4267-8b32-a51d75c7ce05 req-9479f5d6-ff38-4842-ab96-d76fb536cfc4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:41 np0005465988 nova_compute[236126]: 2025-10-02 12:28:41.736 2 DEBUG oslo_concurrency.lockutils [req-0c2a436e-69e6-4267-8b32-a51d75c7ce05 req-9479f5d6-ff38-4842-ab96-d76fb536cfc4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:41 np0005465988 nova_compute[236126]: 2025-10-02 12:28:41.736 2 DEBUG oslo_concurrency.lockutils [req-0c2a436e-69e6-4267-8b32-a51d75c7ce05 req-9479f5d6-ff38-4842-ab96-d76fb536cfc4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:41 np0005465988 nova_compute[236126]: 2025-10-02 12:28:41.736 2 DEBUG nova.compute.manager [req-0c2a436e-69e6-4267-8b32-a51d75c7ce05 req-9479f5d6-ff38-4842-ab96-d76fb536cfc4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] No event matching network-vif-unplugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 in dict_keys([('volume-reimaged', '4c9d8719-7ebc-4171-8171-a0e55d9f5992')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:28:41 np0005465988 nova_compute[236126]: 2025-10-02 12:28:41.737 2 WARNING nova.compute.manager [req-0c2a436e-69e6-4267-8b32-a51d75c7ce05 req-9479f5d6-ff38-4842-ab96-d76fb536cfc4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received unexpected event network-vif-unplugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 for instance with vm_state active and task_state rebuilding.#033[00m
Oct  2 08:28:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:42.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:43 np0005465988 nova_compute[236126]: 2025-10-02 12:28:43.020 2 DEBUG nova.network.neutron [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:43 np0005465988 nova_compute[236126]: 2025-10-02 12:28:43.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:43 np0005465988 nova_compute[236126]: 2025-10-02 12:28:43.458 2 INFO nova.compute.manager [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Took 5.55 seconds to deallocate network for instance.#033[00m
Oct  2 08:28:43 np0005465988 nova_compute[236126]: 2025-10-02 12:28:43.469 2 DEBUG nova.compute.manager [req-fd471072-eb25-4ae7-b245-070a05b33bc9 req-770450f6-0b26-4874-a25c-84cd79f4d2f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Received event network-vif-deleted-3bdb6970-487f-4313-ab25-aa900f8b084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:43 np0005465988 nova_compute[236126]: 2025-10-02 12:28:43.469 2 INFO nova.compute.manager [req-fd471072-eb25-4ae7-b245-070a05b33bc9 req-770450f6-0b26-4874-a25c-84cd79f4d2f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Neutron deleted interface 3bdb6970-487f-4313-ab25-aa900f8b084a; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:28:43 np0005465988 nova_compute[236126]: 2025-10-02 12:28:43.469 2 DEBUG nova.network.neutron [req-fd471072-eb25-4ae7-b245-070a05b33bc9 req-770450f6-0b26-4874-a25c-84cd79f4d2f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:43 np0005465988 nova_compute[236126]: 2025-10-02 12:28:43.660 2 DEBUG nova.compute.manager [req-fd471072-eb25-4ae7-b245-070a05b33bc9 req-770450f6-0b26-4874-a25c-84cd79f4d2f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Detach interface failed, port_id=3bdb6970-487f-4313-ab25-aa900f8b084a, reason: Instance 87ebffd5-69af-414b-be5d-67ba42e8cae1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:28:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:43.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:43 np0005465988 nova_compute[236126]: 2025-10-02 12:28:43.889 2 DEBUG oslo_concurrency.lockutils [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:43 np0005465988 nova_compute[236126]: 2025-10-02 12:28:43.889 2 DEBUG oslo_concurrency.lockutils [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:43 np0005465988 nova_compute[236126]: 2025-10-02 12:28:43.895 2 DEBUG oslo_concurrency.lockutils [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:44.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:44 np0005465988 nova_compute[236126]: 2025-10-02 12:28:44.173 2 DEBUG nova.compute.manager [req-0648d468-95c6-4f30-870f-a68a1f677736 req-7b14c1a5-95d7-4730-950b-ec93863ad01c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:44 np0005465988 nova_compute[236126]: 2025-10-02 12:28:44.173 2 DEBUG oslo_concurrency.lockutils [req-0648d468-95c6-4f30-870f-a68a1f677736 req-7b14c1a5-95d7-4730-950b-ec93863ad01c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:44 np0005465988 nova_compute[236126]: 2025-10-02 12:28:44.173 2 DEBUG oslo_concurrency.lockutils [req-0648d468-95c6-4f30-870f-a68a1f677736 req-7b14c1a5-95d7-4730-950b-ec93863ad01c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:44 np0005465988 nova_compute[236126]: 2025-10-02 12:28:44.174 2 DEBUG oslo_concurrency.lockutils [req-0648d468-95c6-4f30-870f-a68a1f677736 req-7b14c1a5-95d7-4730-950b-ec93863ad01c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:44 np0005465988 nova_compute[236126]: 2025-10-02 12:28:44.174 2 DEBUG nova.compute.manager [req-0648d468-95c6-4f30-870f-a68a1f677736 req-7b14c1a5-95d7-4730-950b-ec93863ad01c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] No event matching network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 in dict_keys([('volume-reimaged', '4c9d8719-7ebc-4171-8171-a0e55d9f5992')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:28:44 np0005465988 nova_compute[236126]: 2025-10-02 12:28:44.174 2 WARNING nova.compute.manager [req-0648d468-95c6-4f30-870f-a68a1f677736 req-7b14c1a5-95d7-4730-950b-ec93863ad01c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received unexpected event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 for instance with vm_state active and task_state rebuilding.#033[00m
Oct  2 08:28:44 np0005465988 nova_compute[236126]: 2025-10-02 12:28:44.199 2 INFO nova.scheduler.client.report [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Deleted allocations for instance 87ebffd5-69af-414b-be5d-67ba42e8cae1#033[00m
Oct  2 08:28:44 np0005465988 nova_compute[236126]: 2025-10-02 12:28:44.407 2 DEBUG oslo_concurrency.lockutils [None req-df115e0e-d600-4cb4-871a-fcacf915fe5b 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "87ebffd5-69af-414b-be5d-67ba42e8cae1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 12.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:45 np0005465988 nova_compute[236126]: 2025-10-02 12:28:45.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:45.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:46.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:47 np0005465988 nova_compute[236126]: 2025-10-02 12:28:47.511 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408112.5095859, 87ebffd5-69af-414b-be5d-67ba42e8cae1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:47 np0005465988 nova_compute[236126]: 2025-10-02 12:28:47.511 2 INFO nova.compute.manager [-] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:28:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:47.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:47 np0005465988 nova_compute[236126]: 2025-10-02 12:28:47.955 2 DEBUG nova.compute.manager [None req-00707660-38ec-4bc9-bdb5-92c677a4a922 - - - - - -] [instance: 87ebffd5-69af-414b-be5d-67ba42e8cae1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:48.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:48 np0005465988 nova_compute[236126]: 2025-10-02 12:28:48.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:48 np0005465988 podman[284388]: 2025-10-02 12:28:48.592218917 +0000 UTC m=+0.096116686 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001)
Oct  2 08:28:48 np0005465988 podman[284389]: 2025-10-02 12:28:48.59614661 +0000 UTC m=+0.095234321 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible)
Oct  2 08:28:48 np0005465988 podman[284387]: 2025-10-02 12:28:48.625210326 +0000 UTC m=+0.131055181 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:28:49 np0005465988 nova_compute[236126]: 2025-10-02 12:28:49.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:49 np0005465988 nova_compute[236126]: 2025-10-02 12:28:49.522 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:49 np0005465988 nova_compute[236126]: 2025-10-02 12:28:49.522 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:49 np0005465988 nova_compute[236126]: 2025-10-02 12:28:49.523 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:49 np0005465988 nova_compute[236126]: 2025-10-02 12:28:49.523 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:28:49 np0005465988 nova_compute[236126]: 2025-10-02 12:28:49.523 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:49.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:28:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/527686598' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:28:50 np0005465988 nova_compute[236126]: 2025-10-02 12:28:50.017 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:50.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:50 np0005465988 nova_compute[236126]: 2025-10-02 12:28:50.168 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:28:50 np0005465988 nova_compute[236126]: 2025-10-02 12:28:50.169 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4371MB free_disk=20.900894165039062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:28:50 np0005465988 nova_compute[236126]: 2025-10-02 12:28:50.170 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:50 np0005465988 nova_compute[236126]: 2025-10-02 12:28:50.170 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:50 np0005465988 nova_compute[236126]: 2025-10-02 12:28:50.388 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408115.3866177, 1e47e923-75c9-4c8c-b5f3-86f715462a64 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:50 np0005465988 nova_compute[236126]: 2025-10-02 12:28:50.389 2 INFO nova.compute.manager [-] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:28:50 np0005465988 nova_compute[236126]: 2025-10-02 12:28:50.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:51 np0005465988 nova_compute[236126]: 2025-10-02 12:28:51.267 2 DEBUG nova.compute.manager [None req-13078663-5077-45da-a1c1-0684f31117c5 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:51.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:52.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:53 np0005465988 nova_compute[236126]: 2025-10-02 12:28:53.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:53 np0005465988 nova_compute[236126]: 2025-10-02 12:28:53.552 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 1e47e923-75c9-4c8c-b5f3-86f715462a64 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:28:53 np0005465988 nova_compute[236126]: 2025-10-02 12:28:53.553 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:28:53 np0005465988 nova_compute[236126]: 2025-10-02 12:28:53.553 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:28:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:53.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:53 np0005465988 nova_compute[236126]: 2025-10-02 12:28:53.793 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:54.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:28:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3347900093' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:28:54 np0005465988 nova_compute[236126]: 2025-10-02 12:28:54.243 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:54 np0005465988 nova_compute[236126]: 2025-10-02 12:28:54.250 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:28:54 np0005465988 nova_compute[236126]: 2025-10-02 12:28:54.284 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:28:54 np0005465988 nova_compute[236126]: 2025-10-02 12:28:54.486 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:28:54 np0005465988 nova_compute[236126]: 2025-10-02 12:28:54.486 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:55 np0005465988 nova_compute[236126]: 2025-10-02 12:28:55.487 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:55 np0005465988 nova_compute[236126]: 2025-10-02 12:28:55.488 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:55 np0005465988 nova_compute[236126]: 2025-10-02 12:28:55.488 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:55 np0005465988 nova_compute[236126]: 2025-10-02 12:28:55.488 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:55 np0005465988 nova_compute[236126]: 2025-10-02 12:28:55.488 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:28:55 np0005465988 nova_compute[236126]: 2025-10-02 12:28:55.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:28:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:55.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:28:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:28:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:56.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:28:57 np0005465988 nova_compute[236126]: 2025-10-02 12:28:57.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:57 np0005465988 nova_compute[236126]: 2025-10-02 12:28:57.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:57 np0005465988 nova_compute[236126]: 2025-10-02 12:28:57.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:28:57 np0005465988 nova_compute[236126]: 2025-10-02 12:28:57.626 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:28:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:57.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:28:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:28:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:28:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:28:58.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:28:58 np0005465988 nova_compute[236126]: 2025-10-02 12:28:58.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:58 np0005465988 nova_compute[236126]: 2025-10-02 12:28:58.625 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:59 np0005465988 podman[284501]: 2025-10-02 12:28:59.550323689 +0000 UTC m=+0.076524762 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent)
Oct  2 08:28:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:28:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:28:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:28:59.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:00.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:00 np0005465988 nova_compute[236126]: 2025-10-02 12:29:00.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:29:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:01.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.069 2 DEBUG nova.compute.manager [req-a287238d-80ed-4643-9861-16930c483535 req-ea061408-6466-468b-8ef0-855337ee3214 f2700629748f4a53afaee0e33dc39c53 aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event volume-reimaged-4c9d8719-7ebc-4171-8171-a0e55d9f5992 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.069 2 DEBUG oslo_concurrency.lockutils [req-a287238d-80ed-4643-9861-16930c483535 req-ea061408-6466-468b-8ef0-855337ee3214 f2700629748f4a53afaee0e33dc39c53 aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.070 2 DEBUG oslo_concurrency.lockutils [req-a287238d-80ed-4643-9861-16930c483535 req-ea061408-6466-468b-8ef0-855337ee3214 f2700629748f4a53afaee0e33dc39c53 aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.070 2 DEBUG oslo_concurrency.lockutils [req-a287238d-80ed-4643-9861-16930c483535 req-ea061408-6466-468b-8ef0-855337ee3214 f2700629748f4a53afaee0e33dc39c53 aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.070 2 DEBUG nova.compute.manager [req-a287238d-80ed-4643-9861-16930c483535 req-ea061408-6466-468b-8ef0-855337ee3214 f2700629748f4a53afaee0e33dc39c53 aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Processing event volume-reimaged-4c9d8719-7ebc-4171-8171-a0e55d9f5992 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.071 2 DEBUG nova.compute.manager [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Instance event wait completed in 19 seconds for volume-reimaged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:29:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:02.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.192 2 INFO nova.virt.block_device [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Booting with volume 4c9d8719-7ebc-4171-8171-a0e55d9f5992 at /dev/vda#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.615 2 DEBUG os_brick.utils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.616 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.632 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.633 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[34ba0f79-7e91-48c6-9631-9fc2fe9eabd4]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.634 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.646 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.647 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[fd925cc9-0ee8-47d5-a717-4d05e71cf9a2]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.648 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.661 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.662 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[8bbb5b64-b4d1-471d-9269-f157a8486c93]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.663 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[838eaec4-caef-4ef4-9eed-903968668f1c]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.664 2 DEBUG oslo_concurrency.processutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.705 2 DEBUG oslo_concurrency.processutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CMD "nvme version" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.707 2 DEBUG os_brick.initiator.connectors.lightos [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.707 2 DEBUG os_brick.initiator.connectors.lightos [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.708 2 DEBUG os_brick.initiator.connectors.lightos [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.708 2 DEBUG os_brick.utils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] <== get_connector_properties: return (92ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:29:02 np0005465988 nova_compute[236126]: 2025-10-02 12:29:02.708 2 DEBUG nova.virt.block_device [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Updating existing volume attachment record: 2ea8cdb1-71fc-40b9-ab28-f6ea4dc1f141 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:29:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:03 np0005465988 nova_compute[236126]: 2025-10-02 12:29:03.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:03 np0005465988 nova_compute[236126]: 2025-10-02 12:29:03.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:03 np0005465988 nova_compute[236126]: 2025-10-02 12:29:03.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:29:03 np0005465988 nova_compute[236126]: 2025-10-02 12:29:03.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:29:03 np0005465988 nova_compute[236126]: 2025-10-02 12:29:03.556 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:03 np0005465988 nova_compute[236126]: 2025-10-02 12:29:03.557 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:03 np0005465988 nova_compute[236126]: 2025-10-02 12:29:03.557 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:29:03 np0005465988 nova_compute[236126]: 2025-10-02 12:29:03.558 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1e47e923-75c9-4c8c-b5f3-86f715462a64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:03.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:29:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:04.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:29:05 np0005465988 nova_compute[236126]: 2025-10-02 12:29:05.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:05.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.086 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.087 2 INFO nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Creating image(s)#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.087 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.088 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Ensure instance console log exists: /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.088 2 DEBUG oslo_concurrency.lockutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.088 2 DEBUG oslo_concurrency.lockutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.088 2 DEBUG oslo_concurrency.lockutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:06.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.091 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Start _get_guest_xml network_info=[{"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:47Z,direct_url=<?>,disk_format='qcow2',id=db05f54c-61f8-42d6-a1e2-da3219a77b12,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '2ea8cdb1-71fc-40b9-ab28-f6ea4dc1f141', 'disk_bus': 'virtio', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-4c9d8719-7ebc-4171-8171-a0e55d9f5992', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '4c9d8719-7ebc-4171-8171-a0e55d9f5992', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1e47e923-75c9-4c8c-b5f3-86f715462a64', 'attached_at': '', 'detached_at': '', 'volume_id': '4c9d8719-7ebc-4171-8171-a0e55d9f5992', 'serial': '4c9d8719-7ebc-4171-8171-a0e55d9f5992'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.095 2 WARNING nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.100 2 DEBUG nova.virt.libvirt.host [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.102 2 DEBUG nova.virt.libvirt.host [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.104 2 DEBUG nova.virt.libvirt.host [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.105 2 DEBUG nova.virt.libvirt.host [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.107 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.107 2 DEBUG nova.virt.hardware [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:47Z,direct_url=<?>,disk_format='qcow2',id=db05f54c-61f8-42d6-a1e2-da3219a77b12,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.108 2 DEBUG nova.virt.hardware [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.108 2 DEBUG nova.virt.hardware [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.109 2 DEBUG nova.virt.hardware [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.109 2 DEBUG nova.virt.hardware [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.109 2 DEBUG nova.virt.hardware [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.109 2 DEBUG nova.virt.hardware [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.110 2 DEBUG nova.virt.hardware [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.110 2 DEBUG nova.virt.hardware [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.110 2 DEBUG nova.virt.hardware [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.111 2 DEBUG nova.virt.hardware [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.111 2 DEBUG nova.objects.instance [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1e47e923-75c9-4c8c-b5f3-86f715462a64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.361 2 DEBUG nova.storage.rbd_utils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] rbd image 1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.370 2 DEBUG oslo_concurrency.processutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:29:06 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1688902075' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:29:06 np0005465988 nova_compute[236126]: 2025-10-02 12:29:06.864 2 DEBUG oslo_concurrency.processutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:29:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:29:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:29:07 np0005465988 nova_compute[236126]: 2025-10-02 12:29:07.601 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Updating instance_info_cache with network_info: [{"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:07.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:29:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:08.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:29:08 np0005465988 nova_compute[236126]: 2025-10-02 12:29:08.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:09.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:10.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.960 2 DEBUG nova.virt.libvirt.vif [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-2141659460',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-1358372400',id=108,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJsc13H1E1twasivn4l5fIknm6WxTVFzYRGRsGTsk9ZC9W5y41hzk124eOR+2RfREafyMEhaDPvhujygatWcUNbG54vVZEPcxsJp0CaKKZ+PeWEtzqfW2Ozb0JbEn7mctQ==',key_name='tempest-keypair-294989539',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:28:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7aad01ba5df14c1a9309451d0daaab83',ramdisk_id='',reservation_id='r-pdakyd3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-1846432011',owner_user_name='tempest-ServerActionsV293TestJSON-1846432011-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:29:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c6bda06d7e6348bab069e07b21022b60',uuid=1e47e923-75c9-4c8c-b5f3-86f715462a64,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.961 2 DEBUG nova.network.os_vif_util [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converting VIF {"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.962 2 DEBUG nova.network.os_vif_util [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.967 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  <uuid>1e47e923-75c9-4c8c-b5f3-86f715462a64</uuid>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  <name>instance-0000006c</name>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerActionsV293TestJSON-server-2141659460</nova:name>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:29:06</nova:creationTime>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <nova:user uuid="c6bda06d7e6348bab069e07b21022b60">tempest-ServerActionsV293TestJSON-1846432011-project-member</nova:user>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <nova:project uuid="7aad01ba5df14c1a9309451d0daaab83">tempest-ServerActionsV293TestJSON-1846432011</nova:project>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <nova:port uuid="4abf20c2-f65e-479c-8fe9-62982b2fa096">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <entry name="serial">1e47e923-75c9-4c8c-b5f3-86f715462a64</entry>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <entry name="uuid">1e47e923-75c9-4c8c-b5f3-86f715462a64</entry>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-4c9d8719-7ebc-4171-8171-a0e55d9f5992">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <serial>4c9d8719-7ebc-4171-8171-a0e55d9f5992</serial>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:a1:c1:78"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <target dev="tap4abf20c2-f6"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/console.log" append="off"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:29:10 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:29:10 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:29:10 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:29:10 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.968 2 DEBUG nova.virt.libvirt.vif [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-2141659460',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-1358372400',id=108,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJsc13H1E1twasivn4l5fIknm6WxTVFzYRGRsGTsk9ZC9W5y41hzk124eOR+2RfREafyMEhaDPvhujygatWcUNbG54vVZEPcxsJp0CaKKZ+PeWEtzqfW2Ozb0JbEn7mctQ==',key_name='tempest-keypair-294989539',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:28:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7aad01ba5df14c1a9309451d0daaab83',ramdisk_id='',reservation_id='r-pdakyd3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-1846432011',owner_user_name='tempest-ServerActionsV293TestJSON-1846432011-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:29:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c6bda06d7e6348bab069e07b21022b60',uuid=1e47e923-75c9-4c8c-b5f3-86f715462a64,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.968 2 DEBUG nova.network.os_vif_util [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converting VIF {"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.969 2 DEBUG nova.network.os_vif_util [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.970 2 DEBUG os_vif [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.971 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-1e47e923-75c9-4c8c-b5f3-86f715462a64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.972 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.973 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.973 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.975 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.975 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.981 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4abf20c2-f6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.982 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4abf20c2-f6, col_values=(('external_ids', {'iface-id': '4abf20c2-f65e-479c-8fe9-62982b2fa096', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a1:c1:78', 'vm-uuid': '1e47e923-75c9-4c8c-b5f3-86f715462a64'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:10 np0005465988 NetworkManager[45041]: <info>  [1759408150.9853] manager: (tap4abf20c2-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:10 np0005465988 nova_compute[236126]: 2025-10-02 12:29:10.995 2 INFO os_vif [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6')#033[00m
Oct  2 08:29:11 np0005465988 nova_compute[236126]: 2025-10-02 12:29:11.306 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:11.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:11 np0005465988 nova_compute[236126]: 2025-10-02 12:29:11.853 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:29:11 np0005465988 nova_compute[236126]: 2025-10-02 12:29:11.854 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:29:11 np0005465988 nova_compute[236126]: 2025-10-02 12:29:11.854 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] No VIF found with MAC fa:16:3e:a1:c1:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:29:11 np0005465988 nova_compute[236126]: 2025-10-02 12:29:11.855 2 INFO nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Using config drive#033[00m
Oct  2 08:29:11 np0005465988 nova_compute[236126]: 2025-10-02 12:29:11.889 2 DEBUG nova.storage.rbd_utils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] rbd image 1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:29:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:12.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:12 np0005465988 nova_compute[236126]: 2025-10-02 12:29:12.441 2 DEBUG nova.objects.instance [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 1e47e923-75c9-4c8c-b5f3-86f715462a64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:12 np0005465988 nova_compute[236126]: 2025-10-02 12:29:12.723 2 DEBUG nova.objects.instance [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lazy-loading 'keypairs' on Instance uuid 1e47e923-75c9-4c8c-b5f3-86f715462a64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:13 np0005465988 nova_compute[236126]: 2025-10-02 12:29:13.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:13.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:13 np0005465988 nova_compute[236126]: 2025-10-02 12:29:13.857 2 INFO nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Creating config drive at /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config#033[00m
Oct  2 08:29:13 np0005465988 nova_compute[236126]: 2025-10-02 12:29:13.869 2 DEBUG oslo_concurrency.processutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2nk_ef57 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.039 2 DEBUG oslo_concurrency.processutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2nk_ef57" returned: 0 in 0.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.085 2 DEBUG nova.storage.rbd_utils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] rbd image 1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.092 2 DEBUG oslo_concurrency.processutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config 1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:14.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.297 2 DEBUG oslo_concurrency.processutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config 1e47e923-75c9-4c8c-b5f3-86f715462a64_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.205s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.299 2 INFO nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Deleting local config drive /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64/disk.config because it was imported into RBD.#033[00m
Oct  2 08:29:14 np0005465988 kernel: tap4abf20c2-f6: entered promiscuous mode
Oct  2 08:29:14 np0005465988 NetworkManager[45041]: <info>  [1759408154.3794] manager: (tap4abf20c2-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/232)
Oct  2 08:29:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:14Z|00501|binding|INFO|Claiming lport 4abf20c2-f65e-479c-8fe9-62982b2fa096 for this chassis.
Oct  2 08:29:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:14Z|00502|binding|INFO|4abf20c2-f65e-479c-8fe9-62982b2fa096: Claiming fa:16:3e:a1:c1:78 10.100.0.10
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.389 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:c1:78 10.100.0.10'], port_security=['fa:16:3e:a1:c1:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1e47e923-75c9-4c8c-b5f3-86f715462a64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-454203c0-2170-41e3-a903-014f7d235b68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aad01ba5df14c1a9309451d0daaab83', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6993526b-112b-49f0-b6c4-5993fe406fdf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.183'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=825ba480-e228-47f3-8e23-c895bd3a1194, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=4abf20c2-f65e-479c-8fe9-62982b2fa096) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.391 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 4abf20c2-f65e-479c-8fe9-62982b2fa096 in datapath 454203c0-2170-41e3-a903-014f7d235b68 bound to our chassis#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.393 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 454203c0-2170-41e3-a903-014f7d235b68#033[00m
Oct  2 08:29:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:14Z|00503|binding|INFO|Setting lport 4abf20c2-f65e-479c-8fe9-62982b2fa096 ovn-installed in OVS
Oct  2 08:29:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:14Z|00504|binding|INFO|Setting lport 4abf20c2-f65e-479c-8fe9-62982b2fa096 up in Southbound
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.410 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e7f806-9d37-467b-b348-b1f96ee3a46c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.411 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap454203c0-21 in ovnmeta-454203c0-2170-41e3-a903-014f7d235b68 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:29:14 np0005465988 systemd-udevd[284828]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.413 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap454203c0-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.413 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce7d11f-eafe-40e3-aa96-71b2fec4841f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.414 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f3982ae2-57f3-4a81-9ac0-43a98775dc7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 NetworkManager[45041]: <info>  [1759408154.4264] device (tap4abf20c2-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:29:14 np0005465988 NetworkManager[45041]: <info>  [1759408154.4273] device (tap4abf20c2-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:29:14 np0005465988 systemd-machined[192594]: New machine qemu-48-instance-0000006c.
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.428 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[50e93cee-2007-49bd-92dc-1207c5cad4f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 systemd[1]: Started Virtual Machine qemu-48-instance-0000006c.
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.456 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cebcfe5c-1dfe-4a8b-8ece-1a27469fc798]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.500 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0a61e8-c36b-419f-834c-fb0704c07549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 NetworkManager[45041]: <info>  [1759408154.5080] manager: (tap454203c0-20): new Veth device (/org/freedesktop/NetworkManager/Devices/233)
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.509 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[adb97c69-8fc1-4535-8306-2902216e91f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 systemd-udevd[284832]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.566 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c600ea90-401d-431f-b7a5-8f90be6cb931]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.580 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[257d061c-828f-4d08-b33a-22695acbba14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 NetworkManager[45041]: <info>  [1759408154.6159] device (tap454203c0-20): carrier: link connected
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.624 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[47c2b8e4-0762-4e7d-803c-327eb8d4aae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.644 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[097a5c33-9fc9-4cb7-8fa2-a6a3c4e7cf77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap454203c0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:aa:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 621494, 'reachable_time': 24965, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284861, 'error': None, 'target': 'ovnmeta-454203c0-2170-41e3-a903-014f7d235b68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.676 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[32ff2d88-729d-4e2e-a6dc-5f82bede51d9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:aa5d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 621494, 'tstamp': 621494}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 284862, 'error': None, 'target': 'ovnmeta-454203c0-2170-41e3-a903-014f7d235b68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.698 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[52fecc74-821a-45ea-b7d5-b830339f1a54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap454203c0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:aa:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 621494, 'reachable_time': 24965, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 284863, 'error': None, 'target': 'ovnmeta-454203c0-2170-41e3-a903-014f7d235b68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.738 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[307f6558-e3d6-4402-bee9-cecce698febf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.827 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3431bbcb-c72d-4082-9dad-e91c0f8f4218]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.831 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap454203c0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.833 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.833 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap454203c0-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005465988 NetworkManager[45041]: <info>  [1759408154.8372] manager: (tap454203c0-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Oct  2 08:29:14 np0005465988 kernel: tap454203c0-20: entered promiscuous mode
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.849 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap454203c0-20, col_values=(('external_ids', {'iface-id': '785d24d0-2f0b-4375-86f8-0edd313fa257'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:14Z|00505|binding|INFO|Releasing lport 785d24d0-2f0b-4375-86f8-0edd313fa257 from this chassis (sb_readonly=0)
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.852 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/454203c0-2170-41e3-a903-014f7d235b68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/454203c0-2170-41e3-a903-014f7d235b68.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.853 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6cdc9f-279d-463c-9c4a-e0b813822149]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.854 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-454203c0-2170-41e3-a903-014f7d235b68
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/454203c0-2170-41e3-a903-014f7d235b68.pid.haproxy
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 454203c0-2170-41e3-a903-014f7d235b68
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:29:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:14.855 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-454203c0-2170-41e3-a903-014f7d235b68', 'env', 'PROCESS_TAG=haproxy-454203c0-2170-41e3-a903-014f7d235b68', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/454203c0-2170-41e3-a903-014f7d235b68.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:29:14 np0005465988 nova_compute[236126]: 2025-10-02 12:29:14.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:29:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1321179616' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.162 2 DEBUG nova.compute.manager [req-2e301b00-8cb3-43bd-bdf7-fcf77ea190e5 req-f77499df-6c00-4637-b663-564a303363d7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.162 2 DEBUG oslo_concurrency.lockutils [req-2e301b00-8cb3-43bd-bdf7-fcf77ea190e5 req-f77499df-6c00-4637-b663-564a303363d7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.162 2 DEBUG oslo_concurrency.lockutils [req-2e301b00-8cb3-43bd-bdf7-fcf77ea190e5 req-f77499df-6c00-4637-b663-564a303363d7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.163 2 DEBUG oslo_concurrency.lockutils [req-2e301b00-8cb3-43bd-bdf7-fcf77ea190e5 req-f77499df-6c00-4637-b663-564a303363d7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.163 2 DEBUG nova.compute.manager [req-2e301b00-8cb3-43bd-bdf7-fcf77ea190e5 req-f77499df-6c00-4637-b663-564a303363d7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] No waiting events found dispatching network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.163 2 WARNING nova.compute.manager [req-2e301b00-8cb3-43bd-bdf7-fcf77ea190e5 req-f77499df-6c00-4637-b663-564a303363d7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received unexpected event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Oct  2 08:29:15 np0005465988 podman[284938]: 2025-10-02 12:29:15.309358078 +0000 UTC m=+0.056175917 container create 967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:29:15 np0005465988 podman[284938]: 2025-10-02 12:29:15.282482335 +0000 UTC m=+0.029300194 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:29:15 np0005465988 systemd[1]: Started libpod-conmon-967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38.scope.
Oct  2 08:29:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:15.421 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:15 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:29:15 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c242537b206603c5c2409d4b7bee59546610202210678b0f64558ed707dc344d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:29:15 np0005465988 podman[284938]: 2025-10-02 12:29:15.450087016 +0000 UTC m=+0.196904885 container init 967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:29:15 np0005465988 podman[284938]: 2025-10-02 12:29:15.459689492 +0000 UTC m=+0.206507331 container start 967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:29:15 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[284954]: [NOTICE]   (284958) : New worker (284960) forked
Oct  2 08:29:15 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[284954]: [NOTICE]   (284958) : Loading success.
Oct  2 08:29:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:15.538 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.577 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408155.5765767, 1e47e923-75c9-4c8c-b5f3-86f715462a64 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.577 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.581 2 DEBUG nova.compute.manager [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.581 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.585 2 INFO nova.virt.libvirt.driver [-] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Instance spawned successfully.#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.586 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.615 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.620 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.632 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.633 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.634 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.634 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.635 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.635 2 DEBUG nova.virt.libvirt.driver [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.691 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.692 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408155.5790462, 1e47e923-75c9-4c8c-b5f3-86f715462a64 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.692 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] VM Started (Lifecycle Event)#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.733 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.738 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.760 2 DEBUG nova.compute.manager [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 08:29:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:15.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.767 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.825 2 DEBUG oslo_concurrency.lockutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.826 2 DEBUG oslo_concurrency.lockutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.826 2 DEBUG nova.objects.instance [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:29:15 np0005465988 nova_compute[236126]: 2025-10-02 12:29:15.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:16 np0005465988 nova_compute[236126]: 2025-10-02 12:29:16.000 2 DEBUG oslo_concurrency.lockutils [None req-a287238d-80ed-4643-9861-16930c483535 c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:29:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:16.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:29:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:29:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:29:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:29:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1564795986' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:29:17 np0005465988 nova_compute[236126]: 2025-10-02 12:29:17.520 2 DEBUG nova.compute.manager [req-0e397c29-515c-4ac2-9859-52a1932e7922 req-ce4e5170-c197-4712-bdea-e176ae9458c3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:17 np0005465988 nova_compute[236126]: 2025-10-02 12:29:17.522 2 DEBUG oslo_concurrency.lockutils [req-0e397c29-515c-4ac2-9859-52a1932e7922 req-ce4e5170-c197-4712-bdea-e176ae9458c3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:17 np0005465988 nova_compute[236126]: 2025-10-02 12:29:17.522 2 DEBUG oslo_concurrency.lockutils [req-0e397c29-515c-4ac2-9859-52a1932e7922 req-ce4e5170-c197-4712-bdea-e176ae9458c3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:17 np0005465988 nova_compute[236126]: 2025-10-02 12:29:17.522 2 DEBUG oslo_concurrency.lockutils [req-0e397c29-515c-4ac2-9859-52a1932e7922 req-ce4e5170-c197-4712-bdea-e176ae9458c3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:17 np0005465988 nova_compute[236126]: 2025-10-02 12:29:17.523 2 DEBUG nova.compute.manager [req-0e397c29-515c-4ac2-9859-52a1932e7922 req-ce4e5170-c197-4712-bdea-e176ae9458c3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] No waiting events found dispatching network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:17 np0005465988 nova_compute[236126]: 2025-10-02 12:29:17.523 2 WARNING nova.compute.manager [req-0e397c29-515c-4ac2-9859-52a1932e7922 req-ce4e5170-c197-4712-bdea-e176ae9458c3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received unexpected event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:29:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:17.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:18.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:18 np0005465988 nova_compute[236126]: 2025-10-02 12:29:18.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:18.541 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:19 np0005465988 podman[285022]: 2025-10-02 12:29:19.564522927 +0000 UTC m=+0.086121678 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible)
Oct  2 08:29:19 np0005465988 podman[285023]: 2025-10-02 12:29:19.579596681 +0000 UTC m=+0.095473087 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:29:19 np0005465988 podman[285021]: 2025-10-02 12:29:19.595504349 +0000 UTC m=+0.125524612 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:29:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:19.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:20.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:21 np0005465988 nova_compute[236126]: 2025-10-02 12:29:21.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:21.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:22.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:23 np0005465988 nova_compute[236126]: 2025-10-02 12:29:23.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:23.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:24.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:25.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:26 np0005465988 nova_compute[236126]: 2025-10-02 12:29:26.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:26.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:27.359 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:27.360 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:27.360 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:27.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:28.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:28 np0005465988 nova_compute[236126]: 2025-10-02 12:29:28.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:28Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a1:c1:78 10.100.0.10
Oct  2 08:29:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:28Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a1:c1:78 10.100.0.10
Oct  2 08:29:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:29.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:30.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:30 np0005465988 podman[285140]: 2025-10-02 12:29:30.562047162 +0000 UTC m=+0.084098550 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:29:31 np0005465988 nova_compute[236126]: 2025-10-02 12:29:31.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:31.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:29:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:32.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:29:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:33 np0005465988 nova_compute[236126]: 2025-10-02 12:29:33.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:33 np0005465988 nova_compute[236126]: 2025-10-02 12:29:33.709 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:33 np0005465988 nova_compute[236126]: 2025-10-02 12:29:33.709 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:33 np0005465988 nova_compute[236126]: 2025-10-02 12:29:33.736 2 DEBUG nova.compute.manager [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:29:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:33.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:34 np0005465988 nova_compute[236126]: 2025-10-02 12:29:34.059 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:34 np0005465988 nova_compute[236126]: 2025-10-02 12:29:34.059 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:34 np0005465988 nova_compute[236126]: 2025-10-02 12:29:34.118 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:29:34 np0005465988 nova_compute[236126]: 2025-10-02 12:29:34.119 2 INFO nova.compute.claims [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:29:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:34.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:34 np0005465988 nova_compute[236126]: 2025-10-02 12:29:34.358 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:29:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3796285016' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:29:34 np0005465988 nova_compute[236126]: 2025-10-02 12:29:34.846 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:34 np0005465988 nova_compute[236126]: 2025-10-02 12:29:34.858 2 DEBUG nova.compute.provider_tree [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:29:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:35.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:36.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.560 2 DEBUG nova.scheduler.client.report [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.591 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.531s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.592 2 DEBUG nova.compute.manager [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.681 2 DEBUG nova.compute.manager [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.682 2 DEBUG nova.network.neutron [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.718 2 INFO nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.757 2 DEBUG nova.compute.manager [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.927 2 DEBUG nova.compute.manager [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.928 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.928 2 INFO nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Creating image(s)#033[00m
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.963 2 DEBUG nova.storage.rbd_utils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image 4297c5cd-77b6-4f80-a746-11b304df8c90_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:29:36 np0005465988 nova_compute[236126]: 2025-10-02 12:29:36.993 2 DEBUG nova.storage.rbd_utils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image 4297c5cd-77b6-4f80-a746-11b304df8c90_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.019 2 DEBUG nova.storage.rbd_utils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image 4297c5cd-77b6-4f80-a746-11b304df8c90_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.023 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.095 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.096 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.096 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.097 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.124 2 DEBUG nova.storage.rbd_utils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image 4297c5cd-77b6-4f80-a746-11b304df8c90_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.127 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 4297c5cd-77b6-4f80-a746-11b304df8c90_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.164 2 DEBUG nova.policy [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '22d56fcd2a4b4851bfd126ae4548ee9b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5533aaac08cd4856af72ef4992bb5e76', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.168 2 DEBUG oslo_concurrency.lockutils [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.168 2 DEBUG oslo_concurrency.lockutils [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.169 2 DEBUG oslo_concurrency.lockutils [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.169 2 DEBUG oslo_concurrency.lockutils [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.170 2 DEBUG oslo_concurrency.lockutils [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.171 2 INFO nova.compute.manager [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Terminating instance#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.173 2 DEBUG nova.compute.manager [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:29:37 np0005465988 kernel: tap4abf20c2-f6 (unregistering): left promiscuous mode
Oct  2 08:29:37 np0005465988 NetworkManager[45041]: <info>  [1759408177.2352] device (tap4abf20c2-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:29:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:37Z|00506|binding|INFO|Releasing lport 4abf20c2-f65e-479c-8fe9-62982b2fa096 from this chassis (sb_readonly=0)
Oct  2 08:29:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:37Z|00507|binding|INFO|Setting lport 4abf20c2-f65e-479c-8fe9-62982b2fa096 down in Southbound
Oct  2 08:29:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:37Z|00508|binding|INFO|Removing iface tap4abf20c2-f6 ovn-installed in OVS
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.296 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:c1:78 10.100.0.10'], port_security=['fa:16:3e:a1:c1:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1e47e923-75c9-4c8c-b5f3-86f715462a64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-454203c0-2170-41e3-a903-014f7d235b68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aad01ba5df14c1a9309451d0daaab83', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6993526b-112b-49f0-b6c4-5993fe406fdf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.183', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=825ba480-e228-47f3-8e23-c895bd3a1194, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=4abf20c2-f65e-479c-8fe9-62982b2fa096) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.297 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 4abf20c2-f65e-479c-8fe9-62982b2fa096 in datapath 454203c0-2170-41e3-a903-014f7d235b68 unbound from our chassis#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.298 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 454203c0-2170-41e3-a903-014f7d235b68, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.301 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5644cd4a-676e-4319-ae1e-7b85e37003e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.302 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-454203c0-2170-41e3-a903-014f7d235b68 namespace which is not needed anymore#033[00m
Oct  2 08:29:37 np0005465988 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Oct  2 08:29:37 np0005465988 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006c.scope: Consumed 14.283s CPU time.
Oct  2 08:29:37 np0005465988 systemd-machined[192594]: Machine qemu-48-instance-0000006c terminated.
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.421 2 INFO nova.virt.libvirt.driver [-] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Instance destroyed successfully.#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.422 2 DEBUG nova.objects.instance [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lazy-loading 'resources' on Instance uuid 1e47e923-75c9-4c8c-b5f3-86f715462a64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.458 2 DEBUG nova.virt.libvirt.vif [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-2141659460',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-1358372400',id=108,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJsc13H1E1twasivn4l5fIknm6WxTVFzYRGRsGTsk9ZC9W5y41hzk124eOR+2RfREafyMEhaDPvhujygatWcUNbG54vVZEPcxsJp0CaKKZ+PeWEtzqfW2Ozb0JbEn7mctQ==',key_name='tempest-keypair-294989539',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:29:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7aad01ba5df14c1a9309451d0daaab83',ramdisk_id='',reservation_id='r-pdakyd3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-1846432011',owner_user_name='tempest-ServerActionsV293TestJSON-1846432011-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:29:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c6bda06d7e6348bab069e07b21022b60',uuid=1e47e923-75c9-4c8c-b5f3-86f715462a64,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:29:37 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[284954]: [NOTICE]   (284958) : haproxy version is 2.8.14-c23fe91
Oct  2 08:29:37 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[284954]: [NOTICE]   (284958) : path to executable is /usr/sbin/haproxy
Oct  2 08:29:37 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[284954]: [WARNING]  (284958) : Exiting Master process...
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.459 2 DEBUG nova.network.os_vif_util [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converting VIF {"id": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "address": "fa:16:3e:a1:c1:78", "network": {"id": "454203c0-2170-41e3-a903-014f7d235b68", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-510689424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7aad01ba5df14c1a9309451d0daaab83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4abf20c2-f6", "ovs_interfaceid": "4abf20c2-f65e-479c-8fe9-62982b2fa096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.462 2 DEBUG nova.network.os_vif_util [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:37 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[284954]: [ALERT]    (284958) : Current worker (284960) exited with code 143 (Terminated)
Oct  2 08:29:37 np0005465988 neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68[284954]: [WARNING]  (284958) : All workers exited. Exiting... (0)
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.462 2 DEBUG os_vif [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.464 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4abf20c2-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:37 np0005465988 systemd[1]: libpod-967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38.scope: Deactivated successfully.
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:29:37 np0005465988 podman[285305]: 2025-10-02 12:29:37.471776183 +0000 UTC m=+0.055490668 container died 967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.471 2 INFO os_vif [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:c1:78,bridge_name='br-int',has_traffic_filtering=True,id=4abf20c2-f65e-479c-8fe9-62982b2fa096,network=Network(454203c0-2170-41e3-a903-014f7d235b68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4abf20c2-f6')#033[00m
Oct  2 08:29:37 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38-userdata-shm.mount: Deactivated successfully.
Oct  2 08:29:37 np0005465988 systemd[1]: var-lib-containers-storage-overlay-c242537b206603c5c2409d4b7bee59546610202210678b0f64558ed707dc344d-merged.mount: Deactivated successfully.
Oct  2 08:29:37 np0005465988 podman[285305]: 2025-10-02 12:29:37.550906999 +0000 UTC m=+0.134621454 container cleanup 967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:29:37 np0005465988 systemd[1]: libpod-conmon-967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38.scope: Deactivated successfully.
Oct  2 08:29:37 np0005465988 podman[285360]: 2025-10-02 12:29:37.624228468 +0000 UTC m=+0.045545511 container remove 967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.630 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[48985ac8-f6c7-4f3d-9c0d-a8541376306d]: (4, ('Thu Oct  2 12:29:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68 (967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38)\n967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38\nThu Oct  2 12:29:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-454203c0-2170-41e3-a903-014f7d235b68 (967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38)\n967bc10cd62f910eacd179d0b3fbcc0a5957ef22011821c93597c237b73aad38\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.631 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[71d141b6-9df3-41ee-98e7-db12118e38e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.633 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap454203c0-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:37 np0005465988 kernel: tap454203c0-20: left promiscuous mode
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.639 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 4297c5cd-77b6-4f80-a746-11b304df8c90_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.655 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9fbed2f5-2f1d-4bca-9824-a9d2b762633b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.691 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c3c964e6-8749-4d2b-bec0-d50a2cbd60ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.693 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[be0f1d5c-0a98-4d44-b365-05454fd3b4e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.710 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[66f125b5-13cf-41dc-ad91-9567fc96ce4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 621482, 'reachable_time': 32603, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285401, 'error': None, 'target': 'ovnmeta-454203c0-2170-41e3-a903-014f7d235b68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:37 np0005465988 systemd[1]: run-netns-ovnmeta\x2d454203c0\x2d2170\x2d41e3\x2da903\x2d014f7d235b68.mount: Deactivated successfully.
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.712 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-454203c0-2170-41e3-a903-014f7d235b68 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:29:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:37.712 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[53669c32-73bf-4efd-92fa-b01e202a44b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.728 2 DEBUG nova.compute.manager [req-e3c5590d-3df2-491d-a5e5-8e0b7a2a44bb req-428fc944-da11-4479-86ec-2585bb2fcfb9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-vif-unplugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.729 2 DEBUG oslo_concurrency.lockutils [req-e3c5590d-3df2-491d-a5e5-8e0b7a2a44bb req-428fc944-da11-4479-86ec-2585bb2fcfb9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.729 2 DEBUG oslo_concurrency.lockutils [req-e3c5590d-3df2-491d-a5e5-8e0b7a2a44bb req-428fc944-da11-4479-86ec-2585bb2fcfb9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.729 2 DEBUG oslo_concurrency.lockutils [req-e3c5590d-3df2-491d-a5e5-8e0b7a2a44bb req-428fc944-da11-4479-86ec-2585bb2fcfb9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.730 2 DEBUG nova.compute.manager [req-e3c5590d-3df2-491d-a5e5-8e0b7a2a44bb req-428fc944-da11-4479-86ec-2585bb2fcfb9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] No waiting events found dispatching network-vif-unplugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.730 2 DEBUG nova.compute.manager [req-e3c5590d-3df2-491d-a5e5-8e0b7a2a44bb req-428fc944-da11-4479-86ec-2585bb2fcfb9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-vif-unplugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.736 2 DEBUG nova.storage.rbd_utils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] resizing rbd image 4297c5cd-77b6-4f80-a746-11b304df8c90_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:29:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:37.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.805 2 INFO nova.virt.libvirt.driver [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Deleting instance files /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64_del#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.806 2 INFO nova.virt.libvirt.driver [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Deletion of /var/lib/nova/instances/1e47e923-75c9-4c8c-b5f3-86f715462a64_del complete#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.883 2 DEBUG nova.objects.instance [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'migration_context' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.940 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.941 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Ensure instance console log exists: /var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.941 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.941 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.942 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.980 2 INFO nova.compute.manager [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Took 0.81 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.981 2 DEBUG oslo.service.loopingcall [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.981 2 DEBUG nova.compute.manager [-] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:29:37 np0005465988 nova_compute[236126]: 2025-10-02 12:29:37.981 2 DEBUG nova.network.neutron [-] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:29:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:38.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:38 np0005465988 nova_compute[236126]: 2025-10-02 12:29:38.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:39 np0005465988 nova_compute[236126]: 2025-10-02 12:29:39.368 2 DEBUG nova.network.neutron [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Successfully created port: 7cf26487-91ca-4d15-85f3-bb6a66393796 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:29:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:39.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:39 np0005465988 nova_compute[236126]: 2025-10-02 12:29:39.848 2 DEBUG nova.network.neutron [-] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:39 np0005465988 nova_compute[236126]: 2025-10-02 12:29:39.886 2 INFO nova.compute.manager [-] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Took 1.90 seconds to deallocate network for instance.#033[00m
Oct  2 08:29:39 np0005465988 nova_compute[236126]: 2025-10-02 12:29:39.950 2 DEBUG nova.compute.manager [req-9e1fe4ae-99fe-4d8b-89b4-5cfe285e36c2 req-5b4abe7f-195b-468c-980a-f48d65e0813c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:39 np0005465988 nova_compute[236126]: 2025-10-02 12:29:39.951 2 DEBUG oslo_concurrency.lockutils [req-9e1fe4ae-99fe-4d8b-89b4-5cfe285e36c2 req-5b4abe7f-195b-468c-980a-f48d65e0813c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:39 np0005465988 nova_compute[236126]: 2025-10-02 12:29:39.951 2 DEBUG oslo_concurrency.lockutils [req-9e1fe4ae-99fe-4d8b-89b4-5cfe285e36c2 req-5b4abe7f-195b-468c-980a-f48d65e0813c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:39 np0005465988 nova_compute[236126]: 2025-10-02 12:29:39.952 2 DEBUG oslo_concurrency.lockutils [req-9e1fe4ae-99fe-4d8b-89b4-5cfe285e36c2 req-5b4abe7f-195b-468c-980a-f48d65e0813c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:39 np0005465988 nova_compute[236126]: 2025-10-02 12:29:39.952 2 DEBUG nova.compute.manager [req-9e1fe4ae-99fe-4d8b-89b4-5cfe285e36c2 req-5b4abe7f-195b-468c-980a-f48d65e0813c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] No waiting events found dispatching network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:39 np0005465988 nova_compute[236126]: 2025-10-02 12:29:39.952 2 WARNING nova.compute.manager [req-9e1fe4ae-99fe-4d8b-89b4-5cfe285e36c2 req-5b4abe7f-195b-468c-980a-f48d65e0813c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received unexpected event network-vif-plugged-4abf20c2-f65e-479c-8fe9-62982b2fa096 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:29:40 np0005465988 nova_compute[236126]: 2025-10-02 12:29:40.067 2 DEBUG nova.compute.manager [req-d34d25bd-9ad7-4a3b-bed3-4844ba5d94b2 req-eb6c22f1-8fcf-4353-ac15-ae0215dce5a3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Received event network-vif-deleted-4abf20c2-f65e-479c-8fe9-62982b2fa096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:40.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:40 np0005465988 nova_compute[236126]: 2025-10-02 12:29:40.560 2 INFO nova.compute.manager [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Took 0.67 seconds to detach 1 volumes for instance.#033[00m
Oct  2 08:29:40 np0005465988 nova_compute[236126]: 2025-10-02 12:29:40.562 2 DEBUG nova.compute.manager [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Deleting volume: 4c9d8719-7ebc-4171-8171-a0e55d9f5992 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Oct  2 08:29:40 np0005465988 nova_compute[236126]: 2025-10-02 12:29:40.889 2 DEBUG oslo_concurrency.lockutils [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:40 np0005465988 nova_compute[236126]: 2025-10-02 12:29:40.890 2 DEBUG oslo_concurrency.lockutils [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.045 2 DEBUG oslo_concurrency.processutils [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.392 2 DEBUG nova.network.neutron [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Successfully updated port: 7cf26487-91ca-4d15-85f3-bb6a66393796 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:29:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:29:41 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3513244713' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.534 2 DEBUG oslo_concurrency.processutils [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.545 2 DEBUG nova.compute.provider_tree [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.573 2 DEBUG nova.scheduler.client.report [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.583 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.584 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquired lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.584 2 DEBUG nova.network.neutron [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.617 2 DEBUG oslo_concurrency.lockutils [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.727s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.650 2 INFO nova.scheduler.client.report [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Deleted allocations for instance 1e47e923-75c9-4c8c-b5f3-86f715462a64#033[00m
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.773 2 DEBUG oslo_concurrency.lockutils [None req-62e582fa-d044-4f75-bf3b-882b6ef4143a c6bda06d7e6348bab069e07b21022b60 7aad01ba5df14c1a9309451d0daaab83 - - default default] Lock "1e47e923-75c9-4c8c-b5f3-86f715462a64" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:41.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:41 np0005465988 nova_compute[236126]: 2025-10-02 12:29:41.849 2 DEBUG nova.network.neutron [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:29:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:42.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:42 np0005465988 nova_compute[236126]: 2025-10-02 12:29:42.322 2 DEBUG nova.compute.manager [req-5f844c16-1b46-4374-a781-62718682c2e6 req-0e80a67f-7727-4c41-8f28-2591c6608dcb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-changed-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:42 np0005465988 nova_compute[236126]: 2025-10-02 12:29:42.323 2 DEBUG nova.compute.manager [req-5f844c16-1b46-4374-a781-62718682c2e6 req-0e80a67f-7727-4c41-8f28-2591c6608dcb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Refreshing instance network info cache due to event network-changed-7cf26487-91ca-4d15-85f3-bb6a66393796. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:29:42 np0005465988 nova_compute[236126]: 2025-10-02 12:29:42.323 2 DEBUG oslo_concurrency.lockutils [req-5f844c16-1b46-4374-a781-62718682c2e6 req-0e80a67f-7727-4c41-8f28-2591c6608dcb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:42 np0005465988 nova_compute[236126]: 2025-10-02 12:29:42.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:43 np0005465988 nova_compute[236126]: 2025-10-02 12:29:43.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:43.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.024 2 DEBUG nova.network.neutron [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.069 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Releasing lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.069 2 DEBUG nova.compute.manager [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Instance network_info: |[{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.070 2 DEBUG oslo_concurrency.lockutils [req-5f844c16-1b46-4374-a781-62718682c2e6 req-0e80a67f-7727-4c41-8f28-2591c6608dcb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.071 2 DEBUG nova.network.neutron [req-5f844c16-1b46-4374-a781-62718682c2e6 req-0e80a67f-7727-4c41-8f28-2591c6608dcb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Refreshing network info cache for port 7cf26487-91ca-4d15-85f3-bb6a66393796 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.076 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Start _get_guest_xml network_info=[{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.090 2 WARNING nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.098 2 DEBUG nova.virt.libvirt.host [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.099 2 DEBUG nova.virt.libvirt.host [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.109 2 DEBUG nova.virt.libvirt.host [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.109 2 DEBUG nova.virt.libvirt.host [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.111 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.112 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.112 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.113 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.113 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.114 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.114 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.115 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.115 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.116 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.116 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.117 2 DEBUG nova.virt.hardware [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.122 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:29:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:44.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:29:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:29:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3141947561' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.631 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.672 2 DEBUG nova.storage.rbd_utils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image 4297c5cd-77b6-4f80-a746-11b304df8c90_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:29:44 np0005465988 nova_compute[236126]: 2025-10-02 12:29:44.679 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:29:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2274156463' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.157 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.160 2 DEBUG nova.virt.libvirt.vif [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:29:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=114,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBEmvr2XQXDnaV0WQDbbXt57cEK6okdC4PHEYdjpQBx2HU9OQgvgRTm3sGWmsa/AInUTPV9ABsCq2lJ9PCqfb1WP51XCZeB9QBIxafEy8h788huF0550ajkopZIwmSLpiA==',key_name='tempest-keypair-425033456',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5533aaac08cd4856af72ef4992bb5e76',ramdisk_id='',reservation_id='r-w0tlxvyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1564585024',owner_user_name='tempest-AttachVolumeMultiAttachTest-1564585024-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:29:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22d56fcd2a4b4851bfd126ae4548ee9b',uuid=4297c5cd-77b6-4f80-a746-11b304df8c90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.161 2 DEBUG nova.network.os_vif_util [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converting VIF {"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.162 2 DEBUG nova.network.os_vif_util [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.164 2 DEBUG nova.objects.instance [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.193 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  <uuid>4297c5cd-77b6-4f80-a746-11b304df8c90</uuid>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  <name>instance-00000072</name>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <nova:name>multiattach-server-0</nova:name>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:29:44</nova:creationTime>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <nova:user uuid="22d56fcd2a4b4851bfd126ae4548ee9b">tempest-AttachVolumeMultiAttachTest-1564585024-project-member</nova:user>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <nova:project uuid="5533aaac08cd4856af72ef4992bb5e76">tempest-AttachVolumeMultiAttachTest-1564585024</nova:project>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <nova:port uuid="7cf26487-91ca-4d15-85f3-bb6a66393796">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <entry name="serial">4297c5cd-77b6-4f80-a746-11b304df8c90</entry>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <entry name="uuid">4297c5cd-77b6-4f80-a746-11b304df8c90</entry>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/4297c5cd-77b6-4f80-a746-11b304df8c90_disk">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/4297c5cd-77b6-4f80-a746-11b304df8c90_disk.config">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:60:9d:7c"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <target dev="tap7cf26487-91"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90/console.log" append="off"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:29:45 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:29:45 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:29:45 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:29:45 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.195 2 DEBUG nova.compute.manager [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Preparing to wait for external event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.196 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.196 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.197 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.198 2 DEBUG nova.virt.libvirt.vif [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:29:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=114,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBEmvr2XQXDnaV0WQDbbXt57cEK6okdC4PHEYdjpQBx2HU9OQgvgRTm3sGWmsa/AInUTPV9ABsCq2lJ9PCqfb1WP51XCZeB9QBIxafEy8h788huF0550ajkopZIwmSLpiA==',key_name='tempest-keypair-425033456',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5533aaac08cd4856af72ef4992bb5e76',ramdisk_id='',reservation_id='r-w0tlxvyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1564585024',owner_user_name='tempest-AttachVolumeMultiAttachTest-1564585024-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:29:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22d56fcd2a4b4851bfd126ae4548ee9b',uuid=4297c5cd-77b6-4f80-a746-11b304df8c90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.199 2 DEBUG nova.network.os_vif_util [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converting VIF {"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.201 2 DEBUG nova.network.os_vif_util [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.201 2 DEBUG os_vif [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.204 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.204 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.211 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7cf26487-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.212 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7cf26487-91, col_values=(('external_ids', {'iface-id': '7cf26487-91ca-4d15-85f3-bb6a66393796', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:9d:7c', 'vm-uuid': '4297c5cd-77b6-4f80-a746-11b304df8c90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:45 np0005465988 NetworkManager[45041]: <info>  [1759408185.2162] manager: (tap7cf26487-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.223 2 INFO os_vif [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91')#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.302 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.303 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.303 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No VIF found with MAC fa:16:3e:60:9d:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.304 2 INFO nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Using config drive#033[00m
Oct  2 08:29:45 np0005465988 nova_compute[236126]: 2025-10-02 12:29:45.344 2 DEBUG nova.storage.rbd_utils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image 4297c5cd-77b6-4f80-a746-11b304df8c90_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:29:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:45.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.001 2 INFO nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Creating config drive at /var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90/disk.config#033[00m
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.009 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa0llqru3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:46.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.173 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa0llqru3" returned: 0 in 0.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.222 2 DEBUG nova.storage.rbd_utils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image 4297c5cd-77b6-4f80-a746-11b304df8c90_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.228 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90/disk.config 4297c5cd-77b6-4f80-a746-11b304df8c90_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.432 2 DEBUG nova.network.neutron [req-5f844c16-1b46-4374-a781-62718682c2e6 req-0e80a67f-7727-4c41-8f28-2591c6608dcb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updated VIF entry in instance network info cache for port 7cf26487-91ca-4d15-85f3-bb6a66393796. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.433 2 DEBUG nova.network.neutron [req-5f844c16-1b46-4374-a781-62718682c2e6 req-0e80a67f-7727-4c41-8f28-2591c6608dcb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.472 2 DEBUG oslo_concurrency.lockutils [req-5f844c16-1b46-4374-a781-62718682c2e6 req-0e80a67f-7727-4c41-8f28-2591c6608dcb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.647 2 DEBUG oslo_concurrency.processutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90/disk.config 4297c5cd-77b6-4f80-a746-11b304df8c90_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.649 2 INFO nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Deleting local config drive /var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90/disk.config because it was imported into RBD.#033[00m
Oct  2 08:29:46 np0005465988 kernel: tap7cf26487-91: entered promiscuous mode
Oct  2 08:29:46 np0005465988 NetworkManager[45041]: <info>  [1759408186.7303] manager: (tap7cf26487-91): new Tun device (/org/freedesktop/NetworkManager/Devices/236)
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:46Z|00509|binding|INFO|Claiming lport 7cf26487-91ca-4d15-85f3-bb6a66393796 for this chassis.
Oct  2 08:29:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:46Z|00510|binding|INFO|7cf26487-91ca-4d15-85f3-bb6a66393796: Claiming fa:16:3e:60:9d:7c 10.100.0.5
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.743 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:9d:7c 10.100.0.5'], port_security=['fa:16:3e:60:9d:7c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4297c5cd-77b6-4f80-a746-11b304df8c90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-585473f8-52e4-4e55-96df-8a236d361126', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5533aaac08cd4856af72ef4992bb5e76', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a7e36b3-799e-47d8-a152-7f7146431afe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec297f04-3bda-490f-87d3-1f684caf96fd, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7cf26487-91ca-4d15-85f3-bb6a66393796) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.744 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7cf26487-91ca-4d15-85f3-bb6a66393796 in datapath 585473f8-52e4-4e55-96df-8a236d361126 bound to our chassis#033[00m
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.746 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 585473f8-52e4-4e55-96df-8a236d361126#033[00m
Oct  2 08:29:46 np0005465988 systemd-udevd[285659]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.762 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d6589147-d2d6-4ca1-9e31-a3daf325da31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.764 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap585473f8-51 in ovnmeta-585473f8-52e4-4e55-96df-8a236d361126 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:29:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:46Z|00511|binding|INFO|Setting lport 7cf26487-91ca-4d15-85f3-bb6a66393796 ovn-installed in OVS
Oct  2 08:29:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:46Z|00512|binding|INFO|Setting lport 7cf26487-91ca-4d15-85f3-bb6a66393796 up in Southbound
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.768 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap585473f8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.768 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5d503699-3056-4d68-9746-4f52cee7844b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.771 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[17fb646c-ce7b-40f5-bcc7-b7915f5eaa40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:46 np0005465988 NetworkManager[45041]: <info>  [1759408186.7770] device (tap7cf26487-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:29:46 np0005465988 NetworkManager[45041]: <info>  [1759408186.7788] device (tap7cf26487-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:29:46 np0005465988 nova_compute[236126]: 2025-10-02 12:29:46.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:46 np0005465988 systemd-machined[192594]: New machine qemu-49-instance-00000072.
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.794 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d0c069-603a-464b-8e78-5e19ea4eee46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:46 np0005465988 systemd[1]: Started Virtual Machine qemu-49-instance-00000072.
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.817 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[52d39159-a9c9-4a73-b248-40646b1db2dd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.859 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[aaff5316-26e4-4637-9836-861dfdf7f4ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.866 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[db5bde49-7b0e-4ee9-ad87-5f01719371d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:46 np0005465988 systemd-udevd[285664]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:29:46 np0005465988 NetworkManager[45041]: <info>  [1759408186.8690] manager: (tap585473f8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/237)
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.910 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[cb75cbd1-52a0-453b-a671-555e23af5d35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.914 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[96f42eff-19ee-4664-a41b-cfe41f4e4172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:46 np0005465988 NetworkManager[45041]: <info>  [1759408186.9402] device (tap585473f8-50): carrier: link connected
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.950 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d784ae38-aa8f-4232-bcce-496c5a09fd18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.977 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c66c97fc-17ec-4264-9fd2-60ccb63a624a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap585473f8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:8e:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624727, 'reachable_time': 36955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285695, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:46.997 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[11462daf-b0cd-48e5-a22b-77161481399c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef3:8e12'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624727, 'tstamp': 624727}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285696, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:47.021 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[884510ec-ee5d-459e-ae0e-6312c31a41c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap585473f8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:8e:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624727, 'reachable_time': 36955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285697, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:47.057 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d341ec9f-ae7c-4154-bb9f-3e1ea66bc845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:47.154 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8936fcce-4a8e-40b5-853f-f26270f80453]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:47.156 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap585473f8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:47.157 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:47.157 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap585473f8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:47 np0005465988 kernel: tap585473f8-50: entered promiscuous mode
Oct  2 08:29:47 np0005465988 NetworkManager[45041]: <info>  [1759408187.1614] manager: (tap585473f8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/238)
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:47.166 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap585473f8-50, col_values=(('external_ids', {'iface-id': '02b7597d-2fc1-4c56-8603-4dcb0c716c82'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:47Z|00513|binding|INFO|Releasing lport 02b7597d-2fc1-4c56-8603-4dcb0c716c82 from this chassis (sb_readonly=0)
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:47.196 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/585473f8-52e4-4e55-96df-8a236d361126.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/585473f8-52e4-4e55-96df-8a236d361126.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:47.198 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[24c403cd-0ee8-4457-9c0a-8fdd8429fb25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:47.198 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-585473f8-52e4-4e55-96df-8a236d361126
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/585473f8-52e4-4e55-96df-8a236d361126.pid.haproxy
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 585473f8-52e4-4e55-96df-8a236d361126
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:29:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:29:47.201 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'env', 'PROCESS_TAG=haproxy-585473f8-52e4-4e55-96df-8a236d361126', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/585473f8-52e4-4e55-96df-8a236d361126.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.301 2 DEBUG nova.compute.manager [req-7cb6958b-ac47-4b90-b377-98b80700d94c req-91a94f10-3976-40e6-8b5d-6d2240e8940c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.302 2 DEBUG oslo_concurrency.lockutils [req-7cb6958b-ac47-4b90-b377-98b80700d94c req-91a94f10-3976-40e6-8b5d-6d2240e8940c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.302 2 DEBUG oslo_concurrency.lockutils [req-7cb6958b-ac47-4b90-b377-98b80700d94c req-91a94f10-3976-40e6-8b5d-6d2240e8940c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.302 2 DEBUG oslo_concurrency.lockutils [req-7cb6958b-ac47-4b90-b377-98b80700d94c req-91a94f10-3976-40e6-8b5d-6d2240e8940c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.303 2 DEBUG nova.compute.manager [req-7cb6958b-ac47-4b90-b377-98b80700d94c req-91a94f10-3976-40e6-8b5d-6d2240e8940c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Processing event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:29:47 np0005465988 podman[285771]: 2025-10-02 12:29:47.702645522 +0000 UTC m=+0.121074244 container create 653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:29:47 np0005465988 podman[285771]: 2025-10-02 12:29:47.61532334 +0000 UTC m=+0.033752052 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:29:47 np0005465988 systemd[1]: Started libpod-conmon-653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330.scope.
Oct  2 08:29:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:47 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:29:47 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82c8be227a724991a81492b38ae49b66b42ac396b2712168bfc34953b7a2aa9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:29:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:47.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:47 np0005465988 podman[285771]: 2025-10-02 12:29:47.815947831 +0000 UTC m=+0.234376563 container init 653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:29:47 np0005465988 podman[285771]: 2025-10-02 12:29:47.822566092 +0000 UTC m=+0.240994794 container start 653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:29:47 np0005465988 neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126[285786]: [NOTICE]   (285790) : New worker (285792) forked
Oct  2 08:29:47 np0005465988 neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126[285786]: [NOTICE]   (285790) : Loading success.
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.940 2 DEBUG nova.compute.manager [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.941 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408187.9392128, 4297c5cd-77b6-4f80-a746-11b304df8c90 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.942 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] VM Started (Lifecycle Event)#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.946 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.950 2 INFO nova.virt.libvirt.driver [-] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Instance spawned successfully.#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.951 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:29:47 np0005465988 nova_compute[236126]: 2025-10-02 12:29:47.995 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.000 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.001 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.001 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.002 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.002 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.003 2 DEBUG nova.virt.libvirt.driver [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.008 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.070 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.070 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408187.9407487, 4297c5cd-77b6-4f80-a746-11b304df8c90 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.071 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.114 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.119 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408187.9451118, 4297c5cd-77b6-4f80-a746-11b304df8c90 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.119 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.130 2 INFO nova.compute.manager [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Took 11.20 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.130 2 DEBUG nova.compute.manager [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:48.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.151 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.155 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.204 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.279 2 INFO nova.compute.manager [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Took 14.46 seconds to build instance.#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.308 2 DEBUG oslo_concurrency.lockutils [None req-5fc9522e-c5cb-4507-bac3-514d2b70361a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:48 np0005465988 nova_compute[236126]: 2025-10-02 12:29:48.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:49 np0005465988 nova_compute[236126]: 2025-10-02 12:29:49.450 2 DEBUG nova.compute.manager [req-8ad840bb-81ef-49da-8443-d5f7b3cc144b req-83b28c24-d2f6-4118-8b14-26ae2cafd54b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:49 np0005465988 nova_compute[236126]: 2025-10-02 12:29:49.451 2 DEBUG oslo_concurrency.lockutils [req-8ad840bb-81ef-49da-8443-d5f7b3cc144b req-83b28c24-d2f6-4118-8b14-26ae2cafd54b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:49 np0005465988 nova_compute[236126]: 2025-10-02 12:29:49.451 2 DEBUG oslo_concurrency.lockutils [req-8ad840bb-81ef-49da-8443-d5f7b3cc144b req-83b28c24-d2f6-4118-8b14-26ae2cafd54b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:49 np0005465988 nova_compute[236126]: 2025-10-02 12:29:49.452 2 DEBUG oslo_concurrency.lockutils [req-8ad840bb-81ef-49da-8443-d5f7b3cc144b req-83b28c24-d2f6-4118-8b14-26ae2cafd54b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:49 np0005465988 nova_compute[236126]: 2025-10-02 12:29:49.452 2 DEBUG nova.compute.manager [req-8ad840bb-81ef-49da-8443-d5f7b3cc144b req-83b28c24-d2f6-4118-8b14-26ae2cafd54b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] No waiting events found dispatching network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:49 np0005465988 nova_compute[236126]: 2025-10-02 12:29:49.453 2 WARNING nova.compute.manager [req-8ad840bb-81ef-49da-8443-d5f7b3cc144b req-83b28c24-d2f6-4118-8b14-26ae2cafd54b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received unexpected event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:29:49 np0005465988 ovn_controller[132601]: 2025-10-02T12:29:49Z|00514|binding|INFO|Releasing lport 02b7597d-2fc1-4c56-8603-4dcb0c716c82 from this chassis (sb_readonly=0)
Oct  2 08:29:49 np0005465988 nova_compute[236126]: 2025-10-02 12:29:49.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:49.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:50.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:50 np0005465988 nova_compute[236126]: 2025-10-02 12:29:50.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:50 np0005465988 podman[285803]: 2025-10-02 12:29:50.532348544 +0000 UTC m=+0.063052315 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:29:50 np0005465988 podman[285804]: 2025-10-02 12:29:50.546429809 +0000 UTC m=+0.068720848 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:29:50 np0005465988 podman[285802]: 2025-10-02 12:29:50.557075775 +0000 UTC m=+0.092768020 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller)
Oct  2 08:29:51 np0005465988 nova_compute[236126]: 2025-10-02 12:29:51.347 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:51 np0005465988 nova_compute[236126]: 2025-10-02 12:29:51.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:51 np0005465988 nova_compute[236126]: 2025-10-02 12:29:51.518 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:51 np0005465988 nova_compute[236126]: 2025-10-02 12:29:51.519 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:51 np0005465988 nova_compute[236126]: 2025-10-02 12:29:51.520 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:51 np0005465988 nova_compute[236126]: 2025-10-02 12:29:51.520 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:29:51 np0005465988 nova_compute[236126]: 2025-10-02 12:29:51.521 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:51.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:29:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3006823484' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:29:51 np0005465988 nova_compute[236126]: 2025-10-02 12:29:51.966 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.092 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.093 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:29:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:52.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.261 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.262 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4189MB free_disk=20.784996032714844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.263 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.263 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.417 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408177.4157405, 1e47e923-75c9-4c8c-b5f3-86f715462a64 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.418 2 INFO nova.compute.manager [-] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.463 2 DEBUG nova.compute.manager [None req-324bf673-a143-469b-a604-18718f85da8e - - - - - -] [instance: 1e47e923-75c9-4c8c-b5f3-86f715462a64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.729 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 4297c5cd-77b6-4f80-a746-11b304df8c90 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.730 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.730 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.752 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:29:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.776 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.777 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.813 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.877 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:29:52 np0005465988 nova_compute[236126]: 2025-10-02 12:29:52.982 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:29:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2162660339' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:29:53 np0005465988 nova_compute[236126]: 2025-10-02 12:29:53.414 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:53 np0005465988 nova_compute[236126]: 2025-10-02 12:29:53.420 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:29:53 np0005465988 nova_compute[236126]: 2025-10-02 12:29:53.443 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:29:53 np0005465988 nova_compute[236126]: 2025-10-02 12:29:53.491 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:29:53 np0005465988 nova_compute[236126]: 2025-10-02 12:29:53.492 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:53 np0005465988 nova_compute[236126]: 2025-10-02 12:29:53.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:53.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:29:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:54.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:29:55 np0005465988 nova_compute[236126]: 2025-10-02 12:29:55.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:55.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:29:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:56.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:29:56 np0005465988 nova_compute[236126]: 2025-10-02 12:29:56.493 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:56 np0005465988 nova_compute[236126]: 2025-10-02 12:29:56.493 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:56 np0005465988 nova_compute[236126]: 2025-10-02 12:29:56.493 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:56 np0005465988 nova_compute[236126]: 2025-10-02 12:29:56.494 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:29:57 np0005465988 nova_compute[236126]: 2025-10-02 12:29:57.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:29:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:57.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.029 2 DEBUG nova.compute.manager [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Oct  2 08:29:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:29:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:29:58.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.188 2 DEBUG oslo_concurrency.lockutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.189 2 DEBUG oslo_concurrency.lockutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.261 2 DEBUG nova.objects.instance [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'pci_requests' on Instance uuid 634c38a6-caab-410d-8748-3ec1fd6f9cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.301 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.302 2 INFO nova.compute.claims [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.302 2 DEBUG nova.objects.instance [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'resources' on Instance uuid 634c38a6-caab-410d-8748-3ec1fd6f9cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.321 2 DEBUG nova.objects.instance [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 634c38a6-caab-410d-8748-3ec1fd6f9cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.382 2 INFO nova.compute.resource_tracker [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Updating resource usage from migration 86470150-13cf-4780-a5e8-7915418b1cd9#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.383 2 DEBUG nova.compute.resource_tracker [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Starting to track incoming migration 86470150-13cf-4780-a5e8-7915418b1cd9 with flavor eb3a53f1-304b-4cb0-acc3-abffce0fb181 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.513 2 DEBUG oslo_concurrency.processutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.649 2 DEBUG nova.compute.manager [req-9265e47f-905a-46b3-ae77-60315e591157 req-401a276b-cf17-431f-b63c-6e902b522c2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-changed-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.650 2 DEBUG nova.compute.manager [req-9265e47f-905a-46b3-ae77-60315e591157 req-401a276b-cf17-431f-b63c-6e902b522c2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Refreshing instance network info cache due to event network-changed-7cf26487-91ca-4d15-85f3-bb6a66393796. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.650 2 DEBUG oslo_concurrency.lockutils [req-9265e47f-905a-46b3-ae77-60315e591157 req-401a276b-cf17-431f-b63c-6e902b522c2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.651 2 DEBUG oslo_concurrency.lockutils [req-9265e47f-905a-46b3-ae77-60315e591157 req-401a276b-cf17-431f-b63c-6e902b522c2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.651 2 DEBUG nova.network.neutron [req-9265e47f-905a-46b3-ae77-60315e591157 req-401a276b-cf17-431f-b63c-6e902b522c2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Refreshing network info cache for port 7cf26487-91ca-4d15-85f3-bb6a66393796 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:29:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:29:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2680950879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.965 2 DEBUG oslo_concurrency.processutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.970 2 DEBUG nova.compute.provider_tree [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:29:58 np0005465988 nova_compute[236126]: 2025-10-02 12:29:58.995 2 DEBUG nova.scheduler.client.report [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:29:59 np0005465988 nova_compute[236126]: 2025-10-02 12:29:59.055 2 DEBUG oslo_concurrency.lockutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.866s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:59 np0005465988 nova_compute[236126]: 2025-10-02 12:29:59.055 2 INFO nova.compute.manager [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Migrating#033[00m
Oct  2 08:29:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e306 e306: 3 total, 3 up, 3 in
Oct  2 08:29:59 np0005465988 nova_compute[236126]: 2025-10-02 12:29:59.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:29:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:29:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:29:59.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:00.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:00 np0005465988 nova_compute[236126]: 2025-10-02 12:30:00.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:00 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 08:30:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:00Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:60:9d:7c 10.100.0.5
Oct  2 08:30:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:00Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:60:9d:7c 10.100.0.5
Oct  2 08:30:01 np0005465988 podman[285940]: 2025-10-02 12:30:01.578166629 +0000 UTC m=+0.094318154 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Oct  2 08:30:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:01.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:01 np0005465988 nova_compute[236126]: 2025-10-02 12:30:01.910 2 DEBUG nova.network.neutron [req-9265e47f-905a-46b3-ae77-60315e591157 req-401a276b-cf17-431f-b63c-6e902b522c2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updated VIF entry in instance network info cache for port 7cf26487-91ca-4d15-85f3-bb6a66393796. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:01 np0005465988 nova_compute[236126]: 2025-10-02 12:30:01.912 2 DEBUG nova.network.neutron [req-9265e47f-905a-46b3-ae77-60315e591157 req-401a276b-cf17-431f-b63c-6e902b522c2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:01 np0005465988 nova_compute[236126]: 2025-10-02 12:30:01.969 2 DEBUG oslo_concurrency.lockutils [req-9265e47f-905a-46b3-ae77-60315e591157 req-401a276b-cf17-431f-b63c-6e902b522c2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:02.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:03 np0005465988 systemd[1]: Created slice User Slice of UID 42436.
Oct  2 08:30:03 np0005465988 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  2 08:30:03 np0005465988 systemd-logind[827]: New session 53 of user nova.
Oct  2 08:30:03 np0005465988 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  2 08:30:03 np0005465988 systemd[1]: Starting User Manager for UID 42436...
Oct  2 08:30:03 np0005465988 nova_compute[236126]: 2025-10-02 12:30:03.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:03 np0005465988 systemd[285965]: Queued start job for default target Main User Target.
Oct  2 08:30:03 np0005465988 systemd[285965]: Created slice User Application Slice.
Oct  2 08:30:03 np0005465988 systemd[285965]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:30:03 np0005465988 systemd[285965]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 08:30:03 np0005465988 systemd[285965]: Reached target Paths.
Oct  2 08:30:03 np0005465988 systemd[285965]: Reached target Timers.
Oct  2 08:30:03 np0005465988 systemd[285965]: Starting D-Bus User Message Bus Socket...
Oct  2 08:30:03 np0005465988 systemd[285965]: Starting Create User's Volatile Files and Directories...
Oct  2 08:30:03 np0005465988 systemd[285965]: Listening on D-Bus User Message Bus Socket.
Oct  2 08:30:03 np0005465988 systemd[285965]: Reached target Sockets.
Oct  2 08:30:03 np0005465988 systemd[285965]: Finished Create User's Volatile Files and Directories.
Oct  2 08:30:03 np0005465988 systemd[285965]: Reached target Basic System.
Oct  2 08:30:03 np0005465988 systemd[285965]: Reached target Main User Target.
Oct  2 08:30:03 np0005465988 systemd[285965]: Startup finished in 148ms.
Oct  2 08:30:03 np0005465988 systemd[1]: Started User Manager for UID 42436.
Oct  2 08:30:03 np0005465988 systemd[1]: Started Session 53 of User nova.
Oct  2 08:30:03 np0005465988 nova_compute[236126]: 2025-10-02 12:30:03.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:03 np0005465988 systemd[1]: session-53.scope: Deactivated successfully.
Oct  2 08:30:03 np0005465988 systemd-logind[827]: Session 53 logged out. Waiting for processes to exit.
Oct  2 08:30:03 np0005465988 systemd-logind[827]: Removed session 53.
Oct  2 08:30:03 np0005465988 systemd-logind[827]: New session 55 of user nova.
Oct  2 08:30:03 np0005465988 systemd[1]: Started Session 55 of User nova.
Oct  2 08:30:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:03.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:03 np0005465988 systemd[1]: session-55.scope: Deactivated successfully.
Oct  2 08:30:03 np0005465988 systemd-logind[827]: Session 55 logged out. Waiting for processes to exit.
Oct  2 08:30:03 np0005465988 systemd-logind[827]: Removed session 55.
Oct  2 08:30:03 np0005465988 nova_compute[236126]: 2025-10-02 12:30:03.880 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:03 np0005465988 nova_compute[236126]: 2025-10-02 12:30:03.881 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:03 np0005465988 nova_compute[236126]: 2025-10-02 12:30:03.992 2 DEBUG nova.compute.manager [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:30:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:04.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:04 np0005465988 nova_compute[236126]: 2025-10-02 12:30:04.282 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:04 np0005465988 nova_compute[236126]: 2025-10-02 12:30:04.283 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:04 np0005465988 nova_compute[236126]: 2025-10-02 12:30:04.291 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:30:04 np0005465988 nova_compute[236126]: 2025-10-02 12:30:04.292 2 INFO nova.compute.claims [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:30:04 np0005465988 nova_compute[236126]: 2025-10-02 12:30:04.655 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:30:05 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2233653918' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.138 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.149 2 DEBUG nova.compute.provider_tree [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.185 2 DEBUG nova.scheduler.client.report [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.260 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.977s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.261 2 DEBUG nova.compute.manager [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.464 2 DEBUG nova.compute.manager [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.465 2 DEBUG nova.network.neutron [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.538 2 INFO nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.544 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.574 2 DEBUG nova.compute.manager [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:30:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:05.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.891 2 DEBUG nova.policy [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '22d56fcd2a4b4851bfd126ae4548ee9b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5533aaac08cd4856af72ef4992bb5e76', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.899 2 DEBUG nova.compute.manager [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.901 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.902 2 INFO nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Creating image(s)#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.950 2 DEBUG nova.storage.rbd_utils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image c8b713f4-4f41-4153-928c-164f2ed108ed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:30:05 np0005465988 nova_compute[236126]: 2025-10-02 12:30:05.987 2 DEBUG nova.storage.rbd_utils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image c8b713f4-4f41-4153-928c-164f2ed108ed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.023 2 DEBUG nova.storage.rbd_utils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image c8b713f4-4f41-4153-928c-164f2ed108ed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.029 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.069 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.070 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.071 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.072 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.114 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.115 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.116 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.117 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.159 2 DEBUG nova.storage.rbd_utils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image c8b713f4-4f41-4153-928c-164f2ed108ed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.164 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c8b713f4-4f41-4153-928c-164f2ed108ed_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:06.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.611 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c8b713f4-4f41-4153-928c-164f2ed108ed_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.714 2 DEBUG nova.storage.rbd_utils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] resizing rbd image c8b713f4-4f41-4153-928c-164f2ed108ed_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.781 2 DEBUG nova.compute.manager [req-11a21f8d-96ba-4e1b-8d3f-a6fdc67e64c9 req-a38df8cf-7e0a-49c0-a4b3-e6c6cf74be9f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received event network-vif-unplugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.782 2 DEBUG oslo_concurrency.lockutils [req-11a21f8d-96ba-4e1b-8d3f-a6fdc67e64c9 req-a38df8cf-7e0a-49c0-a4b3-e6c6cf74be9f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.782 2 DEBUG oslo_concurrency.lockutils [req-11a21f8d-96ba-4e1b-8d3f-a6fdc67e64c9 req-a38df8cf-7e0a-49c0-a4b3-e6c6cf74be9f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.782 2 DEBUG oslo_concurrency.lockutils [req-11a21f8d-96ba-4e1b-8d3f-a6fdc67e64c9 req-a38df8cf-7e0a-49c0-a4b3-e6c6cf74be9f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.783 2 DEBUG nova.compute.manager [req-11a21f8d-96ba-4e1b-8d3f-a6fdc67e64c9 req-a38df8cf-7e0a-49c0-a4b3-e6c6cf74be9f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] No waiting events found dispatching network-vif-unplugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.783 2 WARNING nova.compute.manager [req-11a21f8d-96ba-4e1b-8d3f-a6fdc67e64c9 req-a38df8cf-7e0a-49c0-a4b3-e6c6cf74be9f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received unexpected event network-vif-unplugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.851 2 DEBUG nova.objects.instance [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'migration_context' on Instance uuid c8b713f4-4f41-4153-928c-164f2ed108ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.874 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.874 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Ensure instance console log exists: /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.875 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.875 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:06 np0005465988 nova_compute[236126]: 2025-10-02 12:30:06.875 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:07 np0005465988 nova_compute[236126]: 2025-10-02 12:30:07.715 2 INFO nova.network.neutron [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Updating port b76b92a4-1882-4f89-94f4-3a4700f9c379 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  2 08:30:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:07.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:30:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:08.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:30:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e307 e307: 3 total, 3 up, 3 in
Oct  2 08:30:08 np0005465988 nova_compute[236126]: 2025-10-02 12:30:08.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:08 np0005465988 nova_compute[236126]: 2025-10-02 12:30:08.817 2 DEBUG nova.network.neutron [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Successfully created port: 386c73f3-c5a1-4edb-894f-841beabaecbd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:30:09 np0005465988 nova_compute[236126]: 2025-10-02 12:30:09.288 2 DEBUG nova.compute.manager [req-8912030d-91f0-4c0c-9307-7266960a0cfa req-0bc881fc-76ca-4fe4-97d8-0cd839c73242 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received event network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:09 np0005465988 nova_compute[236126]: 2025-10-02 12:30:09.289 2 DEBUG oslo_concurrency.lockutils [req-8912030d-91f0-4c0c-9307-7266960a0cfa req-0bc881fc-76ca-4fe4-97d8-0cd839c73242 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:09 np0005465988 nova_compute[236126]: 2025-10-02 12:30:09.290 2 DEBUG oslo_concurrency.lockutils [req-8912030d-91f0-4c0c-9307-7266960a0cfa req-0bc881fc-76ca-4fe4-97d8-0cd839c73242 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:09 np0005465988 nova_compute[236126]: 2025-10-02 12:30:09.291 2 DEBUG oslo_concurrency.lockutils [req-8912030d-91f0-4c0c-9307-7266960a0cfa req-0bc881fc-76ca-4fe4-97d8-0cd839c73242 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:09 np0005465988 nova_compute[236126]: 2025-10-02 12:30:09.291 2 DEBUG nova.compute.manager [req-8912030d-91f0-4c0c-9307-7266960a0cfa req-0bc881fc-76ca-4fe4-97d8-0cd839c73242 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] No waiting events found dispatching network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:09 np0005465988 nova_compute[236126]: 2025-10-02 12:30:09.292 2 WARNING nova.compute.manager [req-8912030d-91f0-4c0c-9307-7266960a0cfa req-0bc881fc-76ca-4fe4-97d8-0cd839c73242 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received unexpected event network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:30:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:30:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:09.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:30:10 np0005465988 nova_compute[236126]: 2025-10-02 12:30:10.068 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:10 np0005465988 nova_compute[236126]: 2025-10-02 12:30:10.093 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:10 np0005465988 nova_compute[236126]: 2025-10-02 12:30:10.094 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:30:10 np0005465988 nova_compute[236126]: 2025-10-02 12:30:10.097 2 DEBUG oslo_concurrency.lockutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:10 np0005465988 nova_compute[236126]: 2025-10-02 12:30:10.097 2 DEBUG oslo_concurrency.lockutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquired lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:10 np0005465988 nova_compute[236126]: 2025-10-02 12:30:10.098 2 DEBUG nova.network.neutron [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:30:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:10.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:10 np0005465988 nova_compute[236126]: 2025-10-02 12:30:10.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:10 np0005465988 nova_compute[236126]: 2025-10-02 12:30:10.438 2 DEBUG nova.compute.manager [req-a202b232-8055-4a6f-9570-2ff6d243aca0 req-b1ed5947-aa41-4990-8ef9-e6e21337d232 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received event network-changed-b76b92a4-1882-4f89-94f4-3a4700f9c379 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:10 np0005465988 nova_compute[236126]: 2025-10-02 12:30:10.438 2 DEBUG nova.compute.manager [req-a202b232-8055-4a6f-9570-2ff6d243aca0 req-b1ed5947-aa41-4990-8ef9-e6e21337d232 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Refreshing instance network info cache due to event network-changed-b76b92a4-1882-4f89-94f4-3a4700f9c379. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:30:10 np0005465988 nova_compute[236126]: 2025-10-02 12:30:10.439 2 DEBUG oslo_concurrency.lockutils [req-a202b232-8055-4a6f-9570-2ff6d243aca0 req-b1ed5947-aa41-4990-8ef9-e6e21337d232 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:11 np0005465988 nova_compute[236126]: 2025-10-02 12:30:11.792 2 DEBUG nova.network.neutron [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Updating instance_info_cache with network_info: [{"id": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "address": "fa:16:3e:23:4b:2a", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb76b92a4-18", "ovs_interfaceid": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:11.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:11 np0005465988 nova_compute[236126]: 2025-10-02 12:30:11.953 2 DEBUG oslo_concurrency.lockutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Releasing lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:11 np0005465988 nova_compute[236126]: 2025-10-02 12:30:11.960 2 DEBUG oslo_concurrency.lockutils [req-a202b232-8055-4a6f-9570-2ff6d243aca0 req-b1ed5947-aa41-4990-8ef9-e6e21337d232 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:11 np0005465988 nova_compute[236126]: 2025-10-02 12:30:11.960 2 DEBUG nova.network.neutron [req-a202b232-8055-4a6f-9570-2ff6d243aca0 req-b1ed5947-aa41-4990-8ef9-e6e21337d232 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Refreshing network info cache for port b76b92a4-1882-4f89-94f4-3a4700f9c379 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:30:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:12.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.338 2 DEBUG nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.341 2 DEBUG nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.341 2 INFO nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Creating image(s)#033[00m
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.395 2 DEBUG nova.storage.rbd_utils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] creating snapshot(nova-resize) on rbd image(634c38a6-caab-410d-8748-3ec1fd6f9cdc_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.532 2 DEBUG nova.network.neutron [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Successfully updated port: 386c73f3-c5a1-4edb-894f-841beabaecbd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.638 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.638 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquired lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.639 2 DEBUG nova.network.neutron [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.719 2 DEBUG nova.compute.manager [req-f9835275-3c99-4603-bde3-5476d6b3da5f req-4ef6adef-548e-4077-bbc2-8bcf44c0939f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received event network-changed-386c73f3-c5a1-4edb-894f-841beabaecbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.720 2 DEBUG nova.compute.manager [req-f9835275-3c99-4603-bde3-5476d6b3da5f req-4ef6adef-548e-4077-bbc2-8bcf44c0939f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Refreshing instance network info cache due to event network-changed-386c73f3-c5a1-4edb-894f-841beabaecbd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.721 2 DEBUG oslo_concurrency.lockutils [req-f9835275-3c99-4603-bde3-5476d6b3da5f req-4ef6adef-548e-4077-bbc2-8bcf44c0939f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e307 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e308 e308: 3 total, 3 up, 3 in
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.906 2 DEBUG nova.objects.instance [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 634c38a6-caab-410d-8748-3ec1fd6f9cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:12 np0005465988 nova_compute[236126]: 2025-10-02 12:30:12.999 2 DEBUG nova.network.neutron [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.141 2 DEBUG nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.142 2 DEBUG nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Ensure instance console log exists: /var/lib/nova/instances/634c38a6-caab-410d-8748-3ec1fd6f9cdc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.143 2 DEBUG oslo_concurrency.lockutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.143 2 DEBUG oslo_concurrency.lockutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.144 2 DEBUG oslo_concurrency.lockutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.148 2 DEBUG nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Start _get_guest_xml network_info=[{"id": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "address": "fa:16:3e:23:4b:2a", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:23:4b:2a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb76b92a4-18", "ovs_interfaceid": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.155 2 WARNING nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.160 2 DEBUG nova.virt.libvirt.host [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.161 2 DEBUG nova.virt.libvirt.host [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.166 2 DEBUG nova.virt.libvirt.host [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.167 2 DEBUG nova.virt.libvirt.host [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.168 2 DEBUG nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.169 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eb3a53f1-304b-4cb0-acc3-abffce0fb181',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.170 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.170 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.170 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.171 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.171 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.172 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.172 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.172 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.173 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.173 2 DEBUG nova.virt.hardware [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.174 2 DEBUG nova.objects.instance [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 634c38a6-caab-410d-8748-3ec1fd6f9cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.273 2 DEBUG oslo_concurrency.processutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #91. Immutable memtables: 0.
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.569926) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 55] Flushing memtable with next log file: 91
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408213569977, "job": 55, "event": "flush_started", "num_memtables": 1, "num_entries": 1934, "num_deletes": 257, "total_data_size": 4283650, "memory_usage": 4343336, "flush_reason": "Manual Compaction"}
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 55] Level-0 flush table #92: started
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408213590603, "cf_name": "default", "job": 55, "event": "table_file_creation", "file_number": 92, "file_size": 2812386, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 47257, "largest_seqno": 49186, "table_properties": {"data_size": 2804580, "index_size": 4620, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 17011, "raw_average_key_size": 20, "raw_value_size": 2788564, "raw_average_value_size": 3311, "num_data_blocks": 202, "num_entries": 842, "num_filter_entries": 842, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759408054, "oldest_key_time": 1759408054, "file_creation_time": 1759408213, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 92, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 55] Flush lasted 20732 microseconds, and 10662 cpu microseconds.
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.590657) [db/flush_job.cc:967] [default] [JOB 55] Level-0 flush table #92: 2812386 bytes OK
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.590682) [db/memtable_list.cc:519] [default] Level-0 commit table #92 started
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.593212) [db/memtable_list.cc:722] [default] Level-0 commit table #92: memtable #1 done
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.593237) EVENT_LOG_v1 {"time_micros": 1759408213593230, "job": 55, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.593262) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 55] Try to delete WAL files size 4274955, prev total WAL file size 4274955, number of live WAL files 2.
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000088.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.595246) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031353035' seq:72057594037927935, type:22 .. '6C6F676D0031373537' seq:0, type:0; will stop at (end)
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 56] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 55 Base level 0, inputs: [92(2746KB)], [90(10MB)]
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408213595301, "job": 56, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [92], "files_L6": [90], "score": -1, "input_data_size": 13918166, "oldest_snapshot_seqno": -1}
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 56] Generated table #93: 7486 keys, 13779139 bytes, temperature: kUnknown
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408213716713, "cf_name": "default", "job": 56, "event": "table_file_creation", "file_number": 93, "file_size": 13779139, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13725410, "index_size": 33905, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18757, "raw_key_size": 192355, "raw_average_key_size": 25, "raw_value_size": 13588051, "raw_average_value_size": 1815, "num_data_blocks": 1350, "num_entries": 7486, "num_filter_entries": 7486, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759408213, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 93, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.717051) [db/compaction/compaction_job.cc:1663] [default] [JOB 56] Compacted 1@0 + 1@6 files to L6 => 13779139 bytes
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.721207) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 114.5 rd, 113.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 10.6 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(9.8) write-amplify(4.9) OK, records in: 8017, records dropped: 531 output_compression: NoCompression
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.721229) EVENT_LOG_v1 {"time_micros": 1759408213721217, "job": 56, "event": "compaction_finished", "compaction_time_micros": 121526, "compaction_time_cpu_micros": 52073, "output_level": 6, "num_output_files": 1, "total_output_size": 13779139, "num_input_records": 8017, "num_output_records": 7486, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000092.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408213721747, "job": 56, "event": "table_file_deletion", "file_number": 92}
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000090.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408213723567, "job": 56, "event": "table_file_deletion", "file_number": 90}
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.595120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.723647) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.723654) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.723656) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.723657) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:13.723659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:30:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3925013349' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.790 2 DEBUG oslo_concurrency.processutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:30:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:13.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:30:13 np0005465988 nova_compute[236126]: 2025-10-02 12:30:13.833 2 DEBUG oslo_concurrency.processutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:13 np0005465988 systemd[1]: Stopping User Manager for UID 42436...
Oct  2 08:30:13 np0005465988 systemd[285965]: Activating special unit Exit the Session...
Oct  2 08:30:13 np0005465988 systemd[285965]: Stopped target Main User Target.
Oct  2 08:30:13 np0005465988 systemd[285965]: Stopped target Basic System.
Oct  2 08:30:13 np0005465988 systemd[285965]: Stopped target Paths.
Oct  2 08:30:13 np0005465988 systemd[285965]: Stopped target Sockets.
Oct  2 08:30:13 np0005465988 systemd[285965]: Stopped target Timers.
Oct  2 08:30:13 np0005465988 systemd[285965]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:30:13 np0005465988 systemd[285965]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 08:30:13 np0005465988 systemd[285965]: Closed D-Bus User Message Bus Socket.
Oct  2 08:30:13 np0005465988 systemd[285965]: Stopped Create User's Volatile Files and Directories.
Oct  2 08:30:13 np0005465988 systemd[285965]: Removed slice User Application Slice.
Oct  2 08:30:13 np0005465988 systemd[285965]: Reached target Shutdown.
Oct  2 08:30:13 np0005465988 systemd[285965]: Finished Exit the Session.
Oct  2 08:30:13 np0005465988 systemd[285965]: Reached target Exit the Session.
Oct  2 08:30:13 np0005465988 systemd[1]: user@42436.service: Deactivated successfully.
Oct  2 08:30:13 np0005465988 systemd[1]: Stopped User Manager for UID 42436.
Oct  2 08:30:13 np0005465988 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  2 08:30:13 np0005465988 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  2 08:30:13 np0005465988 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  2 08:30:13 np0005465988 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  2 08:30:13 np0005465988 systemd[1]: Removed slice User Slice of UID 42436.
Oct  2 08:30:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:30:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:14.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:30:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:30:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2936122552' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.308 2 DEBUG oslo_concurrency.processutils [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.311 2 DEBUG nova.virt.libvirt.vif [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1187258695',display_name='tempest-ServerActionsTestJSON-server-1187258695',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1187258695',id=112,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:29:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-8nz9nuro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=634c38a6-caab-410d-8748-3ec1fd6f9cdc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "address": "fa:16:3e:23:4b:2a", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:23:4b:2a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb76b92a4-18", "ovs_interfaceid": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.311 2 DEBUG nova.network.os_vif_util [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "address": "fa:16:3e:23:4b:2a", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:23:4b:2a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb76b92a4-18", "ovs_interfaceid": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.312 2 DEBUG nova.network.os_vif_util [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:4b:2a,bridge_name='br-int',has_traffic_filtering=True,id=b76b92a4-1882-4f89-94f4-3a4700f9c379,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb76b92a4-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.318 2 DEBUG nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  <uuid>634c38a6-caab-410d-8748-3ec1fd6f9cdc</uuid>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  <name>instance-00000070</name>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  <memory>196608</memory>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerActionsTestJSON-server-1187258695</nova:name>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:30:13</nova:creationTime>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.micro">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <nova:memory>192</nova:memory>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <nova:user uuid="2bd16d1f5f9d4eb396c474eedee67165">tempest-ServerActionsTestJSON-842270816-project-member</nova:user>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <nova:project uuid="4b8ca48cb5f64ef3b0736b8be82378b8">tempest-ServerActionsTestJSON-842270816</nova:project>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <nova:port uuid="b76b92a4-1882-4f89-94f4-3a4700f9c379">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <entry name="serial">634c38a6-caab-410d-8748-3ec1fd6f9cdc</entry>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <entry name="uuid">634c38a6-caab-410d-8748-3ec1fd6f9cdc</entry>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/634c38a6-caab-410d-8748-3ec1fd6f9cdc_disk">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/634c38a6-caab-410d-8748-3ec1fd6f9cdc_disk.config">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:23:4b:2a"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <target dev="tapb76b92a4-18"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/634c38a6-caab-410d-8748-3ec1fd6f9cdc/console.log" append="off"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:30:14 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:30:14 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:30:14 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:30:14 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.320 2 DEBUG nova.virt.libvirt.vif [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1187258695',display_name='tempest-ServerActionsTestJSON-server-1187258695',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1187258695',id=112,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:29:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-8nz9nuro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=634c38a6-caab-410d-8748-3ec1fd6f9cdc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "address": "fa:16:3e:23:4b:2a", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:23:4b:2a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb76b92a4-18", "ovs_interfaceid": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.321 2 DEBUG nova.network.os_vif_util [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "address": "fa:16:3e:23:4b:2a", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1352928597-network", "vif_mac": "fa:16:3e:23:4b:2a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb76b92a4-18", "ovs_interfaceid": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.322 2 DEBUG nova.network.os_vif_util [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:4b:2a,bridge_name='br-int',has_traffic_filtering=True,id=b76b92a4-1882-4f89-94f4-3a4700f9c379,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb76b92a4-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.323 2 DEBUG os_vif [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:4b:2a,bridge_name='br-int',has_traffic_filtering=True,id=b76b92a4-1882-4f89-94f4-3a4700f9c379,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb76b92a4-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.325 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.326 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.330 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb76b92a4-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.331 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb76b92a4-18, col_values=(('external_ids', {'iface-id': 'b76b92a4-1882-4f89-94f4-3a4700f9c379', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:4b:2a', 'vm-uuid': '634c38a6-caab-410d-8748-3ec1fd6f9cdc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:14 np0005465988 NetworkManager[45041]: <info>  [1759408214.3351] manager: (tapb76b92a4-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.346 2 INFO os_vif [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:4b:2a,bridge_name='br-int',has_traffic_filtering=True,id=b76b92a4-1882-4f89-94f4-3a4700f9c379,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb76b92a4-18')#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.420 2 DEBUG nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.420 2 DEBUG nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.421 2 DEBUG nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] No VIF found with MAC fa:16:3e:23:4b:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.422 2 INFO nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Using config drive#033[00m
Oct  2 08:30:14 np0005465988 kernel: tapb76b92a4-18: entered promiscuous mode
Oct  2 08:30:14 np0005465988 NetworkManager[45041]: <info>  [1759408214.5574] manager: (tapb76b92a4-18): new Tun device (/org/freedesktop/NetworkManager/Devices/240)
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:14Z|00515|binding|INFO|Claiming lport b76b92a4-1882-4f89-94f4-3a4700f9c379 for this chassis.
Oct  2 08:30:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:14Z|00516|binding|INFO|b76b92a4-1882-4f89-94f4-3a4700f9c379: Claiming fa:16:3e:23:4b:2a 10.100.0.10
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.620 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:4b:2a 10.100.0.10'], port_security=['fa:16:3e:23:4b:2a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '634c38a6-caab-410d-8748-3ec1fd6f9cdc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=b76b92a4-1882-4f89-94f4-3a4700f9c379) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.622 142124 INFO neutron.agent.ovn.metadata.agent [-] Port b76b92a4-1882-4f89-94f4-3a4700f9c379 in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff bound to our chassis#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.624 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff#033[00m
Oct  2 08:30:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:14Z|00517|binding|INFO|Setting lport b76b92a4-1882-4f89-94f4-3a4700f9c379 ovn-installed in OVS
Oct  2 08:30:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:14Z|00518|binding|INFO|Setting lport b76b92a4-1882-4f89-94f4-3a4700f9c379 up in Southbound
Oct  2 08:30:14 np0005465988 nova_compute[236126]: 2025-10-02 12:30:14.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:14 np0005465988 systemd-udevd[286395]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.637 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[eda5394b-7596-49df-8636-20668912de1c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.639 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2c62a66-f1 in ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.642 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2c62a66-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.642 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9c192887-bb17-40c9-9239-c90832d7f1b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.643 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dd09d901-69c1-4dba-b597-48f4a5729e45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.656 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[6ae6c35c-8d02-49f1-8d06-9c0f7f597b9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 NetworkManager[45041]: <info>  [1759408214.6582] device (tapb76b92a4-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:30:14 np0005465988 NetworkManager[45041]: <info>  [1759408214.6599] device (tapb76b92a4-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:30:14 np0005465988 systemd-machined[192594]: New machine qemu-50-instance-00000070.
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.683 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0c1c547c-c5ee-46ff-b705-5c01112818eb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 systemd[1]: Started Virtual Machine qemu-50-instance-00000070.
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.718 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[a75ff7d5-3466-47d1-8a7f-762b66cc3afc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.724 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dce0760d-560d-45c8-b18a-479f03947cd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 NetworkManager[45041]: <info>  [1759408214.7303] manager: (tapb2c62a66-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/241)
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.765 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b8b6146d-b3c1-41b6-bccc-ed62da5af4f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.769 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cd2c52-532d-4866-8512-f447b3deacc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 NetworkManager[45041]: <info>  [1759408214.8010] device (tapb2c62a66-f0): carrier: link connected
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.810 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[3daeb8b2-ad92-4c80-a0c6-a778fdb4c021]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.835 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[57d323b5-c49a-478f-8f41-30715725f938]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627513, 'reachable_time': 21293, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286430, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.858 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[eb9c4341-f713-4b7c-83f9-ff6fb8fb5a0b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:7a7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627513, 'tstamp': 627513}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286431, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.882 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fa8a0cd7-aae7-4bf9-bdeb-006af9a71f71]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2c62a66-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:07:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627513, 'reachable_time': 21293, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 286432, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:14.928 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dabca40b-6f75-40af-8a4c-d58cab6048b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:15.009 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca89273-559e-407b-ac13-b345178a479f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:15.010 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:15.011 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:15.011 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c62a66-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:15 np0005465988 kernel: tapb2c62a66-f0: entered promiscuous mode
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:15 np0005465988 NetworkManager[45041]: <info>  [1759408215.0150] manager: (tapb2c62a66-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/242)
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:15.020 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2c62a66-f0, col_values=(('external_ids', {'iface-id': '8c1234f6-f595-4979-94c0-98da2211f0e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:15Z|00519|binding|INFO|Releasing lport 8c1234f6-f595-4979-94c0-98da2211f0e1 from this chassis (sb_readonly=0)
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:15.025 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:15.026 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e5356755-db61-4ec1-bad9-3f14e337953c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:15.028 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.pid.haproxy
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID b2c62a66-f9bc-4a45-a843-aef2e12a7fff
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:15.031 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'env', 'PROCESS_TAG=haproxy-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2c62a66-f9bc-4a45-a843-aef2e12a7fff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:15 np0005465988 podman[286465]: 2025-10-02 12:30:15.476844174 +0000 UTC m=+0.068702008 container create c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.522 2 DEBUG nova.compute.manager [req-22fe4d62-9d91-4440-94fb-76662242cfd7 req-f09f155f-de97-4e5b-818e-81f4cca1d071 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received event network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.523 2 DEBUG oslo_concurrency.lockutils [req-22fe4d62-9d91-4440-94fb-76662242cfd7 req-f09f155f-de97-4e5b-818e-81f4cca1d071 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.524 2 DEBUG oslo_concurrency.lockutils [req-22fe4d62-9d91-4440-94fb-76662242cfd7 req-f09f155f-de97-4e5b-818e-81f4cca1d071 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.524 2 DEBUG oslo_concurrency.lockutils [req-22fe4d62-9d91-4440-94fb-76662242cfd7 req-f09f155f-de97-4e5b-818e-81f4cca1d071 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.524 2 DEBUG nova.compute.manager [req-22fe4d62-9d91-4440-94fb-76662242cfd7 req-f09f155f-de97-4e5b-818e-81f4cca1d071 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] No waiting events found dispatching network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.524 2 WARNING nova.compute.manager [req-22fe4d62-9d91-4440-94fb-76662242cfd7 req-f09f155f-de97-4e5b-818e-81f4cca1d071 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received unexpected event network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 for instance with vm_state active and task_state resize_finish.#033[00m
Oct  2 08:30:15 np0005465988 podman[286465]: 2025-10-02 12:30:15.435849524 +0000 UTC m=+0.027707348 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:30:15 np0005465988 systemd[1]: Started libpod-conmon-c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a.scope.
Oct  2 08:30:15 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:30:15 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/661a27526f82d521eec7df9ccf96b35bdd58585f6f3370129c12ba143cdfef77/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:30:15 np0005465988 podman[286465]: 2025-10-02 12:30:15.57577325 +0000 UTC m=+0.167631054 container init c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:30:15 np0005465988 podman[286465]: 2025-10-02 12:30:15.581018921 +0000 UTC m=+0.172876715 container start c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS)
Oct  2 08:30:15 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[286481]: [NOTICE]   (286485) : New worker (286487) forked
Oct  2 08:30:15 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[286481]: [NOTICE]   (286485) : Loading success.
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.625 2 DEBUG nova.network.neutron [req-a202b232-8055-4a6f-9570-2ff6d243aca0 req-b1ed5947-aa41-4990-8ef9-e6e21337d232 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Updated VIF entry in instance network info cache for port b76b92a4-1882-4f89-94f4-3a4700f9c379. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.626 2 DEBUG nova.network.neutron [req-a202b232-8055-4a6f-9570-2ff6d243aca0 req-b1ed5947-aa41-4990-8ef9-e6e21337d232 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Updating instance_info_cache with network_info: [{"id": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "address": "fa:16:3e:23:4b:2a", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb76b92a4-18", "ovs_interfaceid": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.643 2 DEBUG oslo_concurrency.lockutils [req-a202b232-8055-4a6f-9570-2ff6d243aca0 req-b1ed5947-aa41-4990-8ef9-e6e21337d232 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:15.739 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:15.740 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:30:15 np0005465988 nova_compute[236126]: 2025-10-02 12:30:15.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:15.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.090 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:16.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.259 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408216.2587109, 634c38a6-caab-410d-8748-3ec1fd6f9cdc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.260 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.263 2 DEBUG nova.compute.manager [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.268 2 INFO nova.virt.libvirt.driver [-] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Instance running successfully.#033[00m
Oct  2 08:30:16 np0005465988 virtqemud[235689]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.271 2 DEBUG nova.virt.libvirt.guest [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.272 2 DEBUG nova.virt.libvirt.driver [None req-3a7465cc-47da-4e29-aa69-ae3bd8d529f0 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.290 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.296 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.321 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.322 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408216.2622654, 634c38a6-caab-410d-8748-3ec1fd6f9cdc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.322 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] VM Started (Lifecycle Event)#033[00m
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.394 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:16 np0005465988 nova_compute[236126]: 2025-10-02 12:30:16.400 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.055 2 DEBUG nova.network.neutron [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updating instance_info_cache with network_info: [{"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.087 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Releasing lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.087 2 DEBUG nova.compute.manager [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Instance network_info: |[{"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.087 2 DEBUG oslo_concurrency.lockutils [req-f9835275-3c99-4603-bde3-5476d6b3da5f req-4ef6adef-548e-4077-bbc2-8bcf44c0939f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.087 2 DEBUG nova.network.neutron [req-f9835275-3c99-4603-bde3-5476d6b3da5f req-4ef6adef-548e-4077-bbc2-8bcf44c0939f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Refreshing network info cache for port 386c73f3-c5a1-4edb-894f-841beabaecbd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.090 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Start _get_guest_xml network_info=[{"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.093 2 WARNING nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.099 2 DEBUG nova.virt.libvirt.host [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.099 2 DEBUG nova.virt.libvirt.host [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.103 2 DEBUG nova.virt.libvirt.host [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.104 2 DEBUG nova.virt.libvirt.host [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.105 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.105 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.105 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.105 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.106 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.106 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.106 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.106 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.106 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.106 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.106 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.107 2 DEBUG nova.virt.hardware [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.109 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:30:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1757451483' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.578 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.617 2 DEBUG nova.storage.rbd_utils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image c8b713f4-4f41-4153-928c-164f2ed108ed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.623 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:17.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.867 2 DEBUG nova.compute.manager [req-8545b034-06b4-4609-9463-98a03ba04bfc req-893cc4ae-da0f-4bd4-9e46-4a3f3dde442d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received event network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.867 2 DEBUG oslo_concurrency.lockutils [req-8545b034-06b4-4609-9463-98a03ba04bfc req-893cc4ae-da0f-4bd4-9e46-4a3f3dde442d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.868 2 DEBUG oslo_concurrency.lockutils [req-8545b034-06b4-4609-9463-98a03ba04bfc req-893cc4ae-da0f-4bd4-9e46-4a3f3dde442d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.868 2 DEBUG oslo_concurrency.lockutils [req-8545b034-06b4-4609-9463-98a03ba04bfc req-893cc4ae-da0f-4bd4-9e46-4a3f3dde442d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.868 2 DEBUG nova.compute.manager [req-8545b034-06b4-4609-9463-98a03ba04bfc req-893cc4ae-da0f-4bd4-9e46-4a3f3dde442d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] No waiting events found dispatching network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:17 np0005465988 nova_compute[236126]: 2025-10-02 12:30:17.868 2 WARNING nova.compute.manager [req-8545b034-06b4-4609-9463-98a03ba04bfc req-893cc4ae-da0f-4bd4-9e46-4a3f3dde442d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received unexpected event network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 08:30:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:30:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3727581969' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.096 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.098 2 DEBUG nova.virt.libvirt.vif [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:30:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=115,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBEmvr2XQXDnaV0WQDbbXt57cEK6okdC4PHEYdjpQBx2HU9OQgvgRTm3sGWmsa/AInUTPV9ABsCq2lJ9PCqfb1WP51XCZeB9QBIxafEy8h788huF0550ajkopZIwmSLpiA==',key_name='tempest-keypair-425033456',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5533aaac08cd4856af72ef4992bb5e76',ramdisk_id='',reservation_id='r-1m21sn7g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1564585024',owner_user_name='tempest-AttachVolumeMultiAttachTest-1564585024-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22d56fcd2a4b4851bfd126ae4548ee9b',uuid=c8b713f4-4f41-4153-928c-164f2ed108ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.098 2 DEBUG nova.network.os_vif_util [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converting VIF {"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.099 2 DEBUG nova.network.os_vif_util [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:65:0d,bridge_name='br-int',has_traffic_filtering=True,id=386c73f3-c5a1-4edb-894f-841beabaecbd,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap386c73f3-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.102 2 DEBUG nova.objects.instance [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'pci_devices' on Instance uuid c8b713f4-4f41-4153-928c-164f2ed108ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:30:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.171 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  <uuid>c8b713f4-4f41-4153-928c-164f2ed108ed</uuid>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  <name>instance-00000073</name>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <nova:name>multiattach-server-1</nova:name>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:30:17</nova:creationTime>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <nova:user uuid="22d56fcd2a4b4851bfd126ae4548ee9b">tempest-AttachVolumeMultiAttachTest-1564585024-project-member</nova:user>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <nova:project uuid="5533aaac08cd4856af72ef4992bb5e76">tempest-AttachVolumeMultiAttachTest-1564585024</nova:project>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <nova:port uuid="386c73f3-c5a1-4edb-894f-841beabaecbd">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <entry name="serial">c8b713f4-4f41-4153-928c-164f2ed108ed</entry>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <entry name="uuid">c8b713f4-4f41-4153-928c-164f2ed108ed</entry>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c8b713f4-4f41-4153-928c-164f2ed108ed_disk">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c8b713f4-4f41-4153-928c-164f2ed108ed_disk.config">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:94:65:0d"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <target dev="tap386c73f3-c5"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/console.log" append="off"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:30:18 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:30:18 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:30:18 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:30:18 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.172 2 DEBUG nova.compute.manager [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Preparing to wait for external event network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.173 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.173 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.174 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.175 2 DEBUG nova.virt.libvirt.vif [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:30:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=115,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBEmvr2XQXDnaV0WQDbbXt57cEK6okdC4PHEYdjpQBx2HU9OQgvgRTm3sGWmsa/AInUTPV9ABsCq2lJ9PCqfb1WP51XCZeB9QBIxafEy8h788huF0550ajkopZIwmSLpiA==',key_name='tempest-keypair-425033456',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5533aaac08cd4856af72ef4992bb5e76',ramdisk_id='',reservation_id='r-1m21sn7g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1564585024',owner_user_name='tempest-AttachVolumeMultiAttachTest-1564585024-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22d56fcd2a4b4851bfd126ae4548ee9b',uuid=c8b713f4-4f41-4153-928c-164f2ed108ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.175 2 DEBUG nova.network.os_vif_util [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converting VIF {"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.176 2 DEBUG nova.network.os_vif_util [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:65:0d,bridge_name='br-int',has_traffic_filtering=True,id=386c73f3-c5a1-4edb-894f-841beabaecbd,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap386c73f3-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.177 2 DEBUG os_vif [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:65:0d,bridge_name='br-int',has_traffic_filtering=True,id=386c73f3-c5a1-4edb-894f-841beabaecbd,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap386c73f3-c5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.179 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.179 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:18.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.186 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap386c73f3-c5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.187 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap386c73f3-c5, col_values=(('external_ids', {'iface-id': '386c73f3-c5a1-4edb-894f-841beabaecbd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:94:65:0d', 'vm-uuid': 'c8b713f4-4f41-4153-928c-164f2ed108ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:18 np0005465988 NetworkManager[45041]: <info>  [1759408218.1908] manager: (tap386c73f3-c5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.209 2 INFO os_vif [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:65:0d,bridge_name='br-int',has_traffic_filtering=True,id=386c73f3-c5a1-4edb-894f-841beabaecbd,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap386c73f3-c5')#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.431 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.432 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.432 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No VIF found with MAC fa:16:3e:94:65:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.433 2 INFO nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Using config drive#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.476 2 DEBUG nova.storage.rbd_utils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image c8b713f4-4f41-4153-928c-164f2ed108ed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e309 e309: 3 total, 3 up, 3 in
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.846 2 DEBUG nova.network.neutron [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Port b76b92a4-1882-4f89-94f4-3a4700f9c379 binding to destination host compute-2.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.847 2 DEBUG oslo_concurrency.lockutils [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.848 2 DEBUG oslo_concurrency.lockutils [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquired lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:18 np0005465988 nova_compute[236126]: 2025-10-02 12:30:18.848 2 DEBUG nova.network.neutron [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:30:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:30:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:30:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:19.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.056 2 INFO nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Creating config drive at /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/disk.config#033[00m
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.066 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpse188l37 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:30:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:30:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:30:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:20.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.216 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpse188l37" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.253 2 DEBUG nova.storage.rbd_utils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] rbd image c8b713f4-4f41-4153-928c-164f2ed108ed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.257 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/disk.config c8b713f4-4f41-4153-928c-164f2ed108ed_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.459 2 DEBUG oslo_concurrency.processutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/disk.config c8b713f4-4f41-4153-928c-164f2ed108ed_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.202s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.461 2 INFO nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Deleting local config drive /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/disk.config because it was imported into RBD.#033[00m
Oct  2 08:30:20 np0005465988 kernel: tap386c73f3-c5: entered promiscuous mode
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:20Z|00520|binding|INFO|Claiming lport 386c73f3-c5a1-4edb-894f-841beabaecbd for this chassis.
Oct  2 08:30:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:20Z|00521|binding|INFO|386c73f3-c5a1-4edb-894f-841beabaecbd: Claiming fa:16:3e:94:65:0d 10.100.0.4
Oct  2 08:30:20 np0005465988 NetworkManager[45041]: <info>  [1759408220.5442] manager: (tap386c73f3-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/244)
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.568 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:65:0d 10.100.0.4'], port_security=['fa:16:3e:94:65:0d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c8b713f4-4f41-4153-928c-164f2ed108ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-585473f8-52e4-4e55-96df-8a236d361126', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5533aaac08cd4856af72ef4992bb5e76', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a7e36b3-799e-47d8-a152-7f7146431afe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec297f04-3bda-490f-87d3-1f684caf96fd, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=386c73f3-c5a1-4edb-894f-841beabaecbd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.571 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 386c73f3-c5a1-4edb-894f-841beabaecbd in datapath 585473f8-52e4-4e55-96df-8a236d361126 bound to our chassis#033[00m
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.576 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 585473f8-52e4-4e55-96df-8a236d361126#033[00m
Oct  2 08:30:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:20Z|00522|binding|INFO|Setting lport 386c73f3-c5a1-4edb-894f-841beabaecbd ovn-installed in OVS
Oct  2 08:30:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:20Z|00523|binding|INFO|Setting lport 386c73f3-c5a1-4edb-894f-841beabaecbd up in Southbound
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.604 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[16c34ac5-2628-4478-b756-364c8216284c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:20 np0005465988 systemd-machined[192594]: New machine qemu-51-instance-00000073.
Oct  2 08:30:20 np0005465988 systemd-udevd[286834]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:30:20 np0005465988 systemd[1]: Started Virtual Machine qemu-51-instance-00000073.
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.644 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8e9ee843-7b70-42b2-8926-725fb81ac6c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.648 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d61d16ce-58e8-49ca-96a2-ccb24351673e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:20 np0005465988 NetworkManager[45041]: <info>  [1759408220.6508] device (tap386c73f3-c5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:30:20 np0005465988 NetworkManager[45041]: <info>  [1759408220.6518] device (tap386c73f3-c5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:30:20 np0005465988 podman[286805]: 2025-10-02 12:30:20.691346889 +0000 UTC m=+0.112249330 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:30:20 np0005465988 podman[286806]: 2025-10-02 12:30:20.691696679 +0000 UTC m=+0.106986418 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2)
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.693 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d4921be1-62f0-470c-b0db-93ccf0cdaef4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:20 np0005465988 podman[286804]: 2025-10-02 12:30:20.701639916 +0000 UTC m=+0.124731900 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001)
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.719 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a68036c9-83e9-4a5b-b467-2d8534bbd1c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap585473f8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:8e:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624727, 'reachable_time': 36955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286880, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.742 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0325436b-fd2c-41b4-b7ce-210ea4118f67]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624742, 'tstamp': 624742}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286882, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624747, 'tstamp': 624747}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286882, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.743 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap585473f8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:20 np0005465988 nova_compute[236126]: 2025-10-02 12:30:20.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.747 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap585473f8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.748 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.748 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap585473f8-50, col_values=(('external_ids', {'iface-id': '02b7597d-2fc1-4c56-8603-4dcb0c716c82'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:20.748 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.581 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408221.5809653, c8b713f4-4f41-4153-928c-164f2ed108ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.582 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] VM Started (Lifecycle Event)#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.615 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.621 2 DEBUG nova.network.neutron [req-f9835275-3c99-4603-bde3-5476d6b3da5f req-4ef6adef-548e-4077-bbc2-8bcf44c0939f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updated VIF entry in instance network info cache for port 386c73f3-c5a1-4edb-894f-841beabaecbd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.621 2 DEBUG nova.network.neutron [req-f9835275-3c99-4603-bde3-5476d6b3da5f req-4ef6adef-548e-4077-bbc2-8bcf44c0939f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updating instance_info_cache with network_info: [{"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.624 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408221.5811787, c8b713f4-4f41-4153-928c-164f2ed108ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.625 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.671 2 DEBUG oslo_concurrency.lockutils [req-f9835275-3c99-4603-bde3-5476d6b3da5f req-4ef6adef-548e-4077-bbc2-8bcf44c0939f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.677 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.682 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.705 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:30:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:21.743 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.752 2 DEBUG nova.compute.manager [req-fcf1b6d4-3254-475c-8b07-48c2aef239bf req-e17aca43-d147-4c46-9256-5f3118719c83 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received event network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.753 2 DEBUG oslo_concurrency.lockutils [req-fcf1b6d4-3254-475c-8b07-48c2aef239bf req-e17aca43-d147-4c46-9256-5f3118719c83 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.753 2 DEBUG oslo_concurrency.lockutils [req-fcf1b6d4-3254-475c-8b07-48c2aef239bf req-e17aca43-d147-4c46-9256-5f3118719c83 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.755 2 DEBUG oslo_concurrency.lockutils [req-fcf1b6d4-3254-475c-8b07-48c2aef239bf req-e17aca43-d147-4c46-9256-5f3118719c83 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.755 2 DEBUG nova.compute.manager [req-fcf1b6d4-3254-475c-8b07-48c2aef239bf req-e17aca43-d147-4c46-9256-5f3118719c83 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Processing event network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.757 2 DEBUG nova.compute.manager [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.770 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408221.7705271, c8b713f4-4f41-4153-928c-164f2ed108ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.771 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.776 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.780 2 INFO nova.virt.libvirt.driver [-] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Instance spawned successfully.#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.781 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.800 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.809 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.812 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.813 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.813 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.814 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.814 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.815 2 DEBUG nova.virt.libvirt.driver [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:21.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.851 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.938 2 INFO nova.compute.manager [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Took 16.04 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:30:21 np0005465988 nova_compute[236126]: 2025-10-02 12:30:21.938 2 DEBUG nova.compute.manager [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:22 np0005465988 nova_compute[236126]: 2025-10-02 12:30:22.061 2 INFO nova.compute.manager [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Took 17.83 seconds to build instance.#033[00m
Oct  2 08:30:22 np0005465988 nova_compute[236126]: 2025-10-02 12:30:22.085 2 DEBUG oslo_concurrency.lockutils [None req-4cb3cde2-0fd0-4232-afce-522c762331c1 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:22.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:23 np0005465988 nova_compute[236126]: 2025-10-02 12:30:23.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:23 np0005465988 nova_compute[236126]: 2025-10-02 12:30:23.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:23.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:24.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.261 2 DEBUG nova.compute.manager [req-2696dc0f-6bb0-4ef0-bb6f-ebc8c728ccd8 req-31a4f502-b097-4c69-8073-247bd5ae146f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received event network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.262 2 DEBUG oslo_concurrency.lockutils [req-2696dc0f-6bb0-4ef0-bb6f-ebc8c728ccd8 req-31a4f502-b097-4c69-8073-247bd5ae146f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.262 2 DEBUG oslo_concurrency.lockutils [req-2696dc0f-6bb0-4ef0-bb6f-ebc8c728ccd8 req-31a4f502-b097-4c69-8073-247bd5ae146f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.262 2 DEBUG oslo_concurrency.lockutils [req-2696dc0f-6bb0-4ef0-bb6f-ebc8c728ccd8 req-31a4f502-b097-4c69-8073-247bd5ae146f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.262 2 DEBUG nova.compute.manager [req-2696dc0f-6bb0-4ef0-bb6f-ebc8c728ccd8 req-31a4f502-b097-4c69-8073-247bd5ae146f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] No waiting events found dispatching network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.263 2 WARNING nova.compute.manager [req-2696dc0f-6bb0-4ef0-bb6f-ebc8c728ccd8 req-31a4f502-b097-4c69-8073-247bd5ae146f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received unexpected event network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd for instance with vm_state active and task_state None.#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.426 2 DEBUG nova.network.neutron [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Updating instance_info_cache with network_info: [{"id": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "address": "fa:16:3e:23:4b:2a", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb76b92a4-18", "ovs_interfaceid": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.462 2 DEBUG oslo_concurrency.lockutils [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Releasing lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:24 np0005465988 kernel: tapb76b92a4-18 (unregistering): left promiscuous mode
Oct  2 08:30:24 np0005465988 NetworkManager[45041]: <info>  [1759408224.5737] device (tapb76b92a4-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:30:24 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:24Z|00524|binding|INFO|Releasing lport b76b92a4-1882-4f89-94f4-3a4700f9c379 from this chassis (sb_readonly=0)
Oct  2 08:30:24 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:24Z|00525|binding|INFO|Setting lport b76b92a4-1882-4f89-94f4-3a4700f9c379 down in Southbound
Oct  2 08:30:24 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:24Z|00526|binding|INFO|Removing iface tapb76b92a4-18 ovn-installed in OVS
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:24.604 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:4b:2a 10.100.0.10'], port_security=['fa:16:3e:23:4b:2a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '634c38a6-caab-410d-8748-3ec1fd6f9cdc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8ca48cb5f64ef3b0736b8be82378b8', 'neutron:revision_number': '8', 'neutron:security_group_ids': '16cf92c4-b852-4373-864a-75cf05995c6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ff60d5-33e2-45e9-a563-ce0081b7cc04, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=b76b92a4-1882-4f89-94f4-3a4700f9c379) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:24.605 142124 INFO neutron.agent.ovn.metadata.agent [-] Port b76b92a4-1882-4f89-94f4-3a4700f9c379 in datapath b2c62a66-f9bc-4a45-a843-aef2e12a7fff unbound from our chassis#033[00m
Oct  2 08:30:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:24.606 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:30:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:24.607 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e6e861e5-2e07-419a-ad47-523f434c0f05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:24.608 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff namespace which is not needed anymore#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:24 np0005465988 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000070.scope: Deactivated successfully.
Oct  2 08:30:24 np0005465988 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000070.scope: Consumed 9.905s CPU time.
Oct  2 08:30:24 np0005465988 systemd-machined[192594]: Machine qemu-50-instance-00000070 terminated.
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.754 2 INFO nova.virt.libvirt.driver [-] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Instance destroyed successfully.#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.755 2 DEBUG nova.objects.instance [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'resources' on Instance uuid 634c38a6-caab-410d-8748-3ec1fd6f9cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.777 2 DEBUG nova.virt.libvirt.vif [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1187258695',display_name='tempest-ServerActionsTestJSON-server-1187258695',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1187258695',id=112,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGPR3K1CRfy8NuVFf5q7pocTRVWdkUGpXAwygQtld+YJHetWc29OTANCyHNFo7YQv+XOnhZt50IrR5VORh36EWFuQsqglHMqAdAjWfcmv1078iyeLOwVFkKMjcTINNdhYA==',key_name='tempest-keypair-579122016',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:30:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b8ca48cb5f64ef3b0736b8be82378b8',ramdisk_id='',reservation_id='r-8nz9nuro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-842270816',owner_user_name='tempest-ServerActionsTestJSON-842270816-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:30:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2bd16d1f5f9d4eb396c474eedee67165',uuid=634c38a6-caab-410d-8748-3ec1fd6f9cdc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "address": "fa:16:3e:23:4b:2a", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb76b92a4-18", "ovs_interfaceid": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.778 2 DEBUG nova.network.os_vif_util [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converting VIF {"id": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "address": "fa:16:3e:23:4b:2a", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb76b92a4-18", "ovs_interfaceid": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.778 2 DEBUG nova.network.os_vif_util [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:23:4b:2a,bridge_name='br-int',has_traffic_filtering=True,id=b76b92a4-1882-4f89-94f4-3a4700f9c379,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb76b92a4-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.779 2 DEBUG os_vif [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:4b:2a,bridge_name='br-int',has_traffic_filtering=True,id=b76b92a4-1882-4f89-94f4-3a4700f9c379,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb76b92a4-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.782 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb76b92a4-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.787 2 INFO os_vif [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:4b:2a,bridge_name='br-int',has_traffic_filtering=True,id=b76b92a4-1882-4f89-94f4-3a4700f9c379,network=Network(b2c62a66-f9bc-4a45-a843-aef2e12a7fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb76b92a4-18')#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.792 2 DEBUG oslo_concurrency.lockutils [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.793 2 DEBUG oslo_concurrency.lockutils [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:24 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[286481]: [NOTICE]   (286485) : haproxy version is 2.8.14-c23fe91
Oct  2 08:30:24 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[286481]: [NOTICE]   (286485) : path to executable is /usr/sbin/haproxy
Oct  2 08:30:24 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[286481]: [WARNING]  (286485) : Exiting Master process...
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.833 2 DEBUG nova.objects.instance [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 634c38a6-caab-410d-8748-3ec1fd6f9cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:24 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[286481]: [WARNING]  (286485) : Exiting Master process...
Oct  2 08:30:24 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[286481]: [ALERT]    (286485) : Current worker (286487) exited with code 143 (Terminated)
Oct  2 08:30:24 np0005465988 neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff[286481]: [WARNING]  (286485) : All workers exited. Exiting... (0)
Oct  2 08:30:24 np0005465988 systemd[1]: libpod-c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a.scope: Deactivated successfully.
Oct  2 08:30:24 np0005465988 podman[286952]: 2025-10-02 12:30:24.844638667 +0000 UTC m=+0.088805086 container died c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:30:24 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a-userdata-shm.mount: Deactivated successfully.
Oct  2 08:30:24 np0005465988 systemd[1]: var-lib-containers-storage-overlay-661a27526f82d521eec7df9ccf96b35bdd58585f6f3370129c12ba143cdfef77-merged.mount: Deactivated successfully.
Oct  2 08:30:24 np0005465988 podman[286952]: 2025-10-02 12:30:24.8916787 +0000 UTC m=+0.135845129 container cleanup c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:30:24 np0005465988 systemd[1]: libpod-conmon-c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a.scope: Deactivated successfully.
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.920 2 DEBUG nova.compute.manager [req-637a97d0-7a6d-4b3d-be0e-d975b447d771 req-6896983c-5f55-465d-95fd-545ba491d101 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received event network-vif-unplugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.920 2 DEBUG oslo_concurrency.lockutils [req-637a97d0-7a6d-4b3d-be0e-d975b447d771 req-6896983c-5f55-465d-95fd-545ba491d101 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.920 2 DEBUG oslo_concurrency.lockutils [req-637a97d0-7a6d-4b3d-be0e-d975b447d771 req-6896983c-5f55-465d-95fd-545ba491d101 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.920 2 DEBUG oslo_concurrency.lockutils [req-637a97d0-7a6d-4b3d-be0e-d975b447d771 req-6896983c-5f55-465d-95fd-545ba491d101 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.921 2 DEBUG nova.compute.manager [req-637a97d0-7a6d-4b3d-be0e-d975b447d771 req-6896983c-5f55-465d-95fd-545ba491d101 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] No waiting events found dispatching network-vif-unplugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.921 2 WARNING nova.compute.manager [req-637a97d0-7a6d-4b3d-be0e-d975b447d771 req-6896983c-5f55-465d-95fd-545ba491d101 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received unexpected event network-vif-unplugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 08:30:24 np0005465988 podman[286986]: 2025-10-02 12:30:24.955544008 +0000 UTC m=+0.040706092 container remove c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:30:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:24.962 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4d17af7c-d009-4657-96d1-bf4ff95a3f80]: (4, ('Thu Oct  2 12:30:24 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a)\nc9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a\nThu Oct  2 12:30:24 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff (c9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a)\nc9190390f775e2d672b95eda03f1fa5d061e447f26404a706a6927cdd61b0f0a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:24.964 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c4df7960-25f4-4d96-9194-f2ef5a9233d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:24.965 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c62a66-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:24 np0005465988 nova_compute[236126]: 2025-10-02 12:30:24.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:24 np0005465988 kernel: tapb2c62a66-f0: left promiscuous mode
Oct  2 08:30:25 np0005465988 nova_compute[236126]: 2025-10-02 12:30:25.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:25.048 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4631a3-4b4b-434a-bd78-4baabaf179b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:25.076 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb153b8-1185-4a64-a855-65c4196601af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:25.078 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9804be84-88b7-48a1-8720-606e2dd3172a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:25.100 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0f1dd4a3-aa6a-4d15-8f72-ce0e526a7934]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627504, 'reachable_time': 41750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287047, 'error': None, 'target': 'ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:25 np0005465988 systemd[1]: run-netns-ovnmeta\x2db2c62a66\x2df9bc\x2d4a45\x2da843\x2daef2e12a7fff.mount: Deactivated successfully.
Oct  2 08:30:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:25.108 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2c62a66-f9bc-4a45-a843-aef2e12a7fff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:30:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:25.109 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[3953d9a8-8c42-4e5d-89d6-0236c744fe8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:25 np0005465988 nova_compute[236126]: 2025-10-02 12:30:25.125 2 DEBUG oslo_concurrency.processutils [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:30:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/359606026' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:30:25 np0005465988 nova_compute[236126]: 2025-10-02 12:30:25.600 2 DEBUG oslo_concurrency.processutils [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:25 np0005465988 nova_compute[236126]: 2025-10-02 12:30:25.612 2 DEBUG nova.compute.provider_tree [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:30:25 np0005465988 nova_compute[236126]: 2025-10-02 12:30:25.643 2 DEBUG nova.scheduler.client.report [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:30:25 np0005465988 nova_compute[236126]: 2025-10-02 12:30:25.724 2 DEBUG oslo_concurrency.lockutils [None req-be993e59-fc1b-4b7d-a999-a9d5501b7284 2bd16d1f5f9d4eb396c474eedee67165 4b8ca48cb5f64ef3b0736b8be82378b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.932s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:25.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:26.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:30:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:30:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:27.360 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:27.360 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:27.360 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:27 np0005465988 nova_compute[236126]: 2025-10-02 12:30:27.823 2 DEBUG nova.compute.manager [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-changed-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:27 np0005465988 nova_compute[236126]: 2025-10-02 12:30:27.824 2 DEBUG nova.compute.manager [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Refreshing instance network info cache due to event network-changed-7cf26487-91ca-4d15-85f3-bb6a66393796. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:30:27 np0005465988 nova_compute[236126]: 2025-10-02 12:30:27.824 2 DEBUG oslo_concurrency.lockutils [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:27 np0005465988 nova_compute[236126]: 2025-10-02 12:30:27.824 2 DEBUG oslo_concurrency.lockutils [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:27 np0005465988 nova_compute[236126]: 2025-10-02 12:30:27.825 2 DEBUG nova.network.neutron [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Refreshing network info cache for port 7cf26487-91ca-4d15-85f3-bb6a66393796 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:30:27 np0005465988 nova_compute[236126]: 2025-10-02 12:30:27.830 2 DEBUG nova.compute.manager [req-ca5de053-7c68-4521-9378-1dbe1a3adda6 req-b0e4fd2d-60fa-49f9-9f72-e3f0dc90236a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received event network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:27 np0005465988 nova_compute[236126]: 2025-10-02 12:30:27.831 2 DEBUG oslo_concurrency.lockutils [req-ca5de053-7c68-4521-9378-1dbe1a3adda6 req-b0e4fd2d-60fa-49f9-9f72-e3f0dc90236a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:27 np0005465988 nova_compute[236126]: 2025-10-02 12:30:27.832 2 DEBUG oslo_concurrency.lockutils [req-ca5de053-7c68-4521-9378-1dbe1a3adda6 req-b0e4fd2d-60fa-49f9-9f72-e3f0dc90236a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:27 np0005465988 nova_compute[236126]: 2025-10-02 12:30:27.832 2 DEBUG oslo_concurrency.lockutils [req-ca5de053-7c68-4521-9378-1dbe1a3adda6 req-b0e4fd2d-60fa-49f9-9f72-e3f0dc90236a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:27 np0005465988 nova_compute[236126]: 2025-10-02 12:30:27.832 2 DEBUG nova.compute.manager [req-ca5de053-7c68-4521-9378-1dbe1a3adda6 req-b0e4fd2d-60fa-49f9-9f72-e3f0dc90236a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] No waiting events found dispatching network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:27 np0005465988 nova_compute[236126]: 2025-10-02 12:30:27.833 2 WARNING nova.compute.manager [req-ca5de053-7c68-4521-9378-1dbe1a3adda6 req-b0e4fd2d-60fa-49f9-9f72-e3f0dc90236a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received unexpected event network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 08:30:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:30:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:27.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:30:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:28.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:28 np0005465988 nova_compute[236126]: 2025-10-02 12:30:28.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:29 np0005465988 nova_compute[236126]: 2025-10-02 12:30:29.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:29.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:30 np0005465988 nova_compute[236126]: 2025-10-02 12:30:30.076 2 DEBUG nova.compute.manager [req-de962bcc-0700-4057-97ff-a8e1ed82dcbb req-27a64f69-1414-4ac1-92fc-1b0495829966 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received event network-changed-b76b92a4-1882-4f89-94f4-3a4700f9c379 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:30 np0005465988 nova_compute[236126]: 2025-10-02 12:30:30.077 2 DEBUG nova.compute.manager [req-de962bcc-0700-4057-97ff-a8e1ed82dcbb req-27a64f69-1414-4ac1-92fc-1b0495829966 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Refreshing instance network info cache due to event network-changed-b76b92a4-1882-4f89-94f4-3a4700f9c379. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:30:30 np0005465988 nova_compute[236126]: 2025-10-02 12:30:30.079 2 DEBUG oslo_concurrency.lockutils [req-de962bcc-0700-4057-97ff-a8e1ed82dcbb req-27a64f69-1414-4ac1-92fc-1b0495829966 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:30 np0005465988 nova_compute[236126]: 2025-10-02 12:30:30.080 2 DEBUG oslo_concurrency.lockutils [req-de962bcc-0700-4057-97ff-a8e1ed82dcbb req-27a64f69-1414-4ac1-92fc-1b0495829966 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:30 np0005465988 nova_compute[236126]: 2025-10-02 12:30:30.080 2 DEBUG nova.network.neutron [req-de962bcc-0700-4057-97ff-a8e1ed82dcbb req-27a64f69-1414-4ac1-92fc-1b0495829966 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Refreshing network info cache for port b76b92a4-1882-4f89-94f4-3a4700f9c379 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:30:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:30.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.023 2 DEBUG nova.network.neutron [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updated VIF entry in instance network info cache for port 7cf26487-91ca-4d15-85f3-bb6a66393796. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.023 2 DEBUG nova.network.neutron [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.068 2 DEBUG oslo_concurrency.lockutils [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.069 2 DEBUG nova.compute.manager [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received event network-changed-386c73f3-c5a1-4edb-894f-841beabaecbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.069 2 DEBUG nova.compute.manager [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Refreshing instance network info cache due to event network-changed-386c73f3-c5a1-4edb-894f-841beabaecbd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.070 2 DEBUG oslo_concurrency.lockutils [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.070 2 DEBUG oslo_concurrency.lockutils [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.070 2 DEBUG nova.network.neutron [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Refreshing network info cache for port 386c73f3-c5a1-4edb-894f-841beabaecbd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.737 2 DEBUG oslo_concurrency.lockutils [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.737 2 DEBUG oslo_concurrency.lockutils [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.824 2 DEBUG nova.objects.instance [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'flavor' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:31.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:31 np0005465988 nova_compute[236126]: 2025-10-02 12:30:31.912 2 DEBUG oslo_concurrency.lockutils [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:32.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:32 np0005465988 podman[287128]: 2025-10-02 12:30:32.541228975 +0000 UTC m=+0.071055205 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:30:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e310 e310: 3 total, 3 up, 3 in
Oct  2 08:30:33 np0005465988 nova_compute[236126]: 2025-10-02 12:30:33.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:33 np0005465988 nova_compute[236126]: 2025-10-02 12:30:33.729 2 DEBUG oslo_concurrency.lockutils [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:33 np0005465988 nova_compute[236126]: 2025-10-02 12:30:33.730 2 DEBUG oslo_concurrency.lockutils [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:33 np0005465988 nova_compute[236126]: 2025-10-02 12:30:33.731 2 INFO nova.compute.manager [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Attaching volume 1f1fe097-f4b6-4748-bf18-8e487e0f3ba6 to /dev/vdb#033[00m
Oct  2 08:30:33 np0005465988 nova_compute[236126]: 2025-10-02 12:30:33.753 2 DEBUG nova.network.neutron [req-de962bcc-0700-4057-97ff-a8e1ed82dcbb req-27a64f69-1414-4ac1-92fc-1b0495829966 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Updated VIF entry in instance network info cache for port b76b92a4-1882-4f89-94f4-3a4700f9c379. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:33 np0005465988 nova_compute[236126]: 2025-10-02 12:30:33.753 2 DEBUG nova.network.neutron [req-de962bcc-0700-4057-97ff-a8e1ed82dcbb req-27a64f69-1414-4ac1-92fc-1b0495829966 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Updating instance_info_cache with network_info: [{"id": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "address": "fa:16:3e:23:4b:2a", "network": {"id": "b2c62a66-f9bc-4a45-a843-aef2e12a7fff", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1352928597-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8ca48cb5f64ef3b0736b8be82378b8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb76b92a4-18", "ovs_interfaceid": "b76b92a4-1882-4f89-94f4-3a4700f9c379", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:33 np0005465988 nova_compute[236126]: 2025-10-02 12:30:33.776 2 DEBUG oslo_concurrency.lockutils [req-de962bcc-0700-4057-97ff-a8e1ed82dcbb req-27a64f69-1414-4ac1-92fc-1b0495829966 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-634c38a6-caab-410d-8748-3ec1fd6f9cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:33.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.023 2 DEBUG os_brick.utils [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.027 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.046 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.046 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[333f48a7-bb3f-4a64-8055-99664e77646d]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.048 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.054 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.055 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[c87cf37a-aaf4-4ecc-baa1-da7d61667c8f]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.056 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.066 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.066 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[339558d7-b084-4692-aa6e-96fcb0faabe9]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.067 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[29760578-876c-42c6-8f30-2e905c67738c]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.067 2 DEBUG oslo_concurrency.processutils [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.104 2 DEBUG oslo_concurrency.processutils [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "nvme version" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.106 2 DEBUG os_brick.initiator.connectors.lightos [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.107 2 DEBUG os_brick.initiator.connectors.lightos [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.107 2 DEBUG os_brick.initiator.connectors.lightos [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.107 2 DEBUG os_brick.utils [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] <== get_connector_properties: return (83ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.108 2 DEBUG nova.virt.block_device [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating existing volume attachment record: 8c319c65-041c-4a29-b9df-3c90bdb8a438 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:30:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:34.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.754 2 DEBUG nova.network.neutron [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updated VIF entry in instance network info cache for port 386c73f3-c5a1-4edb-894f-841beabaecbd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.755 2 DEBUG nova.network.neutron [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updating instance_info_cache with network_info: [{"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:34 np0005465988 nova_compute[236126]: 2025-10-02 12:30:34.962 2 DEBUG oslo_concurrency.lockutils [req-7a0fab10-7e9f-4040-9bd9-72348eafd3d3 req-ca4fab89-9e62-46f7-9a4b-6835bc79f4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:30:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:35.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:30:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:36.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:36 np0005465988 nova_compute[236126]: 2025-10-02 12:30:36.644 2 DEBUG nova.objects.instance [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'flavor' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:36 np0005465988 nova_compute[236126]: 2025-10-02 12:30:36.748 2 DEBUG nova.compute.manager [req-cf78ef90-3bba-4ba9-83a0-352452511938 req-cb3b7ac3-258e-4841-980a-f20dc0aa49db d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received event network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:36 np0005465988 nova_compute[236126]: 2025-10-02 12:30:36.748 2 DEBUG oslo_concurrency.lockutils [req-cf78ef90-3bba-4ba9-83a0-352452511938 req-cb3b7ac3-258e-4841-980a-f20dc0aa49db d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:36 np0005465988 nova_compute[236126]: 2025-10-02 12:30:36.749 2 DEBUG oslo_concurrency.lockutils [req-cf78ef90-3bba-4ba9-83a0-352452511938 req-cb3b7ac3-258e-4841-980a-f20dc0aa49db d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:36 np0005465988 nova_compute[236126]: 2025-10-02 12:30:36.749 2 DEBUG oslo_concurrency.lockutils [req-cf78ef90-3bba-4ba9-83a0-352452511938 req-cb3b7ac3-258e-4841-980a-f20dc0aa49db d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "634c38a6-caab-410d-8748-3ec1fd6f9cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:36 np0005465988 nova_compute[236126]: 2025-10-02 12:30:36.749 2 DEBUG nova.compute.manager [req-cf78ef90-3bba-4ba9-83a0-352452511938 req-cb3b7ac3-258e-4841-980a-f20dc0aa49db d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] No waiting events found dispatching network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:36 np0005465988 nova_compute[236126]: 2025-10-02 12:30:36.750 2 WARNING nova.compute.manager [req-cf78ef90-3bba-4ba9-83a0-352452511938 req-cb3b7ac3-258e-4841-980a-f20dc0aa49db d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Received unexpected event network-vif-plugged-b76b92a4-1882-4f89-94f4-3a4700f9c379 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 08:30:36 np0005465988 nova_compute[236126]: 2025-10-02 12:30:36.780 2 DEBUG nova.virt.libvirt.driver [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Attempting to attach volume 1f1fe097-f4b6-4748-bf18-8e487e0f3ba6 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Oct  2 08:30:36 np0005465988 nova_compute[236126]: 2025-10-02 12:30:36.783 2 DEBUG nova.virt.libvirt.guest [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] attach device xml: <disk type="network" device="disk">
Oct  2 08:30:36 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:30:36 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-1f1fe097-f4b6-4748-bf18-8e487e0f3ba6">
Oct  2 08:30:36 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:30:36 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:30:36 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:30:36 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:30:36 np0005465988 nova_compute[236126]:  <auth username="openstack">
Oct  2 08:30:36 np0005465988 nova_compute[236126]:    <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:30:36 np0005465988 nova_compute[236126]:  </auth>
Oct  2 08:30:36 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:30:36 np0005465988 nova_compute[236126]:  <serial>1f1fe097-f4b6-4748-bf18-8e487e0f3ba6</serial>
Oct  2 08:30:36 np0005465988 nova_compute[236126]:  <shareable/>
Oct  2 08:30:36 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:30:36 np0005465988 nova_compute[236126]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:30:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:36Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:94:65:0d 10.100.0.4
Oct  2 08:30:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:36Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:94:65:0d 10.100.0.4
Oct  2 08:30:37 np0005465988 nova_compute[236126]: 2025-10-02 12:30:37.049 2 DEBUG nova.virt.libvirt.driver [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:37 np0005465988 nova_compute[236126]: 2025-10-02 12:30:37.050 2 DEBUG nova.virt.libvirt.driver [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:37 np0005465988 nova_compute[236126]: 2025-10-02 12:30:37.050 2 DEBUG nova.virt.libvirt.driver [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:37 np0005465988 nova_compute[236126]: 2025-10-02 12:30:37.051 2 DEBUG nova.virt.libvirt.driver [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No VIF found with MAC fa:16:3e:60:9d:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:30:37 np0005465988 nova_compute[236126]: 2025-10-02 12:30:37.386 2 DEBUG oslo_concurrency.lockutils [None req-9ef8f0dc-e361-4c41-b979-be8421890a15 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:37.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.114 2 DEBUG oslo_concurrency.lockutils [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.115 2 DEBUG oslo_concurrency.lockutils [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.137 2 DEBUG nova.objects.instance [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'flavor' on Instance uuid c8b713f4-4f41-4153-928c-164f2ed108ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.178 2 DEBUG oslo_concurrency.lockutils [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:38.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.480 2 DEBUG oslo_concurrency.lockutils [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.481 2 DEBUG oslo_concurrency.lockutils [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.481 2 INFO nova.compute.manager [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Attaching volume 1f1fe097-f4b6-4748-bf18-8e487e0f3ba6 to /dev/vdb#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.782 2 DEBUG os_brick.utils [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.784 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.821 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.821 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0fa6ca-d6cc-44b8-9579-6ef7c5421055]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.823 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.831 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.832 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[035a3f04-6181-409f-ae4e-a2961d38c497]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.833 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.842 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.843 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[b00e07e9-2193-4fab-a37a-05f1ce82b954]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.844 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[804c039e-9ef3-4615-8ead-4ad72847eafd]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.845 2 DEBUG oslo_concurrency.processutils [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:38 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [P] New memtable created with log file: #51. Immutable memtables: 0.
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.892 2 DEBUG oslo_concurrency.processutils [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "nvme version" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.897 2 DEBUG os_brick.initiator.connectors.lightos [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.897 2 DEBUG os_brick.initiator.connectors.lightos [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.898 2 DEBUG os_brick.initiator.connectors.lightos [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.899 2 DEBUG os_brick.utils [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] <== get_connector_properties: return (115ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:30:38 np0005465988 nova_compute[236126]: 2025-10-02 12:30:38.899 2 DEBUG nova.virt.block_device [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updating existing volume attachment record: b36803f7-f694-48ab-a490-981509461ace _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:30:39 np0005465988 nova_compute[236126]: 2025-10-02 12:30:39.752 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408224.750724, 634c38a6-caab-410d-8748-3ec1fd6f9cdc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:39 np0005465988 nova_compute[236126]: 2025-10-02 12:30:39.753 2 INFO nova.compute.manager [-] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:30:39 np0005465988 nova_compute[236126]: 2025-10-02 12:30:39.783 2 DEBUG nova.compute.manager [None req-2d3c1d81-5778-4082-bec2-a873db9ef15a - - - - - -] [instance: 634c38a6-caab-410d-8748-3ec1fd6f9cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:39 np0005465988 nova_compute[236126]: 2025-10-02 12:30:39.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:39.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:40 np0005465988 nova_compute[236126]: 2025-10-02 12:30:39.999 2 DEBUG nova.objects.instance [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'flavor' on Instance uuid c8b713f4-4f41-4153-928c-164f2ed108ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:40 np0005465988 nova_compute[236126]: 2025-10-02 12:30:40.070 2 DEBUG nova.virt.libvirt.driver [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Attempting to attach volume 1f1fe097-f4b6-4748-bf18-8e487e0f3ba6 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Oct  2 08:30:40 np0005465988 nova_compute[236126]: 2025-10-02 12:30:40.074 2 DEBUG nova.virt.libvirt.guest [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] attach device xml: <disk type="network" device="disk">
Oct  2 08:30:40 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:30:40 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-1f1fe097-f4b6-4748-bf18-8e487e0f3ba6">
Oct  2 08:30:40 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:30:40 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:30:40 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:30:40 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:30:40 np0005465988 nova_compute[236126]:  <auth username="openstack">
Oct  2 08:30:40 np0005465988 nova_compute[236126]:    <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:30:40 np0005465988 nova_compute[236126]:  </auth>
Oct  2 08:30:40 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:30:40 np0005465988 nova_compute[236126]:  <serial>1f1fe097-f4b6-4748-bf18-8e487e0f3ba6</serial>
Oct  2 08:30:40 np0005465988 nova_compute[236126]:  <shareable/>
Oct  2 08:30:40 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:30:40 np0005465988 nova_compute[236126]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:30:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:40.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:40 np0005465988 nova_compute[236126]: 2025-10-02 12:30:40.402 2 DEBUG nova.virt.libvirt.driver [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:40 np0005465988 nova_compute[236126]: 2025-10-02 12:30:40.402 2 DEBUG nova.virt.libvirt.driver [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:40 np0005465988 nova_compute[236126]: 2025-10-02 12:30:40.402 2 DEBUG nova.virt.libvirt.driver [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:40 np0005465988 nova_compute[236126]: 2025-10-02 12:30:40.402 2 DEBUG nova.virt.libvirt.driver [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No VIF found with MAC fa:16:3e:94:65:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:30:40 np0005465988 nova_compute[236126]: 2025-10-02 12:30:40.684 2 DEBUG oslo_concurrency.lockutils [None req-1ece1c58-f650-427b-b6e9-a4306ba7bc50 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:41.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:42.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:43 np0005465988 nova_compute[236126]: 2025-10-02 12:30:43.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e311 e311: 3 total, 3 up, 3 in
Oct  2 08:30:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:43.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:44 np0005465988 nova_compute[236126]: 2025-10-02 12:30:44.213 2 DEBUG nova.compute.manager [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Oct  2 08:30:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:44.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:44 np0005465988 nova_compute[236126]: 2025-10-02 12:30:44.330 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:44 np0005465988 nova_compute[236126]: 2025-10-02 12:30:44.331 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:44 np0005465988 nova_compute[236126]: 2025-10-02 12:30:44.360 2 DEBUG nova.objects.instance [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:44 np0005465988 nova_compute[236126]: 2025-10-02 12:30:44.406 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:30:44 np0005465988 nova_compute[236126]: 2025-10-02 12:30:44.407 2 INFO nova.compute.claims [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:30:44 np0005465988 nova_compute[236126]: 2025-10-02 12:30:44.407 2 DEBUG nova.objects.instance [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'resources' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:44 np0005465988 nova_compute[236126]: 2025-10-02 12:30:44.422 2 DEBUG nova.objects.instance [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:44 np0005465988 nova_compute[236126]: 2025-10-02 12:30:44.490 2 INFO nova.compute.resource_tracker [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating resource usage from migration d51a423a-06fd-4b2f-bb2c-4aafdb99dc0f#033[00m
Oct  2 08:30:44 np0005465988 nova_compute[236126]: 2025-10-02 12:30:44.603 2 DEBUG oslo_concurrency.processutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:44 np0005465988 nova_compute[236126]: 2025-10-02 12:30:44.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:30:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3030540415' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:30:45 np0005465988 nova_compute[236126]: 2025-10-02 12:30:45.072 2 DEBUG oslo_concurrency.processutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:45 np0005465988 nova_compute[236126]: 2025-10-02 12:30:45.078 2 DEBUG nova.compute.provider_tree [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:30:45 np0005465988 nova_compute[236126]: 2025-10-02 12:30:45.109 2 DEBUG nova.scheduler.client.report [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:30:45 np0005465988 nova_compute[236126]: 2025-10-02 12:30:45.137 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:45 np0005465988 nova_compute[236126]: 2025-10-02 12:30:45.138 2 INFO nova.compute.manager [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Migrating#033[00m
Oct  2 08:30:45 np0005465988 nova_compute[236126]: 2025-10-02 12:30:45.185 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:45 np0005465988 nova_compute[236126]: 2025-10-02 12:30:45.185 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquired lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:45 np0005465988 nova_compute[236126]: 2025-10-02 12:30:45.186 2 DEBUG nova.network.neutron [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:30:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:45.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:46.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:47.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:48 np0005465988 nova_compute[236126]: 2025-10-02 12:30:48.093 2 DEBUG nova.network.neutron [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:48 np0005465988 nova_compute[236126]: 2025-10-02 12:30:48.115 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Releasing lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:48.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:48 np0005465988 nova_compute[236126]: 2025-10-02 12:30:48.237 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:30:48 np0005465988 nova_compute[236126]: 2025-10-02 12:30:48.243 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:30:48 np0005465988 nova_compute[236126]: 2025-10-02 12:30:48.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:49 np0005465988 nova_compute[236126]: 2025-10-02 12:30:49.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:49.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #94. Immutable memtables: 0.
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.071117) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 57] Flushing memtable with next log file: 94
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408250071147, "job": 57, "event": "flush_started", "num_memtables": 1, "num_entries": 755, "num_deletes": 253, "total_data_size": 1259619, "memory_usage": 1276288, "flush_reason": "Manual Compaction"}
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 57] Level-0 flush table #95: started
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408250076799, "cf_name": "default", "job": 57, "event": "table_file_creation", "file_number": 95, "file_size": 818839, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49191, "largest_seqno": 49941, "table_properties": {"data_size": 815163, "index_size": 1456, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8914, "raw_average_key_size": 20, "raw_value_size": 807593, "raw_average_value_size": 1823, "num_data_blocks": 63, "num_entries": 443, "num_filter_entries": 443, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759408213, "oldest_key_time": 1759408213, "file_creation_time": 1759408250, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 95, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 57] Flush lasted 5720 microseconds, and 2749 cpu microseconds.
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.076835) [db/flush_job.cc:967] [default] [JOB 57] Level-0 flush table #95: 818839 bytes OK
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.076856) [db/memtable_list.cc:519] [default] Level-0 commit table #95 started
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.080290) [db/memtable_list.cc:722] [default] Level-0 commit table #95: memtable #1 done
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.080304) EVENT_LOG_v1 {"time_micros": 1759408250080300, "job": 57, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.080320) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 57] Try to delete WAL files size 1255581, prev total WAL file size 1255581, number of live WAL files 2.
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000091.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.080982) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034303136' seq:72057594037927935, type:22 .. '7061786F730034323638' seq:0, type:0; will stop at (end)
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 58] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 57 Base level 0, inputs: [95(799KB)], [93(13MB)]
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408250081052, "job": 58, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [95], "files_L6": [93], "score": -1, "input_data_size": 14597978, "oldest_snapshot_seqno": -1}
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 58] Generated table #96: 7405 keys, 12708573 bytes, temperature: kUnknown
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408250183582, "cf_name": "default", "job": 58, "event": "table_file_creation", "file_number": 96, "file_size": 12708573, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12656407, "index_size": 32534, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18565, "raw_key_size": 191427, "raw_average_key_size": 25, "raw_value_size": 12521456, "raw_average_value_size": 1690, "num_data_blocks": 1286, "num_entries": 7405, "num_filter_entries": 7405, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759408250, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 96, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.183930) [db/compaction/compaction_job.cc:1663] [default] [JOB 58] Compacted 1@0 + 1@6 files to L6 => 12708573 bytes
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.185408) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.2 rd, 123.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 13.1 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(33.3) write-amplify(15.5) OK, records in: 7929, records dropped: 524 output_compression: NoCompression
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.185441) EVENT_LOG_v1 {"time_micros": 1759408250185428, "job": 58, "event": "compaction_finished", "compaction_time_micros": 102633, "compaction_time_cpu_micros": 51336, "output_level": 6, "num_output_files": 1, "total_output_size": 12708573, "num_input_records": 7929, "num_output_records": 7405, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000095.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408250186255, "job": 58, "event": "table_file_deletion", "file_number": 95}
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000093.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408250191128, "job": 58, "event": "table_file_deletion", "file_number": 93}
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.080822) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.191260) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.191265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.191267) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.191269) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:50 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:30:50.191270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:30:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:50.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:50 np0005465988 nova_compute[236126]: 2025-10-02 12:30:50.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:50 np0005465988 kernel: tap7cf26487-91 (unregistering): left promiscuous mode
Oct  2 08:30:50 np0005465988 NetworkManager[45041]: <info>  [1759408250.7655] device (tap7cf26487-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:30:50 np0005465988 nova_compute[236126]: 2025-10-02 12:30:50.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:50 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:50Z|00527|binding|INFO|Releasing lport 7cf26487-91ca-4d15-85f3-bb6a66393796 from this chassis (sb_readonly=0)
Oct  2 08:30:50 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:50Z|00528|binding|INFO|Setting lport 7cf26487-91ca-4d15-85f3-bb6a66393796 down in Southbound
Oct  2 08:30:50 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:50Z|00529|binding|INFO|Removing iface tap7cf26487-91 ovn-installed in OVS
Oct  2 08:30:50 np0005465988 nova_compute[236126]: 2025-10-02 12:30:50.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.788 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:9d:7c 10.100.0.5'], port_security=['fa:16:3e:60:9d:7c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4297c5cd-77b6-4f80-a746-11b304df8c90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-585473f8-52e4-4e55-96df-8a236d361126', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5533aaac08cd4856af72ef4992bb5e76', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a7e36b3-799e-47d8-a152-7f7146431afe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec297f04-3bda-490f-87d3-1f684caf96fd, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7cf26487-91ca-4d15-85f3-bb6a66393796) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:50 np0005465988 nova_compute[236126]: 2025-10-02 12:30:50.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.790 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7cf26487-91ca-4d15-85f3-bb6a66393796 in datapath 585473f8-52e4-4e55-96df-8a236d361126 unbound from our chassis#033[00m
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.793 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 585473f8-52e4-4e55-96df-8a236d361126#033[00m
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.828 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ae28a923-84bb-4425-bbc7-9340c30f7b60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:50 np0005465988 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000072.scope: Deactivated successfully.
Oct  2 08:30:50 np0005465988 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000072.scope: Consumed 16.200s CPU time.
Oct  2 08:30:50 np0005465988 systemd-machined[192594]: Machine qemu-49-instance-00000072 terminated.
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.860 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[14fd0446-8cba-421c-a382-58e95fceaca8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.863 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[700b7af3-b504-4a7f-ad65-9451e78525ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:50 np0005465988 podman[287288]: 2025-10-02 12:30:50.876281041 +0000 UTC m=+0.071825208 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:30:50 np0005465988 podman[287287]: 2025-10-02 12:30:50.888109241 +0000 UTC m=+0.092285416 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.895 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[03457756-a9ee-48d2-9a97-4a22d46b7e6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.912 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[119faf7b-4a74-43db-a3c5-a9544074c7b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap585473f8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:8e:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624727, 'reachable_time': 36955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287357, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:50 np0005465988 podman[287284]: 2025-10-02 12:30:50.927512995 +0000 UTC m=+0.135572282 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.930 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a89d77-9a4e-47a7-98ef-2ee3624ff490]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624742, 'tstamp': 624742}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287358, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624747, 'tstamp': 624747}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287358, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.932 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap585473f8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:50 np0005465988 nova_compute[236126]: 2025-10-02 12:30:50.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:50 np0005465988 nova_compute[236126]: 2025-10-02 12:30:50.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.937 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap585473f8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.938 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.938 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap585473f8-50, col_values=(('external_ids', {'iface-id': '02b7597d-2fc1-4c56-8603-4dcb0c716c82'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:50.938 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:51 np0005465988 kernel: tap7cf26487-91: entered promiscuous mode
Oct  2 08:30:51 np0005465988 kernel: tap7cf26487-91 (unregistering): left promiscuous mode
Oct  2 08:30:51 np0005465988 systemd-udevd[287332]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:30:51 np0005465988 NetworkManager[45041]: <info>  [1759408251.0034] manager: (tap7cf26487-91): new Tun device (/org/freedesktop/NetworkManager/Devices/245)
Oct  2 08:30:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:51Z|00530|binding|INFO|Claiming lport 7cf26487-91ca-4d15-85f3-bb6a66393796 for this chassis.
Oct  2 08:30:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:51Z|00531|binding|INFO|7cf26487-91ca-4d15-85f3-bb6a66393796: Claiming fa:16:3e:60:9d:7c 10.100.0.5
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.011 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:9d:7c 10.100.0.5'], port_security=['fa:16:3e:60:9d:7c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4297c5cd-77b6-4f80-a746-11b304df8c90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-585473f8-52e4-4e55-96df-8a236d361126', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5533aaac08cd4856af72ef4992bb5e76', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a7e36b3-799e-47d8-a152-7f7146431afe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec297f04-3bda-490f-87d3-1f684caf96fd, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7cf26487-91ca-4d15-85f3-bb6a66393796) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.012 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7cf26487-91ca-4d15-85f3-bb6a66393796 in datapath 585473f8-52e4-4e55-96df-8a236d361126 bound to our chassis#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.015 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 585473f8-52e4-4e55-96df-8a236d361126#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:51Z|00532|binding|INFO|Setting lport 7cf26487-91ca-4d15-85f3-bb6a66393796 ovn-installed in OVS
Oct  2 08:30:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:51Z|00533|binding|INFO|Setting lport 7cf26487-91ca-4d15-85f3-bb6a66393796 up in Southbound
Oct  2 08:30:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:51Z|00534|binding|INFO|Releasing lport 7cf26487-91ca-4d15-85f3-bb6a66393796 from this chassis (sb_readonly=1)
Oct  2 08:30:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:51Z|00535|if_status|INFO|Dropped 2 log messages in last 139 seconds (most recently, 139 seconds ago) due to excessive rate
Oct  2 08:30:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:51Z|00536|if_status|INFO|Not setting lport 7cf26487-91ca-4d15-85f3-bb6a66393796 down as sb is readonly
Oct  2 08:30:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:51Z|00537|binding|INFO|Removing iface tap7cf26487-91 ovn-installed in OVS
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.036 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cb0408e3-8f59-4cf7-8428-7a391f1e8d2d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:51Z|00538|binding|INFO|Releasing lport 7cf26487-91ca-4d15-85f3-bb6a66393796 from this chassis (sb_readonly=0)
Oct  2 08:30:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:51Z|00539|binding|INFO|Setting lport 7cf26487-91ca-4d15-85f3-bb6a66393796 down in Southbound
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.048 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:9d:7c 10.100.0.5'], port_security=['fa:16:3e:60:9d:7c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4297c5cd-77b6-4f80-a746-11b304df8c90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-585473f8-52e4-4e55-96df-8a236d361126', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5533aaac08cd4856af72ef4992bb5e76', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a7e36b3-799e-47d8-a152-7f7146431afe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec297f04-3bda-490f-87d3-1f684caf96fd, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7cf26487-91ca-4d15-85f3-bb6a66393796) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.074 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[40db0048-4f2b-417d-8f4b-434e8fb2ee82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.077 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[67e6c4a8-9c65-4ec5-805e-8762c266f87e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.111 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e1031489-2d69-41c0-955c-e730a7ce3b9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.130 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[62673b91-fbe3-4a66-a5ef-b4a431cd4b59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap585473f8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:8e:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 10, 'rx_bytes': 1000, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 10, 'rx_bytes': 1000, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624727, 'reachable_time': 36955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287370, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.151 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d964fe9f-a068-46c7-b4d3-182e64255a10]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624742, 'tstamp': 624742}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287371, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624747, 'tstamp': 624747}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287371, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.153 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap585473f8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.162 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap585473f8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.162 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.163 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap585473f8-50, col_values=(('external_ids', {'iface-id': '02b7597d-2fc1-4c56-8603-4dcb0c716c82'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.164 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.165 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7cf26487-91ca-4d15-85f3-bb6a66393796 in datapath 585473f8-52e4-4e55-96df-8a236d361126 unbound from our chassis#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.168 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 585473f8-52e4-4e55-96df-8a236d361126#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.187 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c89434a9-86c7-441d-88ee-a35ed18a9582]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.199 2 DEBUG nova.compute.manager [req-e88c56eb-fcda-4b98-b5d8-fe477e15ca69 req-5da37015-6526-461d-97be-383ef4ff2de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-unplugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.199 2 DEBUG oslo_concurrency.lockutils [req-e88c56eb-fcda-4b98-b5d8-fe477e15ca69 req-5da37015-6526-461d-97be-383ef4ff2de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.201 2 DEBUG oslo_concurrency.lockutils [req-e88c56eb-fcda-4b98-b5d8-fe477e15ca69 req-5da37015-6526-461d-97be-383ef4ff2de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.201 2 DEBUG oslo_concurrency.lockutils [req-e88c56eb-fcda-4b98-b5d8-fe477e15ca69 req-5da37015-6526-461d-97be-383ef4ff2de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.202 2 DEBUG nova.compute.manager [req-e88c56eb-fcda-4b98-b5d8-fe477e15ca69 req-5da37015-6526-461d-97be-383ef4ff2de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] No waiting events found dispatching network-vif-unplugged-7cf26487-91ca-4d15-85f3-bb6a66393796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.203 2 WARNING nova.compute.manager [req-e88c56eb-fcda-4b98-b5d8-fe477e15ca69 req-5da37015-6526-461d-97be-383ef4ff2de3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received unexpected event network-vif-unplugged-7cf26487-91ca-4d15-85f3-bb6a66393796 for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.226 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a54051-44ac-4861-9ceb-6e5721886e80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.230 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[6869677a-1e4a-4221-acc4-0b0a86cb40f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.264 2 INFO nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.271 2 INFO nova.virt.libvirt.driver [-] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Instance destroyed successfully.#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.273 2 DEBUG nova.virt.libvirt.vif [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:29:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=114,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBEmvr2XQXDnaV0WQDbbXt57cEK6okdC4PHEYdjpQBx2HU9OQgvgRTm3sGWmsa/AInUTPV9ABsCq2lJ9PCqfb1WP51XCZeB9QBIxafEy8h788huF0550ajkopZIwmSLpiA==',key_name='tempest-keypair-425033456',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:29:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5533aaac08cd4856af72ef4992bb5e76',ramdisk_id='',reservation_id='r-w0tlxvyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-1564585024',owner_user_name='tempest-AttachVolumeMultiAttachTest-1564585024-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:30:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22d56fcd2a4b4851bfd126ae4548ee9b',uuid=4297c5cd-77b6-4f80-a746-11b304df8c90,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "vif_mac": "fa:16:3e:60:9d:7c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.274 2 DEBUG nova.network.os_vif_util [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converting VIF {"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "vif_mac": "fa:16:3e:60:9d:7c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.275 2 DEBUG nova.network.os_vif_util [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.275 2 DEBUG os_vif [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.277 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[eadce70d-514a-46d9-b078-0b27333bdda0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.278 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7cf26487-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.284 2 INFO os_vif [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91')#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.295 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[aa903ba2-a17d-4f10-810b-8ae647c05445]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap585473f8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:8e:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 12, 'rx_bytes': 1000, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 12, 'rx_bytes': 1000, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624727, 'reachable_time': 36955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287378, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.312 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b3cd5338-90a4-4224-bd8f-c98e870bd5dc]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624742, 'tstamp': 624742}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287379, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624747, 'tstamp': 624747}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287379, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.313 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap585473f8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.316 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap585473f8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.317 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.317 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap585473f8-50, col_values=(('external_ids', {'iface-id': '02b7597d-2fc1-4c56-8603-4dcb0c716c82'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:51.317 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.592 2 INFO nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Detected multiple connections on this host for volume: 1f1fe097-f4b6-4748-bf18-8e487e0f3ba6, skipping target disconnect.#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.596 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.597 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:30:51 np0005465988 nova_compute[236126]: 2025-10-02 12:30:51.597 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:30:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:51.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:51Z|00540|binding|INFO|Releasing lport 02b7597d-2fc1-4c56-8603-4dcb0c716c82 from this chassis (sb_readonly=0)
Oct  2 08:30:52 np0005465988 nova_compute[236126]: 2025-10-02 12:30:52.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:52.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:52 np0005465988 nova_compute[236126]: 2025-10-02 12:30:52.637 2 DEBUG nova.network.neutron [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Port 7cf26487-91ca-4d15-85f3-bb6a66393796 binding to destination host compute-2.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Oct  2 08:30:52 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #52. Immutable memtables: 8.
Oct  2 08:30:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:52 np0005465988 nova_compute[236126]: 2025-10-02 12:30:52.782 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:52 np0005465988 nova_compute[236126]: 2025-10-02 12:30:52.783 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:52 np0005465988 nova_compute[236126]: 2025-10-02 12:30:52.783 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.091 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.092 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquired lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.092 2 DEBUG nova.network.neutron [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.311 2 DEBUG nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.312 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.312 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.313 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.313 2 DEBUG nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] No waiting events found dispatching network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.313 2 WARNING nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received unexpected event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.314 2 DEBUG nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.314 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.314 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.314 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.315 2 DEBUG nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] No waiting events found dispatching network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.315 2 WARNING nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received unexpected event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.315 2 DEBUG nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.316 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.316 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.316 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.317 2 DEBUG nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] No waiting events found dispatching network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.317 2 WARNING nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received unexpected event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.317 2 DEBUG nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-unplugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.317 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.318 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.318 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.318 2 DEBUG nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] No waiting events found dispatching network-vif-unplugged-7cf26487-91ca-4d15-85f3-bb6a66393796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.319 2 WARNING nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received unexpected event network-vif-unplugged-7cf26487-91ca-4d15-85f3-bb6a66393796 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.319 2 DEBUG nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.319 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.320 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.320 2 DEBUG oslo_concurrency.lockutils [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.320 2 DEBUG nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] No waiting events found dispatching network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.321 2 WARNING nova.compute.manager [req-531c97d1-1983-496a-8c1b-e1e9dcccd188 req-f11e74b6-eaf3-4580-9228-0edb0e02ccb8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received unexpected event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.517 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.518 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.519 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.519 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.520 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:53 np0005465988 nova_compute[236126]: 2025-10-02 12:30:53.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:53.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:30:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4067525008' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.005 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.129 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.129 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.130 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.133 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000073 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.133 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000073 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.133 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000073 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:30:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:30:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:54.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.327 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.328 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4128MB free_disk=20.78494644165039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.329 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.329 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.406 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Applying migration context for instance 4297c5cd-77b6-4f80-a746-11b304df8c90 as it has an incoming, in-progress migration d51a423a-06fd-4b2f-bb2c-4aafdb99dc0f. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.407 2 INFO nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating resource usage from migration d51a423a-06fd-4b2f-bb2c-4aafdb99dc0f#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.465 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance c8b713f4-4f41-4153-928c-164f2ed108ed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.466 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Migration d51a423a-06fd-4b2f-bb2c-4aafdb99dc0f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.466 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 4297c5cd-77b6-4f80-a746-11b304df8c90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.466 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.467 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:30:54 np0005465988 nova_compute[236126]: 2025-10-02 12:30:54.598 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:30:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2766251714' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.059 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.066 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.093 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.124 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.125 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.225 2 DEBUG nova.network.neutron [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.251 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Releasing lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.339 2 DEBUG os_brick.utils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.341 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.353 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.353 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[22ad2d7a-266f-4d2a-977e-1bab4cab0e09]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.355 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.363 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.364 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[6a80b54e-b5ba-4a34-8690-54b4dbe6171c]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.365 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.375 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.376 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[5999d576-e967-4c9b-8fc4-7c2044882d5a]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.377 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[a532b6bd-7c4f-4a9d-bb40-b61b903dd088]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.378 2 DEBUG oslo_concurrency.processutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.417 2 DEBUG oslo_concurrency.processutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "nvme version" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.421 2 DEBUG os_brick.initiator.connectors.lightos [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.422 2 DEBUG os_brick.initiator.connectors.lightos [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.422 2 DEBUG os_brick.initiator.connectors.lightos [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:30:55 np0005465988 nova_compute[236126]: 2025-10-02 12:30:55.423 2 DEBUG os_brick.utils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] <== get_connector_properties: return (82ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:30:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:55.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:56.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:56 np0005465988 nova_compute[236126]: 2025-10-02 12:30:56.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.097 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.100 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.100 2 INFO nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Creating image(s)#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.151 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.153 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.158 2 DEBUG nova.storage.rbd_utils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] creating snapshot(nova-resize) on rbd image(4297c5cd-77b6-4f80-a746-11b304df8c90_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:30:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e312 e312: 3 total, 3 up, 3 in
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.479 2 DEBUG nova.objects.instance [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.628 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.629 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Ensure instance console log exists: /var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.629 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.629 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.630 2 DEBUG oslo_concurrency.lockutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.632 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Start _get_guest_xml network_info=[{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "vif_mac": "fa:16:3e:60:9d:7c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': None, 'mount_device': '/dev/vdb', 'attachment_id': 'b51fbf87-59e9-4776-8492-33d39184200e', 'disk_bus': 'virtio', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-1f1fe097-f4b6-4748-bf18-8e487e0f3ba6', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '1f1fe097-f4b6-4748-bf18-8e487e0f3ba6', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'attaching', 'instance': '4297c5cd-77b6-4f80-a746-11b304df8c90', 'attached_at': '2025-10-02T12:30:56.000000', 'detached_at': '', 'volume_id': '1f1fe097-f4b6-4748-bf18-8e487e0f3ba6', 'multiattach': True, 'serial': '1f1fe097-f4b6-4748-bf18-8e487e0f3ba6'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.636 2 WARNING nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.641 2 DEBUG nova.virt.libvirt.host [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.641 2 DEBUG nova.virt.libvirt.host [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.650 2 DEBUG nova.virt.libvirt.host [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.651 2 DEBUG nova.virt.libvirt.host [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.652 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.653 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eb3a53f1-304b-4cb0-acc3-abffce0fb181',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.653 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.653 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.653 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.654 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.654 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.654 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.654 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.655 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.655 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.655 2 DEBUG nova.virt.hardware [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.655 2 DEBUG nova.objects.instance [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:57 np0005465988 nova_compute[236126]: 2025-10-02 12:30:57.678 2 DEBUG oslo_concurrency.processutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:30:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:57.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:30:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/755243422' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.129 2 DEBUG oslo_concurrency.processutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.170 2 DEBUG oslo_concurrency.processutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:30:58.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:30:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:30:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2031110600' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.640 2 DEBUG oslo_concurrency.processutils [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.686 2 DEBUG nova.virt.libvirt.vif [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:29:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=114,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBEmvr2XQXDnaV0WQDbbXt57cEK6okdC4PHEYdjpQBx2HU9OQgvgRTm3sGWmsa/AInUTPV9ABsCq2lJ9PCqfb1WP51XCZeB9QBIxafEy8h788huF0550ajkopZIwmSLpiA==',key_name='tempest-keypair-425033456',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:29:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5533aaac08cd4856af72ef4992bb5e76',ramdisk_id='',reservation_id='r-w0tlxvyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-1564585024',owner_user_name='tempest-AttachVolumeMultiAttachTest-1564585024-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22d56fcd2a4b4851bfd126ae4548ee9b',uuid=4297c5cd-77b6-4f80-a746-11b304df8c90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "vif_mac": "fa:16:3e:60:9d:7c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.687 2 DEBUG nova.network.os_vif_util [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converting VIF {"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "vif_mac": "fa:16:3e:60:9d:7c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.689 2 DEBUG nova.network.os_vif_util [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.693 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  <uuid>4297c5cd-77b6-4f80-a746-11b304df8c90</uuid>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  <name>instance-00000072</name>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  <memory>196608</memory>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <nova:name>multiattach-server-0</nova:name>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:30:57</nova:creationTime>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.micro">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <nova:memory>192</nova:memory>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <nova:user uuid="22d56fcd2a4b4851bfd126ae4548ee9b">tempest-AttachVolumeMultiAttachTest-1564585024-project-member</nova:user>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <nova:project uuid="5533aaac08cd4856af72ef4992bb5e76">tempest-AttachVolumeMultiAttachTest-1564585024</nova:project>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <nova:port uuid="7cf26487-91ca-4d15-85f3-bb6a66393796">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <entry name="serial">4297c5cd-77b6-4f80-a746-11b304df8c90</entry>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <entry name="uuid">4297c5cd-77b6-4f80-a746-11b304df8c90</entry>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/4297c5cd-77b6-4f80-a746-11b304df8c90_disk">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/4297c5cd-77b6-4f80-a746-11b304df8c90_disk.config">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-1f1fe097-f4b6-4748-bf18-8e487e0f3ba6">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <target dev="vdb" bus="virtio"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <serial>1f1fe097-f4b6-4748-bf18-8e487e0f3ba6</serial>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <shareable/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:60:9d:7c"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <target dev="tap7cf26487-91"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90/console.log" append="off"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:30:58 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:30:58 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:30:58 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:30:58 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.697 2 DEBUG nova.virt.libvirt.vif [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:29:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=114,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBEmvr2XQXDnaV0WQDbbXt57cEK6okdC4PHEYdjpQBx2HU9OQgvgRTm3sGWmsa/AInUTPV9ABsCq2lJ9PCqfb1WP51XCZeB9QBIxafEy8h788huF0550ajkopZIwmSLpiA==',key_name='tempest-keypair-425033456',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:29:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5533aaac08cd4856af72ef4992bb5e76',ramdisk_id='',reservation_id='r-w0tlxvyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-1564585024',owner_user_name='tempest-AttachVolumeMultiAttachTest-1564585024-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22d56fcd2a4b4851bfd126ae4548ee9b',uuid=4297c5cd-77b6-4f80-a746-11b304df8c90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "vif_mac": "fa:16:3e:60:9d:7c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.697 2 DEBUG nova.network.os_vif_util [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converting VIF {"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "vif_mac": "fa:16:3e:60:9d:7c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.698 2 DEBUG nova.network.os_vif_util [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.699 2 DEBUG os_vif [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.704 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.704 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.710 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7cf26487-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.710 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7cf26487-91, col_values=(('external_ids', {'iface-id': '7cf26487-91ca-4d15-85f3-bb6a66393796', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:9d:7c', 'vm-uuid': '4297c5cd-77b6-4f80-a746-11b304df8c90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:58 np0005465988 NetworkManager[45041]: <info>  [1759408258.7141] manager: (tap7cf26487-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.719 2 INFO os_vif [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91')#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.773 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.774 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.774 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.774 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] No VIF found with MAC fa:16:3e:60:9d:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.775 2 INFO nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Using config drive#033[00m
Oct  2 08:30:58 np0005465988 kernel: tap7cf26487-91: entered promiscuous mode
Oct  2 08:30:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:58Z|00541|binding|INFO|Claiming lport 7cf26487-91ca-4d15-85f3-bb6a66393796 for this chassis.
Oct  2 08:30:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:58Z|00542|binding|INFO|7cf26487-91ca-4d15-85f3-bb6a66393796: Claiming fa:16:3e:60:9d:7c 10.100.0.5
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:58 np0005465988 NetworkManager[45041]: <info>  [1759408258.8791] manager: (tap7cf26487-91): new Tun device (/org/freedesktop/NetworkManager/Devices/247)
Oct  2 08:30:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:58.887 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:9d:7c 10.100.0.5'], port_security=['fa:16:3e:60:9d:7c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4297c5cd-77b6-4f80-a746-11b304df8c90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-585473f8-52e4-4e55-96df-8a236d361126', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5533aaac08cd4856af72ef4992bb5e76', 'neutron:revision_number': '7', 'neutron:security_group_ids': '0a7e36b3-799e-47d8-a152-7f7146431afe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec297f04-3bda-490f-87d3-1f684caf96fd, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7cf26487-91ca-4d15-85f3-bb6a66393796) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:58.889 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7cf26487-91ca-4d15-85f3-bb6a66393796 in datapath 585473f8-52e4-4e55-96df-8a236d361126 bound to our chassis#033[00m
Oct  2 08:30:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:58Z|00543|binding|INFO|Setting lport 7cf26487-91ca-4d15-85f3-bb6a66393796 ovn-installed in OVS
Oct  2 08:30:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:30:58Z|00544|binding|INFO|Setting lport 7cf26487-91ca-4d15-85f3-bb6a66393796 up in Southbound
Oct  2 08:30:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:58.892 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 585473f8-52e4-4e55-96df-8a236d361126#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:58 np0005465988 nova_compute[236126]: 2025-10-02 12:30:58.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:58 np0005465988 systemd-udevd[287600]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:30:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:58.912 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[25443715-9a83-466f-8907-7083d8f6ede2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:58 np0005465988 systemd-machined[192594]: New machine qemu-52-instance-00000072.
Oct  2 08:30:58 np0005465988 NetworkManager[45041]: <info>  [1759408258.9261] device (tap7cf26487-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:30:58 np0005465988 NetworkManager[45041]: <info>  [1759408258.9267] device (tap7cf26487-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:30:58 np0005465988 systemd[1]: Started Virtual Machine qemu-52-instance-00000072.
Oct  2 08:30:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:58.954 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8211d9cf-a88a-4f3c-9466-0497b0674778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:58.958 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3f51c5-843f-40fa-9870-f8f0c1bce1fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:58.993 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e51e6788-640f-4736-aecf-49b89b6f0643]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:59.021 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[88442e67-2d1c-4e5d-9983-938b335843a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap585473f8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:8e:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 14, 'rx_bytes': 1000, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 14, 'rx_bytes': 1000, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624727, 'reachable_time': 36955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287615, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:59.046 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d639d1c9-f468-4512-a18e-22db38177c04]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624742, 'tstamp': 624742}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287617, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624747, 'tstamp': 624747}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287617, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:59.049 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap585473f8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:59 np0005465988 nova_compute[236126]: 2025-10-02 12:30:59.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:59 np0005465988 nova_compute[236126]: 2025-10-02 12:30:59.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:59.052 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap585473f8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:59.052 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:59.053 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap585473f8-50, col_values=(('external_ids', {'iface-id': '02b7597d-2fc1-4c56-8603-4dcb0c716c82'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:30:59.053 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:59 np0005465988 nova_compute[236126]: 2025-10-02 12:30:59.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:30:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:30:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:30:59.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.041 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 4297c5cd-77b6-4f80-a746-11b304df8c90 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.042 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408260.0407205, 4297c5cd-77b6-4f80-a746-11b304df8c90 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.042 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.045 2 DEBUG nova.compute.manager [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.051 2 INFO nova.virt.libvirt.driver [-] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Instance running successfully.#033[00m
Oct  2 08:31:00 np0005465988 virtqemud[235689]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.055 2 DEBUG nova.virt.libvirt.guest [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.056 2 DEBUG nova.virt.libvirt.driver [None req-4751fad9-f7cd-40be-bec8-58ad276c7349 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.068 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.072 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.095 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.095 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408260.0461817, 4297c5cd-77b6-4f80-a746-11b304df8c90 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.096 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] VM Started (Lifecycle Event)#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.117 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.121 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.144 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Oct  2 08:31:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:00.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.558 2 DEBUG nova.compute.manager [req-06d903fa-5102-4a8f-9eed-99294b838d71 req-0ac66899-2dfc-4df5-8871-b6a217044e6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.558 2 DEBUG oslo_concurrency.lockutils [req-06d903fa-5102-4a8f-9eed-99294b838d71 req-0ac66899-2dfc-4df5-8871-b6a217044e6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.559 2 DEBUG oslo_concurrency.lockutils [req-06d903fa-5102-4a8f-9eed-99294b838d71 req-0ac66899-2dfc-4df5-8871-b6a217044e6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.560 2 DEBUG oslo_concurrency.lockutils [req-06d903fa-5102-4a8f-9eed-99294b838d71 req-0ac66899-2dfc-4df5-8871-b6a217044e6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.560 2 DEBUG nova.compute.manager [req-06d903fa-5102-4a8f-9eed-99294b838d71 req-0ac66899-2dfc-4df5-8871-b6a217044e6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] No waiting events found dispatching network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:00 np0005465988 nova_compute[236126]: 2025-10-02 12:31:00.561 2 WARNING nova.compute.manager [req-06d903fa-5102-4a8f-9eed-99294b838d71 req-0ac66899-2dfc-4df5-8871-b6a217044e6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received unexpected event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 for instance with vm_state resized and task_state None.#033[00m
Oct  2 08:31:01 np0005465988 nova_compute[236126]: 2025-10-02 12:31:01.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:01.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:01 np0005465988 nova_compute[236126]: 2025-10-02 12:31:01.995 2 DEBUG oslo_concurrency.lockutils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:01 np0005465988 nova_compute[236126]: 2025-10-02 12:31:01.996 2 DEBUG oslo_concurrency.lockutils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:01 np0005465988 nova_compute[236126]: 2025-10-02 12:31:01.996 2 DEBUG nova.compute.manager [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Going to confirm migration 17 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  2 08:31:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:02.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:02 np0005465988 nova_compute[236126]: 2025-10-02 12:31:02.723 2 DEBUG nova.compute.manager [req-846ac920-3acf-4958-844f-5ab188c70655 req-57009920-be49-4930-99e4-49d46a9766fe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:02 np0005465988 nova_compute[236126]: 2025-10-02 12:31:02.724 2 DEBUG oslo_concurrency.lockutils [req-846ac920-3acf-4958-844f-5ab188c70655 req-57009920-be49-4930-99e4-49d46a9766fe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:02 np0005465988 nova_compute[236126]: 2025-10-02 12:31:02.724 2 DEBUG oslo_concurrency.lockutils [req-846ac920-3acf-4958-844f-5ab188c70655 req-57009920-be49-4930-99e4-49d46a9766fe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:02 np0005465988 nova_compute[236126]: 2025-10-02 12:31:02.725 2 DEBUG oslo_concurrency.lockutils [req-846ac920-3acf-4958-844f-5ab188c70655 req-57009920-be49-4930-99e4-49d46a9766fe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:02 np0005465988 nova_compute[236126]: 2025-10-02 12:31:02.725 2 DEBUG nova.compute.manager [req-846ac920-3acf-4958-844f-5ab188c70655 req-57009920-be49-4930-99e4-49d46a9766fe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] No waiting events found dispatching network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:02 np0005465988 nova_compute[236126]: 2025-10-02 12:31:02.725 2 WARNING nova.compute.manager [req-846ac920-3acf-4958-844f-5ab188c70655 req-57009920-be49-4930-99e4-49d46a9766fe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received unexpected event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 for instance with vm_state resized and task_state None.#033[00m
Oct  2 08:31:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:03 np0005465988 nova_compute[236126]: 2025-10-02 12:31:03.155 2 DEBUG oslo_concurrency.lockutils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:03 np0005465988 nova_compute[236126]: 2025-10-02 12:31:03.156 2 DEBUG oslo_concurrency.lockutils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquired lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:03 np0005465988 nova_compute[236126]: 2025-10-02 12:31:03.156 2 DEBUG nova.network.neutron [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:31:03 np0005465988 nova_compute[236126]: 2025-10-02 12:31:03.156 2 DEBUG nova.objects.instance [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'info_cache' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:03 np0005465988 nova_compute[236126]: 2025-10-02 12:31:03.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:03 np0005465988 podman[287680]: 2025-10-02 12:31:03.537262999 +0000 UTC m=+0.067804691 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:31:03 np0005465988 nova_compute[236126]: 2025-10-02 12:31:03.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:03 np0005465988 nova_compute[236126]: 2025-10-02 12:31:03.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:03.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:04.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:05.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:06.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:06 np0005465988 nova_compute[236126]: 2025-10-02 12:31:06.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:06 np0005465988 nova_compute[236126]: 2025-10-02 12:31:06.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:06 np0005465988 nova_compute[236126]: 2025-10-02 12:31:06.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:31:06 np0005465988 nova_compute[236126]: 2025-10-02 12:31:06.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:31:06 np0005465988 nova_compute[236126]: 2025-10-02 12:31:06.794 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:06 np0005465988 nova_compute[236126]: 2025-10-02 12:31:06.846 2 DEBUG nova.network.neutron [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:06 np0005465988 nova_compute[236126]: 2025-10-02 12:31:06.941 2 DEBUG oslo_concurrency.lockutils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Releasing lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:06 np0005465988 nova_compute[236126]: 2025-10-02 12:31:06.942 2 DEBUG nova.objects.instance [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'migration_context' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:06 np0005465988 nova_compute[236126]: 2025-10-02 12:31:06.944 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:06 np0005465988 nova_compute[236126]: 2025-10-02 12:31:06.944 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:31:06 np0005465988 nova_compute[236126]: 2025-10-02 12:31:06.945 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:07 np0005465988 nova_compute[236126]: 2025-10-02 12:31:07.110 2 DEBUG nova.storage.rbd_utils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] removing snapshot(nova-resize) on rbd image(4297c5cd-77b6-4f80-a746-11b304df8c90_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:31:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:07.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:08.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e313 e313: 3 total, 3 up, 3 in
Oct  2 08:31:08 np0005465988 nova_compute[236126]: 2025-10-02 12:31:08.374 2 DEBUG oslo_concurrency.lockutils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:08 np0005465988 nova_compute[236126]: 2025-10-02 12:31:08.375 2 DEBUG oslo_concurrency.lockutils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:08 np0005465988 nova_compute[236126]: 2025-10-02 12:31:08.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:08 np0005465988 nova_compute[236126]: 2025-10-02 12:31:08.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:09 np0005465988 nova_compute[236126]: 2025-10-02 12:31:09.031 2 DEBUG oslo_concurrency.processutils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:09 np0005465988 nova_compute[236126]: 2025-10-02 12:31:09.403 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:31:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3832861014' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:31:09 np0005465988 nova_compute[236126]: 2025-10-02 12:31:09.870 2 DEBUG oslo_concurrency.processutils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.839s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:09 np0005465988 nova_compute[236126]: 2025-10-02 12:31:09.877 2 DEBUG nova.compute.provider_tree [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:09.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:09 np0005465988 nova_compute[236126]: 2025-10-02 12:31:09.957 2 DEBUG nova.scheduler.client.report [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:09 np0005465988 nova_compute[236126]: 2025-10-02 12:31:09.962 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:09 np0005465988 nova_compute[236126]: 2025-10-02 12:31:09.962 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:31:10 np0005465988 nova_compute[236126]: 2025-10-02 12:31:10.250 2 DEBUG oslo_concurrency.lockutils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 1.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:10.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:10 np0005465988 nova_compute[236126]: 2025-10-02 12:31:10.549 2 INFO nova.scheduler.client.report [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Deleted allocation for migration d51a423a-06fd-4b2f-bb2c-4aafdb99dc0f#033[00m
Oct  2 08:31:10 np0005465988 nova_compute[236126]: 2025-10-02 12:31:10.713 2 DEBUG oslo_concurrency.lockutils [None req-5027235b-01c2-4c68-b8ea-59b75016eee8 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 8.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:11.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:12.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e314 e314: 3 total, 3 up, 3 in
Oct  2 08:31:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e314 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:13 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:13Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:60:9d:7c 10.100.0.5
Oct  2 08:31:13 np0005465988 nova_compute[236126]: 2025-10-02 12:31:13.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:13 np0005465988 nova_compute[236126]: 2025-10-02 12:31:13.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:13.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:14.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:15 np0005465988 nova_compute[236126]: 2025-10-02 12:31:15.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:15.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:16.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e314 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:17.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:18.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e315 e315: 3 total, 3 up, 3 in
Oct  2 08:31:18 np0005465988 nova_compute[236126]: 2025-10-02 12:31:18.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:18 np0005465988 nova_compute[236126]: 2025-10-02 12:31:18.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:19.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:20.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:21 np0005465988 podman[287815]: 2025-10-02 12:31:21.562159052 +0000 UTC m=+0.077506560 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:31:21 np0005465988 podman[287816]: 2025-10-02 12:31:21.562152352 +0000 UTC m=+0.072082134 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  2 08:31:21 np0005465988 podman[287814]: 2025-10-02 12:31:21.593318539 +0000 UTC m=+0.108629106 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:31:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:21.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:22.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:23 np0005465988 nova_compute[236126]: 2025-10-02 12:31:23.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:23 np0005465988 nova_compute[236126]: 2025-10-02 12:31:23.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:23.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:24.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:24 np0005465988 nova_compute[236126]: 2025-10-02 12:31:24.334 2 DEBUG nova.compute.manager [req-c64670d9-c9d3-438b-9d32-06ba965ab631 req-a2258b61-1539-4c90-85a6-b5b8c7911a9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received event network-changed-386c73f3-c5a1-4edb-894f-841beabaecbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:24 np0005465988 nova_compute[236126]: 2025-10-02 12:31:24.334 2 DEBUG nova.compute.manager [req-c64670d9-c9d3-438b-9d32-06ba965ab631 req-a2258b61-1539-4c90-85a6-b5b8c7911a9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Refreshing instance network info cache due to event network-changed-386c73f3-c5a1-4edb-894f-841beabaecbd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:31:24 np0005465988 nova_compute[236126]: 2025-10-02 12:31:24.335 2 DEBUG oslo_concurrency.lockutils [req-c64670d9-c9d3-438b-9d32-06ba965ab631 req-a2258b61-1539-4c90-85a6-b5b8c7911a9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:24 np0005465988 nova_compute[236126]: 2025-10-02 12:31:24.335 2 DEBUG oslo_concurrency.lockutils [req-c64670d9-c9d3-438b-9d32-06ba965ab631 req-a2258b61-1539-4c90-85a6-b5b8c7911a9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:24 np0005465988 nova_compute[236126]: 2025-10-02 12:31:24.335 2 DEBUG nova.network.neutron [req-c64670d9-c9d3-438b-9d32-06ba965ab631 req-a2258b61-1539-4c90-85a6-b5b8c7911a9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Refreshing network info cache for port 386c73f3-c5a1-4edb-894f-841beabaecbd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:31:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:25.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:26.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:26 np0005465988 nova_compute[236126]: 2025-10-02 12:31:26.655 2 DEBUG nova.compute.manager [req-5e4c2816-8e42-42c5-919b-b8c890866cd5 req-43ffd3ec-3def-46f0-b212-8525d17bef97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-changed-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:26 np0005465988 nova_compute[236126]: 2025-10-02 12:31:26.655 2 DEBUG nova.compute.manager [req-5e4c2816-8e42-42c5-919b-b8c890866cd5 req-43ffd3ec-3def-46f0-b212-8525d17bef97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Refreshing instance network info cache due to event network-changed-7cf26487-91ca-4d15-85f3-bb6a66393796. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:31:26 np0005465988 nova_compute[236126]: 2025-10-02 12:31:26.656 2 DEBUG oslo_concurrency.lockutils [req-5e4c2816-8e42-42c5-919b-b8c890866cd5 req-43ffd3ec-3def-46f0-b212-8525d17bef97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:26 np0005465988 nova_compute[236126]: 2025-10-02 12:31:26.656 2 DEBUG oslo_concurrency.lockutils [req-5e4c2816-8e42-42c5-919b-b8c890866cd5 req-43ffd3ec-3def-46f0-b212-8525d17bef97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:26 np0005465988 nova_compute[236126]: 2025-10-02 12:31:26.656 2 DEBUG nova.network.neutron [req-5e4c2816-8e42-42c5-919b-b8c890866cd5 req-43ffd3ec-3def-46f0-b212-8525d17bef97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Refreshing network info cache for port 7cf26487-91ca-4d15-85f3-bb6a66393796 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:31:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:27.361 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:27.361 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:27.362 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:31:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:31:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:31:27 np0005465988 nova_compute[236126]: 2025-10-02 12:31:27.724 2 DEBUG nova.network.neutron [req-c64670d9-c9d3-438b-9d32-06ba965ab631 req-a2258b61-1539-4c90-85a6-b5b8c7911a9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updated VIF entry in instance network info cache for port 386c73f3-c5a1-4edb-894f-841beabaecbd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:31:27 np0005465988 nova_compute[236126]: 2025-10-02 12:31:27.725 2 DEBUG nova.network.neutron [req-c64670d9-c9d3-438b-9d32-06ba965ab631 req-a2258b61-1539-4c90-85a6-b5b8c7911a9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updating instance_info_cache with network_info: [{"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:27.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:28 np0005465988 nova_compute[236126]: 2025-10-02 12:31:28.194 2 DEBUG oslo_concurrency.lockutils [req-c64670d9-c9d3-438b-9d32-06ba965ab631 req-a2258b61-1539-4c90-85a6-b5b8c7911a9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:28.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:28 np0005465988 nova_compute[236126]: 2025-10-02 12:31:28.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:28 np0005465988 nova_compute[236126]: 2025-10-02 12:31:28.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:29 np0005465988 nova_compute[236126]: 2025-10-02 12:31:29.448 2 DEBUG nova.network.neutron [req-5e4c2816-8e42-42c5-919b-b8c890866cd5 req-43ffd3ec-3def-46f0-b212-8525d17bef97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updated VIF entry in instance network info cache for port 7cf26487-91ca-4d15-85f3-bb6a66393796. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:31:29 np0005465988 nova_compute[236126]: 2025-10-02 12:31:29.449 2 DEBUG nova.network.neutron [req-5e4c2816-8e42-42c5-919b-b8c890866cd5 req-43ffd3ec-3def-46f0-b212-8525d17bef97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:29 np0005465988 nova_compute[236126]: 2025-10-02 12:31:29.754 2 DEBUG oslo_concurrency.lockutils [req-5e4c2816-8e42-42c5-919b-b8c890866cd5 req-43ffd3ec-3def-46f0-b212-8525d17bef97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:29.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:30.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e316 e316: 3 total, 3 up, 3 in
Oct  2 08:31:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:31.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:32 np0005465988 nova_compute[236126]: 2025-10-02 12:31:32.078 2 DEBUG oslo_concurrency.lockutils [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:32 np0005465988 nova_compute[236126]: 2025-10-02 12:31:32.079 2 DEBUG oslo_concurrency.lockutils [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquired lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:32 np0005465988 nova_compute[236126]: 2025-10-02 12:31:32.079 2 DEBUG nova.network.neutron [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:31:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:32.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e316 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #97. Immutable memtables: 0.
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.602897) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 59] Flushing memtable with next log file: 97
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408293602956, "job": 59, "event": "flush_started", "num_memtables": 1, "num_entries": 762, "num_deletes": 253, "total_data_size": 1214044, "memory_usage": 1235304, "flush_reason": "Manual Compaction"}
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 59] Level-0 flush table #98: started
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408293610188, "cf_name": "default", "job": 59, "event": "table_file_creation", "file_number": 98, "file_size": 569462, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49946, "largest_seqno": 50703, "table_properties": {"data_size": 566249, "index_size": 1057, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 9029, "raw_average_key_size": 21, "raw_value_size": 559267, "raw_average_value_size": 1303, "num_data_blocks": 47, "num_entries": 429, "num_filter_entries": 429, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759408251, "oldest_key_time": 1759408251, "file_creation_time": 1759408293, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 98, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 59] Flush lasted 7354 microseconds, and 4216 cpu microseconds.
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.610252) [db/flush_job.cc:967] [default] [JOB 59] Level-0 flush table #98: 569462 bytes OK
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.610283) [db/memtable_list.cc:519] [default] Level-0 commit table #98 started
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.611530) [db/memtable_list.cc:722] [default] Level-0 commit table #98: memtable #1 done
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.611553) EVENT_LOG_v1 {"time_micros": 1759408293611546, "job": 59, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.611574) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 59] Try to delete WAL files size 1209981, prev total WAL file size 1209981, number of live WAL files 2.
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000094.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.612761) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031353037' seq:72057594037927935, type:22 .. '6D6772737461740031373630' seq:0, type:0; will stop at (end)
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 60] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 59 Base level 0, inputs: [98(556KB)], [96(12MB)]
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408293612878, "job": 60, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [98], "files_L6": [96], "score": -1, "input_data_size": 13278035, "oldest_snapshot_seqno": -1}
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 60] Generated table #99: 7328 keys, 9648714 bytes, temperature: kUnknown
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408293684914, "cf_name": "default", "job": 60, "event": "table_file_creation", "file_number": 99, "file_size": 9648714, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9601303, "index_size": 27971, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18373, "raw_key_size": 190154, "raw_average_key_size": 25, "raw_value_size": 9472039, "raw_average_value_size": 1292, "num_data_blocks": 1096, "num_entries": 7328, "num_filter_entries": 7328, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759408293, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 99, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.685275) [db/compaction/compaction_job.cc:1663] [default] [JOB 60] Compacted 1@0 + 1@6 files to L6 => 9648714 bytes
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.688805) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 184.1 rd, 133.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 12.1 +0.0 blob) out(9.2 +0.0 blob), read-write-amplify(40.3) write-amplify(16.9) OK, records in: 7834, records dropped: 506 output_compression: NoCompression
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.688868) EVENT_LOG_v1 {"time_micros": 1759408293688848, "job": 60, "event": "compaction_finished", "compaction_time_micros": 72129, "compaction_time_cpu_micros": 49090, "output_level": 6, "num_output_files": 1, "total_output_size": 9648714, "num_input_records": 7834, "num_output_records": 7328, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000098.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408293689294, "job": 60, "event": "table_file_deletion", "file_number": 98}
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000096.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408293693941, "job": 60, "event": "table_file_deletion", "file_number": 96}
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.612543) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.694036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.694044) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.694046) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.694048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:31:33 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:31:33.694049) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:31:33 np0005465988 nova_compute[236126]: 2025-10-02 12:31:33.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:33 np0005465988 nova_compute[236126]: 2025-10-02 12:31:33.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:33.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:34 np0005465988 nova_compute[236126]: 2025-10-02 12:31:34.169 2 DEBUG nova.network.neutron [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updating instance_info_cache with network_info: [{"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:34 np0005465988 nova_compute[236126]: 2025-10-02 12:31:34.216 2 DEBUG oslo_concurrency.lockutils [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Releasing lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:34 np0005465988 podman[288087]: 2025-10-02 12:31:34.269832422 +0000 UTC m=+0.080572649 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:31:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:34.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:34 np0005465988 nova_compute[236126]: 2025-10-02 12:31:34.656 2 DEBUG nova.virt.libvirt.driver [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:31:34 np0005465988 nova_compute[236126]: 2025-10-02 12:31:34.656 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Creating file /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/91d5f90b41cc4cab972cd4b8248ce8b6.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  2 08:31:34 np0005465988 nova_compute[236126]: 2025-10-02 12:31:34.657 2 DEBUG oslo_concurrency.processutils [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/91d5f90b41cc4cab972cd4b8248ce8b6.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:31:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:31:35 np0005465988 nova_compute[236126]: 2025-10-02 12:31:35.164 2 DEBUG oslo_concurrency.processutils [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/91d5f90b41cc4cab972cd4b8248ce8b6.tmp" returned: 1 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:35 np0005465988 nova_compute[236126]: 2025-10-02 12:31:35.165 2 DEBUG oslo_concurrency.processutils [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed/91d5f90b41cc4cab972cd4b8248ce8b6.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  2 08:31:35 np0005465988 nova_compute[236126]: 2025-10-02 12:31:35.166 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Creating directory /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  2 08:31:35 np0005465988 nova_compute[236126]: 2025-10-02 12:31:35.167 2 DEBUG oslo_concurrency.processutils [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:35 np0005465988 nova_compute[236126]: 2025-10-02 12:31:35.397 2 DEBUG oslo_concurrency.processutils [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/c8b713f4-4f41-4153-928c-164f2ed108ed" returned: 0 in 0.231s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:35 np0005465988 nova_compute[236126]: 2025-10-02 12:31:35.406 2 DEBUG nova.virt.libvirt.driver [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:31:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:35.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:36.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e316 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:37.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:38.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:38 np0005465988 kernel: tap386c73f3-c5 (unregistering): left promiscuous mode
Oct  2 08:31:38 np0005465988 NetworkManager[45041]: <info>  [1759408298.4256] device (tap386c73f3-c5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.429 2 INFO nova.virt.libvirt.driver [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:38Z|00545|binding|INFO|Releasing lport 386c73f3-c5a1-4edb-894f-841beabaecbd from this chassis (sb_readonly=0)
Oct  2 08:31:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:38Z|00546|binding|INFO|Setting lport 386c73f3-c5a1-4edb-894f-841beabaecbd down in Southbound
Oct  2 08:31:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:38Z|00547|binding|INFO|Removing iface tap386c73f3-c5 ovn-installed in OVS
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.447 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:65:0d 10.100.0.4'], port_security=['fa:16:3e:94:65:0d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c8b713f4-4f41-4153-928c-164f2ed108ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-585473f8-52e4-4e55-96df-8a236d361126', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5533aaac08cd4856af72ef4992bb5e76', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a7e36b3-799e-47d8-a152-7f7146431afe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec297f04-3bda-490f-87d3-1f684caf96fd, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=386c73f3-c5a1-4edb-894f-841beabaecbd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.448 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 386c73f3-c5a1-4edb-894f-841beabaecbd in datapath 585473f8-52e4-4e55-96df-8a236d361126 unbound from our chassis#033[00m
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.450 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 585473f8-52e4-4e55-96df-8a236d361126#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.479 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[332fe4d8-f9cb-4373-a7bb-896f5fcdd765]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.519 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7911712f-33d7-44c1-8305-43788cc2a920]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:38 np0005465988 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000073.scope: Deactivated successfully.
Oct  2 08:31:38 np0005465988 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000073.scope: Consumed 17.982s CPU time.
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.522 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[462c71bb-c263-4c90-aa22-c8a5f82e8a60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:38 np0005465988 systemd-machined[192594]: Machine qemu-51-instance-00000073 terminated.
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.555 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[01dbcb6d-b207-46f8-9997-3ed2e1f52f32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.575 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b74b3684-8d21-4c92-a5e3-740603afaf79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap585473f8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:8e:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 16, 'rx_bytes': 1084, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 16, 'rx_bytes': 1084, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624727, 'reachable_time': 36955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288148, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.592 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8bbe806e-886a-44f5-bafa-34221543a487]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624742, 'tstamp': 624742}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288149, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap585473f8-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624747, 'tstamp': 624747}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288149, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.593 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap585473f8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.600 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap585473f8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.600 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.601 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap585473f8-50, col_values=(('external_ids', {'iface-id': '02b7597d-2fc1-4c56-8603-4dcb0c716c82'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:38.601 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.677 2 INFO nova.virt.libvirt.driver [-] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Instance destroyed successfully.#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.678 2 DEBUG nova.virt.libvirt.vif [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:30:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=115,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBEmvr2XQXDnaV0WQDbbXt57cEK6okdC4PHEYdjpQBx2HU9OQgvgRTm3sGWmsa/AInUTPV9ABsCq2lJ9PCqfb1WP51XCZeB9QBIxafEy8h788huF0550ajkopZIwmSLpiA==',key_name='tempest-keypair-425033456',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:30:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5533aaac08cd4856af72ef4992bb5e76',ramdisk_id='',reservation_id='r-1m21sn7g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-1564585024',owner_user_name='tempest-AttachVolumeMultiAttachTest-1564585024-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:31:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22d56fcd2a4b4851bfd126ae4548ee9b',uuid=c8b713f4-4f41-4153-928c-164f2ed108ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "vif_mac": "fa:16:3e:94:65:0d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.679 2 DEBUG nova.network.os_vif_util [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converting VIF {"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "vif_mac": "fa:16:3e:94:65:0d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.680 2 DEBUG nova.network.os_vif_util [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:94:65:0d,bridge_name='br-int',has_traffic_filtering=True,id=386c73f3-c5a1-4edb-894f-841beabaecbd,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap386c73f3-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.680 2 DEBUG os_vif [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:65:0d,bridge_name='br-int',has_traffic_filtering=True,id=386c73f3-c5a1-4edb-894f-841beabaecbd,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap386c73f3-c5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.682 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap386c73f3-c5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.689 2 INFO os_vif [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:65:0d,bridge_name='br-int',has_traffic_filtering=True,id=386c73f3-c5a1-4edb-894f-841beabaecbd,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap386c73f3-c5')#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e317 e317: 3 total, 3 up, 3 in
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.954 2 INFO nova.virt.libvirt.driver [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Detected multiple connections on this host for volume: 1f1fe097-f4b6-4748-bf18-8e487e0f3ba6, skipping target disconnect.#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.960 2 DEBUG nova.virt.libvirt.driver [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] skipping disk for instance-00000073 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.960 2 DEBUG nova.virt.libvirt.driver [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] skipping disk for instance-00000073 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:31:38 np0005465988 nova_compute[236126]: 2025-10-02 12:31:38.960 2 DEBUG nova.virt.libvirt.driver [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] skipping disk for instance-00000073 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:31:39 np0005465988 nova_compute[236126]: 2025-10-02 12:31:39.758 2 DEBUG nova.compute.manager [req-4ee1c75a-8fff-4407-94c5-a4dd6bdf635f req-7c1a4234-8e38-45c3-ba9e-b5f4dd8e18c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received event network-vif-unplugged-386c73f3-c5a1-4edb-894f-841beabaecbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:39 np0005465988 nova_compute[236126]: 2025-10-02 12:31:39.759 2 DEBUG oslo_concurrency.lockutils [req-4ee1c75a-8fff-4407-94c5-a4dd6bdf635f req-7c1a4234-8e38-45c3-ba9e-b5f4dd8e18c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:39 np0005465988 nova_compute[236126]: 2025-10-02 12:31:39.759 2 DEBUG oslo_concurrency.lockutils [req-4ee1c75a-8fff-4407-94c5-a4dd6bdf635f req-7c1a4234-8e38-45c3-ba9e-b5f4dd8e18c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:39 np0005465988 nova_compute[236126]: 2025-10-02 12:31:39.760 2 DEBUG oslo_concurrency.lockutils [req-4ee1c75a-8fff-4407-94c5-a4dd6bdf635f req-7c1a4234-8e38-45c3-ba9e-b5f4dd8e18c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:39 np0005465988 nova_compute[236126]: 2025-10-02 12:31:39.760 2 DEBUG nova.compute.manager [req-4ee1c75a-8fff-4407-94c5-a4dd6bdf635f req-7c1a4234-8e38-45c3-ba9e-b5f4dd8e18c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] No waiting events found dispatching network-vif-unplugged-386c73f3-c5a1-4edb-894f-841beabaecbd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:39 np0005465988 nova_compute[236126]: 2025-10-02 12:31:39.761 2 WARNING nova.compute.manager [req-4ee1c75a-8fff-4407-94c5-a4dd6bdf635f req-7c1a4234-8e38-45c3-ba9e-b5f4dd8e18c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received unexpected event network-vif-unplugged-386c73f3-c5a1-4edb-894f-841beabaecbd for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  2 08:31:39 np0005465988 nova_compute[236126]: 2025-10-02 12:31:39.861 2 DEBUG neutronclient.v2_0.client [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 386c73f3-c5a1-4edb-894f-841beabaecbd for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:31:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:39.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:40 np0005465988 nova_compute[236126]: 2025-10-02 12:31:40.084 2 DEBUG oslo_concurrency.lockutils [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:40 np0005465988 nova_compute[236126]: 2025-10-02 12:31:40.085 2 DEBUG oslo_concurrency.lockutils [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:40 np0005465988 nova_compute[236126]: 2025-10-02 12:31:40.085 2 DEBUG oslo_concurrency.lockutils [None req-bbfe8bb3-4f9f-443e-a7a1-89d804bda2d7 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:40.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:41.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:42.033 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.034 2 DEBUG nova.compute.manager [req-7b2c8041-3f9a-4f8d-b264-88de9512b43a req-bc436fcc-6c04-4315-874e-e8054022ce55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received event network-changed-386c73f3-c5a1-4edb-894f-841beabaecbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.035 2 DEBUG nova.compute.manager [req-7b2c8041-3f9a-4f8d-b264-88de9512b43a req-bc436fcc-6c04-4315-874e-e8054022ce55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Refreshing instance network info cache due to event network-changed-386c73f3-c5a1-4edb-894f-841beabaecbd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:31:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:42.035 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.036 2 DEBUG oslo_concurrency.lockutils [req-7b2c8041-3f9a-4f8d-b264-88de9512b43a req-bc436fcc-6c04-4315-874e-e8054022ce55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.036 2 DEBUG oslo_concurrency.lockutils [req-7b2c8041-3f9a-4f8d-b264-88de9512b43a req-bc436fcc-6c04-4315-874e-e8054022ce55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.036 2 DEBUG nova.network.neutron [req-7b2c8041-3f9a-4f8d-b264-88de9512b43a req-bc436fcc-6c04-4315-874e-e8054022ce55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Refreshing network info cache for port 386c73f3-c5a1-4edb-894f-841beabaecbd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.069 2 DEBUG nova.compute.manager [req-1a1bef82-45d0-49f3-92f7-1c2cec8353c0 req-98ab2c51-3289-493c-99ca-3094bdfc058d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received event network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.069 2 DEBUG oslo_concurrency.lockutils [req-1a1bef82-45d0-49f3-92f7-1c2cec8353c0 req-98ab2c51-3289-493c-99ca-3094bdfc058d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.070 2 DEBUG oslo_concurrency.lockutils [req-1a1bef82-45d0-49f3-92f7-1c2cec8353c0 req-98ab2c51-3289-493c-99ca-3094bdfc058d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.070 2 DEBUG oslo_concurrency.lockutils [req-1a1bef82-45d0-49f3-92f7-1c2cec8353c0 req-98ab2c51-3289-493c-99ca-3094bdfc058d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.071 2 DEBUG nova.compute.manager [req-1a1bef82-45d0-49f3-92f7-1c2cec8353c0 req-98ab2c51-3289-493c-99ca-3094bdfc058d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] No waiting events found dispatching network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.071 2 WARNING nova.compute.manager [req-1a1bef82-45d0-49f3-92f7-1c2cec8353c0 req-98ab2c51-3289-493c-99ca-3094bdfc058d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received unexpected event network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:31:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:42.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.532 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Acquiring lock "d949b168-1d5a-4487-8380-e99f5847c0fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.533 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.588 2 DEBUG nova.compute.manager [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.743 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.744 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.752 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:31:42 np0005465988 nova_compute[236126]: 2025-10-02 12:31:42.752 2 INFO nova.compute.claims [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:31:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e317 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.048 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:31:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/74631981' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.560 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.569 2 DEBUG nova.compute.provider_tree [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.608 2 DEBUG nova.scheduler.client.report [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.636 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.637 2 DEBUG nova.compute.manager [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.698 2 DEBUG nova.compute.manager [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.698 2 DEBUG nova.network.neutron [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.721 2 INFO nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.742 2 DEBUG nova.compute.manager [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.846 2 DEBUG nova.compute.manager [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.848 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.849 2 INFO nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Creating image(s)#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.885 2 DEBUG nova.storage.rbd_utils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] rbd image d949b168-1d5a-4487-8380-e99f5847c0fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.919 2 DEBUG nova.storage.rbd_utils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] rbd image d949b168-1d5a-4487-8380-e99f5847c0fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:31:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:43.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.955 2 DEBUG nova.storage.rbd_utils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] rbd image d949b168-1d5a-4487-8380-e99f5847c0fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:31:43 np0005465988 nova_compute[236126]: 2025-10-02 12:31:43.960 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:44 np0005465988 nova_compute[236126]: 2025-10-02 12:31:44.079 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:44 np0005465988 nova_compute[236126]: 2025-10-02 12:31:44.081 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:44 np0005465988 nova_compute[236126]: 2025-10-02 12:31:44.083 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:44 np0005465988 nova_compute[236126]: 2025-10-02 12:31:44.083 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:44 np0005465988 nova_compute[236126]: 2025-10-02 12:31:44.117 2 DEBUG nova.storage.rbd_utils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] rbd image d949b168-1d5a-4487-8380-e99f5847c0fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:31:44 np0005465988 nova_compute[236126]: 2025-10-02 12:31:44.122 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 d949b168-1d5a-4487-8380-e99f5847c0fd_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:44 np0005465988 nova_compute[236126]: 2025-10-02 12:31:44.209 2 DEBUG nova.policy [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4a89b71e2513413e922ee6d5d06362b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '27a1729bf10548219b90df46839849f5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:31:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:44.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:44 np0005465988 nova_compute[236126]: 2025-10-02 12:31:44.439 2 DEBUG nova.network.neutron [req-7b2c8041-3f9a-4f8d-b264-88de9512b43a req-bc436fcc-6c04-4315-874e-e8054022ce55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updated VIF entry in instance network info cache for port 386c73f3-c5a1-4edb-894f-841beabaecbd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:31:44 np0005465988 nova_compute[236126]: 2025-10-02 12:31:44.440 2 DEBUG nova.network.neutron [req-7b2c8041-3f9a-4f8d-b264-88de9512b43a req-bc436fcc-6c04-4315-874e-e8054022ce55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updating instance_info_cache with network_info: [{"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:44 np0005465988 nova_compute[236126]: 2025-10-02 12:31:44.469 2 DEBUG oslo_concurrency.lockutils [req-7b2c8041-3f9a-4f8d-b264-88de9512b43a req-bc436fcc-6c04-4315-874e-e8054022ce55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:31:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1365739230' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:31:45 np0005465988 nova_compute[236126]: 2025-10-02 12:31:45.386 2 DEBUG nova.network.neutron [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Successfully created port: 3e2b3141-d578-4c23-bf29-c676fefd32ec _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:31:45 np0005465988 nova_compute[236126]: 2025-10-02 12:31:45.735 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 d949b168-1d5a-4487-8380-e99f5847c0fd_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.613s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:45 np0005465988 nova_compute[236126]: 2025-10-02 12:31:45.816 2 DEBUG nova.storage.rbd_utils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] resizing rbd image d949b168-1d5a-4487-8380-e99f5847c0fd_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:31:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:45.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:46.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:46 np0005465988 nova_compute[236126]: 2025-10-02 12:31:46.316 2 DEBUG nova.objects.instance [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lazy-loading 'migration_context' on Instance uuid d949b168-1d5a-4487-8380-e99f5847c0fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:46 np0005465988 nova_compute[236126]: 2025-10-02 12:31:46.353 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:31:46 np0005465988 nova_compute[236126]: 2025-10-02 12:31:46.354 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Ensure instance console log exists: /var/lib/nova/instances/d949b168-1d5a-4487-8380-e99f5847c0fd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:31:46 np0005465988 nova_compute[236126]: 2025-10-02 12:31:46.355 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:46 np0005465988 nova_compute[236126]: 2025-10-02 12:31:46.356 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:46 np0005465988 nova_compute[236126]: 2025-10-02 12:31:46.357 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e318 e318: 3 total, 3 up, 3 in
Oct  2 08:31:47 np0005465988 nova_compute[236126]: 2025-10-02 12:31:47.038 2 DEBUG nova.network.neutron [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Successfully updated port: 3e2b3141-d578-4c23-bf29-c676fefd32ec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:31:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:47.038 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:47 np0005465988 nova_compute[236126]: 2025-10-02 12:31:47.068 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Acquiring lock "refresh_cache-d949b168-1d5a-4487-8380-e99f5847c0fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:47 np0005465988 nova_compute[236126]: 2025-10-02 12:31:47.068 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Acquired lock "refresh_cache-d949b168-1d5a-4487-8380-e99f5847c0fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:47 np0005465988 nova_compute[236126]: 2025-10-02 12:31:47.069 2 DEBUG nova.network.neutron [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:31:47 np0005465988 nova_compute[236126]: 2025-10-02 12:31:47.131 2 DEBUG nova.compute.manager [req-79a37cd7-9cb0-4482-97eb-deac986ad4e1 req-e44cb91c-cb28-490c-a219-a961a7973de8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Received event network-changed-3e2b3141-d578-4c23-bf29-c676fefd32ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:47 np0005465988 nova_compute[236126]: 2025-10-02 12:31:47.132 2 DEBUG nova.compute.manager [req-79a37cd7-9cb0-4482-97eb-deac986ad4e1 req-e44cb91c-cb28-490c-a219-a961a7973de8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Refreshing instance network info cache due to event network-changed-3e2b3141-d578-4c23-bf29-c676fefd32ec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:31:47 np0005465988 nova_compute[236126]: 2025-10-02 12:31:47.132 2 DEBUG oslo_concurrency.lockutils [req-79a37cd7-9cb0-4482-97eb-deac986ad4e1 req-e44cb91c-cb28-490c-a219-a961a7973de8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-d949b168-1d5a-4487-8380-e99f5847c0fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:47 np0005465988 nova_compute[236126]: 2025-10-02 12:31:47.276 2 DEBUG nova.network.neutron [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:31:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:47.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.277 2 DEBUG nova.network.neutron [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Updating instance_info_cache with network_info: [{"id": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "address": "fa:16:3e:66:16:87", "network": {"id": "247d774d-0cc8-4ef2-a9b8-c756adae0874", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1434934627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27a1729bf10548219b90df46839849f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e2b3141-d5", "ovs_interfaceid": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.304 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Releasing lock "refresh_cache-d949b168-1d5a-4487-8380-e99f5847c0fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.305 2 DEBUG nova.compute.manager [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Instance network_info: |[{"id": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "address": "fa:16:3e:66:16:87", "network": {"id": "247d774d-0cc8-4ef2-a9b8-c756adae0874", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1434934627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27a1729bf10548219b90df46839849f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e2b3141-d5", "ovs_interfaceid": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.306 2 DEBUG oslo_concurrency.lockutils [req-79a37cd7-9cb0-4482-97eb-deac986ad4e1 req-e44cb91c-cb28-490c-a219-a961a7973de8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-d949b168-1d5a-4487-8380-e99f5847c0fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.306 2 DEBUG nova.network.neutron [req-79a37cd7-9cb0-4482-97eb-deac986ad4e1 req-e44cb91c-cb28-490c-a219-a961a7973de8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Refreshing network info cache for port 3e2b3141-d578-4c23-bf29-c676fefd32ec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.310 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Start _get_guest_xml network_info=[{"id": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "address": "fa:16:3e:66:16:87", "network": {"id": "247d774d-0cc8-4ef2-a9b8-c756adae0874", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1434934627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27a1729bf10548219b90df46839849f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e2b3141-d5", "ovs_interfaceid": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:31:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:48.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.317 2 WARNING nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.324 2 DEBUG nova.virt.libvirt.host [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.325 2 DEBUG nova.virt.libvirt.host [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.329 2 DEBUG nova.virt.libvirt.host [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.330 2 DEBUG nova.virt.libvirt.host [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.331 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.331 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.332 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.332 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.332 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.333 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.333 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.333 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.334 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.334 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.334 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.335 2 DEBUG nova.virt.hardware [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.338 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:31:48 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/250638847' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.811 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.847 2 DEBUG nova.storage.rbd_utils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] rbd image d949b168-1d5a-4487-8380-e99f5847c0fd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:31:48 np0005465988 nova_compute[236126]: 2025-10-02 12:31:48.853 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.044 2 DEBUG nova.compute.manager [req-6d7460c6-8c4e-4524-a2dc-013a535677e5 req-a968322e-0b00-4386-97e6-58f07cf3d31e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received event network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.044 2 DEBUG oslo_concurrency.lockutils [req-6d7460c6-8c4e-4524-a2dc-013a535677e5 req-a968322e-0b00-4386-97e6-58f07cf3d31e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.045 2 DEBUG oslo_concurrency.lockutils [req-6d7460c6-8c4e-4524-a2dc-013a535677e5 req-a968322e-0b00-4386-97e6-58f07cf3d31e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.045 2 DEBUG oslo_concurrency.lockutils [req-6d7460c6-8c4e-4524-a2dc-013a535677e5 req-a968322e-0b00-4386-97e6-58f07cf3d31e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.045 2 DEBUG nova.compute.manager [req-6d7460c6-8c4e-4524-a2dc-013a535677e5 req-a968322e-0b00-4386-97e6-58f07cf3d31e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] No waiting events found dispatching network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.045 2 WARNING nova.compute.manager [req-6d7460c6-8c4e-4524-a2dc-013a535677e5 req-a968322e-0b00-4386-97e6-58f07cf3d31e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received unexpected event network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd for instance with vm_state active and task_state resize_finish.#033[00m
Oct  2 08:31:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:31:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3523354863' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.301 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.303 2 DEBUG nova.virt.libvirt.vif [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:31:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-870650247',display_name='tempest-ServerDiskConfigTestJSON-server-870650247',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-870650247',id=120,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='27a1729bf10548219b90df46839849f5',ramdisk_id='',reservation_id='r-qovnmdul',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1123059068',owner_user_name='tempest-ServerDiskConfigTestJSON-1123059068-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:31:43Z,user_data=None,user_id='4a89b71e2513413e922ee6d5d06362b1',uuid=d949b168-1d5a-4487-8380-e99f5847c0fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "address": "fa:16:3e:66:16:87", "network": {"id": "247d774d-0cc8-4ef2-a9b8-c756adae0874", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1434934627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27a1729bf10548219b90df46839849f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e2b3141-d5", "ovs_interfaceid": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.303 2 DEBUG nova.network.os_vif_util [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Converting VIF {"id": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "address": "fa:16:3e:66:16:87", "network": {"id": "247d774d-0cc8-4ef2-a9b8-c756adae0874", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1434934627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27a1729bf10548219b90df46839849f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e2b3141-d5", "ovs_interfaceid": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.304 2 DEBUG nova.network.os_vif_util [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:16:87,bridge_name='br-int',has_traffic_filtering=True,id=3e2b3141-d578-4c23-bf29-c676fefd32ec,network=Network(247d774d-0cc8-4ef2-a9b8-c756adae0874),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e2b3141-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.306 2 DEBUG nova.objects.instance [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid d949b168-1d5a-4487-8380-e99f5847c0fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.423 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  <uuid>d949b168-1d5a-4487-8380-e99f5847c0fd</uuid>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  <name>instance-00000078</name>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-870650247</nova:name>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:31:48</nova:creationTime>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <nova:user uuid="4a89b71e2513413e922ee6d5d06362b1">tempest-ServerDiskConfigTestJSON-1123059068-project-member</nova:user>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <nova:project uuid="27a1729bf10548219b90df46839849f5">tempest-ServerDiskConfigTestJSON-1123059068</nova:project>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <nova:port uuid="3e2b3141-d578-4c23-bf29-c676fefd32ec">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <entry name="serial">d949b168-1d5a-4487-8380-e99f5847c0fd</entry>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <entry name="uuid">d949b168-1d5a-4487-8380-e99f5847c0fd</entry>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/d949b168-1d5a-4487-8380-e99f5847c0fd_disk">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/d949b168-1d5a-4487-8380-e99f5847c0fd_disk.config">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:66:16:87"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <target dev="tap3e2b3141-d5"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/d949b168-1d5a-4487-8380-e99f5847c0fd/console.log" append="off"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:31:49 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:31:49 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:31:49 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:31:49 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.423 2 DEBUG nova.compute.manager [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Preparing to wait for external event network-vif-plugged-3e2b3141-d578-4c23-bf29-c676fefd32ec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.424 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Acquiring lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.424 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.424 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.425 2 DEBUG nova.virt.libvirt.vif [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:31:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-870650247',display_name='tempest-ServerDiskConfigTestJSON-server-870650247',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-870650247',id=120,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='27a1729bf10548219b90df46839849f5',ramdisk_id='',reservation_id='r-qovnmdul',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1123059068',owner_user_name='tempest-ServerDiskConfigTestJSON-1123059068-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:31:43Z,user_data=None,user_id='4a89b71e2513413e922ee6d5d06362b1',uuid=d949b168-1d5a-4487-8380-e99f5847c0fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "address": "fa:16:3e:66:16:87", "network": {"id": "247d774d-0cc8-4ef2-a9b8-c756adae0874", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1434934627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27a1729bf10548219b90df46839849f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e2b3141-d5", "ovs_interfaceid": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.425 2 DEBUG nova.network.os_vif_util [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Converting VIF {"id": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "address": "fa:16:3e:66:16:87", "network": {"id": "247d774d-0cc8-4ef2-a9b8-c756adae0874", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1434934627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27a1729bf10548219b90df46839849f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e2b3141-d5", "ovs_interfaceid": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.426 2 DEBUG nova.network.os_vif_util [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:16:87,bridge_name='br-int',has_traffic_filtering=True,id=3e2b3141-d578-4c23-bf29-c676fefd32ec,network=Network(247d774d-0cc8-4ef2-a9b8-c756adae0874),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e2b3141-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.427 2 DEBUG os_vif [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:16:87,bridge_name='br-int',has_traffic_filtering=True,id=3e2b3141-d578-4c23-bf29-c676fefd32ec,network=Network(247d774d-0cc8-4ef2-a9b8-c756adae0874),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e2b3141-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.428 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.428 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.432 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e2b3141-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.432 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e2b3141-d5, col_values=(('external_ids', {'iface-id': '3e2b3141-d578-4c23-bf29-c676fefd32ec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:16:87', 'vm-uuid': 'd949b168-1d5a-4487-8380-e99f5847c0fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:49 np0005465988 NetworkManager[45041]: <info>  [1759408309.4355] manager: (tap3e2b3141-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.441 2 INFO os_vif [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:16:87,bridge_name='br-int',has_traffic_filtering=True,id=3e2b3141-d578-4c23-bf29-c676fefd32ec,network=Network(247d774d-0cc8-4ef2-a9b8-c756adae0874),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e2b3141-d5')#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.705 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.706 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.706 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] No VIF found with MAC fa:16:3e:66:16:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.708 2 INFO nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Using config drive#033[00m
Oct  2 08:31:49 np0005465988 nova_compute[236126]: 2025-10-02 12:31:49.749 2 DEBUG nova.storage.rbd_utils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] rbd image d949b168-1d5a-4487-8380-e99f5847c0fd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:31:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:49.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:50.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:50 np0005465988 nova_compute[236126]: 2025-10-02 12:31:50.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:50 np0005465988 nova_compute[236126]: 2025-10-02 12:31:50.602 2 INFO nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Creating config drive at /var/lib/nova/instances/d949b168-1d5a-4487-8380-e99f5847c0fd/disk.config#033[00m
Oct  2 08:31:50 np0005465988 nova_compute[236126]: 2025-10-02 12:31:50.612 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d949b168-1d5a-4487-8380-e99f5847c0fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi778y469 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:50 np0005465988 nova_compute[236126]: 2025-10-02 12:31:50.780 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d949b168-1d5a-4487-8380-e99f5847c0fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi778y469" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:50 np0005465988 nova_compute[236126]: 2025-10-02 12:31:50.864 2 DEBUG nova.storage.rbd_utils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] rbd image d949b168-1d5a-4487-8380-e99f5847c0fd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:31:50 np0005465988 nova_compute[236126]: 2025-10-02 12:31:50.876 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d949b168-1d5a-4487-8380-e99f5847c0fd/disk.config d949b168-1d5a-4487-8380-e99f5847c0fd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.065 2 DEBUG oslo_concurrency.processutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d949b168-1d5a-4487-8380-e99f5847c0fd/disk.config d949b168-1d5a-4487-8380-e99f5847c0fd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.189s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.066 2 INFO nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Deleting local config drive /var/lib/nova/instances/d949b168-1d5a-4487-8380-e99f5847c0fd/disk.config because it was imported into RBD.#033[00m
Oct  2 08:31:51 np0005465988 kernel: tap3e2b3141-d5: entered promiscuous mode
Oct  2 08:31:51 np0005465988 NetworkManager[45041]: <info>  [1759408311.1401] manager: (tap3e2b3141-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:51Z|00548|binding|INFO|Claiming lport 3e2b3141-d578-4c23-bf29-c676fefd32ec for this chassis.
Oct  2 08:31:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:51Z|00549|binding|INFO|3e2b3141-d578-4c23-bf29-c676fefd32ec: Claiming fa:16:3e:66:16:87 10.100.0.10
Oct  2 08:31:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:51Z|00550|binding|INFO|Setting lport 3e2b3141-d578-4c23-bf29-c676fefd32ec ovn-installed in OVS
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:51Z|00551|binding|INFO|Setting lport 3e2b3141-d578-4c23-bf29-c676fefd32ec up in Southbound
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.170 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:16:87 10.100.0.10'], port_security=['fa:16:3e:66:16:87 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd949b168-1d5a-4487-8380-e99f5847c0fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-247d774d-0cc8-4ef2-a9b8-c756adae0874', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27a1729bf10548219b90df46839849f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '19f6d4f0-1655-4062-a124-10140844bfae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f7e0b23-d51b-4498-9dd8-e3096f69c99c, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3e2b3141-d578-4c23-bf29-c676fefd32ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.172 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3e2b3141-d578-4c23-bf29-c676fefd32ec in datapath 247d774d-0cc8-4ef2-a9b8-c756adae0874 bound to our chassis#033[00m
Oct  2 08:31:51 np0005465988 systemd-udevd[288543]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.175 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 247d774d-0cc8-4ef2-a9b8-c756adae0874#033[00m
Oct  2 08:31:51 np0005465988 systemd-machined[192594]: New machine qemu-53-instance-00000078.
Oct  2 08:31:51 np0005465988 NetworkManager[45041]: <info>  [1759408311.1892] device (tap3e2b3141-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:31:51 np0005465988 NetworkManager[45041]: <info>  [1759408311.1908] device (tap3e2b3141-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.192 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ced3fde9-7788-4235-96f9-1c90227dbfea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.194 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap247d774d-01 in ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:31:51 np0005465988 systemd[1]: Started Virtual Machine qemu-53-instance-00000078.
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.197 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap247d774d-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.197 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9bcccd-4ba8-429e-ad90-4c80bd492f1f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.199 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[34efc2b9-1d92-48a8-b161-084801b55666]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.215 2 DEBUG nova.compute.manager [req-2e4aa462-ef48-49ae-9fcc-8a2bd97a3379 req-37ef9860-985d-405e-999c-50fa4f7c611b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received event network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.215 2 DEBUG oslo_concurrency.lockutils [req-2e4aa462-ef48-49ae-9fcc-8a2bd97a3379 req-37ef9860-985d-405e-999c-50fa4f7c611b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.215 2 DEBUG oslo_concurrency.lockutils [req-2e4aa462-ef48-49ae-9fcc-8a2bd97a3379 req-37ef9860-985d-405e-999c-50fa4f7c611b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.216 2 DEBUG oslo_concurrency.lockutils [req-2e4aa462-ef48-49ae-9fcc-8a2bd97a3379 req-37ef9860-985d-405e-999c-50fa4f7c611b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.216 2 DEBUG nova.compute.manager [req-2e4aa462-ef48-49ae-9fcc-8a2bd97a3379 req-37ef9860-985d-405e-999c-50fa4f7c611b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] No waiting events found dispatching network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.216 2 WARNING nova.compute.manager [req-2e4aa462-ef48-49ae-9fcc-8a2bd97a3379 req-37ef9860-985d-405e-999c-50fa4f7c611b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Received unexpected event network-vif-plugged-386c73f3-c5a1-4edb-894f-841beabaecbd for instance with vm_state resized and task_state None.#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.219 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[20f66349-ee4a-4db4-891e-fdac4ddc9ca8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.250 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8c2c727a-7419-41f4-ad8c-9d74c8d40105]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.289 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[aa399bf0-e31c-4dea-9aa7-34edc26749d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.296 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea15740-e967-43c3-be6f-543dbb2545f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 NetworkManager[45041]: <info>  [1759408311.2993] manager: (tap247d774d-00): new Veth device (/org/freedesktop/NetworkManager/Devices/250)
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.340 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f910141d-aa12-4b18-a5b8-6bc6613568ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.345 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b9fc9688-d117-4873-8392-376981a8c536]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 NetworkManager[45041]: <info>  [1759408311.3747] device (tap247d774d-00): carrier: link connected
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.382 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[533e9f20-63b9-4ab7-8a62-4668d3e9dda5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.405 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5e33d49e-9f41-4ce0-9952-e1aefa1d48d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap247d774d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:ab:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637170, 'reachable_time': 28470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288577, 'error': None, 'target': 'ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.432 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[129960ec-be1e-47b8-9571-2c888014bf07]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:ab18'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637170, 'tstamp': 637170}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288578, 'error': None, 'target': 'ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.455 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ffa77477-d54d-4ac9-8a9e-ac8b8b30792b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap247d774d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:ab:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 220, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 220, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637170, 'reachable_time': 28470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 288579, 'error': None, 'target': 'ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.502 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[803cee61-8d18-4542-a382-c3c96135745c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.591 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7c41fe4b-feb0-4ff1-8a98-de3770629ca3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.593 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap247d774d-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.594 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.595 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap247d774d-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:51 np0005465988 NetworkManager[45041]: <info>  [1759408311.6012] manager: (tap247d774d-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Oct  2 08:31:51 np0005465988 kernel: tap247d774d-00: entered promiscuous mode
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.607 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap247d774d-00, col_values=(('external_ids', {'iface-id': '04584168-a51c-41f9-9206-d39db8a81566'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:51Z|00552|binding|INFO|Releasing lport 04584168-a51c-41f9-9206-d39db8a81566 from this chassis (sb_readonly=0)
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.613 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/247d774d-0cc8-4ef2-a9b8-c756adae0874.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/247d774d-0cc8-4ef2-a9b8-c756adae0874.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.615 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d8bedbe5-4518-47ce-a864-e5fc78aa0f96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.616 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-247d774d-0cc8-4ef2-a9b8-c756adae0874
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/247d774d-0cc8-4ef2-a9b8-c756adae0874.pid.haproxy
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 247d774d-0cc8-4ef2-a9b8-c756adae0874
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:31:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:51.618 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874', 'env', 'PROCESS_TAG=haproxy-247d774d-0cc8-4ef2-a9b8-c756adae0874', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/247d774d-0cc8-4ef2-a9b8-c756adae0874.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.777 2 DEBUG nova.network.neutron [req-79a37cd7-9cb0-4482-97eb-deac986ad4e1 req-e44cb91c-cb28-490c-a219-a961a7973de8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Updated VIF entry in instance network info cache for port 3e2b3141-d578-4c23-bf29-c676fefd32ec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.777 2 DEBUG nova.network.neutron [req-79a37cd7-9cb0-4482-97eb-deac986ad4e1 req-e44cb91c-cb28-490c-a219-a961a7973de8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Updating instance_info_cache with network_info: [{"id": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "address": "fa:16:3e:66:16:87", "network": {"id": "247d774d-0cc8-4ef2-a9b8-c756adae0874", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1434934627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27a1729bf10548219b90df46839849f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e2b3141-d5", "ovs_interfaceid": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:51 np0005465988 nova_compute[236126]: 2025-10-02 12:31:51.830 2 DEBUG oslo_concurrency.lockutils [req-79a37cd7-9cb0-4482-97eb-deac986ad4e1 req-e44cb91c-cb28-490c-a219-a961a7973de8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-d949b168-1d5a-4487-8380-e99f5847c0fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:51.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:52 np0005465988 podman[288636]: 2025-10-02 12:31:51.946551932 +0000 UTC m=+0.022299503 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:31:52 np0005465988 podman[288636]: 2025-10-02 12:31:52.050507981 +0000 UTC m=+0.126255542 container create 10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.098 2 DEBUG nova.compute.manager [req-bed9afb8-25e7-41a9-b1e3-e6cbbda8825e req-89ed0ab8-4ff9-4c84-aa33-83a9c6e2231a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Received event network-vif-plugged-3e2b3141-d578-4c23-bf29-c676fefd32ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.098 2 DEBUG oslo_concurrency.lockutils [req-bed9afb8-25e7-41a9-b1e3-e6cbbda8825e req-89ed0ab8-4ff9-4c84-aa33-83a9c6e2231a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.099 2 DEBUG oslo_concurrency.lockutils [req-bed9afb8-25e7-41a9-b1e3-e6cbbda8825e req-89ed0ab8-4ff9-4c84-aa33-83a9c6e2231a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.099 2 DEBUG oslo_concurrency.lockutils [req-bed9afb8-25e7-41a9-b1e3-e6cbbda8825e req-89ed0ab8-4ff9-4c84-aa33-83a9c6e2231a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.099 2 DEBUG nova.compute.manager [req-bed9afb8-25e7-41a9-b1e3-e6cbbda8825e req-89ed0ab8-4ff9-4c84-aa33-83a9c6e2231a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Processing event network-vif-plugged-3e2b3141-d578-4c23-bf29-c676fefd32ec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:31:52 np0005465988 systemd[1]: Started libpod-conmon-10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e.scope.
Oct  2 08:31:52 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:31:52 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0a547790d27c2101f1d1fc87a79f111402d5eda1d60660bb9704834330453ea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:31:52 np0005465988 podman[288636]: 2025-10-02 12:31:52.189993754 +0000 UTC m=+0.265741315 container init 10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:31:52 np0005465988 podman[288636]: 2025-10-02 12:31:52.198578081 +0000 UTC m=+0.274325652 container start 10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:31:52 np0005465988 podman[288667]: 2025-10-02 12:31:52.222435607 +0000 UTC m=+0.082092062 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:31:52 np0005465988 neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874[288686]: [NOTICE]   (288723) : New worker (288738) forked
Oct  2 08:31:52 np0005465988 neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874[288686]: [NOTICE]   (288723) : Loading success.
Oct  2 08:31:52 np0005465988 podman[288662]: 2025-10-02 12:31:52.248057104 +0000 UTC m=+0.144605041 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:31:52 np0005465988 podman[288663]: 2025-10-02 12:31:52.258602438 +0000 UTC m=+0.137612030 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  2 08:31:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:52.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.678 2 DEBUG oslo_concurrency.lockutils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "c8b713f4-4f41-4153-928c-164f2ed108ed" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.679 2 DEBUG oslo_concurrency.lockutils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.679 2 DEBUG nova.compute.manager [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Going to confirm migration 19 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.743 2 DEBUG nova.compute.manager [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.745 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408312.7447376, d949b168-1d5a-4487-8380-e99f5847c0fd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.745 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] VM Started (Lifecycle Event)#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.750 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.754 2 INFO nova.virt.libvirt.driver [-] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Instance spawned successfully.#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.755 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:31:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.804 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.811 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.811 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.812 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.812 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.813 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.813 2 DEBUG nova.virt.libvirt.driver [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.820 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.869 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.869 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408312.7473345, d949b168-1d5a-4487-8380-e99f5847c0fd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.869 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.959 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.962 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408312.750075, d949b168-1d5a-4487-8380-e99f5847c0fd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.962 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:31:52 np0005465988 nova_compute[236126]: 2025-10-02 12:31:52.999 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.003 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.062 2 INFO nova.compute.manager [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Took 9.21 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.062 2 DEBUG nova.compute.manager [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.075 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.195 2 INFO nova.compute.manager [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Took 10.52 seconds to build instance.#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.226 2 DEBUG oslo_concurrency.lockutils [None req-49b8c0f7-4708-4bcc-a756-43a3774b952e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.341 2 DEBUG neutronclient.v2_0.client [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 386c73f3-c5a1-4edb-894f-841beabaecbd for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.342 2 DEBUG oslo_concurrency.lockutils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.343 2 DEBUG oslo_concurrency.lockutils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquired lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.343 2 DEBUG nova.network.neutron [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.343 2 DEBUG nova.objects.instance [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'info_cache' on Instance uuid c8b713f4-4f41-4153-928c-164f2ed108ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.515 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.516 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.516 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.517 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.517 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.675 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408298.6742842, c8b713f4-4f41-4153-928c-164f2ed108ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.676 2 INFO nova.compute.manager [-] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.721 2 DEBUG nova.compute.manager [None req-a46d82c6-e32d-4316-bbfc-3a34e4f53d25 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.725 2 DEBUG nova.compute.manager [None req-a46d82c6-e32d-4316-bbfc-3a34e4f53d25 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.750 2 INFO nova.compute.manager [None req-a46d82c6-e32d-4316-bbfc-3a34e4f53d25 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Oct  2 08:31:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:31:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/12138334' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:31:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:53.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:53 np0005465988 nova_compute[236126]: 2025-10-02 12:31:53.961 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.059 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.059 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.060 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.066 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000078 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.066 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000078 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.072 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000073 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.073 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000073 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.073 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000073 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.276 2 DEBUG nova.compute.manager [req-c44d1493-7da6-416a-a86f-3c7a6b78abd7 req-a7927337-d9f7-4880-bcdb-28b5ff58e24b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Received event network-vif-plugged-3e2b3141-d578-4c23-bf29-c676fefd32ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.277 2 DEBUG oslo_concurrency.lockutils [req-c44d1493-7da6-416a-a86f-3c7a6b78abd7 req-a7927337-d9f7-4880-bcdb-28b5ff58e24b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.277 2 DEBUG oslo_concurrency.lockutils [req-c44d1493-7da6-416a-a86f-3c7a6b78abd7 req-a7927337-d9f7-4880-bcdb-28b5ff58e24b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.277 2 DEBUG oslo_concurrency.lockutils [req-c44d1493-7da6-416a-a86f-3c7a6b78abd7 req-a7927337-d9f7-4880-bcdb-28b5ff58e24b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.278 2 DEBUG nova.compute.manager [req-c44d1493-7da6-416a-a86f-3c7a6b78abd7 req-a7927337-d9f7-4880-bcdb-28b5ff58e24b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] No waiting events found dispatching network-vif-plugged-3e2b3141-d578-4c23-bf29-c676fefd32ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.278 2 WARNING nova.compute.manager [req-c44d1493-7da6-416a-a86f-3c7a6b78abd7 req-a7927337-d9f7-4880-bcdb-28b5ff58e24b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Received unexpected event network-vif-plugged-3e2b3141-d578-4c23-bf29-c676fefd32ec for instance with vm_state active and task_state None.#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.298 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.300 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4001MB free_disk=20.71839141845703GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.300 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.300 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:31:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:54.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.449 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Migration for instance c8b713f4-4f41-4153-928c-164f2ed108ed refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.511 2 INFO nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updating resource usage from migration 5b161cce-b4d4-4d69-a865-c0e36b836911#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.512 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Starting to track outgoing migration 5b161cce-b4d4-4d69-a865-c0e36b836911 with flavor cef129e5-cce4-4465-9674-03d3559e8a14 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.573 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 4297c5cd-77b6-4f80-a746-11b304df8c90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.573 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Migration 5b161cce-b4d4-4d69-a865-c0e36b836911 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.574 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance d949b168-1d5a-4487-8380-e99f5847c0fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.574 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.574 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:31:54 np0005465988 nova_compute[236126]: 2025-10-02 12:31:54.687 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:31:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2174074340' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:31:55 np0005465988 nova_compute[236126]: 2025-10-02 12:31:55.255 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.569s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:55 np0005465988 nova_compute[236126]: 2025-10-02 12:31:55.264 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:55 np0005465988 nova_compute[236126]: 2025-10-02 12:31:55.323 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:55 np0005465988 nova_compute[236126]: 2025-10-02 12:31:55.429 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:31:55 np0005465988 nova_compute[236126]: 2025-10-02 12:31:55.430 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:55 np0005465988 nova_compute[236126]: 2025-10-02 12:31:55.512 2 DEBUG nova.network.neutron [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Updating instance_info_cache with network_info: [{"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:55 np0005465988 nova_compute[236126]: 2025-10-02 12:31:55.574 2 DEBUG oslo_concurrency.lockutils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Releasing lock "refresh_cache-c8b713f4-4f41-4153-928c-164f2ed108ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:55 np0005465988 nova_compute[236126]: 2025-10-02 12:31:55.575 2 DEBUG nova.objects.instance [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'migration_context' on Instance uuid c8b713f4-4f41-4153-928c-164f2ed108ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:55 np0005465988 nova_compute[236126]: 2025-10-02 12:31:55.770 2 DEBUG nova.storage.rbd_utils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] removing snapshot(nova-resize) on rbd image(c8b713f4-4f41-4153-928c-164f2ed108ed_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:31:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:55.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e319 e319: 3 total, 3 up, 3 in
Oct  2 08:31:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:56.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:56 np0005465988 nova_compute[236126]: 2025-10-02 12:31:56.365 2 DEBUG nova.virt.libvirt.vif [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:30:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='multiattach-server-1',id=115,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBEmvr2XQXDnaV0WQDbbXt57cEK6okdC4PHEYdjpQBx2HU9OQgvgRTm3sGWmsa/AInUTPV9ABsCq2lJ9PCqfb1WP51XCZeB9QBIxafEy8h788huF0550ajkopZIwmSLpiA==',key_name='tempest-keypair-425033456',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:31:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5533aaac08cd4856af72ef4992bb5e76',ramdisk_id='',reservation_id='r-1m21sn7g',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-1564585024',owner_user_name='tempest-AttachVolumeMultiAttachTest-1564585024-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:31:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22d56fcd2a4b4851bfd126ae4548ee9b',uuid=c8b713f4-4f41-4153-928c-164f2ed108ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:31:56 np0005465988 nova_compute[236126]: 2025-10-02 12:31:56.366 2 DEBUG nova.network.os_vif_util [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converting VIF {"id": "386c73f3-c5a1-4edb-894f-841beabaecbd", "address": "fa:16:3e:94:65:0d", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386c73f3-c5", "ovs_interfaceid": "386c73f3-c5a1-4edb-894f-841beabaecbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:56 np0005465988 nova_compute[236126]: 2025-10-02 12:31:56.366 2 DEBUG nova.network.os_vif_util [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:94:65:0d,bridge_name='br-int',has_traffic_filtering=True,id=386c73f3-c5a1-4edb-894f-841beabaecbd,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap386c73f3-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:56 np0005465988 nova_compute[236126]: 2025-10-02 12:31:56.367 2 DEBUG os_vif [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:65:0d,bridge_name='br-int',has_traffic_filtering=True,id=386c73f3-c5a1-4edb-894f-841beabaecbd,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap386c73f3-c5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:31:56 np0005465988 nova_compute[236126]: 2025-10-02 12:31:56.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:56 np0005465988 nova_compute[236126]: 2025-10-02 12:31:56.369 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap386c73f3-c5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:56 np0005465988 nova_compute[236126]: 2025-10-02 12:31:56.369 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:56 np0005465988 nova_compute[236126]: 2025-10-02 12:31:56.371 2 INFO os_vif [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:65:0d,bridge_name='br-int',has_traffic_filtering=True,id=386c73f3-c5a1-4edb-894f-841beabaecbd,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap386c73f3-c5')#033[00m
Oct  2 08:31:56 np0005465988 nova_compute[236126]: 2025-10-02 12:31:56.372 2 DEBUG oslo_concurrency.lockutils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:56 np0005465988 nova_compute[236126]: 2025-10-02 12:31:56.372 2 DEBUG oslo_concurrency.lockutils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:56 np0005465988 nova_compute[236126]: 2025-10-02 12:31:56.536 2 DEBUG oslo_concurrency.processutils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:31:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4239914508' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:31:57 np0005465988 nova_compute[236126]: 2025-10-02 12:31:57.001 2 DEBUG oslo_concurrency.processutils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:57 np0005465988 nova_compute[236126]: 2025-10-02 12:31:57.010 2 DEBUG nova.compute.provider_tree [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:57 np0005465988 nova_compute[236126]: 2025-10-02 12:31:57.056 2 DEBUG nova.scheduler.client.report [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:57 np0005465988 nova_compute[236126]: 2025-10-02 12:31:57.173 2 DEBUG oslo_concurrency.lockutils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:57 np0005465988 nova_compute[236126]: 2025-10-02 12:31:57.431 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:57 np0005465988 nova_compute[236126]: 2025-10-02 12:31:57.432 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:57 np0005465988 nova_compute[236126]: 2025-10-02 12:31:57.461 2 INFO nova.scheduler.client.report [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Deleted allocation for migration 5b161cce-b4d4-4d69-a865-c0e36b836911#033[00m
Oct  2 08:31:57 np0005465988 nova_compute[236126]: 2025-10-02 12:31:57.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:57 np0005465988 nova_compute[236126]: 2025-10-02 12:31:57.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:31:57 np0005465988 nova_compute[236126]: 2025-10-02 12:31:57.755 2 DEBUG oslo_concurrency.lockutils [None req-1fb3d3e9-97f8-4183-842f-88f101788d5a 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "c8b713f4-4f41-4153-928c-164f2ed108ed" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 5.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e319 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:31:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:57.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:31:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:31:58.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:31:58 np0005465988 nova_compute[236126]: 2025-10-02 12:31:58.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.452 2 DEBUG oslo_concurrency.lockutils [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Acquiring lock "d949b168-1d5a-4487-8380-e99f5847c0fd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.452 2 DEBUG oslo_concurrency.lockutils [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.453 2 DEBUG oslo_concurrency.lockutils [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Acquiring lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.453 2 DEBUG oslo_concurrency.lockutils [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.453 2 DEBUG oslo_concurrency.lockutils [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.455 2 INFO nova.compute.manager [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Terminating instance#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.456 2 DEBUG nova.compute.manager [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:59 np0005465988 kernel: tap3e2b3141-d5 (unregistering): left promiscuous mode
Oct  2 08:31:59 np0005465988 NetworkManager[45041]: <info>  [1759408319.5617] device (tap3e2b3141-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:59 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:59Z|00553|binding|INFO|Releasing lport 3e2b3141-d578-4c23-bf29-c676fefd32ec from this chassis (sb_readonly=0)
Oct  2 08:31:59 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:59Z|00554|binding|INFO|Setting lport 3e2b3141-d578-4c23-bf29-c676fefd32ec down in Southbound
Oct  2 08:31:59 np0005465988 ovn_controller[132601]: 2025-10-02T12:31:59Z|00555|binding|INFO|Removing iface tap3e2b3141-d5 ovn-installed in OVS
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:59 np0005465988 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000078.scope: Deactivated successfully.
Oct  2 08:31:59 np0005465988 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000078.scope: Consumed 7.722s CPU time.
Oct  2 08:31:59 np0005465988 systemd-machined[192594]: Machine qemu-53-instance-00000078 terminated.
Oct  2 08:31:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:59.662 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:16:87 10.100.0.10'], port_security=['fa:16:3e:66:16:87 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd949b168-1d5a-4487-8380-e99f5847c0fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-247d774d-0cc8-4ef2-a9b8-c756adae0874', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27a1729bf10548219b90df46839849f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '19f6d4f0-1655-4062-a124-10140844bfae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f7e0b23-d51b-4498-9dd8-e3096f69c99c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3e2b3141-d578-4c23-bf29-c676fefd32ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:59.665 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3e2b3141-d578-4c23-bf29-c676fefd32ec in datapath 247d774d-0cc8-4ef2-a9b8-c756adae0874 unbound from our chassis#033[00m
Oct  2 08:31:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:59.668 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 247d774d-0cc8-4ef2-a9b8-c756adae0874, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:31:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:59.670 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f2eebc-c1cc-4ec1-a6fb-03663e4afb09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:59.670 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874 namespace which is not needed anymore#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.695 2 INFO nova.virt.libvirt.driver [-] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Instance destroyed successfully.#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.695 2 DEBUG nova.objects.instance [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lazy-loading 'resources' on Instance uuid d949b168-1d5a-4487-8380-e99f5847c0fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:59 np0005465988 neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874[288686]: [NOTICE]   (288723) : haproxy version is 2.8.14-c23fe91
Oct  2 08:31:59 np0005465988 neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874[288686]: [NOTICE]   (288723) : path to executable is /usr/sbin/haproxy
Oct  2 08:31:59 np0005465988 neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874[288686]: [WARNING]  (288723) : Exiting Master process...
Oct  2 08:31:59 np0005465988 neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874[288686]: [ALERT]    (288723) : Current worker (288738) exited with code 143 (Terminated)
Oct  2 08:31:59 np0005465988 neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874[288686]: [WARNING]  (288723) : All workers exited. Exiting... (0)
Oct  2 08:31:59 np0005465988 systemd[1]: libpod-10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e.scope: Deactivated successfully.
Oct  2 08:31:59 np0005465988 podman[288890]: 2025-10-02 12:31:59.836430734 +0000 UTC m=+0.064907858 container died 10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.848 2 DEBUG nova.virt.libvirt.vif [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:31:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-870650247',display_name='tempest-ServerDiskConfigTestJSON-server-870650247',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-870650247',id=120,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:31:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='27a1729bf10548219b90df46839849f5',ramdisk_id='',reservation_id='r-qovnmdul',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1123059068',owner_user_name='tempest-ServerDiskConfigTestJSON-1123059068-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:31:57Z,user_data=None,user_id='4a89b71e2513413e922ee6d5d06362b1',uuid=d949b168-1d5a-4487-8380-e99f5847c0fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "address": "fa:16:3e:66:16:87", "network": {"id": "247d774d-0cc8-4ef2-a9b8-c756adae0874", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1434934627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27a1729bf10548219b90df46839849f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e2b3141-d5", "ovs_interfaceid": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.848 2 DEBUG nova.network.os_vif_util [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Converting VIF {"id": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "address": "fa:16:3e:66:16:87", "network": {"id": "247d774d-0cc8-4ef2-a9b8-c756adae0874", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1434934627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27a1729bf10548219b90df46839849f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e2b3141-d5", "ovs_interfaceid": "3e2b3141-d578-4c23-bf29-c676fefd32ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.849 2 DEBUG nova.network.os_vif_util [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:16:87,bridge_name='br-int',has_traffic_filtering=True,id=3e2b3141-d578-4c23-bf29-c676fefd32ec,network=Network(247d774d-0cc8-4ef2-a9b8-c756adae0874),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e2b3141-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.849 2 DEBUG os_vif [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:16:87,bridge_name='br-int',has_traffic_filtering=True,id=3e2b3141-d578-4c23-bf29-c676fefd32ec,network=Network(247d774d-0cc8-4ef2-a9b8-c756adae0874),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e2b3141-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.851 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e2b3141-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:59 np0005465988 nova_compute[236126]: 2025-10-02 12:31:59.856 2 INFO os_vif [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:16:87,bridge_name='br-int',has_traffic_filtering=True,id=3e2b3141-d578-4c23-bf29-c676fefd32ec,network=Network(247d774d-0cc8-4ef2-a9b8-c756adae0874),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e2b3141-d5')#033[00m
Oct  2 08:31:59 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:31:59 np0005465988 systemd[1]: var-lib-containers-storage-overlay-b0a547790d27c2101f1d1fc87a79f111402d5eda1d60660bb9704834330453ea-merged.mount: Deactivated successfully.
Oct  2 08:31:59 np0005465988 podman[288890]: 2025-10-02 12:31:59.881317275 +0000 UTC m=+0.109794399 container cleanup 10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:31:59 np0005465988 systemd[1]: libpod-conmon-10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e.scope: Deactivated successfully.
Oct  2 08:31:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:31:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:31:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:31:59.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:31:59 np0005465988 podman[288937]: 2025-10-02 12:31:59.950988169 +0000 UTC m=+0.041722241 container remove 10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:31:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:59.961 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[53e1d835-1fa9-4aab-aeed-82bfa09f5e67]: (4, ('Thu Oct  2 12:31:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874 (10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e)\n10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e\nThu Oct  2 12:31:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874 (10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e)\n10f87d4f04220a0a68f2bce2a1b5041139530ac1cb1601e5221bd4ea07f5683e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:59.963 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[facb2e6d-ead1-437b-8325-0307fa4854f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:31:59.964 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap247d774d-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:00 np0005465988 kernel: tap247d774d-00: left promiscuous mode
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:00.047 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[74a352d2-7833-41b9-aae4-c2c36bcfdbb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:00.094 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ca8b9028-fe03-4d18-bb67-a7bd69de3eee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:00.096 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ab7a3e50-0fd5-4762-9c3f-1ab0e64200bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:00.124 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cad247c0-2d17-4d99-a0d9-a4ea4852d179]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637160, 'reachable_time': 40114, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288955, 'error': None, 'target': 'ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:00 np0005465988 systemd[1]: run-netns-ovnmeta\x2d247d774d\x2d0cc8\x2d4ef2\x2da9b8\x2dc756adae0874.mount: Deactivated successfully.
Oct  2 08:32:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:00.130 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-247d774d-0cc8-4ef2-a9b8-c756adae0874 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:32:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:00.130 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[bcd980ee-8a32-413e-90e9-d50ec017e2b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:00Z|00556|binding|INFO|Releasing lport 02b7597d-2fc1-4c56-8603-4dcb0c716c82 from this chassis (sb_readonly=0)
Oct  2 08:32:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:00.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.339 2 DEBUG nova.compute.manager [req-6a4badb4-288e-49ee-a3cb-199553abc048 req-906f7b4a-12e9-4543-ba5c-41a0436a2733 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Received event network-vif-unplugged-3e2b3141-d578-4c23-bf29-c676fefd32ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.340 2 DEBUG oslo_concurrency.lockutils [req-6a4badb4-288e-49ee-a3cb-199553abc048 req-906f7b4a-12e9-4543-ba5c-41a0436a2733 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.340 2 DEBUG oslo_concurrency.lockutils [req-6a4badb4-288e-49ee-a3cb-199553abc048 req-906f7b4a-12e9-4543-ba5c-41a0436a2733 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.341 2 DEBUG oslo_concurrency.lockutils [req-6a4badb4-288e-49ee-a3cb-199553abc048 req-906f7b4a-12e9-4543-ba5c-41a0436a2733 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.341 2 DEBUG nova.compute.manager [req-6a4badb4-288e-49ee-a3cb-199553abc048 req-906f7b4a-12e9-4543-ba5c-41a0436a2733 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] No waiting events found dispatching network-vif-unplugged-3e2b3141-d578-4c23-bf29-c676fefd32ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.342 2 DEBUG nova.compute.manager [req-6a4badb4-288e-49ee-a3cb-199553abc048 req-906f7b4a-12e9-4543-ba5c-41a0436a2733 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Received event network-vif-unplugged-3e2b3141-d578-4c23-bf29-c676fefd32ec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.793 2 INFO nova.virt.libvirt.driver [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Deleting instance files /var/lib/nova/instances/d949b168-1d5a-4487-8380-e99f5847c0fd_del#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.794 2 INFO nova.virt.libvirt.driver [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Deletion of /var/lib/nova/instances/d949b168-1d5a-4487-8380-e99f5847c0fd_del complete#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.848 2 DEBUG nova.compute.manager [req-e31eee24-e9bc-476d-ad24-5c96498d4b76 req-f314e1d3-177d-49c9-835c-c2b206db1565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-changed-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.848 2 DEBUG nova.compute.manager [req-e31eee24-e9bc-476d-ad24-5c96498d4b76 req-f314e1d3-177d-49c9-835c-c2b206db1565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Refreshing instance network info cache due to event network-changed-7cf26487-91ca-4d15-85f3-bb6a66393796. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.849 2 DEBUG oslo_concurrency.lockutils [req-e31eee24-e9bc-476d-ad24-5c96498d4b76 req-f314e1d3-177d-49c9-835c-c2b206db1565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.850 2 DEBUG oslo_concurrency.lockutils [req-e31eee24-e9bc-476d-ad24-5c96498d4b76 req-f314e1d3-177d-49c9-835c-c2b206db1565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.850 2 DEBUG nova.network.neutron [req-e31eee24-e9bc-476d-ad24-5c96498d4b76 req-f314e1d3-177d-49c9-835c-c2b206db1565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Refreshing network info cache for port 7cf26487-91ca-4d15-85f3-bb6a66393796 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.888 2 INFO nova.compute.manager [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Took 1.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.889 2 DEBUG oslo.service.loopingcall [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.890 2 DEBUG nova.compute.manager [-] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:32:00 np0005465988 nova_compute[236126]: 2025-10-02 12:32:00.890 2 DEBUG nova.network.neutron [-] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:32:01 np0005465988 nova_compute[236126]: 2025-10-02 12:32:01.806 2 DEBUG nova.network.neutron [-] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:01 np0005465988 nova_compute[236126]: 2025-10-02 12:32:01.835 2 INFO nova.compute.manager [-] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Took 0.94 seconds to deallocate network for instance.#033[00m
Oct  2 08:32:01 np0005465988 nova_compute[236126]: 2025-10-02 12:32:01.908 2 DEBUG oslo_concurrency.lockutils [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:01 np0005465988 nova_compute[236126]: 2025-10-02 12:32:01.910 2 DEBUG oslo_concurrency.lockutils [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:01.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.009 2 DEBUG oslo_concurrency.processutils [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.061 2 DEBUG nova.compute.manager [req-3e4b15cd-25ee-4170-bef8-095f99f96664 req-25a38507-bc87-419f-82e1-8e7112966dad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Received event network-vif-deleted-3e2b3141-d578-4c23-bf29-c676fefd32ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:02.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:32:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3781105023' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.499 2 DEBUG oslo_concurrency.processutils [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.506 2 DEBUG nova.compute.provider_tree [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.519 2 DEBUG nova.compute.manager [req-52b9c34d-611c-4387-95df-d6d7bf285d61 req-40d455d6-08e4-487c-a036-dcda0ec8e8e4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Received event network-vif-plugged-3e2b3141-d578-4c23-bf29-c676fefd32ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.520 2 DEBUG oslo_concurrency.lockutils [req-52b9c34d-611c-4387-95df-d6d7bf285d61 req-40d455d6-08e4-487c-a036-dcda0ec8e8e4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.520 2 DEBUG oslo_concurrency.lockutils [req-52b9c34d-611c-4387-95df-d6d7bf285d61 req-40d455d6-08e4-487c-a036-dcda0ec8e8e4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.520 2 DEBUG oslo_concurrency.lockutils [req-52b9c34d-611c-4387-95df-d6d7bf285d61 req-40d455d6-08e4-487c-a036-dcda0ec8e8e4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.520 2 DEBUG nova.compute.manager [req-52b9c34d-611c-4387-95df-d6d7bf285d61 req-40d455d6-08e4-487c-a036-dcda0ec8e8e4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] No waiting events found dispatching network-vif-plugged-3e2b3141-d578-4c23-bf29-c676fefd32ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.520 2 WARNING nova.compute.manager [req-52b9c34d-611c-4387-95df-d6d7bf285d61 req-40d455d6-08e4-487c-a036-dcda0ec8e8e4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Received unexpected event network-vif-plugged-3e2b3141-d578-4c23-bf29-c676fefd32ec for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.529 2 DEBUG nova.scheduler.client.report [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.584 2 DEBUG oslo_concurrency.lockutils [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.618 2 INFO nova.scheduler.client.report [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Deleted allocations for instance d949b168-1d5a-4487-8380-e99f5847c0fd#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.712 2 DEBUG oslo_concurrency.lockutils [None req-a98078ef-8233-4706-a7dc-1708625f1b3e 4a89b71e2513413e922ee6d5d06362b1 27a1729bf10548219b90df46839849f5 - - default default] Lock "d949b168-1d5a-4487-8380-e99f5847c0fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.727 2 DEBUG nova.network.neutron [req-e31eee24-e9bc-476d-ad24-5c96498d4b76 req-f314e1d3-177d-49c9-835c-c2b206db1565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updated VIF entry in instance network info cache for port 7cf26487-91ca-4d15-85f3-bb6a66393796. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.727 2 DEBUG nova.network.neutron [req-e31eee24-e9bc-476d-ad24-5c96498d4b76 req-f314e1d3-177d-49c9-835c-c2b206db1565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [{"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:02 np0005465988 nova_compute[236126]: 2025-10-02 12:32:02.750 2 DEBUG oslo_concurrency.lockutils [req-e31eee24-e9bc-476d-ad24-5c96498d4b76 req-f314e1d3-177d-49c9-835c-c2b206db1565 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-4297c5cd-77b6-4f80-a746-11b304df8c90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e319 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:03 np0005465988 nova_compute[236126]: 2025-10-02 12:32:03.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:03 np0005465988 nova_compute[236126]: 2025-10-02 12:32:03.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:03.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:04.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:04 np0005465988 podman[288981]: 2025-10-02 12:32:04.531991663 +0000 UTC m=+0.069358586 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 08:32:04 np0005465988 nova_compute[236126]: 2025-10-02 12:32:04.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:05.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:06.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:06 np0005465988 nova_compute[236126]: 2025-10-02 12:32:06.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:06 np0005465988 nova_compute[236126]: 2025-10-02 12:32:06.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:32:06 np0005465988 nova_compute[236126]: 2025-10-02 12:32:06.524 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: c8b713f4-4f41-4153-928c-164f2ed108ed] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Oct  2 08:32:06 np0005465988 nova_compute[236126]: 2025-10-02 12:32:06.524 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:32:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e319 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:07.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:08.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:08 np0005465988 nova_compute[236126]: 2025-10-02 12:32:08.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e320 e320: 3 total, 3 up, 3 in
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.129 2 DEBUG oslo_concurrency.lockutils [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.130 2 DEBUG oslo_concurrency.lockutils [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.146 2 INFO nova.compute.manager [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Detaching volume 1f1fe097-f4b6-4748-bf18-8e487e0f3ba6#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.323 2 INFO nova.virt.block_device [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Attempting to driver detach volume 1f1fe097-f4b6-4748-bf18-8e487e0f3ba6 from mountpoint /dev/vdb#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.337 2 DEBUG nova.virt.libvirt.driver [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Attempting to detach device vdb from instance 4297c5cd-77b6-4f80-a746-11b304df8c90 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.338 2 DEBUG nova.virt.libvirt.guest [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-1f1fe097-f4b6-4748-bf18-8e487e0f3ba6">
Oct  2 08:32:09 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <serial>1f1fe097-f4b6-4748-bf18-8e487e0f3ba6</serial>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <shareable/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:32:09 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.350 2 INFO nova.virt.libvirt.driver [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Successfully detached device vdb from instance 4297c5cd-77b6-4f80-a746-11b304df8c90 from the persistent domain config.#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.350 2 DEBUG nova.virt.libvirt.driver [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 4297c5cd-77b6-4f80-a746-11b304df8c90 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.350 2 DEBUG nova.virt.libvirt.guest [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-1f1fe097-f4b6-4748-bf18-8e487e0f3ba6">
Oct  2 08:32:09 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <serial>1f1fe097-f4b6-4748-bf18-8e487e0f3ba6</serial>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <shareable/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct  2 08:32:09 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:32:09 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.482 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Received event <DeviceRemovedEvent: 1759408329.4820297, 4297c5cd-77b6-4f80-a746-11b304df8c90 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.484 2 DEBUG nova.virt.libvirt.driver [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 4297c5cd-77b6-4f80-a746-11b304df8c90 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.488 2 INFO nova.virt.libvirt.driver [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Successfully detached device vdb from instance 4297c5cd-77b6-4f80-a746-11b304df8c90 from the live domain config.#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:09 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:09.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:10 np0005465988 nova_compute[236126]: 2025-10-02 12:32:09.998 2 DEBUG nova.objects.instance [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'flavor' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:10 np0005465988 nova_compute[236126]: 2025-10-02 12:32:10.038 2 DEBUG oslo_concurrency.lockutils [None req-0018349b-a42f-4e2b-96ae-f3ce27b8bbc6 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.908s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:10.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:10Z|00557|binding|INFO|Releasing lport 02b7597d-2fc1-4c56-8603-4dcb0c716c82 from this chassis (sb_readonly=0)
Oct  2 08:32:10 np0005465988 nova_compute[236126]: 2025-10-02 12:32:10.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:11.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:12.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e320 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:13 np0005465988 nova_compute[236126]: 2025-10-02 12:32:13.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:13.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:14 np0005465988 nova_compute[236126]: 2025-10-02 12:32:14.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:32:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:14.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:32:14 np0005465988 nova_compute[236126]: 2025-10-02 12:32:14.692 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408319.69113, d949b168-1d5a-4487-8380-e99f5847c0fd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:14 np0005465988 nova_compute[236126]: 2025-10-02 12:32:14.693 2 INFO nova.compute.manager [-] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:32:14 np0005465988 nova_compute[236126]: 2025-10-02 12:32:14.751 2 DEBUG nova.compute.manager [None req-122c11a0-4305-451a-9a7a-ddd29e1de92a - - - - - -] [instance: d949b168-1d5a-4487-8380-e99f5847c0fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:14 np0005465988 nova_compute[236126]: 2025-10-02 12:32:14.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:15 np0005465988 nova_compute[236126]: 2025-10-02 12:32:15.519 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:15.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:16.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e320 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:17.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:18Z|00558|binding|INFO|Releasing lport 02b7597d-2fc1-4c56-8603-4dcb0c716c82 from this chassis (sb_readonly=0)
Oct  2 08:32:18 np0005465988 nova_compute[236126]: 2025-10-02 12:32:18.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:18.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:18 np0005465988 nova_compute[236126]: 2025-10-02 12:32:18.375 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Acquiring lock "a94a24b4-e399-49e0-a52b-1383be9a816f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:18 np0005465988 nova_compute[236126]: 2025-10-02 12:32:18.376 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:18 np0005465988 nova_compute[236126]: 2025-10-02 12:32:18.405 2 DEBUG nova.compute.manager [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:32:18 np0005465988 nova_compute[236126]: 2025-10-02 12:32:18.502 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:18 np0005465988 nova_compute[236126]: 2025-10-02 12:32:18.503 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:18 np0005465988 nova_compute[236126]: 2025-10-02 12:32:18.512 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:32:18 np0005465988 nova_compute[236126]: 2025-10-02 12:32:18.512 2 INFO nova.compute.claims [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:32:18 np0005465988 nova_compute[236126]: 2025-10-02 12:32:18.649 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:18 np0005465988 nova_compute[236126]: 2025-10-02 12:32:18.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:32:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3362849676' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.159 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.166 2 DEBUG nova.compute.provider_tree [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.196 2 DEBUG nova.scheduler.client.report [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.222 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.719s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.223 2 DEBUG nova.compute.manager [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.289 2 DEBUG nova.compute.manager [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.289 2 DEBUG nova.network.neutron [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.323 2 INFO nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.347 2 DEBUG nova.compute.manager [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.650 2 DEBUG nova.compute.manager [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.651 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.652 2 INFO nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Creating image(s)#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.688 2 DEBUG nova.storage.rbd_utils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] rbd image a94a24b4-e399-49e0-a52b-1383be9a816f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.726 2 DEBUG nova.storage.rbd_utils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] rbd image a94a24b4-e399-49e0-a52b-1383be9a816f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.762 2 DEBUG nova.storage.rbd_utils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] rbd image a94a24b4-e399-49e0-a52b-1383be9a816f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.766 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.810 2 DEBUG nova.policy [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48a75d0d93424c10a37b179785fd1b2e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e758f5e629284ecf89b1c87f76580d61', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.867 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.868 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.869 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.870 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.909 2 DEBUG nova.storage.rbd_utils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] rbd image a94a24b4-e399-49e0-a52b-1383be9a816f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.913 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 a94a24b4-e399-49e0-a52b-1383be9a816f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:19 np0005465988 nova_compute[236126]: 2025-10-02 12:32:19.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:19.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:20 np0005465988 nova_compute[236126]: 2025-10-02 12:32:20.238 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 a94a24b4-e399-49e0-a52b-1383be9a816f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.325s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:20 np0005465988 nova_compute[236126]: 2025-10-02 12:32:20.337 2 DEBUG nova.storage.rbd_utils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] resizing rbd image a94a24b4-e399-49e0-a52b-1383be9a816f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:32:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:20.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:20 np0005465988 nova_compute[236126]: 2025-10-02 12:32:20.589 2 DEBUG nova.objects.instance [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lazy-loading 'migration_context' on Instance uuid a94a24b4-e399-49e0-a52b-1383be9a816f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:20 np0005465988 nova_compute[236126]: 2025-10-02 12:32:20.751 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:32:20 np0005465988 nova_compute[236126]: 2025-10-02 12:32:20.752 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Ensure instance console log exists: /var/lib/nova/instances/a94a24b4-e399-49e0-a52b-1383be9a816f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:32:20 np0005465988 nova_compute[236126]: 2025-10-02 12:32:20.753 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:20 np0005465988 nova_compute[236126]: 2025-10-02 12:32:20.753 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:20 np0005465988 nova_compute[236126]: 2025-10-02 12:32:20.754 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:20 np0005465988 nova_compute[236126]: 2025-10-02 12:32:20.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:20 np0005465988 nova_compute[236126]: 2025-10-02 12:32:20.986 2 DEBUG nova.network.neutron [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Successfully created port: 76e727c9-9862-4ac4-9b61-705ca2018126 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:32:21 np0005465988 nova_compute[236126]: 2025-10-02 12:32:21.779 2 DEBUG nova.network.neutron [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Successfully updated port: 76e727c9-9862-4ac4-9b61-705ca2018126 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:32:21 np0005465988 nova_compute[236126]: 2025-10-02 12:32:21.804 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Acquiring lock "refresh_cache-a94a24b4-e399-49e0-a52b-1383be9a816f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:21 np0005465988 nova_compute[236126]: 2025-10-02 12:32:21.805 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Acquired lock "refresh_cache-a94a24b4-e399-49e0-a52b-1383be9a816f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:21 np0005465988 nova_compute[236126]: 2025-10-02 12:32:21.805 2 DEBUG nova.network.neutron [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:32:21 np0005465988 nova_compute[236126]: 2025-10-02 12:32:21.948 2 DEBUG nova.compute.manager [req-7800a57a-8d05-4a8f-874b-e324342afd47 req-a2ee7d77-c6a6-4344-998a-61a02dd877cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Received event network-changed-76e727c9-9862-4ac4-9b61-705ca2018126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:21 np0005465988 nova_compute[236126]: 2025-10-02 12:32:21.949 2 DEBUG nova.compute.manager [req-7800a57a-8d05-4a8f-874b-e324342afd47 req-a2ee7d77-c6a6-4344-998a-61a02dd877cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Refreshing instance network info cache due to event network-changed-76e727c9-9862-4ac4-9b61-705ca2018126. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:21 np0005465988 nova_compute[236126]: 2025-10-02 12:32:21.950 2 DEBUG oslo_concurrency.lockutils [req-7800a57a-8d05-4a8f-874b-e324342afd47 req-a2ee7d77-c6a6-4344-998a-61a02dd877cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-a94a24b4-e399-49e0-a52b-1383be9a816f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:21.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:22 np0005465988 nova_compute[236126]: 2025-10-02 12:32:22.019 2 DEBUG nova.network.neutron [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:32:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:22.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:22 np0005465988 podman[289251]: 2025-10-02 12:32:22.571025869 +0000 UTC m=+0.092676527 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:32:22 np0005465988 podman[289250]: 2025-10-02 12:32:22.580229014 +0000 UTC m=+0.097879307 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:32:22 np0005465988 podman[289249]: 2025-10-02 12:32:22.666381222 +0000 UTC m=+0.186369062 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:32:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e320 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.176 2 DEBUG nova.network.neutron [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Updating instance_info_cache with network_info: [{"id": "76e727c9-9862-4ac4-9b61-705ca2018126", "address": "fa:16:3e:6b:69:f8", "network": {"id": "bc383e10-03c6-43e5-a1dc-b686999621f5", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-293120118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e758f5e629284ecf89b1c87f76580d61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76e727c9-98", "ovs_interfaceid": "76e727c9-9862-4ac4-9b61-705ca2018126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.199 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Releasing lock "refresh_cache-a94a24b4-e399-49e0-a52b-1383be9a816f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.199 2 DEBUG nova.compute.manager [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Instance network_info: |[{"id": "76e727c9-9862-4ac4-9b61-705ca2018126", "address": "fa:16:3e:6b:69:f8", "network": {"id": "bc383e10-03c6-43e5-a1dc-b686999621f5", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-293120118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e758f5e629284ecf89b1c87f76580d61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76e727c9-98", "ovs_interfaceid": "76e727c9-9862-4ac4-9b61-705ca2018126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.199 2 DEBUG oslo_concurrency.lockutils [req-7800a57a-8d05-4a8f-874b-e324342afd47 req-a2ee7d77-c6a6-4344-998a-61a02dd877cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-a94a24b4-e399-49e0-a52b-1383be9a816f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.200 2 DEBUG nova.network.neutron [req-7800a57a-8d05-4a8f-874b-e324342afd47 req-a2ee7d77-c6a6-4344-998a-61a02dd877cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Refreshing network info cache for port 76e727c9-9862-4ac4-9b61-705ca2018126 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.203 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Start _get_guest_xml network_info=[{"id": "76e727c9-9862-4ac4-9b61-705ca2018126", "address": "fa:16:3e:6b:69:f8", "network": {"id": "bc383e10-03c6-43e5-a1dc-b686999621f5", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-293120118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e758f5e629284ecf89b1c87f76580d61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76e727c9-98", "ovs_interfaceid": "76e727c9-9862-4ac4-9b61-705ca2018126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.207 2 WARNING nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.212 2 DEBUG nova.virt.libvirt.host [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.213 2 DEBUG nova.virt.libvirt.host [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.221 2 DEBUG nova.virt.libvirt.host [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.221 2 DEBUG nova.virt.libvirt.host [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.223 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.223 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.224 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.224 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.224 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.225 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.225 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.225 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.226 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.226 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.226 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.227 2 DEBUG nova.virt.hardware [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.230 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:32:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1614874640' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.694 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.738 2 DEBUG nova.storage.rbd_utils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] rbd image a94a24b4-e399-49e0-a52b-1383be9a816f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.744 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:23 np0005465988 nova_compute[236126]: 2025-10-02 12:32:23.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:23.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:32:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2322270525' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.236 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.239 2 DEBUG nova.virt.libvirt.vif [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-532644467',display_name='tempest-ServerPasswordTestJSON-server-532644467',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-532644467',id=121,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e758f5e629284ecf89b1c87f76580d61',ramdisk_id='',reservation_id='r-in0me9hf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-833666318',owner_user_name='tempest-ServerPasswordTestJSON-833666318-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:19Z,user_data=None,user_id='48a75d0d93424c10a37b179785fd1b2e',uuid=a94a24b4-e399-49e0-a52b-1383be9a816f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "76e727c9-9862-4ac4-9b61-705ca2018126", "address": "fa:16:3e:6b:69:f8", "network": {"id": "bc383e10-03c6-43e5-a1dc-b686999621f5", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-293120118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e758f5e629284ecf89b1c87f76580d61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76e727c9-98", "ovs_interfaceid": "76e727c9-9862-4ac4-9b61-705ca2018126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.240 2 DEBUG nova.network.os_vif_util [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Converting VIF {"id": "76e727c9-9862-4ac4-9b61-705ca2018126", "address": "fa:16:3e:6b:69:f8", "network": {"id": "bc383e10-03c6-43e5-a1dc-b686999621f5", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-293120118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e758f5e629284ecf89b1c87f76580d61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76e727c9-98", "ovs_interfaceid": "76e727c9-9862-4ac4-9b61-705ca2018126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.242 2 DEBUG nova.network.os_vif_util [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:69:f8,bridge_name='br-int',has_traffic_filtering=True,id=76e727c9-9862-4ac4-9b61-705ca2018126,network=Network(bc383e10-03c6-43e5-a1dc-b686999621f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76e727c9-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.244 2 DEBUG nova.objects.instance [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lazy-loading 'pci_devices' on Instance uuid a94a24b4-e399-49e0-a52b-1383be9a816f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:24.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.511 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  <uuid>a94a24b4-e399-49e0-a52b-1383be9a816f</uuid>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  <name>instance-00000079</name>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerPasswordTestJSON-server-532644467</nova:name>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:32:23</nova:creationTime>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <nova:user uuid="48a75d0d93424c10a37b179785fd1b2e">tempest-ServerPasswordTestJSON-833666318-project-member</nova:user>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <nova:project uuid="e758f5e629284ecf89b1c87f76580d61">tempest-ServerPasswordTestJSON-833666318</nova:project>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <nova:port uuid="76e727c9-9862-4ac4-9b61-705ca2018126">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <entry name="serial">a94a24b4-e399-49e0-a52b-1383be9a816f</entry>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <entry name="uuid">a94a24b4-e399-49e0-a52b-1383be9a816f</entry>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/a94a24b4-e399-49e0-a52b-1383be9a816f_disk">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/a94a24b4-e399-49e0-a52b-1383be9a816f_disk.config">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:6b:69:f8"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <target dev="tap76e727c9-98"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/a94a24b4-e399-49e0-a52b-1383be9a816f/console.log" append="off"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:32:24 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:32:24 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:32:24 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:32:24 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.513 2 DEBUG nova.compute.manager [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Preparing to wait for external event network-vif-plugged-76e727c9-9862-4ac4-9b61-705ca2018126 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.513 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Acquiring lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.514 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.514 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.515 2 DEBUG nova.virt.libvirt.vif [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-532644467',display_name='tempest-ServerPasswordTestJSON-server-532644467',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-532644467',id=121,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e758f5e629284ecf89b1c87f76580d61',ramdisk_id='',reservation_id='r-in0me9hf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-833666318',owner_user_name='tempest-ServerPasswordTestJSON-833666318-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:19Z,user_data=None,user_id='48a75d0d93424c10a37b179785fd1b2e',uuid=a94a24b4-e399-49e0-a52b-1383be9a816f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "76e727c9-9862-4ac4-9b61-705ca2018126", "address": "fa:16:3e:6b:69:f8", "network": {"id": "bc383e10-03c6-43e5-a1dc-b686999621f5", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-293120118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e758f5e629284ecf89b1c87f76580d61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76e727c9-98", "ovs_interfaceid": "76e727c9-9862-4ac4-9b61-705ca2018126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.515 2 DEBUG nova.network.os_vif_util [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Converting VIF {"id": "76e727c9-9862-4ac4-9b61-705ca2018126", "address": "fa:16:3e:6b:69:f8", "network": {"id": "bc383e10-03c6-43e5-a1dc-b686999621f5", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-293120118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e758f5e629284ecf89b1c87f76580d61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76e727c9-98", "ovs_interfaceid": "76e727c9-9862-4ac4-9b61-705ca2018126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.516 2 DEBUG nova.network.os_vif_util [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:69:f8,bridge_name='br-int',has_traffic_filtering=True,id=76e727c9-9862-4ac4-9b61-705ca2018126,network=Network(bc383e10-03c6-43e5-a1dc-b686999621f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76e727c9-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.516 2 DEBUG os_vif [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:69:f8,bridge_name='br-int',has_traffic_filtering=True,id=76e727c9-9862-4ac4-9b61-705ca2018126,network=Network(bc383e10-03c6-43e5-a1dc-b686999621f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76e727c9-98') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.517 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.518 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.524 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap76e727c9-98, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.525 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap76e727c9-98, col_values=(('external_ids', {'iface-id': '76e727c9-9862-4ac4-9b61-705ca2018126', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:69:f8', 'vm-uuid': 'a94a24b4-e399-49e0-a52b-1383be9a816f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:24 np0005465988 NetworkManager[45041]: <info>  [1759408344.5284] manager: (tap76e727c9-98): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.542 2 INFO os_vif [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:69:f8,bridge_name='br-int',has_traffic_filtering=True,id=76e727c9-9862-4ac4-9b61-705ca2018126,network=Network(bc383e10-03c6-43e5-a1dc-b686999621f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76e727c9-98')#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.615 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.616 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.616 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] No VIF found with MAC fa:16:3e:6b:69:f8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.616 2 INFO nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Using config drive#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.645 2 DEBUG nova.storage.rbd_utils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] rbd image a94a24b4-e399-49e0-a52b-1383be9a816f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:24 np0005465988 nova_compute[236126]: 2025-10-02 12:32:24.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.172 2 INFO nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Creating config drive at /var/lib/nova/instances/a94a24b4-e399-49e0-a52b-1383be9a816f/disk.config#033[00m
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.177 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a94a24b4-e399-49e0-a52b-1383be9a816f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuxwwe2xg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.333 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a94a24b4-e399-49e0-a52b-1383be9a816f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuxwwe2xg" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.378 2 DEBUG nova.storage.rbd_utils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] rbd image a94a24b4-e399-49e0-a52b-1383be9a816f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.384 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a94a24b4-e399-49e0-a52b-1383be9a816f/disk.config a94a24b4-e399-49e0-a52b-1383be9a816f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.435 2 DEBUG nova.network.neutron [req-7800a57a-8d05-4a8f-874b-e324342afd47 req-a2ee7d77-c6a6-4344-998a-61a02dd877cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Updated VIF entry in instance network info cache for port 76e727c9-9862-4ac4-9b61-705ca2018126. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.436 2 DEBUG nova.network.neutron [req-7800a57a-8d05-4a8f-874b-e324342afd47 req-a2ee7d77-c6a6-4344-998a-61a02dd877cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Updating instance_info_cache with network_info: [{"id": "76e727c9-9862-4ac4-9b61-705ca2018126", "address": "fa:16:3e:6b:69:f8", "network": {"id": "bc383e10-03c6-43e5-a1dc-b686999621f5", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-293120118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e758f5e629284ecf89b1c87f76580d61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76e727c9-98", "ovs_interfaceid": "76e727c9-9862-4ac4-9b61-705ca2018126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.463 2 DEBUG oslo_concurrency.lockutils [req-7800a57a-8d05-4a8f-874b-e324342afd47 req-a2ee7d77-c6a6-4344-998a-61a02dd877cb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-a94a24b4-e399-49e0-a52b-1383be9a816f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.611 2 DEBUG oslo_concurrency.processutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a94a24b4-e399-49e0-a52b-1383be9a816f/disk.config a94a24b4-e399-49e0-a52b-1383be9a816f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.612 2 INFO nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Deleting local config drive /var/lib/nova/instances/a94a24b4-e399-49e0-a52b-1383be9a816f/disk.config because it was imported into RBD.#033[00m
Oct  2 08:32:25 np0005465988 kernel: tap76e727c9-98: entered promiscuous mode
Oct  2 08:32:25 np0005465988 NetworkManager[45041]: <info>  [1759408345.6649] manager: (tap76e727c9-98): new Tun device (/org/freedesktop/NetworkManager/Devices/253)
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:25Z|00559|binding|INFO|Claiming lport 76e727c9-9862-4ac4-9b61-705ca2018126 for this chassis.
Oct  2 08:32:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:25Z|00560|binding|INFO|76e727c9-9862-4ac4-9b61-705ca2018126: Claiming fa:16:3e:6b:69:f8 10.100.0.10
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.679 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:69:f8 10.100.0.10'], port_security=['fa:16:3e:6b:69:f8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a94a24b4-e399-49e0-a52b-1383be9a816f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc383e10-03c6-43e5-a1dc-b686999621f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e758f5e629284ecf89b1c87f76580d61', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f5cd8566-d43b-4210-a7b8-5a9ea35db7c3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6275ec20-52db-42fc-9773-a542a88bfcdd, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=76e727c9-9862-4ac4-9b61-705ca2018126) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.680 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 76e727c9-9862-4ac4-9b61-705ca2018126 in datapath bc383e10-03c6-43e5-a1dc-b686999621f5 bound to our chassis#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.682 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bc383e10-03c6-43e5-a1dc-b686999621f5#033[00m
Oct  2 08:32:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:25Z|00561|binding|INFO|Setting lport 76e727c9-9862-4ac4-9b61-705ca2018126 ovn-installed in OVS
Oct  2 08:32:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:25Z|00562|binding|INFO|Setting lport 76e727c9-9862-4ac4-9b61-705ca2018126 up in Southbound
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:25 np0005465988 nova_compute[236126]: 2025-10-02 12:32:25.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.696 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd79ecc-f00e-4c51-938f-7856fc9cbf35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.697 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbc383e10-01 in ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.699 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbc383e10-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.699 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b84bf91b-3527-4072-9f8c-bafdeab263c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.700 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6b6a2767-6cf9-4ebf-965a-7546ea742cb2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 systemd-udevd[289450]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.712 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8f9208-a2f9-4785-b940-df7b2d78edaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 systemd-machined[192594]: New machine qemu-54-instance-00000079.
Oct  2 08:32:25 np0005465988 NetworkManager[45041]: <info>  [1759408345.7180] device (tap76e727c9-98): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:32:25 np0005465988 NetworkManager[45041]: <info>  [1759408345.7191] device (tap76e727c9-98): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:32:25 np0005465988 systemd[1]: Started Virtual Machine qemu-54-instance-00000079.
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.727 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[43280ffa-2e09-4eeb-87d2-9e23c913086d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.762 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f1f09667-2528-47f1-b7f9-f4c771f16945]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.770 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5a614a6d-b385-4464-9e0c-7590a77ad68c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 NetworkManager[45041]: <info>  [1759408345.7718] manager: (tapbc383e10-00): new Veth device (/org/freedesktop/NetworkManager/Devices/254)
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.805 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[74bc95f7-3e83-4d4d-a2b0-016ea34c9c7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.808 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc69467-f4da-4ea8-948c-4f2ac2ddce1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 NetworkManager[45041]: <info>  [1759408345.8412] device (tapbc383e10-00): carrier: link connected
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.849 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[916420b2-dc66-4755-9355-f78cff527914]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.868 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d38a4cdc-8886-4155-b3d7-8ee27cc92916]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbc383e10-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:27:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640617, 'reachable_time': 22528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289483, 'error': None, 'target': 'ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.886 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9c7b80a4-fc31-479c-9496-36b5946c540c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:27e3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 640617, 'tstamp': 640617}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289484, 'error': None, 'target': 'ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.908 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[544e78b4-87b7-40f8-ba8f-12658888d43c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbc383e10-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:27:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640617, 'reachable_time': 22528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 289485, 'error': None, 'target': 'ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:25.945 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a50fe6e7-fbb4-41b8-9f89-7ee915c41013]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:25.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:26.002 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5e37d5b1-5811-47ef-b9f3-cf02453b236a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:26.003 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc383e10-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:26.003 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:26.003 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc383e10-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:26 np0005465988 kernel: tapbc383e10-00: entered promiscuous mode
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:26 np0005465988 NetworkManager[45041]: <info>  [1759408346.0062] manager: (tapbc383e10-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:26.009 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbc383e10-00, col_values=(('external_ids', {'iface-id': '4881a572-2b0b-4694-b435-7abbc48204f9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:26 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:26Z|00563|binding|INFO|Releasing lport 4881a572-2b0b-4694-b435-7abbc48204f9 from this chassis (sb_readonly=0)
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.027 2 DEBUG nova.compute.manager [req-76596086-d75d-4e6b-b6c4-c17ba6062b22 req-413657b9-9174-4e5d-8002-5b1900fbf1c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Received event network-vif-plugged-76e727c9-9862-4ac4-9b61-705ca2018126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.027 2 DEBUG oslo_concurrency.lockutils [req-76596086-d75d-4e6b-b6c4-c17ba6062b22 req-413657b9-9174-4e5d-8002-5b1900fbf1c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.028 2 DEBUG oslo_concurrency.lockutils [req-76596086-d75d-4e6b-b6c4-c17ba6062b22 req-413657b9-9174-4e5d-8002-5b1900fbf1c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.028 2 DEBUG oslo_concurrency.lockutils [req-76596086-d75d-4e6b-b6c4-c17ba6062b22 req-413657b9-9174-4e5d-8002-5b1900fbf1c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.028 2 DEBUG nova.compute.manager [req-76596086-d75d-4e6b-b6c4-c17ba6062b22 req-413657b9-9174-4e5d-8002-5b1900fbf1c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Processing event network-vif-plugged-76e727c9-9862-4ac4-9b61-705ca2018126 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:26.032 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bc383e10-03c6-43e5-a1dc-b686999621f5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bc383e10-03c6-43e5-a1dc-b686999621f5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:26.033 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[826a2bda-7c5d-4afb-8c5e-71ca070b4b63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:26.034 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-bc383e10-03c6-43e5-a1dc-b686999621f5
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/bc383e10-03c6-43e5-a1dc-b686999621f5.pid.haproxy
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID bc383e10-03c6-43e5-a1dc-b686999621f5
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:32:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:26.035 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5', 'env', 'PROCESS_TAG=haproxy-bc383e10-03c6-43e5-a1dc-b686999621f5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bc383e10-03c6-43e5-a1dc-b686999621f5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:32:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:26.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:26 np0005465988 podman[289609]: 2025-10-02 12:32:26.450787822 +0000 UTC m=+0.025483374 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.786 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408346.786236, a94a24b4-e399-49e0-a52b-1383be9a816f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.787 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] VM Started (Lifecycle Event)#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.789 2 DEBUG nova.compute.manager [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.794 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.799 2 INFO nova.virt.libvirt.driver [-] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Instance spawned successfully.#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.800 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:32:26 np0005465988 podman[289609]: 2025-10-02 12:32:26.811336002 +0000 UTC m=+0.386031534 container create 9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.820 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.831 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.837 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.837 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.838 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.840 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.840 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.841 2 DEBUG nova.virt.libvirt.driver [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.853 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.854 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408346.7864988, a94a24b4-e399-49e0-a52b-1383be9a816f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.855 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:32:26 np0005465988 systemd[1]: Started libpod-conmon-9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f.scope.
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.876 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.880 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408346.7924461, a94a24b4-e399-49e0-a52b-1383be9a816f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.880 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:32:26 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:32:26 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90ff9b0b5cfb7149830bdbbfb23d96d89bc5814f0506f8b4ebb2aac676336601/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.915 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.919 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:26 np0005465988 podman[289609]: 2025-10-02 12:32:26.925944079 +0000 UTC m=+0.500639631 container init 9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:32:26 np0005465988 podman[289609]: 2025-10-02 12:32:26.933553718 +0000 UTC m=+0.508249240 container start 9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.951 2 INFO nova.compute.manager [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Took 7.30 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:32:26 np0005465988 nova_compute[236126]: 2025-10-02 12:32:26.951 2 DEBUG nova.compute.manager [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:26 np0005465988 neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5[289624]: [NOTICE]   (289628) : New worker (289631) forked
Oct  2 08:32:26 np0005465988 neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5[289624]: [NOTICE]   (289628) : Loading success.
Oct  2 08:32:27 np0005465988 nova_compute[236126]: 2025-10-02 12:32:27.006 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:27 np0005465988 nova_compute[236126]: 2025-10-02 12:32:27.055 2 INFO nova.compute.manager [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Took 8.59 seconds to build instance.#033[00m
Oct  2 08:32:27 np0005465988 nova_compute[236126]: 2025-10-02 12:32:27.072 2 DEBUG oslo_concurrency.lockutils [None req-4b66631f-a82d-418f-8dc4-8a12adf14a56 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:27.362 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:27.363 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:27.365 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e320 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:27.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.215 2 DEBUG nova.compute.manager [req-c5214fae-0482-46bd-8218-711b70d457d1 req-26b4470b-06a0-4050-9465-612a907f3d8b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Received event network-vif-plugged-76e727c9-9862-4ac4-9b61-705ca2018126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.215 2 DEBUG oslo_concurrency.lockutils [req-c5214fae-0482-46bd-8218-711b70d457d1 req-26b4470b-06a0-4050-9465-612a907f3d8b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.216 2 DEBUG oslo_concurrency.lockutils [req-c5214fae-0482-46bd-8218-711b70d457d1 req-26b4470b-06a0-4050-9465-612a907f3d8b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.216 2 DEBUG oslo_concurrency.lockutils [req-c5214fae-0482-46bd-8218-711b70d457d1 req-26b4470b-06a0-4050-9465-612a907f3d8b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.216 2 DEBUG nova.compute.manager [req-c5214fae-0482-46bd-8218-711b70d457d1 req-26b4470b-06a0-4050-9465-612a907f3d8b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] No waiting events found dispatching network-vif-plugged-76e727c9-9862-4ac4-9b61-705ca2018126 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.216 2 WARNING nova.compute.manager [req-c5214fae-0482-46bd-8218-711b70d457d1 req-26b4470b-06a0-4050-9465-612a907f3d8b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Received unexpected event network-vif-plugged-76e727c9-9862-4ac4-9b61-705ca2018126 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:32:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:28.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.502 2 DEBUG oslo_concurrency.lockutils [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Acquiring lock "a94a24b4-e399-49e0-a52b-1383be9a816f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.502 2 DEBUG oslo_concurrency.lockutils [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.503 2 DEBUG oslo_concurrency.lockutils [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Acquiring lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.503 2 DEBUG oslo_concurrency.lockutils [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.503 2 DEBUG oslo_concurrency.lockutils [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.504 2 INFO nova.compute.manager [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Terminating instance#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.505 2 DEBUG nova.compute.manager [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:32:28 np0005465988 kernel: tap76e727c9-98 (unregistering): left promiscuous mode
Oct  2 08:32:28 np0005465988 NetworkManager[45041]: <info>  [1759408348.5773] device (tap76e727c9-98): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:32:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:28Z|00564|binding|INFO|Releasing lport 76e727c9-9862-4ac4-9b61-705ca2018126 from this chassis (sb_readonly=0)
Oct  2 08:32:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:28Z|00565|binding|INFO|Setting lport 76e727c9-9862-4ac4-9b61-705ca2018126 down in Southbound
Oct  2 08:32:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:28Z|00566|binding|INFO|Removing iface tap76e727c9-98 ovn-installed in OVS
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:28.608 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:69:f8 10.100.0.10'], port_security=['fa:16:3e:6b:69:f8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a94a24b4-e399-49e0-a52b-1383be9a816f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc383e10-03c6-43e5-a1dc-b686999621f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e758f5e629284ecf89b1c87f76580d61', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f5cd8566-d43b-4210-a7b8-5a9ea35db7c3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6275ec20-52db-42fc-9773-a542a88bfcdd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=76e727c9-9862-4ac4-9b61-705ca2018126) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:28.610 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 76e727c9-9862-4ac4-9b61-705ca2018126 in datapath bc383e10-03c6-43e5-a1dc-b686999621f5 unbound from our chassis#033[00m
Oct  2 08:32:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:28.613 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc383e10-03c6-43e5-a1dc-b686999621f5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:32:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:28.615 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8cdbfc7f-702f-46e2-b49e-af14bc8a36d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:28.615 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5 namespace which is not needed anymore#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:28 np0005465988 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000079.scope: Deactivated successfully.
Oct  2 08:32:28 np0005465988 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000079.scope: Consumed 2.802s CPU time.
Oct  2 08:32:28 np0005465988 systemd-machined[192594]: Machine qemu-54-instance-00000079 terminated.
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.750 2 INFO nova.virt.libvirt.driver [-] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Instance destroyed successfully.#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.750 2 DEBUG nova.objects.instance [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lazy-loading 'resources' on Instance uuid a94a24b4-e399-49e0-a52b-1383be9a816f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.773 2 DEBUG nova.virt.libvirt.vif [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-532644467',display_name='tempest-ServerPasswordTestJSON-server-532644467',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-532644467',id=121,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e758f5e629284ecf89b1c87f76580d61',ramdisk_id='',reservation_id='r-in0me9hf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerPasswordTestJSON-833666318',owner_user_name='tempest-ServerPasswordTestJSON-833666318-project-member',password_0='',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:28Z,user_data=None,user_id='48a75d0d93424c10a37b179785fd1b2e',uuid=a94a24b4-e399-49e0-a52b-1383be9a816f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "76e727c9-9862-4ac4-9b61-705ca2018126", "address": "fa:16:3e:6b:69:f8", "network": {"id": "bc383e10-03c6-43e5-a1dc-b686999621f5", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-293120118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e758f5e629284ecf89b1c87f76580d61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76e727c9-98", "ovs_interfaceid": "76e727c9-9862-4ac4-9b61-705ca2018126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.773 2 DEBUG nova.network.os_vif_util [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Converting VIF {"id": "76e727c9-9862-4ac4-9b61-705ca2018126", "address": "fa:16:3e:6b:69:f8", "network": {"id": "bc383e10-03c6-43e5-a1dc-b686999621f5", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-293120118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e758f5e629284ecf89b1c87f76580d61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76e727c9-98", "ovs_interfaceid": "76e727c9-9862-4ac4-9b61-705ca2018126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.774 2 DEBUG nova.network.os_vif_util [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:69:f8,bridge_name='br-int',has_traffic_filtering=True,id=76e727c9-9862-4ac4-9b61-705ca2018126,network=Network(bc383e10-03c6-43e5-a1dc-b686999621f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76e727c9-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.774 2 DEBUG os_vif [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:69:f8,bridge_name='br-int',has_traffic_filtering=True,id=76e727c9-9862-4ac4-9b61-705ca2018126,network=Network(bc383e10-03c6-43e5-a1dc-b686999621f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76e727c9-98') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.776 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76e727c9-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:28 np0005465988 nova_compute[236126]: 2025-10-02 12:32:28.781 2 INFO os_vif [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:69:f8,bridge_name='br-int',has_traffic_filtering=True,id=76e727c9-9862-4ac4-9b61-705ca2018126,network=Network(bc383e10-03c6-43e5-a1dc-b686999621f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76e727c9-98')#033[00m
Oct  2 08:32:28 np0005465988 neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5[289624]: [NOTICE]   (289628) : haproxy version is 2.8.14-c23fe91
Oct  2 08:32:28 np0005465988 neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5[289624]: [NOTICE]   (289628) : path to executable is /usr/sbin/haproxy
Oct  2 08:32:28 np0005465988 neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5[289624]: [WARNING]  (289628) : Exiting Master process...
Oct  2 08:32:28 np0005465988 neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5[289624]: [WARNING]  (289628) : Exiting Master process...
Oct  2 08:32:28 np0005465988 neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5[289624]: [ALERT]    (289628) : Current worker (289631) exited with code 143 (Terminated)
Oct  2 08:32:28 np0005465988 neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5[289624]: [WARNING]  (289628) : All workers exited. Exiting... (0)
Oct  2 08:32:28 np0005465988 systemd[1]: libpod-9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f.scope: Deactivated successfully.
Oct  2 08:32:28 np0005465988 conmon[289624]: conmon 9c1e48a4e01621f98417 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f.scope/container/memory.events
Oct  2 08:32:28 np0005465988 podman[289664]: 2025-10-02 12:32:28.807842167 +0000 UTC m=+0.054885380 container died 9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:32:28 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f-userdata-shm.mount: Deactivated successfully.
Oct  2 08:32:28 np0005465988 systemd[1]: var-lib-containers-storage-overlay-90ff9b0b5cfb7149830bdbbfb23d96d89bc5814f0506f8b4ebb2aac676336601-merged.mount: Deactivated successfully.
Oct  2 08:32:28 np0005465988 podman[289664]: 2025-10-02 12:32:28.874158164 +0000 UTC m=+0.121201377 container cleanup 9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:32:28 np0005465988 systemd[1]: libpod-conmon-9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f.scope: Deactivated successfully.
Oct  2 08:32:29 np0005465988 podman[289720]: 2025-10-02 12:32:29.006980135 +0000 UTC m=+0.103286662 container remove 9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:32:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:29.013 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c21994c6-f7ba-4701-b62c-7e25341b031d]: (4, ('Thu Oct  2 12:32:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5 (9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f)\n9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f\nThu Oct  2 12:32:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5 (9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f)\n9c1e48a4e01621f984177c73d7d5e942c3e8bbcc89f29f1b818a255d8d89365f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:29.016 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0dbc7684-fc52-4ec3-9de6-a7c547c10a3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:29.018 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc383e10-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:29 np0005465988 nova_compute[236126]: 2025-10-02 12:32:29.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005465988 kernel: tapbc383e10-00: left promiscuous mode
Oct  2 08:32:29 np0005465988 nova_compute[236126]: 2025-10-02 12:32:29.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:29.038 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff0e997-7123-4143-a73c-202838547b3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:29.074 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f1baa5ef-1c38-4d13-9b7c-34b62945bc40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:29.076 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b712841f-6431-4b76-8c99-88593e0e27bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:29.092 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[93795751-3053-40c5-b0ae-1eab20ccd4b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640608, 'reachable_time': 44111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289736, 'error': None, 'target': 'ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005465988 systemd[1]: run-netns-ovnmeta\x2dbc383e10\x2d03c6\x2d43e5\x2da1dc\x2db686999621f5.mount: Deactivated successfully.
Oct  2 08:32:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:29.094 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bc383e10-03c6-43e5-a1dc-b686999621f5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:32:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:29.095 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[fcfa4a2f-7b69-48c4-8ad5-5d5c9ef8e091]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005465988 nova_compute[236126]: 2025-10-02 12:32:29.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:29.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.314 2 DEBUG nova.compute.manager [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Received event network-vif-unplugged-76e727c9-9862-4ac4-9b61-705ca2018126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.314 2 DEBUG oslo_concurrency.lockutils [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.315 2 DEBUG oslo_concurrency.lockutils [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.315 2 DEBUG oslo_concurrency.lockutils [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.316 2 DEBUG nova.compute.manager [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] No waiting events found dispatching network-vif-unplugged-76e727c9-9862-4ac4-9b61-705ca2018126 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.316 2 DEBUG nova.compute.manager [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Received event network-vif-unplugged-76e727c9-9862-4ac4-9b61-705ca2018126 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.316 2 DEBUG nova.compute.manager [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Received event network-vif-plugged-76e727c9-9862-4ac4-9b61-705ca2018126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.317 2 DEBUG oslo_concurrency.lockutils [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.317 2 DEBUG oslo_concurrency.lockutils [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.317 2 DEBUG oslo_concurrency.lockutils [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.318 2 DEBUG nova.compute.manager [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] No waiting events found dispatching network-vif-plugged-76e727c9-9862-4ac4-9b61-705ca2018126 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.318 2 WARNING nova.compute.manager [req-59c76d74-3a9f-43ab-9f30-3e1593daef7d req-9fb67843-8a87-444b-bafe-3b581ec3b86b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Received unexpected event network-vif-plugged-76e727c9-9862-4ac4-9b61-705ca2018126 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:32:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:30.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.860 2 INFO nova.virt.libvirt.driver [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Deleting instance files /var/lib/nova/instances/a94a24b4-e399-49e0-a52b-1383be9a816f_del#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.861 2 INFO nova.virt.libvirt.driver [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Deletion of /var/lib/nova/instances/a94a24b4-e399-49e0-a52b-1383be9a816f_del complete#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.966 2 INFO nova.compute.manager [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Took 2.46 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.967 2 DEBUG oslo.service.loopingcall [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.968 2 DEBUG nova.compute.manager [-] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:32:30 np0005465988 nova_compute[236126]: 2025-10-02 12:32:30.968 2 DEBUG nova.network.neutron [-] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:32:31 np0005465988 nova_compute[236126]: 2025-10-02 12:32:31.813 2 DEBUG nova.network.neutron [-] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:31 np0005465988 nova_compute[236126]: 2025-10-02 12:32:31.838 2 INFO nova.compute.manager [-] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Took 0.87 seconds to deallocate network for instance.#033[00m
Oct  2 08:32:31 np0005465988 nova_compute[236126]: 2025-10-02 12:32:31.878 2 DEBUG oslo_concurrency.lockutils [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:31 np0005465988 nova_compute[236126]: 2025-10-02 12:32:31.878 2 DEBUG oslo_concurrency.lockutils [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:32:31 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1631283109' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:32:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:31.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:31 np0005465988 nova_compute[236126]: 2025-10-02 12:32:31.999 2 DEBUG oslo_concurrency.processutils [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.042 2 DEBUG nova.compute.manager [req-1034b6fc-2dc1-46e5-9d0c-7cf1a7f07bb7 req-9403d98c-c689-4ccd-b3a4-337f7114a4f9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Received event network-vif-deleted-76e727c9-9862-4ac4-9b61-705ca2018126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:32.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:32:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4142552725' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.493 2 DEBUG oslo_concurrency.processutils [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.502 2 DEBUG nova.compute.provider_tree [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.566 2 DEBUG nova.scheduler.client.report [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.614 2 DEBUG oslo_concurrency.lockutils [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.736s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.639 2 INFO nova.scheduler.client.report [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Deleted allocations for instance a94a24b4-e399-49e0-a52b-1383be9a816f#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.693 2 DEBUG oslo_concurrency.lockutils [None req-09befaa6-e95d-4033-a944-f804689d6d81 48a75d0d93424c10a37b179785fd1b2e e758f5e629284ecf89b1c87f76580d61 - - default default] Lock "a94a24b4-e399-49e0-a52b-1383be9a816f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e320 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.874 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.874 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.914 2 DEBUG nova.compute.manager [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.988 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.989 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.995 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:32:32 np0005465988 nova_compute[236126]: 2025-10-02 12:32:32.996 2 INFO nova.compute.claims [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.117 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:32:33 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2458578435' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.574 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.580 2 DEBUG nova.compute.provider_tree [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.612 2 DEBUG nova.scheduler.client.report [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.632 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.633 2 DEBUG nova.compute.manager [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.684 2 DEBUG nova.compute.manager [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.684 2 DEBUG nova.network.neutron [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.706 2 INFO nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.733 2 DEBUG nova.compute.manager [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.856 2 DEBUG nova.compute.manager [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.857 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.858 2 INFO nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Creating image(s)#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.889 2 DEBUG nova.storage.rbd_utils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.923 2 DEBUG nova.storage.rbd_utils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.957 2 DEBUG nova.storage.rbd_utils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:33 np0005465988 nova_compute[236126]: 2025-10-02 12:32:33.963 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:33.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:34 np0005465988 nova_compute[236126]: 2025-10-02 12:32:34.004 2 DEBUG nova.policy [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95612007183445418f12dc53405b3e7b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f2fdda5532bd4487b413e696cfbf1197', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:32:34 np0005465988 nova_compute[236126]: 2025-10-02 12:32:34.044 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:34 np0005465988 nova_compute[236126]: 2025-10-02 12:32:34.045 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:34 np0005465988 nova_compute[236126]: 2025-10-02 12:32:34.046 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:34 np0005465988 nova_compute[236126]: 2025-10-02 12:32:34.046 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:34 np0005465988 nova_compute[236126]: 2025-10-02 12:32:34.076 2 DEBUG nova.storage.rbd_utils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:34 np0005465988 nova_compute[236126]: 2025-10-02 12:32:34.081 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 11c595fe-756a-4f19-8c39-0c834af96d6a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:34.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:34 np0005465988 nova_compute[236126]: 2025-10-02 12:32:34.618 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 11c595fe-756a-4f19-8c39-0c834af96d6a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:34 np0005465988 nova_compute[236126]: 2025-10-02 12:32:34.714 2 DEBUG nova.network.neutron [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Successfully created port: 26fd7867-5e43-40ee-bb0a-95d52010310c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:32:34 np0005465988 nova_compute[236126]: 2025-10-02 12:32:34.727 2 DEBUG nova.storage.rbd_utils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] resizing rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:32:34 np0005465988 podman[289984]: 2025-10-02 12:32:34.740423565 +0000 UTC m=+0.080277200 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:32:35 np0005465988 nova_compute[236126]: 2025-10-02 12:32:35.032 2 DEBUG nova.objects.instance [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lazy-loading 'migration_context' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:35 np0005465988 nova_compute[236126]: 2025-10-02 12:32:35.056 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:32:35 np0005465988 nova_compute[236126]: 2025-10-02 12:32:35.057 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Ensure instance console log exists: /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:32:35 np0005465988 nova_compute[236126]: 2025-10-02 12:32:35.057 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:35 np0005465988 nova_compute[236126]: 2025-10-02 12:32:35.058 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:35 np0005465988 nova_compute[236126]: 2025-10-02 12:32:35.058 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:32:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:32:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:32:35 np0005465988 nova_compute[236126]: 2025-10-02 12:32:35.843 2 DEBUG nova.network.neutron [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Successfully updated port: 26fd7867-5e43-40ee-bb0a-95d52010310c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:32:35 np0005465988 nova_compute[236126]: 2025-10-02 12:32:35.862 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquiring lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:35 np0005465988 nova_compute[236126]: 2025-10-02 12:32:35.863 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquired lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:35 np0005465988 nova_compute[236126]: 2025-10-02 12:32:35.863 2 DEBUG nova.network.neutron [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:32:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:35.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:36 np0005465988 nova_compute[236126]: 2025-10-02 12:32:36.051 2 DEBUG nova.network.neutron [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:32:36 np0005465988 nova_compute[236126]: 2025-10-02 12:32:36.220 2 DEBUG nova.compute.manager [req-bd870276-e5e5-4185-806b-1cba1f9dfe35 req-2a92968d-64ab-4a2d-b2cb-5f9445699e25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-changed-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:36 np0005465988 nova_compute[236126]: 2025-10-02 12:32:36.221 2 DEBUG nova.compute.manager [req-bd870276-e5e5-4185-806b-1cba1f9dfe35 req-2a92968d-64ab-4a2d-b2cb-5f9445699e25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Refreshing instance network info cache due to event network-changed-26fd7867-5e43-40ee-bb0a-95d52010310c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:36 np0005465988 nova_compute[236126]: 2025-10-02 12:32:36.222 2 DEBUG oslo_concurrency.lockutils [req-bd870276-e5e5-4185-806b-1cba1f9dfe35 req-2a92968d-64ab-4a2d-b2cb-5f9445699e25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:36.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:36Z|00567|binding|INFO|Releasing lport 02b7597d-2fc1-4c56-8603-4dcb0c716c82 from this chassis (sb_readonly=0)
Oct  2 08:32:36 np0005465988 nova_compute[236126]: 2025-10-02 12:32:36.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.046 2 DEBUG nova.network.neutron [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updating instance_info_cache with network_info: [{"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.191 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Releasing lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.192 2 DEBUG nova.compute.manager [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Instance network_info: |[{"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.193 2 DEBUG oslo_concurrency.lockutils [req-bd870276-e5e5-4185-806b-1cba1f9dfe35 req-2a92968d-64ab-4a2d-b2cb-5f9445699e25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.194 2 DEBUG nova.network.neutron [req-bd870276-e5e5-4185-806b-1cba1f9dfe35 req-2a92968d-64ab-4a2d-b2cb-5f9445699e25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Refreshing network info cache for port 26fd7867-5e43-40ee-bb0a-95d52010310c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.199 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Start _get_guest_xml network_info=[{"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.206 2 WARNING nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.212 2 DEBUG nova.virt.libvirt.host [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.213 2 DEBUG nova.virt.libvirt.host [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.218 2 DEBUG nova.virt.libvirt.host [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.218 2 DEBUG nova.virt.libvirt.host [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.220 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.221 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.222 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.222 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.223 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.223 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.224 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.224 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.225 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.225 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.226 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.226 2 DEBUG nova.virt.hardware [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.232 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:32:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2536513026' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.730 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e320 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.836 2 DEBUG nova.storage.rbd_utils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:37 np0005465988 nova_compute[236126]: 2025-10-02 12:32:37.843 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:37.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:32:38 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1278154967' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.333 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.335 2 DEBUG nova.virt.libvirt.vif [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-530066287',display_name='tempest-ServerRescueTestJSONUnderV235-server-530066287',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-530066287',id=124,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f2fdda5532bd4487b413e696cfbf1197',ramdisk_id='',reservation_id='r-icfttuz3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1167302845',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1167302845-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:33Z,user_data=None,user_id='95612007183445418f12dc53405b3e7b',uuid=11c595fe-756a-4f19-8c39-0c834af96d6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.336 2 DEBUG nova.network.os_vif_util [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Converting VIF {"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.337 2 DEBUG nova.network.os_vif_util [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:14:b0,bridge_name='br-int',has_traffic_filtering=True,id=26fd7867-5e43-40ee-bb0a-95d52010310c,network=Network(4c4ff335-9221-4a73-8694-cb9e35a2f586),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26fd7867-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.338 2 DEBUG nova.objects.instance [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lazy-loading 'pci_devices' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.360 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  <uuid>11c595fe-756a-4f19-8c39-0c834af96d6a</uuid>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  <name>instance-0000007c</name>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-530066287</nova:name>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:32:37</nova:creationTime>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <nova:user uuid="95612007183445418f12dc53405b3e7b">tempest-ServerRescueTestJSONUnderV235-1167302845-project-member</nova:user>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <nova:project uuid="f2fdda5532bd4487b413e696cfbf1197">tempest-ServerRescueTestJSONUnderV235-1167302845</nova:project>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <nova:port uuid="26fd7867-5e43-40ee-bb0a-95d52010310c">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <entry name="serial">11c595fe-756a-4f19-8c39-0c834af96d6a</entry>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <entry name="uuid">11c595fe-756a-4f19-8c39-0c834af96d6a</entry>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/11c595fe-756a-4f19-8c39-0c834af96d6a_disk">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/11c595fe-756a-4f19-8c39-0c834af96d6a_disk.config">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:80:14:b0"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <target dev="tap26fd7867-5e"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/console.log" append="off"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:32:38 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:32:38 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:32:38 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:32:38 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.361 2 DEBUG nova.compute.manager [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Preparing to wait for external event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.362 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.362 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.362 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.363 2 DEBUG nova.virt.libvirt.vif [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-530066287',display_name='tempest-ServerRescueTestJSONUnderV235-server-530066287',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-530066287',id=124,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f2fdda5532bd4487b413e696cfbf1197',ramdisk_id='',reservation_id='r-icfttuz3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1167302845',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1167302845-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:33Z,user_data=None,user_id='95612007183445418f12dc53405b3e7b',uuid=11c595fe-756a-4f19-8c39-0c834af96d6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.363 2 DEBUG nova.network.os_vif_util [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Converting VIF {"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.364 2 DEBUG nova.network.os_vif_util [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:14:b0,bridge_name='br-int',has_traffic_filtering=True,id=26fd7867-5e43-40ee-bb0a-95d52010310c,network=Network(4c4ff335-9221-4a73-8694-cb9e35a2f586),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26fd7867-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.364 2 DEBUG os_vif [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:14:b0,bridge_name='br-int',has_traffic_filtering=True,id=26fd7867-5e43-40ee-bb0a-95d52010310c,network=Network(4c4ff335-9221-4a73-8694-cb9e35a2f586),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26fd7867-5e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.365 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.366 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.370 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap26fd7867-5e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.370 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap26fd7867-5e, col_values=(('external_ids', {'iface-id': '26fd7867-5e43-40ee-bb0a-95d52010310c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:14:b0', 'vm-uuid': '11c595fe-756a-4f19-8c39-0c834af96d6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:38 np0005465988 NetworkManager[45041]: <info>  [1759408358.3738] manager: (tap26fd7867-5e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.378 2 INFO os_vif [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:14:b0,bridge_name='br-int',has_traffic_filtering=True,id=26fd7867-5e43-40ee-bb0a-95d52010310c,network=Network(4c4ff335-9221-4a73-8694-cb9e35a2f586),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26fd7867-5e')#033[00m
Oct  2 08:32:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:38.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.437 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.438 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.439 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] No VIF found with MAC fa:16:3e:80:14:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.439 2 INFO nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Using config drive#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.473 2 DEBUG nova.storage.rbd_utils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:38 np0005465988 nova_compute[236126]: 2025-10-02 12:32:38.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:39 np0005465988 nova_compute[236126]: 2025-10-02 12:32:39.768 2 INFO nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Creating config drive at /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config#033[00m
Oct  2 08:32:39 np0005465988 nova_compute[236126]: 2025-10-02 12:32:39.775 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3bweiiyk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e321 e321: 3 total, 3 up, 3 in
Oct  2 08:32:39 np0005465988 nova_compute[236126]: 2025-10-02 12:32:39.929 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3bweiiyk" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:39.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:40 np0005465988 nova_compute[236126]: 2025-10-02 12:32:40.096 2 DEBUG nova.storage.rbd_utils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:32:40 np0005465988 nova_compute[236126]: 2025-10-02 12:32:40.102 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:40.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:40 np0005465988 nova_compute[236126]: 2025-10-02 12:32:40.398 2 DEBUG oslo_concurrency.processutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.296s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:40 np0005465988 nova_compute[236126]: 2025-10-02 12:32:40.399 2 INFO nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Deleting local config drive /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config because it was imported into RBD.#033[00m
Oct  2 08:32:40 np0005465988 kernel: tap26fd7867-5e: entered promiscuous mode
Oct  2 08:32:40 np0005465988 nova_compute[236126]: 2025-10-02 12:32:40.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:40Z|00568|binding|INFO|Claiming lport 26fd7867-5e43-40ee-bb0a-95d52010310c for this chassis.
Oct  2 08:32:40 np0005465988 NetworkManager[45041]: <info>  [1759408360.4787] manager: (tap26fd7867-5e): new Tun device (/org/freedesktop/NetworkManager/Devices/257)
Oct  2 08:32:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:40Z|00569|binding|INFO|26fd7867-5e43-40ee-bb0a-95d52010310c: Claiming fa:16:3e:80:14:b0 10.100.0.11
Oct  2 08:32:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:40.493 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:14:b0 10.100.0.11'], port_security=['fa:16:3e:80:14:b0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '11c595fe-756a-4f19-8c39-0c834af96d6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c4ff335-9221-4a73-8694-cb9e35a2f586', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f2fdda5532bd4487b413e696cfbf1197', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f941b67-d201-4a3f-bc1a-38e632bfe938', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b7aba6e6-d0b4-4668-8db9-bd9393b1e55e, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=26fd7867-5e43-40ee-bb0a-95d52010310c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:40.495 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 26fd7867-5e43-40ee-bb0a-95d52010310c in datapath 4c4ff335-9221-4a73-8694-cb9e35a2f586 bound to our chassis#033[00m
Oct  2 08:32:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:40.498 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4c4ff335-9221-4a73-8694-cb9e35a2f586 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:32:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:40.500 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5acc4e13-8c6e-4394-9c28-a6a8947d230e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:40 np0005465988 systemd-udevd[290239]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:32:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:40Z|00570|binding|INFO|Setting lport 26fd7867-5e43-40ee-bb0a-95d52010310c up in Southbound
Oct  2 08:32:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:40Z|00571|binding|INFO|Setting lport 26fd7867-5e43-40ee-bb0a-95d52010310c ovn-installed in OVS
Oct  2 08:32:40 np0005465988 nova_compute[236126]: 2025-10-02 12:32:40.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:40 np0005465988 nova_compute[236126]: 2025-10-02 12:32:40.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:40 np0005465988 systemd-machined[192594]: New machine qemu-55-instance-0000007c.
Oct  2 08:32:40 np0005465988 NetworkManager[45041]: <info>  [1759408360.5299] device (tap26fd7867-5e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:32:40 np0005465988 NetworkManager[45041]: <info>  [1759408360.5309] device (tap26fd7867-5e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:32:40 np0005465988 systemd[1]: Started Virtual Machine qemu-55-instance-0000007c.
Oct  2 08:32:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e322 e322: 3 total, 3 up, 3 in
Oct  2 08:32:41 np0005465988 nova_compute[236126]: 2025-10-02 12:32:41.085 2 DEBUG nova.network.neutron [req-bd870276-e5e5-4185-806b-1cba1f9dfe35 req-2a92968d-64ab-4a2d-b2cb-5f9445699e25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updated VIF entry in instance network info cache for port 26fd7867-5e43-40ee-bb0a-95d52010310c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:32:41 np0005465988 nova_compute[236126]: 2025-10-02 12:32:41.086 2 DEBUG nova.network.neutron [req-bd870276-e5e5-4185-806b-1cba1f9dfe35 req-2a92968d-64ab-4a2d-b2cb-5f9445699e25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updating instance_info_cache with network_info: [{"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:41 np0005465988 nova_compute[236126]: 2025-10-02 12:32:41.111 2 DEBUG oslo_concurrency.lockutils [req-bd870276-e5e5-4185-806b-1cba1f9dfe35 req-2a92968d-64ab-4a2d-b2cb-5f9445699e25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:41 np0005465988 nova_compute[236126]: 2025-10-02 12:32:41.717 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408361.7167697, 11c595fe-756a-4f19-8c39-0c834af96d6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:41 np0005465988 nova_compute[236126]: 2025-10-02 12:32:41.718 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:32:41 np0005465988 nova_compute[236126]: 2025-10-02 12:32:41.743 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:41 np0005465988 nova_compute[236126]: 2025-10-02 12:32:41.750 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408361.7169042, 11c595fe-756a-4f19-8c39-0c834af96d6a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:41 np0005465988 nova_compute[236126]: 2025-10-02 12:32:41.751 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:32:41 np0005465988 nova_compute[236126]: 2025-10-02 12:32:41.772 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:41 np0005465988 nova_compute[236126]: 2025-10-02 12:32:41.777 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:41 np0005465988 nova_compute[236126]: 2025-10-02 12:32:41.796 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:41.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.055 2 DEBUG nova.compute.manager [req-c67e2e96-0c23-43af-96c4-98585faeb378 req-40f2208a-d89f-42d6-bb8e-9e4625e60baf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.055 2 DEBUG oslo_concurrency.lockutils [req-c67e2e96-0c23-43af-96c4-98585faeb378 req-40f2208a-d89f-42d6-bb8e-9e4625e60baf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.055 2 DEBUG oslo_concurrency.lockutils [req-c67e2e96-0c23-43af-96c4-98585faeb378 req-40f2208a-d89f-42d6-bb8e-9e4625e60baf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.055 2 DEBUG oslo_concurrency.lockutils [req-c67e2e96-0c23-43af-96c4-98585faeb378 req-40f2208a-d89f-42d6-bb8e-9e4625e60baf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.056 2 DEBUG nova.compute.manager [req-c67e2e96-0c23-43af-96c4-98585faeb378 req-40f2208a-d89f-42d6-bb8e-9e4625e60baf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Processing event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.056 2 DEBUG nova.compute.manager [req-c67e2e96-0c23-43af-96c4-98585faeb378 req-40f2208a-d89f-42d6-bb8e-9e4625e60baf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.056 2 DEBUG oslo_concurrency.lockutils [req-c67e2e96-0c23-43af-96c4-98585faeb378 req-40f2208a-d89f-42d6-bb8e-9e4625e60baf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.056 2 DEBUG oslo_concurrency.lockutils [req-c67e2e96-0c23-43af-96c4-98585faeb378 req-40f2208a-d89f-42d6-bb8e-9e4625e60baf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.056 2 DEBUG oslo_concurrency.lockutils [req-c67e2e96-0c23-43af-96c4-98585faeb378 req-40f2208a-d89f-42d6-bb8e-9e4625e60baf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.056 2 DEBUG nova.compute.manager [req-c67e2e96-0c23-43af-96c4-98585faeb378 req-40f2208a-d89f-42d6-bb8e-9e4625e60baf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] No waiting events found dispatching network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.056 2 WARNING nova.compute.manager [req-c67e2e96-0c23-43af-96c4-98585faeb378 req-40f2208a-d89f-42d6-bb8e-9e4625e60baf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received unexpected event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.057 2 DEBUG nova.compute.manager [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.060 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408362.0600457, 11c595fe-756a-4f19-8c39-0c834af96d6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.060 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.062 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.068 2 INFO nova.virt.libvirt.driver [-] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Instance spawned successfully.#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.069 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.077 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.090 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:32:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.097 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.098 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.099 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.099 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.100 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.100 2 DEBUG nova.virt.libvirt.driver [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.112 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.153 2 INFO nova.compute.manager [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Took 8.30 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.154 2 DEBUG nova.compute.manager [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.261 2 INFO nova.compute.manager [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Took 9.29 seconds to build instance.#033[00m
Oct  2 08:32:42 np0005465988 nova_compute[236126]: 2025-10-02 12:32:42.286 2 DEBUG oslo_concurrency.lockutils [None req-2cf1088b-4db5-4cfc-b8a3-7a59b2f33c45 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:42.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:43 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:43.123 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:43 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:43.125 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:32:43 np0005465988 nova_compute[236126]: 2025-10-02 12:32:43.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:43 np0005465988 nova_compute[236126]: 2025-10-02 12:32:43.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:43 np0005465988 nova_compute[236126]: 2025-10-02 12:32:43.748 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408348.7468152, a94a24b4-e399-49e0-a52b-1383be9a816f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:43 np0005465988 nova_compute[236126]: 2025-10-02 12:32:43.748 2 INFO nova.compute.manager [-] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:32:43 np0005465988 nova_compute[236126]: 2025-10-02 12:32:43.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:43 np0005465988 nova_compute[236126]: 2025-10-02 12:32:43.790 2 DEBUG nova.compute.manager [None req-6be89a7f-39a8-4ad9-995d-0d0f48fb46dc - - - - - -] [instance: a94a24b4-e399-49e0-a52b-1383be9a816f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:43.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:44 np0005465988 nova_compute[236126]: 2025-10-02 12:32:44.002 2 INFO nova.compute.manager [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Rescuing#033[00m
Oct  2 08:32:44 np0005465988 nova_compute[236126]: 2025-10-02 12:32:44.002 2 DEBUG oslo_concurrency.lockutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquiring lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:44 np0005465988 nova_compute[236126]: 2025-10-02 12:32:44.002 2 DEBUG oslo_concurrency.lockutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquired lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:44 np0005465988 nova_compute[236126]: 2025-10-02 12:32:44.003 2 DEBUG nova.network.neutron [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:32:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:44.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:45 np0005465988 nova_compute[236126]: 2025-10-02 12:32:45.441 2 DEBUG nova.network.neutron [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updating instance_info_cache with network_info: [{"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:45 np0005465988 nova_compute[236126]: 2025-10-02 12:32:45.474 2 DEBUG oslo_concurrency.lockutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Releasing lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:45 np0005465988 nova_compute[236126]: 2025-10-02 12:32:45.743 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:32:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:32:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:45.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:32:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:46.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:46 np0005465988 nova_compute[236126]: 2025-10-02 12:32:46.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:47.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:32:48.128 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:48 np0005465988 nova_compute[236126]: 2025-10-02 12:32:48.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:32:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:48.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:32:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e323 e323: 3 total, 3 up, 3 in
Oct  2 08:32:48 np0005465988 nova_compute[236126]: 2025-10-02 12:32:48.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:48 np0005465988 nova_compute[236126]: 2025-10-02 12:32:48.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:49.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:32:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:50.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:32:50 np0005465988 nova_compute[236126]: 2025-10-02 12:32:50.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:50 np0005465988 ovn_controller[132601]: 2025-10-02T12:32:50Z|00572|binding|INFO|Releasing lport 02b7597d-2fc1-4c56-8603-4dcb0c716c82 from this chassis (sb_readonly=0)
Oct  2 08:32:50 np0005465988 nova_compute[236126]: 2025-10-02 12:32:50.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e324 e324: 3 total, 3 up, 3 in
Oct  2 08:32:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:51.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:52.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:53 np0005465988 nova_compute[236126]: 2025-10-02 12:32:53.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:53 np0005465988 podman[290401]: 2025-10-02 12:32:53.567672439 +0000 UTC m=+0.081404743 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:32:53 np0005465988 podman[290400]: 2025-10-02 12:32:53.570528191 +0000 UTC m=+0.091515044 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:32:53 np0005465988 podman[290399]: 2025-10-02 12:32:53.625888263 +0000 UTC m=+0.147140383 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:32:53 np0005465988 nova_compute[236126]: 2025-10-02 12:32:53.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:54.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:54.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:54 np0005465988 nova_compute[236126]: 2025-10-02 12:32:54.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:54 np0005465988 nova_compute[236126]: 2025-10-02 12:32:54.508 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:54 np0005465988 nova_compute[236126]: 2025-10-02 12:32:54.509 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:54 np0005465988 nova_compute[236126]: 2025-10-02 12:32:54.510 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:54 np0005465988 nova_compute[236126]: 2025-10-02 12:32:54.510 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:32:54 np0005465988 nova_compute[236126]: 2025-10-02 12:32:54.510 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:32:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3040535' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.017 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.147 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.148 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.153 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000007c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.154 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000007c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.355 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.357 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3894MB free_disk=20.78473663330078GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.357 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.358 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.441 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 4297c5cd-77b6-4f80-a746-11b304df8c90 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.442 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 11c595fe-756a-4f19-8c39-0c834af96d6a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.442 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.443 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.496 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:55 np0005465988 nova_compute[236126]: 2025-10-02 12:32:55.799 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:32:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:56.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:32:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1348520795' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:32:56 np0005465988 nova_compute[236126]: 2025-10-02 12:32:56.055 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:56 np0005465988 nova_compute[236126]: 2025-10-02 12:32:56.063 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:56 np0005465988 nova_compute[236126]: 2025-10-02 12:32:56.079 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:56 np0005465988 nova_compute[236126]: 2025-10-02 12:32:56.124 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:32:56 np0005465988 nova_compute[236126]: 2025-10-02 12:32:56.125 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:56.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:32:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:32:58.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:58 np0005465988 nova_compute[236126]: 2025-10-02 12:32:58.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:32:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:32:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:32:58.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:32:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 e325: 3 total, 3 up, 3 in
Oct  2 08:32:58 np0005465988 nova_compute[236126]: 2025-10-02 12:32:58.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:59 np0005465988 nova_compute[236126]: 2025-10-02 12:32:59.124 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:59 np0005465988 nova_compute[236126]: 2025-10-02 12:32:59.125 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:59 np0005465988 nova_compute[236126]: 2025-10-02 12:32:59.126 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:59 np0005465988 nova_compute[236126]: 2025-10-02 12:32:59.126 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:33:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:00.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:00.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:00 np0005465988 nova_compute[236126]: 2025-10-02 12:33:00.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:02.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:02.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.194 2 DEBUG oslo_concurrency.lockutils [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.195 2 DEBUG oslo_concurrency.lockutils [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.196 2 DEBUG oslo_concurrency.lockutils [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.196 2 DEBUG oslo_concurrency.lockutils [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.196 2 DEBUG oslo_concurrency.lockutils [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.199 2 INFO nova.compute.manager [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Terminating instance#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.200 2 DEBUG nova.compute.manager [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:33:03 np0005465988 kernel: tap7cf26487-91 (unregistering): left promiscuous mode
Oct  2 08:33:03 np0005465988 NetworkManager[45041]: <info>  [1759408383.3546] device (tap7cf26487-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:33:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:03Z|00573|binding|INFO|Releasing lport 7cf26487-91ca-4d15-85f3-bb6a66393796 from this chassis (sb_readonly=0)
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:03Z|00574|binding|INFO|Setting lport 7cf26487-91ca-4d15-85f3-bb6a66393796 down in Southbound
Oct  2 08:33:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:03Z|00575|binding|INFO|Removing iface tap7cf26487-91 ovn-installed in OVS
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.373 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:9d:7c 10.100.0.5'], port_security=['fa:16:3e:60:9d:7c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4297c5cd-77b6-4f80-a746-11b304df8c90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-585473f8-52e4-4e55-96df-8a236d361126', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5533aaac08cd4856af72ef4992bb5e76', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0a7e36b3-799e-47d8-a152-7f7146431afe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec297f04-3bda-490f-87d3-1f684caf96fd, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7cf26487-91ca-4d15-85f3-bb6a66393796) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.377 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7cf26487-91ca-4d15-85f3-bb6a66393796 in datapath 585473f8-52e4-4e55-96df-8a236d361126 unbound from our chassis#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.381 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 585473f8-52e4-4e55-96df-8a236d361126, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.382 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3e05c0-0506-4efc-b026-a0d1b66ce83f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.383 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-585473f8-52e4-4e55-96df-8a236d361126 namespace which is not needed anymore#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:03 np0005465988 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000072.scope: Deactivated successfully.
Oct  2 08:33:03 np0005465988 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000072.scope: Consumed 18.831s CPU time.
Oct  2 08:33:03 np0005465988 systemd-machined[192594]: Machine qemu-52-instance-00000072 terminated.
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:03 np0005465988 neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126[285786]: [NOTICE]   (285790) : haproxy version is 2.8.14-c23fe91
Oct  2 08:33:03 np0005465988 neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126[285786]: [NOTICE]   (285790) : path to executable is /usr/sbin/haproxy
Oct  2 08:33:03 np0005465988 neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126[285786]: [WARNING]  (285790) : Exiting Master process...
Oct  2 08:33:03 np0005465988 neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126[285786]: [WARNING]  (285790) : Exiting Master process...
Oct  2 08:33:03 np0005465988 neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126[285786]: [ALERT]    (285790) : Current worker (285792) exited with code 143 (Terminated)
Oct  2 08:33:03 np0005465988 neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126[285786]: [WARNING]  (285790) : All workers exited. Exiting... (0)
Oct  2 08:33:03 np0005465988 systemd[1]: libpod-653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330.scope: Deactivated successfully.
Oct  2 08:33:03 np0005465988 podman[290536]: 2025-10-02 12:33:03.574140264 +0000 UTC m=+0.053100707 container died 653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:33:03 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330-userdata-shm.mount: Deactivated successfully.
Oct  2 08:33:03 np0005465988 systemd[1]: var-lib-containers-storage-overlay-c82c8be227a724991a81492b38ae49b66b42ac396b2712168bfc34953b7a2aa9-merged.mount: Deactivated successfully.
Oct  2 08:33:03 np0005465988 podman[290536]: 2025-10-02 12:33:03.618994675 +0000 UTC m=+0.097955098 container cleanup 653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:33:03 np0005465988 systemd[1]: libpod-conmon-653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330.scope: Deactivated successfully.
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.646 2 INFO nova.virt.libvirt.driver [-] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Instance destroyed successfully.#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.648 2 DEBUG nova.objects.instance [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lazy-loading 'resources' on Instance uuid 4297c5cd-77b6-4f80-a746-11b304df8c90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.694 2 DEBUG nova.virt.libvirt.vif [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:29:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=114,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBEmvr2XQXDnaV0WQDbbXt57cEK6okdC4PHEYdjpQBx2HU9OQgvgRTm3sGWmsa/AInUTPV9ABsCq2lJ9PCqfb1WP51XCZeB9QBIxafEy8h788huF0550ajkopZIwmSLpiA==',key_name='tempest-keypair-425033456',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:31:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5533aaac08cd4856af72ef4992bb5e76',ramdisk_id='',reservation_id='r-w0tlxvyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeMultiAttachTest-1564585024',owner_user_name='tempest-AttachVolumeMultiAttachTest-1564585024-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:31:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22d56fcd2a4b4851bfd126ae4548ee9b',uuid=4297c5cd-77b6-4f80-a746-11b304df8c90,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.695 2 DEBUG nova.network.os_vif_util [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converting VIF {"id": "7cf26487-91ca-4d15-85f3-bb6a66393796", "address": "fa:16:3e:60:9d:7c", "network": {"id": "585473f8-52e4-4e55-96df-8a236d361126", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1197534465-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5533aaac08cd4856af72ef4992bb5e76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cf26487-91", "ovs_interfaceid": "7cf26487-91ca-4d15-85f3-bb6a66393796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.696 2 DEBUG nova.network.os_vif_util [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.696 2 DEBUG os_vif [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.698 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7cf26487-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.703 2 INFO os_vif [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:9d:7c,bridge_name='br-int',has_traffic_filtering=True,id=7cf26487-91ca-4d15-85f3-bb6a66393796,network=Network(585473f8-52e4-4e55-96df-8a236d361126),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cf26487-91')#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.732 2 DEBUG nova.compute.manager [req-b271cf89-7780-49f3-92c5-5818b7dee33a req-fb280160-a057-4236-870c-2baa6ff592fd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-unplugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.733 2 DEBUG oslo_concurrency.lockutils [req-b271cf89-7780-49f3-92c5-5818b7dee33a req-fb280160-a057-4236-870c-2baa6ff592fd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.734 2 DEBUG oslo_concurrency.lockutils [req-b271cf89-7780-49f3-92c5-5818b7dee33a req-fb280160-a057-4236-870c-2baa6ff592fd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.735 2 DEBUG oslo_concurrency.lockutils [req-b271cf89-7780-49f3-92c5-5818b7dee33a req-fb280160-a057-4236-870c-2baa6ff592fd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.736 2 DEBUG nova.compute.manager [req-b271cf89-7780-49f3-92c5-5818b7dee33a req-fb280160-a057-4236-870c-2baa6ff592fd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] No waiting events found dispatching network-vif-unplugged-7cf26487-91ca-4d15-85f3-bb6a66393796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.736 2 DEBUG nova.compute.manager [req-b271cf89-7780-49f3-92c5-5818b7dee33a req-fb280160-a057-4236-870c-2baa6ff592fd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-unplugged-7cf26487-91ca-4d15-85f3-bb6a66393796 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:33:03 np0005465988 podman[290575]: 2025-10-02 12:33:03.738725398 +0000 UTC m=+0.074521034 container remove 653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.748 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[48267ec7-1b10-4d20-ad60-185ba9e03991]: (4, ('Thu Oct  2 12:33:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126 (653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330)\n653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330\nThu Oct  2 12:33:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-585473f8-52e4-4e55-96df-8a236d361126 (653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330)\n653d0ed8ee9c5dee144307e24200b614da90a0e4d3f5260a4a2f743d09f5d330\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.749 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3a8d976a-818f-4ebe-baa5-2b3e2f0bc7a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.750 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap585473f8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:03 np0005465988 kernel: tap585473f8-50: left promiscuous mode
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.769 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a3df8d3a-352a-4acf-94f1-9b478a546a4f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:03 np0005465988 nova_compute[236126]: 2025-10-02 12:33:03.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.809 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[11c36ac4-203a-48ab-b231-c77c99f8447b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.810 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a18023c0-5c28-49e9-ba72-10eea84deca7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.830 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[daa23b5f-7aba-4b89-b9d0-0522f7b8ee70]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624718, 'reachable_time': 33683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290608, 'error': None, 'target': 'ovnmeta-585473f8-52e4-4e55-96df-8a236d361126', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:03 np0005465988 systemd[1]: run-netns-ovnmeta\x2d585473f8\x2d52e4\x2d4e55\x2d96df\x2d8a236d361126.mount: Deactivated successfully.
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.835 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-585473f8-52e4-4e55-96df-8a236d361126 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:33:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:03.836 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[3f326d7e-b047-4afc-b96b-55d74d245f77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:33:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:04.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:33:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:04.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:04 np0005465988 nova_compute[236126]: 2025-10-02 12:33:04.470 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:05 np0005465988 podman[290610]: 2025-10-02 12:33:05.546598309 +0000 UTC m=+0.075259106 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:33:05 np0005465988 nova_compute[236126]: 2025-10-02 12:33:05.725 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "5c61d077-c345-4b28-9942-624c141fc0a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:05 np0005465988 nova_compute[236126]: 2025-10-02 12:33:05.726 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:05 np0005465988 nova_compute[236126]: 2025-10-02 12:33:05.976 2 DEBUG nova.compute.manager [req-8b28866d-bbd1-4a84-9aa7-c13c37787bcb req-9b7bab42-93c9-4141-807a-59460614fb48 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:05 np0005465988 nova_compute[236126]: 2025-10-02 12:33:05.977 2 DEBUG oslo_concurrency.lockutils [req-8b28866d-bbd1-4a84-9aa7-c13c37787bcb req-9b7bab42-93c9-4141-807a-59460614fb48 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:05 np0005465988 nova_compute[236126]: 2025-10-02 12:33:05.977 2 DEBUG oslo_concurrency.lockutils [req-8b28866d-bbd1-4a84-9aa7-c13c37787bcb req-9b7bab42-93c9-4141-807a-59460614fb48 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:05 np0005465988 nova_compute[236126]: 2025-10-02 12:33:05.977 2 DEBUG oslo_concurrency.lockutils [req-8b28866d-bbd1-4a84-9aa7-c13c37787bcb req-9b7bab42-93c9-4141-807a-59460614fb48 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:05 np0005465988 nova_compute[236126]: 2025-10-02 12:33:05.977 2 DEBUG nova.compute.manager [req-8b28866d-bbd1-4a84-9aa7-c13c37787bcb req-9b7bab42-93c9-4141-807a-59460614fb48 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] No waiting events found dispatching network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:05 np0005465988 nova_compute[236126]: 2025-10-02 12:33:05.978 2 WARNING nova.compute.manager [req-8b28866d-bbd1-4a84-9aa7-c13c37787bcb req-9b7bab42-93c9-4141-807a-59460614fb48 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received unexpected event network-vif-plugged-7cf26487-91ca-4d15-85f3-bb6a66393796 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:33:05 np0005465988 nova_compute[236126]: 2025-10-02 12:33:05.983 2 DEBUG nova.compute.manager [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:33:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:06.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:06 np0005465988 nova_compute[236126]: 2025-10-02 12:33:06.324 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:06 np0005465988 nova_compute[236126]: 2025-10-02 12:33:06.325 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:06 np0005465988 nova_compute[236126]: 2025-10-02 12:33:06.335 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:33:06 np0005465988 nova_compute[236126]: 2025-10-02 12:33:06.336 2 INFO nova.compute.claims [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:33:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:06.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:33:06 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1162337237' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:33:06 np0005465988 nova_compute[236126]: 2025-10-02 12:33:06.854 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:33:06 np0005465988 nova_compute[236126]: 2025-10-02 12:33:06.921 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:33:07 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3341860301' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.442 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.451 2 DEBUG nova.compute.provider_tree [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.658 2 DEBUG nova.scheduler.client.report [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.664 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.664 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.665 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.665 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.665 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.665 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.828 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:07 np0005465988 nova_compute[236126]: 2025-10-02 12:33:07.829 2 DEBUG nova.compute.manager [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:33:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:08.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.043 2 DEBUG nova.compute.manager [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.044 2 DEBUG nova.network.neutron [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.175 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "c5460257-b47c-4a1b-8e44-96ae657d6266" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.175 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.204 2 INFO nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.239 2 DEBUG nova.policy [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1c2fbed9aaf84b4e864db97bec4c797c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '385766b9209941f3ab805e8d5e2af163', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.278 2 DEBUG nova.compute.manager [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:33:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:08.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.490 2 DEBUG nova.compute.manager [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.687 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.688 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.696 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.696 2 INFO nova.compute.claims [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.892 2 DEBUG nova.compute.manager [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.893 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.894 2 INFO nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Creating image(s)#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.921 2 DEBUG nova.storage.rbd_utils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image 5c61d077-c345-4b28-9942-624c141fc0a2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.957 2 DEBUG nova.storage.rbd_utils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image 5c61d077-c345-4b28-9942-624c141fc0a2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:08 np0005465988 nova_compute[236126]: 2025-10-02 12:33:08.996 2 DEBUG nova.storage.rbd_utils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image 5c61d077-c345-4b28-9942-624c141fc0a2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.000 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.084 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.086 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.087 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.088 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.135 2 DEBUG nova.storage.rbd_utils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image 5c61d077-c345-4b28-9942-624c141fc0a2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.141 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 5c61d077-c345-4b28-9942-624c141fc0a2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.203 2 INFO nova.virt.libvirt.driver [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Deleting instance files /var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90_del#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.205 2 INFO nova.virt.libvirt.driver [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Deletion of /var/lib/nova/instances/4297c5cd-77b6-4f80-a746-11b304df8c90_del complete#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.263 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.302 2 DEBUG nova.network.neutron [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Successfully created port: 61ec6de3-7b6b-4f24-bf93-ce21a666d398 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.390 2 INFO nova.compute.manager [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Took 6.19 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.391 2 DEBUG oslo.service.loopingcall [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.392 2 DEBUG nova.compute.manager [-] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.392 2 DEBUG nova.network.neutron [-] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.667 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updating instance_info_cache with network_info: [{"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.675 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 5c61d077-c345-4b28-9942-624c141fc0a2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:33:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2147367564' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.765 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.766 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.773 2 DEBUG nova.storage.rbd_utils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] resizing rbd image 5c61d077-c345-4b28-9942-624c141fc0a2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.821 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.558s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.829 2 DEBUG nova.compute.provider_tree [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:09 np0005465988 nova_compute[236126]: 2025-10-02 12:33:09.880 2 DEBUG nova.scheduler.client.report [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:10.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.027 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.028 2 DEBUG nova.compute.manager [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.036 2 DEBUG nova.objects.instance [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lazy-loading 'migration_context' on Instance uuid 5c61d077-c345-4b28-9942-624c141fc0a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:10 np0005465988 kernel: tap26fd7867-5e (unregistering): left promiscuous mode
Oct  2 08:33:10 np0005465988 NetworkManager[45041]: <info>  [1759408390.0422] device (tap26fd7867-5e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:10Z|00576|binding|INFO|Releasing lport 26fd7867-5e43-40ee-bb0a-95d52010310c from this chassis (sb_readonly=0)
Oct  2 08:33:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:10Z|00577|binding|INFO|Setting lport 26fd7867-5e43-40ee-bb0a-95d52010310c down in Southbound
Oct  2 08:33:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:10Z|00578|binding|INFO|Removing iface tap26fd7867-5e ovn-installed in OVS
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:10.115 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:14:b0 10.100.0.11'], port_security=['fa:16:3e:80:14:b0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '11c595fe-756a-4f19-8c39-0c834af96d6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c4ff335-9221-4a73-8694-cb9e35a2f586', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f2fdda5532bd4487b413e696cfbf1197', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f941b67-d201-4a3f-bc1a-38e632bfe938', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b7aba6e6-d0b4-4668-8db9-bd9393b1e55e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=26fd7867-5e43-40ee-bb0a-95d52010310c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:10.117 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 26fd7867-5e43-40ee-bb0a-95d52010310c in datapath 4c4ff335-9221-4a73-8694-cb9e35a2f586 unbound from our chassis#033[00m
Oct  2 08:33:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:10.118 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4c4ff335-9221-4a73-8694-cb9e35a2f586 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:33:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:10.120 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f9ff350b-5948-42b1-b2ba-c218d9a2a506]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.124 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.125 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Ensure instance console log exists: /var/lib/nova/instances/5c61d077-c345-4b28-9942-624c141fc0a2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.126 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.126 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.126 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:10 np0005465988 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Oct  2 08:33:10 np0005465988 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000007c.scope: Consumed 14.849s CPU time.
Oct  2 08:33:10 np0005465988 systemd-machined[192594]: Machine qemu-55-instance-0000007c terminated.
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.210 2 DEBUG nova.compute.manager [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.210 2 DEBUG nova.network.neutron [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.265 2 INFO nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.320 2 DEBUG nova.compute.manager [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.368 2 DEBUG nova.policy [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1c2fbed9aaf84b4e864db97bec4c797c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '385766b9209941f3ab805e8d5e2af163', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.397 2 DEBUG nova.compute.manager [req-affe37fb-1fca-4f0a-9afc-bb63bb127760 req-5af01060-fb0b-4d61-a325-5a5e2106ea9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-vif-unplugged-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.398 2 DEBUG oslo_concurrency.lockutils [req-affe37fb-1fca-4f0a-9afc-bb63bb127760 req-5af01060-fb0b-4d61-a325-5a5e2106ea9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.398 2 DEBUG oslo_concurrency.lockutils [req-affe37fb-1fca-4f0a-9afc-bb63bb127760 req-5af01060-fb0b-4d61-a325-5a5e2106ea9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.398 2 DEBUG oslo_concurrency.lockutils [req-affe37fb-1fca-4f0a-9afc-bb63bb127760 req-5af01060-fb0b-4d61-a325-5a5e2106ea9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.398 2 DEBUG nova.compute.manager [req-affe37fb-1fca-4f0a-9afc-bb63bb127760 req-5af01060-fb0b-4d61-a325-5a5e2106ea9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] No waiting events found dispatching network-vif-unplugged-26fd7867-5e43-40ee-bb0a-95d52010310c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.399 2 WARNING nova.compute.manager [req-affe37fb-1fca-4f0a-9afc-bb63bb127760 req-5af01060-fb0b-4d61-a325-5a5e2106ea9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received unexpected event network-vif-unplugged-26fd7867-5e43-40ee-bb0a-95d52010310c for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:33:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:10.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.492 2 DEBUG nova.compute.manager [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.494 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.495 2 INFO nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Creating image(s)#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.524 2 DEBUG nova.storage.rbd_utils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image c5460257-b47c-4a1b-8e44-96ae657d6266_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.554 2 DEBUG nova.storage.rbd_utils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image c5460257-b47c-4a1b-8e44-96ae657d6266_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.592 2 DEBUG nova.storage.rbd_utils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image c5460257-b47c-4a1b-8e44-96ae657d6266_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.598 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.648 2 DEBUG nova.network.neutron [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Successfully updated port: 61ec6de3-7b6b-4f24-bf93-ce21a666d398 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.669 2 DEBUG nova.compute.manager [req-44748b49-4c12-4ff3-b043-08a36ddb8ab9 req-f1f41c08-c6eb-4748-9a91-c77804ed4d08 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Received event network-vif-deleted-7cf26487-91ca-4d15-85f3-bb6a66393796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.669 2 INFO nova.compute.manager [req-44748b49-4c12-4ff3-b043-08a36ddb8ab9 req-f1f41c08-c6eb-4748-9a91-c77804ed4d08 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Neutron deleted interface 7cf26487-91ca-4d15-85f3-bb6a66393796; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.670 2 DEBUG nova.network.neutron [req-44748b49-4c12-4ff3-b043-08a36ddb8ab9 req-f1f41c08-c6eb-4748-9a91-c77804ed4d08 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.690 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.691 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.691 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.692 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.720 2 DEBUG nova.storage.rbd_utils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image c5460257-b47c-4a1b-8e44-96ae657d6266_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.725 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c5460257-b47c-4a1b-8e44-96ae657d6266_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.767 2 DEBUG nova.network.neutron [-] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.770 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "refresh_cache-5c61d077-c345-4b28-9942-624c141fc0a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.771 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquired lock "refresh_cache-5c61d077-c345-4b28-9942-624c141fc0a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.771 2 DEBUG nova.network.neutron [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.778 2 DEBUG nova.compute.manager [req-44748b49-4c12-4ff3-b043-08a36ddb8ab9 req-f1f41c08-c6eb-4748-9a91-c77804ed4d08 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Detach interface failed, port_id=7cf26487-91ca-4d15-85f3-bb6a66393796, reason: Instance 4297c5cd-77b6-4f80-a746-11b304df8c90 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.796 2 INFO nova.compute.manager [-] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Took 1.40 seconds to deallocate network for instance.#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.880 2 INFO nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Instance shutdown successfully after 25 seconds.#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.889 2 INFO nova.virt.libvirt.driver [-] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Instance destroyed successfully.#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.890 2 DEBUG nova.objects.instance [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lazy-loading 'numa_topology' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.966 2 INFO nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Attempting rescue#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.967 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.971 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.971 2 INFO nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Creating image(s)#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.994 2 DEBUG nova.storage.rbd_utils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:10 np0005465988 nova_compute[236126]: 2025-10-02 12:33:10.997 2 DEBUG nova.objects.instance [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.046 2 DEBUG oslo_concurrency.lockutils [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.047 2 DEBUG oslo_concurrency.lockutils [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.076 2 DEBUG nova.storage.rbd_utils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.117 2 DEBUG nova.storage.rbd_utils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.127 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.176 2 DEBUG nova.network.neutron [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.232 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.233 2 DEBUG oslo_concurrency.lockutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.234 2 DEBUG oslo_concurrency.lockutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.234 2 DEBUG oslo_concurrency.lockutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.265 2 DEBUG nova.storage.rbd_utils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.271 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:11 np0005465988 nova_compute[236126]: 2025-10-02 12:33:11.795 2 DEBUG oslo_concurrency.processutils [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:12.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.196 2 DEBUG nova.network.neutron [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Successfully created port: 717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.232 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c5460257-b47c-4a1b-8e44-96ae657d6266_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:33:12 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1281428022' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.317 2 DEBUG oslo_concurrency.processutils [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.324 2 DEBUG nova.storage.rbd_utils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] resizing rbd image c5460257-b47c-4a1b-8e44-96ae657d6266_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.363 2 DEBUG nova.compute.provider_tree [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.379 2 DEBUG nova.scheduler.client.report [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:12.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.437 2 DEBUG oslo_concurrency.lockutils [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.444 2 DEBUG nova.objects.instance [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lazy-loading 'migration_context' on Instance uuid c5460257-b47c-4a1b-8e44-96ae657d6266 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.460 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.460 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Ensure instance console log exists: /var/lib/nova/instances/c5460257-b47c-4a1b-8e44-96ae657d6266/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.461 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.461 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.461 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.489 2 INFO nova.scheduler.client.report [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Deleted allocations for instance 4297c5cd-77b6-4f80-a746-11b304df8c90#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.513 2 DEBUG nova.compute.manager [req-02da10fa-1362-4eb4-a53c-89d0790cb51e req-d3f1e5db-1f59-46b5-be7f-2a9915860f7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.514 2 DEBUG oslo_concurrency.lockutils [req-02da10fa-1362-4eb4-a53c-89d0790cb51e req-d3f1e5db-1f59-46b5-be7f-2a9915860f7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.514 2 DEBUG oslo_concurrency.lockutils [req-02da10fa-1362-4eb4-a53c-89d0790cb51e req-d3f1e5db-1f59-46b5-be7f-2a9915860f7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.514 2 DEBUG oslo_concurrency.lockutils [req-02da10fa-1362-4eb4-a53c-89d0790cb51e req-d3f1e5db-1f59-46b5-be7f-2a9915860f7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.515 2 DEBUG nova.compute.manager [req-02da10fa-1362-4eb4-a53c-89d0790cb51e req-d3f1e5db-1f59-46b5-be7f-2a9915860f7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] No waiting events found dispatching network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.515 2 WARNING nova.compute.manager [req-02da10fa-1362-4eb4-a53c-89d0790cb51e req-d3f1e5db-1f59-46b5-be7f-2a9915860f7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received unexpected event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.554 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.283s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.554 2 DEBUG nova.objects.instance [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lazy-loading 'migration_context' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.570 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.571 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Start _get_guest_xml network_info=[{"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "vif_mac": "fa:16:3e:80:14:b0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.571 2 DEBUG nova.objects.instance [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lazy-loading 'resources' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.582 2 DEBUG oslo_concurrency.lockutils [None req-9a6ccab4-a5e2-402d-a52c-2e31e227d5d4 22d56fcd2a4b4851bfd126ae4548ee9b 5533aaac08cd4856af72ef4992bb5e76 - - default default] Lock "4297c5cd-77b6-4f80-a746-11b304df8c90" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.589 2 WARNING nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.597 2 DEBUG nova.virt.libvirt.host [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.597 2 DEBUG nova.virt.libvirt.host [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.601 2 DEBUG nova.virt.libvirt.host [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.601 2 DEBUG nova.virt.libvirt.host [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.603 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.603 2 DEBUG nova.virt.hardware [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.604 2 DEBUG nova.virt.hardware [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.604 2 DEBUG nova.virt.hardware [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.604 2 DEBUG nova.virt.hardware [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.605 2 DEBUG nova.virt.hardware [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.605 2 DEBUG nova.virt.hardware [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.605 2 DEBUG nova.virt.hardware [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.605 2 DEBUG nova.virt.hardware [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.606 2 DEBUG nova.virt.hardware [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.606 2 DEBUG nova.virt.hardware [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.606 2 DEBUG nova.virt.hardware [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.606 2 DEBUG nova.objects.instance [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.619 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.765 2 DEBUG nova.compute.manager [req-e771a085-8bd7-4877-895d-7521655dce53 req-0e8084f4-29f0-41ce-9f22-90eb694e9999 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Received event network-changed-61ec6de3-7b6b-4f24-bf93-ce21a666d398 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.767 2 DEBUG nova.compute.manager [req-e771a085-8bd7-4877-895d-7521655dce53 req-0e8084f4-29f0-41ce-9f22-90eb694e9999 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Refreshing instance network info cache due to event network-changed-61ec6de3-7b6b-4f24-bf93-ce21a666d398. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:12 np0005465988 nova_compute[236126]: 2025-10-02 12:33:12.769 2 DEBUG oslo_concurrency.lockutils [req-e771a085-8bd7-4877-895d-7521655dce53 req-0e8084f4-29f0-41ce-9f22-90eb694e9999 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-5c61d077-c345-4b28-9942-624c141fc0a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:33:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3074780554' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.117 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.122 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.189 2 DEBUG nova.network.neutron [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Updating instance_info_cache with network_info: [{"id": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "address": "fa:16:3e:d2:5a:15", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61ec6de3-7b", "ovs_interfaceid": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.208 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Releasing lock "refresh_cache-5c61d077-c345-4b28-9942-624c141fc0a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.209 2 DEBUG nova.compute.manager [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Instance network_info: |[{"id": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "address": "fa:16:3e:d2:5a:15", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61ec6de3-7b", "ovs_interfaceid": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.209 2 DEBUG oslo_concurrency.lockutils [req-e771a085-8bd7-4877-895d-7521655dce53 req-0e8084f4-29f0-41ce-9f22-90eb694e9999 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-5c61d077-c345-4b28-9942-624c141fc0a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.210 2 DEBUG nova.network.neutron [req-e771a085-8bd7-4877-895d-7521655dce53 req-0e8084f4-29f0-41ce-9f22-90eb694e9999 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Refreshing network info cache for port 61ec6de3-7b6b-4f24-bf93-ce21a666d398 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.212 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Start _get_guest_xml network_info=[{"id": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "address": "fa:16:3e:d2:5a:15", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61ec6de3-7b", "ovs_interfaceid": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:47Z,direct_url=<?>,disk_format='qcow2',id=db05f54c-61f8-42d6-a1e2-da3219a77b12,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'db05f54c-61f8-42d6-a1e2-da3219a77b12'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.219 2 WARNING nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.227 2 DEBUG nova.virt.libvirt.host [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.227 2 DEBUG nova.virt.libvirt.host [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.230 2 DEBUG nova.virt.libvirt.host [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.231 2 DEBUG nova.virt.libvirt.host [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.232 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.232 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:47Z,direct_url=<?>,disk_format='qcow2',id=db05f54c-61f8-42d6-a1e2-da3219a77b12,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.233 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.233 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.233 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.233 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.234 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.234 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.234 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.234 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.234 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.235 2 DEBUG nova.virt.hardware [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.239 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.350 2 DEBUG nova.network.neutron [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Successfully updated port: 717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.363 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "refresh_cache-c5460257-b47c-4a1b-8e44-96ae657d6266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.363 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquired lock "refresh_cache-c5460257-b47c-4a1b-8e44-96ae657d6266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.363 2 DEBUG nova.network.neutron [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.448 2 DEBUG nova.compute.manager [req-8502a0b5-1a84-4221-9e0a-9206a7ee32bb req-3a22e4a3-50c1-41c1-8f14-ebff3596b0a6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Received event network-changed-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.449 2 DEBUG nova.compute.manager [req-8502a0b5-1a84-4221-9e0a-9206a7ee32bb req-3a22e4a3-50c1-41c1-8f14-ebff3596b0a6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Refreshing instance network info cache due to event network-changed-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.449 2 DEBUG oslo_concurrency.lockutils [req-8502a0b5-1a84-4221-9e0a-9206a7ee32bb req-3a22e4a3-50c1-41c1-8f14-ebff3596b0a6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c5460257-b47c-4a1b-8e44-96ae657d6266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.591 2 DEBUG nova.network.neutron [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:33:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:33:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2872398020' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.619 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.626 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:33:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1295974800' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.702 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.740 2 DEBUG nova.storage.rbd_utils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image 5c61d077-c345-4b28-9942-624c141fc0a2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.745 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:13 np0005465988 nova_compute[236126]: 2025-10-02 12:33:13.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:14.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:33:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1855908522' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.202 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.577s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.204 2 DEBUG nova.virt.libvirt.vif [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-530066287',display_name='tempest-ServerRescueTestJSONUnderV235-server-530066287',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-530066287',id=124,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:42Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f2fdda5532bd4487b413e696cfbf1197',ramdisk_id='',reservation_id='r-icfttuz3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1167302845',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1167302845-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:42Z,user_data=None,user_id='95612007183445418f12dc53405b3e7b',uuid=11c595fe-756a-4f19-8c39-0c834af96d6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "vif_mac": "fa:16:3e:80:14:b0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.205 2 DEBUG nova.network.os_vif_util [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Converting VIF {"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "vif_mac": "fa:16:3e:80:14:b0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.205 2 DEBUG nova.network.os_vif_util [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:80:14:b0,bridge_name='br-int',has_traffic_filtering=True,id=26fd7867-5e43-40ee-bb0a-95d52010310c,network=Network(4c4ff335-9221-4a73-8694-cb9e35a2f586),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26fd7867-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.207 2 DEBUG nova.objects.instance [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lazy-loading 'pci_devices' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.225 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <uuid>11c595fe-756a-4f19-8c39-0c834af96d6a</uuid>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <name>instance-0000007c</name>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-530066287</nova:name>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:33:12</nova:creationTime>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:user uuid="95612007183445418f12dc53405b3e7b">tempest-ServerRescueTestJSONUnderV235-1167302845-project-member</nova:user>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:project uuid="f2fdda5532bd4487b413e696cfbf1197">tempest-ServerRescueTestJSONUnderV235-1167302845</nova:project>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:port uuid="26fd7867-5e43-40ee-bb0a-95d52010310c">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="serial">11c595fe-756a-4f19-8c39-0c834af96d6a</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="uuid">11c595fe-756a-4f19-8c39-0c834af96d6a</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/11c595fe-756a-4f19-8c39-0c834af96d6a_disk.rescue">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/11c595fe-756a-4f19-8c39-0c834af96d6a_disk">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <target dev="vdb" bus="virtio"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/11c595fe-756a-4f19-8c39-0c834af96d6a_disk.config.rescue">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:80:14:b0"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <target dev="tap26fd7867-5e"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/console.log" append="off"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:33:14 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:33:14 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.238 2 INFO nova.virt.libvirt.driver [-] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Instance destroyed successfully.#033[00m
Oct  2 08:33:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:33:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3273742260' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.268 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.269 2 DEBUG nova.virt.libvirt.vif [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1879466512',display_name='tempest-ListServerFiltersTestJSON-instance-1879466512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1879466512',id=128,image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='385766b9209941f3ab805e8d5e2af163',ramdisk_id='',reservation_id='r-a5ez8lhs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-542915701',owner_user_name='tempest-ListServerFiltersTestJSON-542915701-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:08Z,user_data=None,user_id='1c2fbed9aaf84b4e864db97bec4c797c',uuid=5c61d077-c345-4b28-9942-624c141fc0a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "address": "fa:16:3e:d2:5a:15", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61ec6de3-7b", "ovs_interfaceid": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.270 2 DEBUG nova.network.os_vif_util [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converting VIF {"id": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "address": "fa:16:3e:d2:5a:15", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61ec6de3-7b", "ovs_interfaceid": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.270 2 DEBUG nova.network.os_vif_util [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:5a:15,bridge_name='br-int',has_traffic_filtering=True,id=61ec6de3-7b6b-4f24-bf93-ce21a666d398,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61ec6de3-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.271 2 DEBUG nova.objects.instance [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5c61d077-c345-4b28-9942-624c141fc0a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.292 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <uuid>5c61d077-c345-4b28-9942-624c141fc0a2</uuid>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <name>instance-00000080</name>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1879466512</nova:name>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:33:13</nova:creationTime>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:user uuid="1c2fbed9aaf84b4e864db97bec4c797c">tempest-ListServerFiltersTestJSON-542915701-project-member</nova:user>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:project uuid="385766b9209941f3ab805e8d5e2af163">tempest-ListServerFiltersTestJSON-542915701</nova:project>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="db05f54c-61f8-42d6-a1e2-da3219a77b12"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <nova:port uuid="61ec6de3-7b6b-4f24-bf93-ce21a666d398">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="serial">5c61d077-c345-4b28-9942-624c141fc0a2</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="uuid">5c61d077-c345-4b28-9942-624c141fc0a2</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/5c61d077-c345-4b28-9942-624c141fc0a2_disk">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/5c61d077-c345-4b28-9942-624c141fc0a2_disk.config">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:d2:5a:15"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <target dev="tap61ec6de3-7b"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/5c61d077-c345-4b28-9942-624c141fc0a2/console.log" append="off"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:33:14 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:33:14 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:33:14 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:33:14 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.293 2 DEBUG nova.compute.manager [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Preparing to wait for external event network-vif-plugged-61ec6de3-7b6b-4f24-bf93-ce21a666d398 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.293 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.294 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.294 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.294 2 DEBUG nova.virt.libvirt.vif [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1879466512',display_name='tempest-ListServerFiltersTestJSON-instance-1879466512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1879466512',id=128,image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='385766b9209941f3ab805e8d5e2af163',ramdisk_id='',reservation_id='r-a5ez8lhs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-542915701',owner_user_name='tempest-ListServerFiltersTestJSON-542915701-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:08Z,user_data=None,user_id='1c2fbed9aaf84b4e864db97bec4c797c',uuid=5c61d077-c345-4b28-9942-624c141fc0a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "address": "fa:16:3e:d2:5a:15", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61ec6de3-7b", "ovs_interfaceid": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.295 2 DEBUG nova.network.os_vif_util [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converting VIF {"id": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "address": "fa:16:3e:d2:5a:15", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61ec6de3-7b", "ovs_interfaceid": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.296 2 DEBUG nova.network.os_vif_util [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:5a:15,bridge_name='br-int',has_traffic_filtering=True,id=61ec6de3-7b6b-4f24-bf93-ce21a666d398,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61ec6de3-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.296 2 DEBUG os_vif [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:5a:15,bridge_name='br-int',has_traffic_filtering=True,id=61ec6de3-7b6b-4f24-bf93-ce21a666d398,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61ec6de3-7b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.297 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.297 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.301 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61ec6de3-7b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.301 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap61ec6de3-7b, col_values=(('external_ids', {'iface-id': '61ec6de3-7b6b-4f24-bf93-ce21a666d398', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:5a:15', 'vm-uuid': '5c61d077-c345-4b28-9942-624c141fc0a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:14 np0005465988 NetworkManager[45041]: <info>  [1759408394.3358] manager: (tap61ec6de3-7b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.345 2 INFO os_vif [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:5a:15,bridge_name='br-int',has_traffic_filtering=True,id=61ec6de3-7b6b-4f24-bf93-ce21a666d398,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61ec6de3-7b')#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.370 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.371 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.371 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.371 2 DEBUG nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] No VIF found with MAC fa:16:3e:80:14:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.372 2 INFO nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Using config drive#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.404 2 DEBUG nova.storage.rbd_utils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:14.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.432 2 DEBUG nova.objects.instance [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.485 2 DEBUG nova.objects.instance [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lazy-loading 'keypairs' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.489 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.489 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.489 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] No VIF found with MAC fa:16:3e:d2:5a:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.490 2 INFO nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Using config drive#033[00m
Oct  2 08:33:14 np0005465988 nova_compute[236126]: 2025-10-02 12:33:14.615 2 DEBUG nova.storage.rbd_utils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image 5c61d077-c345-4b28-9942-624c141fc0a2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.010 2 DEBUG nova.network.neutron [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Updating instance_info_cache with network_info: [{"id": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "address": "fa:16:3e:67:e4:be", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap717a51c1-3d", "ovs_interfaceid": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.047 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Releasing lock "refresh_cache-c5460257-b47c-4a1b-8e44-96ae657d6266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.048 2 DEBUG nova.compute.manager [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Instance network_info: |[{"id": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "address": "fa:16:3e:67:e4:be", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap717a51c1-3d", "ovs_interfaceid": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.049 2 DEBUG oslo_concurrency.lockutils [req-8502a0b5-1a84-4221-9e0a-9206a7ee32bb req-3a22e4a3-50c1-41c1-8f14-ebff3596b0a6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c5460257-b47c-4a1b-8e44-96ae657d6266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.050 2 DEBUG nova.network.neutron [req-8502a0b5-1a84-4221-9e0a-9206a7ee32bb req-3a22e4a3-50c1-41c1-8f14-ebff3596b0a6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Refreshing network info cache for port 717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.055 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Start _get_guest_xml network_info=[{"id": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "address": "fa:16:3e:67:e4:be", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap717a51c1-3d", "ovs_interfaceid": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.062 2 WARNING nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.070 2 DEBUG nova.virt.libvirt.host [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.071 2 DEBUG nova.virt.libvirt.host [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.074 2 DEBUG nova.virt.libvirt.host [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.074 2 DEBUG nova.virt.libvirt.host [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.075 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.075 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eb3a53f1-304b-4cb0-acc3-abffce0fb181',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.075 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.076 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.076 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.076 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.076 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.077 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.077 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.077 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.077 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.077 2 DEBUG nova.virt.hardware [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.080 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.133 2 INFO nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Creating config drive at /var/lib/nova/instances/5c61d077-c345-4b28-9942-624c141fc0a2/disk.config#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.139 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5c61d077-c345-4b28-9942-624c141fc0a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1mlpz9j5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.201 2 INFO nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Creating config drive at /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config.rescue#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.208 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf0lekd2u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.294 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5c61d077-c345-4b28-9942-624c141fc0a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1mlpz9j5" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.329 2 DEBUG nova.storage.rbd_utils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image 5c61d077-c345-4b28-9942-624c141fc0a2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.332 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5c61d077-c345-4b28-9942-624c141fc0a2/disk.config 5c61d077-c345-4b28-9942-624c141fc0a2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.379 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf0lekd2u" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.480 2 DEBUG nova.storage.rbd_utils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] rbd image 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.486 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config.rescue 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.537 2 DEBUG nova.network.neutron [req-e771a085-8bd7-4877-895d-7521655dce53 req-0e8084f4-29f0-41ce-9f22-90eb694e9999 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Updated VIF entry in instance network info cache for port 61ec6de3-7b6b-4f24-bf93-ce21a666d398. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.539 2 DEBUG nova.network.neutron [req-e771a085-8bd7-4877-895d-7521655dce53 req-0e8084f4-29f0-41ce-9f22-90eb694e9999 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Updating instance_info_cache with network_info: [{"id": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "address": "fa:16:3e:d2:5a:15", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61ec6de3-7b", "ovs_interfaceid": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:33:15 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/469280999' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.567 2 DEBUG oslo_concurrency.lockutils [req-e771a085-8bd7-4877-895d-7521655dce53 req-0e8084f4-29f0-41ce-9f22-90eb694e9999 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-5c61d077-c345-4b28-9942-624c141fc0a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.809 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.729s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.840 2 DEBUG nova.storage.rbd_utils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image c5460257-b47c-4a1b-8e44-96ae657d6266_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:15 np0005465988 nova_compute[236126]: 2025-10-02 12:33:15.846 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:16.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:33:16 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2781084812' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.335 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.340 2 DEBUG nova.virt.libvirt.vif [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1285518133',display_name='tempest-ListServerFiltersTestJSON-instance-1285518133',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1285518133',id=129,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='385766b9209941f3ab805e8d5e2af163',ramdisk_id='',reservation_id='r-hl6uzcrq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-542915701',owner_user_name='tempest-ListServerFiltersTestJSON-542915701-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:10Z,user_data=None,user_id='1c2fbed9aaf84b4e864db97bec4c797c',uuid=c5460257-b47c-4a1b-8e44-96ae657d6266,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "address": "fa:16:3e:67:e4:be", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap717a51c1-3d", "ovs_interfaceid": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.341 2 DEBUG nova.network.os_vif_util [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converting VIF {"id": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "address": "fa:16:3e:67:e4:be", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap717a51c1-3d", "ovs_interfaceid": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.343 2 DEBUG nova.network.os_vif_util [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:e4:be,bridge_name='br-int',has_traffic_filtering=True,id=717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap717a51c1-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.346 2 DEBUG nova.objects.instance [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lazy-loading 'pci_devices' on Instance uuid c5460257-b47c-4a1b-8e44-96ae657d6266 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.364 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  <uuid>c5460257-b47c-4a1b-8e44-96ae657d6266</uuid>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  <name>instance-00000081</name>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  <memory>196608</memory>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1285518133</nova:name>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:33:15</nova:creationTime>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.micro">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <nova:memory>192</nova:memory>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <nova:user uuid="1c2fbed9aaf84b4e864db97bec4c797c">tempest-ListServerFiltersTestJSON-542915701-project-member</nova:user>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <nova:project uuid="385766b9209941f3ab805e8d5e2af163">tempest-ListServerFiltersTestJSON-542915701</nova:project>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <nova:port uuid="717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <entry name="serial">c5460257-b47c-4a1b-8e44-96ae657d6266</entry>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <entry name="uuid">c5460257-b47c-4a1b-8e44-96ae657d6266</entry>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c5460257-b47c-4a1b-8e44-96ae657d6266_disk">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c5460257-b47c-4a1b-8e44-96ae657d6266_disk.config">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:67:e4:be"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <target dev="tap717a51c1-3d"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/c5460257-b47c-4a1b-8e44-96ae657d6266/console.log" append="off"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:33:16 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:33:16 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:33:16 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:33:16 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:33:16 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:33:16 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.367 2 DEBUG nova.compute.manager [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Preparing to wait for external event network-vif-plugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.368 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.368 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.368 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.369 2 DEBUG nova.virt.libvirt.vif [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1285518133',display_name='tempest-ListServerFiltersTestJSON-instance-1285518133',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1285518133',id=129,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='385766b9209941f3ab805e8d5e2af163',ramdisk_id='',reservation_id='r-hl6uzcrq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-542915701',owner_user_name='tempest-ListServerFiltersTestJSON-542915701-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:10Z,user_data=None,user_id='1c2fbed9aaf84b4e864db97bec4c797c',uuid=c5460257-b47c-4a1b-8e44-96ae657d6266,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "address": "fa:16:3e:67:e4:be", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap717a51c1-3d", "ovs_interfaceid": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.370 2 DEBUG nova.network.os_vif_util [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converting VIF {"id": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "address": "fa:16:3e:67:e4:be", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap717a51c1-3d", "ovs_interfaceid": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.371 2 DEBUG nova.network.os_vif_util [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:e4:be,bridge_name='br-int',has_traffic_filtering=True,id=717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap717a51c1-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.371 2 DEBUG os_vif [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:e4:be,bridge_name='br-int',has_traffic_filtering=True,id=717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap717a51c1-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.372 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.373 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.378 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap717a51c1-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.379 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap717a51c1-3d, col_values=(('external_ids', {'iface-id': '717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:67:e4:be', 'vm-uuid': 'c5460257-b47c-4a1b-8e44-96ae657d6266'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:16 np0005465988 NetworkManager[45041]: <info>  [1759408396.3821] manager: (tap717a51c1-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.395 2 INFO os_vif [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:e4:be,bridge_name='br-int',has_traffic_filtering=True,id=717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap717a51c1-3d')#033[00m
Oct  2 08:33:16 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:33:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:16.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.478 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.478 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.479 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] No VIF found with MAC fa:16:3e:67:e4:be, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.479 2 INFO nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Using config drive#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.511 2 DEBUG nova.storage.rbd_utils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image c5460257-b47c-4a1b-8e44-96ae657d6266_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.592 2 DEBUG oslo_concurrency.processutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5c61d077-c345-4b28-9942-624c141fc0a2/disk.config 5c61d077-c345-4b28-9942-624c141fc0a2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.593 2 INFO nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Deleting local config drive /var/lib/nova/instances/5c61d077-c345-4b28-9942-624c141fc0a2/disk.config because it was imported into RBD.#033[00m
Oct  2 08:33:16 np0005465988 kernel: tap61ec6de3-7b: entered promiscuous mode
Oct  2 08:33:16 np0005465988 NetworkManager[45041]: <info>  [1759408396.6463] manager: (tap61ec6de3-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/260)
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:16Z|00579|binding|INFO|Claiming lport 61ec6de3-7b6b-4f24-bf93-ce21a666d398 for this chassis.
Oct  2 08:33:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:16Z|00580|binding|INFO|61ec6de3-7b6b-4f24-bf93-ce21a666d398: Claiming fa:16:3e:d2:5a:15 10.100.0.6
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.670 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:5a:15 10.100.0.6'], port_security=['fa:16:3e:d2:5a:15 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5c61d077-c345-4b28-9942-624c141fc0a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3d344f-7e5f-4676-877b-da313e338dc0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '385766b9209941f3ab805e8d5e2af163', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1389a46f-eb3b-49c0-bee4-ea4be4a55967', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6cf2f8e-38d5-4acc-9afc-6fc6835becad, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=61ec6de3-7b6b-4f24-bf93-ce21a666d398) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.671 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 61ec6de3-7b6b-4f24-bf93-ce21a666d398 in datapath 9f3d344f-7e5f-4676-877b-da313e338dc0 bound to our chassis#033[00m
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.673 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f3d344f-7e5f-4676-877b-da313e338dc0#033[00m
Oct  2 08:33:16 np0005465988 systemd-udevd[291541]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.685 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e3fb151a-2ff8-42cb-820f-b1db6a9a715a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.686 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f3d344f-71 in ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:33:16 np0005465988 NetworkManager[45041]: <info>  [1759408396.6890] device (tap61ec6de3-7b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:33:16 np0005465988 NetworkManager[45041]: <info>  [1759408396.6911] device (tap61ec6de3-7b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.689 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f3d344f-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.689 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f14ed3fe-901a-42bd-9b12-48e0c2f3a3b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.691 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[00105aee-a604-4785-9107-9a4bec8c5398]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.704 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[91f58bc5-2b4f-4ebf-9033-ce778fe21ea6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.709 2 DEBUG nova.network.neutron [req-8502a0b5-1a84-4221-9e0a-9206a7ee32bb req-3a22e4a3-50c1-41c1-8f14-ebff3596b0a6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Updated VIF entry in instance network info cache for port 717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.709 2 DEBUG nova.network.neutron [req-8502a0b5-1a84-4221-9e0a-9206a7ee32bb req-3a22e4a3-50c1-41c1-8f14-ebff3596b0a6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Updating instance_info_cache with network_info: [{"id": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "address": "fa:16:3e:67:e4:be", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap717a51c1-3d", "ovs_interfaceid": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:16 np0005465988 systemd-machined[192594]: New machine qemu-56-instance-00000080.
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.728 2 DEBUG oslo_concurrency.lockutils [req-8502a0b5-1a84-4221-9e0a-9206a7ee32bb req-3a22e4a3-50c1-41c1-8f14-ebff3596b0a6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c5460257-b47c-4a1b-8e44-96ae657d6266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:16 np0005465988 systemd[1]: Started Virtual Machine qemu-56-instance-00000080.
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.741 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[06963e16-3bb6-4e8e-ad16-e92c8e51c3bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:16Z|00581|binding|INFO|Setting lport 61ec6de3-7b6b-4f24-bf93-ce21a666d398 ovn-installed in OVS
Oct  2 08:33:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:16Z|00582|binding|INFO|Setting lport 61ec6de3-7b6b-4f24-bf93-ce21a666d398 up in Southbound
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.781 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b868a53e-5eab-4241-9207-a519087cc9ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 NetworkManager[45041]: <info>  [1759408396.7894] manager: (tap9f3d344f-70): new Veth device (/org/freedesktop/NetworkManager/Devices/261)
Oct  2 08:33:16 np0005465988 systemd-udevd[291545]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.788 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[832aa89e-ab1c-40fd-b711-3ea4f4dd1fb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.833 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[47d75ec5-0593-41ac-b751-dd41134fac6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.834 2 DEBUG oslo_concurrency.processutils [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config.rescue 11c595fe-756a-4f19-8c39-0c834af96d6a_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.349s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.835 2 INFO nova.virt.libvirt.driver [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Deleting local config drive /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a/disk.config.rescue because it was imported into RBD.#033[00m
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.837 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[695d2f4c-f001-4d12-a081-4f9073bc9ed0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 NetworkManager[45041]: <info>  [1759408396.8655] device (tap9f3d344f-70): carrier: link connected
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.876 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9e93ee31-3972-4e82-81dc-446704e3f594]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 kernel: tap26fd7867-5e: entered promiscuous mode
Oct  2 08:33:16 np0005465988 NetworkManager[45041]: <info>  [1759408396.8945] manager: (tap26fd7867-5e): new Tun device (/org/freedesktop/NetworkManager/Devices/262)
Oct  2 08:33:16 np0005465988 systemd-udevd[291579]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.895 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5386937a-2c5e-420a-ad16-8b15aee2f78a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f3d344f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:00:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645719, 'reachable_time': 19506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291593, 'error': None, 'target': 'ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:16Z|00583|binding|INFO|Claiming lport 26fd7867-5e43-40ee-bb0a-95d52010310c for this chassis.
Oct  2 08:33:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:16Z|00584|binding|INFO|26fd7867-5e43-40ee-bb0a-95d52010310c: Claiming fa:16:3e:80:14:b0 10.100.0.11
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.912 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:14:b0 10.100.0.11'], port_security=['fa:16:3e:80:14:b0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '11c595fe-756a-4f19-8c39-0c834af96d6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c4ff335-9221-4a73-8694-cb9e35a2f586', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f2fdda5532bd4487b413e696cfbf1197', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5f941b67-d201-4a3f-bc1a-38e632bfe938', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b7aba6e6-d0b4-4668-8db9-bd9393b1e55e, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=26fd7867-5e43-40ee-bb0a-95d52010310c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:16 np0005465988 NetworkManager[45041]: <info>  [1759408396.9152] device (tap26fd7867-5e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:33:16 np0005465988 NetworkManager[45041]: <info>  [1759408396.9174] device (tap26fd7867-5e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.920 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[37b46e03-e69c-48b0-a59e-4fdf2b36e626]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:c3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 645719, 'tstamp': 645719}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291599, 'error': None, 'target': 'ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:16Z|00585|binding|INFO|Setting lport 26fd7867-5e43-40ee-bb0a-95d52010310c ovn-installed in OVS
Oct  2 08:33:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:16Z|00586|binding|INFO|Setting lport 26fd7867-5e43-40ee-bb0a-95d52010310c up in Southbound
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005465988 nova_compute[236126]: 2025-10-02 12:33:16.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005465988 systemd-machined[192594]: New machine qemu-57-instance-0000007c.
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.944 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8843041b-c187-42a4-948d-6d54ce10752b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f3d344f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:00:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645719, 'reachable_time': 19506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 291619, 'error': None, 'target': 'ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:16 np0005465988 systemd[1]: Started Virtual Machine qemu-57-instance-0000007c.
Oct  2 08:33:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:16.978 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5a899052-b4ed-4d3a-9b2e-ccea676b88de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:17.048 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[44c9d4e0-d3cf-4dff-a543-27f653fe1e16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:17.050 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f3d344f-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:17.050 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:17.051 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f3d344f-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:17 np0005465988 NetworkManager[45041]: <info>  [1759408397.0540] manager: (tap9f3d344f-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Oct  2 08:33:17 np0005465988 kernel: tap9f3d344f-70: entered promiscuous mode
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:17.060 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f3d344f-70, col_values=(('external_ids', {'iface-id': '93989b20-c703-4abe-88be-5f6a3f1c5cdc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:17 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:17Z|00587|binding|INFO|Releasing lport 93989b20-c703-4abe-88be-5f6a3f1c5cdc from this chassis (sb_readonly=0)
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:17.086 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f3d344f-7e5f-4676-877b-da313e338dc0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f3d344f-7e5f-4676-877b-da313e338dc0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:17.098 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ec1f342e-514d-488e-aec0-7d1c4082ee54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:17.099 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-9f3d344f-7e5f-4676-877b-da313e338dc0
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/9f3d344f-7e5f-4676-877b-da313e338dc0.pid.haproxy
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 9f3d344f-7e5f-4676-877b-da313e338dc0
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:33:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:17.100 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0', 'env', 'PROCESS_TAG=haproxy-9f3d344f-7e5f-4676-877b-da313e338dc0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f3d344f-7e5f-4676-877b-da313e338dc0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.151 2 INFO nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Creating config drive at /var/lib/nova/instances/c5460257-b47c-4a1b-8e44-96ae657d6266/disk.config#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.161 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5460257-b47c-4a1b-8e44-96ae657d6266/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8x4s1xyu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.311 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5460257-b47c-4a1b-8e44-96ae657d6266/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8x4s1xyu" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:33:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/155040870' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:33:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:33:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/155040870' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.551 2 DEBUG nova.storage.rbd_utils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] rbd image c5460257-b47c-4a1b-8e44-96ae657d6266_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.557 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c5460257-b47c-4a1b-8e44-96ae657d6266/disk.config c5460257-b47c-4a1b-8e44-96ae657d6266_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:17 np0005465988 podman[291722]: 2025-10-02 12:33:17.474455366 +0000 UTC m=+0.025894986 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.602 2 DEBUG nova.compute.manager [req-4c0a9238-cb49-4851-8335-b943f05de51f req-8f9fb6c3-395a-462a-9dbc-5c868b09768a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Received event network-vif-plugged-61ec6de3-7b6b-4f24-bf93-ce21a666d398 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.603 2 DEBUG oslo_concurrency.lockutils [req-4c0a9238-cb49-4851-8335-b943f05de51f req-8f9fb6c3-395a-462a-9dbc-5c868b09768a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.604 2 DEBUG oslo_concurrency.lockutils [req-4c0a9238-cb49-4851-8335-b943f05de51f req-8f9fb6c3-395a-462a-9dbc-5c868b09768a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.604 2 DEBUG oslo_concurrency.lockutils [req-4c0a9238-cb49-4851-8335-b943f05de51f req-8f9fb6c3-395a-462a-9dbc-5c868b09768a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.604 2 DEBUG nova.compute.manager [req-4c0a9238-cb49-4851-8335-b943f05de51f req-8f9fb6c3-395a-462a-9dbc-5c868b09768a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Processing event network-vif-plugged-61ec6de3-7b6b-4f24-bf93-ce21a666d398 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.731 2 DEBUG nova.compute.manager [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.733 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408397.7309544, 5c61d077-c345-4b28-9942-624c141fc0a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.734 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] VM Started (Lifecycle Event)#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.739 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.746 2 INFO nova.virt.libvirt.driver [-] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Instance spawned successfully.#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.747 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.764 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.774 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.780 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.781 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.782 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.783 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.784 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.785 2 DEBUG nova.virt.libvirt.driver [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.794 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.795 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408397.7326035, 5c61d077-c345-4b28-9942-624c141fc0a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.795 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.832 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.837 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408397.737491, 5c61d077-c345-4b28-9942-624c141fc0a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.837 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.861 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.868 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.874 2 INFO nova.compute.manager [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Took 8.98 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.875 2 DEBUG nova.compute.manager [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.887 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:33:17 np0005465988 podman[291722]: 2025-10-02 12:33:17.906502262 +0000 UTC m=+0.457941882 container create 946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.955 2 INFO nova.compute.manager [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Took 11.66 seconds to build instance.#033[00m
Oct  2 08:33:17 np0005465988 nova_compute[236126]: 2025-10-02 12:33:17.971 2 DEBUG oslo_concurrency.lockutils [None req-3df00e8f-d720-4e26-b00c-0324ec862f44 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:18.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.051 2 DEBUG oslo_concurrency.processutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c5460257-b47c-4a1b-8e44-96ae657d6266/disk.config c5460257-b47c-4a1b-8e44-96ae657d6266_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.052 2 INFO nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Deleting local config drive /var/lib/nova/instances/c5460257-b47c-4a1b-8e44-96ae657d6266/disk.config because it was imported into RBD.#033[00m
Oct  2 08:33:18 np0005465988 systemd[1]: Started libpod-conmon-946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9.scope.
Oct  2 08:33:18 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:33:18 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bfe66de165e6c52ee2152b2e186d42f032af1b601ee4ba169de08b7de572689/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:33:18 np0005465988 NetworkManager[45041]: <info>  [1759408398.1330] manager: (tap717a51c1-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/264)
Oct  2 08:33:18 np0005465988 kernel: tap717a51c1-3d: entered promiscuous mode
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:18Z|00588|binding|INFO|Claiming lport 717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 for this chassis.
Oct  2 08:33:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:18Z|00589|binding|INFO|717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2: Claiming fa:16:3e:67:e4:be 10.100.0.11
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.146 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:e4:be 10.100.0.11'], port_security=['fa:16:3e:67:e4:be 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c5460257-b47c-4a1b-8e44-96ae657d6266', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3d344f-7e5f-4676-877b-da313e338dc0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '385766b9209941f3ab805e8d5e2af163', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1389a46f-eb3b-49c0-bee4-ea4be4a55967', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6cf2f8e-38d5-4acc-9afc-6fc6835becad, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:18 np0005465988 NetworkManager[45041]: <info>  [1759408398.1582] device (tap717a51c1-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:33:18 np0005465988 NetworkManager[45041]: <info>  [1759408398.1594] device (tap717a51c1-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:33:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:18Z|00590|binding|INFO|Setting lport 717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 ovn-installed in OVS
Oct  2 08:33:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:18Z|00591|binding|INFO|Setting lport 717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 up in Southbound
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:18 np0005465988 systemd-machined[192594]: New machine qemu-58-instance-00000081.
Oct  2 08:33:18 np0005465988 systemd[1]: Started Virtual Machine qemu-58-instance-00000081.
Oct  2 08:33:18 np0005465988 podman[291722]: 2025-10-02 12:33:18.32677436 +0000 UTC m=+0.878213970 container init 946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:33:18 np0005465988 podman[291722]: 2025-10-02 12:33:18.338461777 +0000 UTC m=+0.889901367 container start 946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:33:18 np0005465988 neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0[291811]: [NOTICE]   (291835) : New worker (291837) forked
Oct  2 08:33:18 np0005465988 neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0[291811]: [NOTICE]   (291835) : Loading success.
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.431 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 26fd7867-5e43-40ee-bb0a-95d52010310c in datapath 4c4ff335-9221-4a73-8694-cb9e35a2f586 unbound from our chassis#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.432 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4c4ff335-9221-4a73-8694-cb9e35a2f586 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.433 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3dae2175-ebb8-44b0-9498-82c719d9f6bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.433 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 in datapath 9f3d344f-7e5f-4676-877b-da313e338dc0 unbound from our chassis#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.435 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f3d344f-7e5f-4676-877b-da313e338dc0#033[00m
Oct  2 08:33:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:18.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.450 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[453df036-7a55-4f5a-9a3d-6285061f234b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.492 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d9412e58-d134-4f88-b3fa-3de3a133a509]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.498 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[758e6ee5-0699-418f-826f-0ddbb7b246f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.575 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e3bd8bf7-4c16-4c8b-a93a-e66455d02972]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.612 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[26b45c39-de2c-4548-a000-636079145eee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f3d344f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:00:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 306, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 306, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645719, 'reachable_time': 19506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291887, 'error': None, 'target': 'ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.635 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8169fbb5-20b1-48e0-b590-5cd71558f19c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9f3d344f-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 645733, 'tstamp': 645733}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291893, 'error': None, 'target': 'ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9f3d344f-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 645736, 'tstamp': 645736}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291893, 'error': None, 'target': 'ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.638 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f3d344f-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.642 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f3d344f-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.642 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408383.6394393, 4297c5cd-77b6-4f80-a746-11b304df8c90 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.642 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.642 2 INFO nova.compute.manager [-] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.642 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f3d344f-70, col_values=(('external_ids', {'iface-id': '93989b20-c703-4abe-88be-5f6a3f1c5cdc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:18.643 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.663 2 DEBUG nova.compute.manager [None req-dca27d93-6431-4253-898f-1354628f9c0c - - - - - -] [instance: 4297c5cd-77b6-4f80-a746-11b304df8c90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.701 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 11c595fe-756a-4f19-8c39-0c834af96d6a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.702 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408398.7011833, 11c595fe-756a-4f19-8c39-0c834af96d6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.702 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.706 2 DEBUG nova.compute.manager [None req-5721822b-cfb0-4a32-addc-83abef25a1f8 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.742 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.746 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.780 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.781 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408398.703927, 11c595fe-756a-4f19-8c39-0c834af96d6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.781 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.806 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:18 np0005465988 nova_compute[236126]: 2025-10-02 12:33:18.810 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.208 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408399.2074986, c5460257-b47c-4a1b-8e44-96ae657d6266 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.208 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] VM Started (Lifecycle Event)#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.231 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.235 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408399.2102349, c5460257-b47c-4a1b-8e44-96ae657d6266 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.236 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.266 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.272 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.297 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.508 2 DEBUG nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Received event network-vif-plugged-61ec6de3-7b6b-4f24-bf93-ce21a666d398 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.509 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.509 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.509 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.509 2 DEBUG nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] No waiting events found dispatching network-vif-plugged-61ec6de3-7b6b-4f24-bf93-ce21a666d398 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.510 2 WARNING nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Received unexpected event network-vif-plugged-61ec6de3-7b6b-4f24-bf93-ce21a666d398 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.510 2 DEBUG nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.510 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.511 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.511 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.511 2 DEBUG nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] No waiting events found dispatching network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.512 2 WARNING nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received unexpected event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.512 2 DEBUG nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.512 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.512 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.512 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.513 2 DEBUG nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] No waiting events found dispatching network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.513 2 WARNING nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received unexpected event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.513 2 DEBUG nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Received event network-vif-plugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.514 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.514 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.514 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.514 2 DEBUG nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Processing event network-vif-plugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.514 2 DEBUG nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Received event network-vif-plugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.515 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.515 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.515 2 DEBUG oslo_concurrency.lockutils [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.516 2 DEBUG nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] No waiting events found dispatching network-vif-plugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.517 2 WARNING nova.compute.manager [req-df1e287d-e0b7-4273-a9b1-8ec7598c5ee8 req-d37933ca-c70d-417b-8c31-fc12f5e77e96 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Received unexpected event network-vif-plugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.518 2 DEBUG nova.compute.manager [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.522 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408399.5217662, c5460257-b47c-4a1b-8e44-96ae657d6266 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.522 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.524 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.539 2 INFO nova.virt.libvirt.driver [-] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Instance spawned successfully.#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.540 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.563 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.570 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.577 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.578 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.579 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.580 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.580 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.581 2 DEBUG nova.virt.libvirt.driver [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.592 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.765 2 INFO nova.compute.manager [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Took 9.27 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.766 2 DEBUG nova.compute.manager [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.851 2 INFO nova.compute.manager [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Took 11.18 seconds to build instance.#033[00m
Oct  2 08:33:19 np0005465988 nova_compute[236126]: 2025-10-02 12:33:19.870 2 DEBUG oslo_concurrency.lockutils [None req-29b0fa60-bf15-4627-90ee-e0cfda7e2c55 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:20.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:33:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:20.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:33:21 np0005465988 nova_compute[236126]: 2025-10-02 12:33:21.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:22.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:22 np0005465988 nova_compute[236126]: 2025-10-02 12:33:22.384 2 DEBUG nova.compute.manager [req-c6c46557-67cf-472c-bc7d-16af89e24615 req-4513b8c3-ebdf-4314-b1f4-8f369283385d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-changed-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:22 np0005465988 nova_compute[236126]: 2025-10-02 12:33:22.385 2 DEBUG nova.compute.manager [req-c6c46557-67cf-472c-bc7d-16af89e24615 req-4513b8c3-ebdf-4314-b1f4-8f369283385d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Refreshing instance network info cache due to event network-changed-26fd7867-5e43-40ee-bb0a-95d52010310c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:22 np0005465988 nova_compute[236126]: 2025-10-02 12:33:22.385 2 DEBUG oslo_concurrency.lockutils [req-c6c46557-67cf-472c-bc7d-16af89e24615 req-4513b8c3-ebdf-4314-b1f4-8f369283385d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:22 np0005465988 nova_compute[236126]: 2025-10-02 12:33:22.385 2 DEBUG oslo_concurrency.lockutils [req-c6c46557-67cf-472c-bc7d-16af89e24615 req-4513b8c3-ebdf-4314-b1f4-8f369283385d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:22 np0005465988 nova_compute[236126]: 2025-10-02 12:33:22.386 2 DEBUG nova.network.neutron [req-c6c46557-67cf-472c-bc7d-16af89e24615 req-4513b8c3-ebdf-4314-b1f4-8f369283385d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Refreshing network info cache for port 26fd7867-5e43-40ee-bb0a-95d52010310c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:33:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:22.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:33:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.474 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.475 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.475 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.476 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.476 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.476 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.481 2 DEBUG nova.compute.manager [req-ba6ccd56-d635-44e6-ae84-00e7acddae09 req-c229a7c8-7938-4236-81e3-95e2e471ca27 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-changed-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.481 2 DEBUG nova.compute.manager [req-ba6ccd56-d635-44e6-ae84-00e7acddae09 req-c229a7c8-7938-4236-81e3-95e2e471ca27 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Refreshing instance network info cache due to event network-changed-26fd7867-5e43-40ee-bb0a-95d52010310c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.482 2 DEBUG oslo_concurrency.lockutils [req-ba6ccd56-d635-44e6-ae84-00e7acddae09 req-c229a7c8-7938-4236-81e3-95e2e471ca27 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.527 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.527 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Image id c2d0c2bc-fe21-4689-86ae-d6728c15874c yields fingerprint 50c3d0e01c5fd68886c717f1fdd053015a0fe968 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.528 2 INFO nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] image c2d0c2bc-fe21-4689-86ae-d6728c15874c at (/var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968): checking#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.528 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] image c2d0c2bc-fe21-4689-86ae-d6728c15874c at (/var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.533 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.533 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Image id db05f54c-61f8-42d6-a1e2-da3219a77b12 yields fingerprint 5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.533 2 INFO nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] image db05f54c-61f8-42d6-a1e2-da3219a77b12 at (/var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609): checking#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.535 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] image db05f54c-61f8-42d6-a1e2-da3219a77b12 at (/var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.536 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] 11c595fe-756a-4f19-8c39-0c834af96d6a is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.536 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] 5c61d077-c345-4b28-9942-624c141fc0a2 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.536 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] c5460257-b47c-4a1b-8e44-96ae657d6266 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.536 2 WARNING nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.537 2 INFO nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Active base files: /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.537 2 INFO nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Removable base files: /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.538 2 INFO nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.538 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.538 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.538 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.968 2 DEBUG nova.network.neutron [req-c6c46557-67cf-472c-bc7d-16af89e24615 req-4513b8c3-ebdf-4314-b1f4-8f369283385d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updated VIF entry in instance network info cache for port 26fd7867-5e43-40ee-bb0a-95d52010310c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.968 2 DEBUG nova.network.neutron [req-c6c46557-67cf-472c-bc7d-16af89e24615 req-4513b8c3-ebdf-4314-b1f4-8f369283385d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updating instance_info_cache with network_info: [{"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.998 2 DEBUG oslo_concurrency.lockutils [req-c6c46557-67cf-472c-bc7d-16af89e24615 req-4513b8c3-ebdf-4314-b1f4-8f369283385d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:23 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.999 2 DEBUG oslo_concurrency.lockutils [req-ba6ccd56-d635-44e6-ae84-00e7acddae09 req-c229a7c8-7938-4236-81e3-95e2e471ca27 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:24 np0005465988 nova_compute[236126]: 2025-10-02 12:33:23.999 2 DEBUG nova.network.neutron [req-ba6ccd56-d635-44e6-ae84-00e7acddae09 req-c229a7c8-7938-4236-81e3-95e2e471ca27 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Refreshing network info cache for port 26fd7867-5e43-40ee-bb0a-95d52010310c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:24.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:24.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:24 np0005465988 podman[291899]: 2025-10-02 12:33:24.570953341 +0000 UTC m=+0.104654201 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:33:24 np0005465988 podman[291898]: 2025-10-02 12:33:24.589614578 +0000 UTC m=+0.119761196 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct  2 08:33:24 np0005465988 podman[291900]: 2025-10-02 12:33:24.614175374 +0000 UTC m=+0.136200898 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:33:25 np0005465988 nova_compute[236126]: 2025-10-02 12:33:25.276 2 DEBUG nova.network.neutron [req-ba6ccd56-d635-44e6-ae84-00e7acddae09 req-c229a7c8-7938-4236-81e3-95e2e471ca27 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updated VIF entry in instance network info cache for port 26fd7867-5e43-40ee-bb0a-95d52010310c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:25 np0005465988 nova_compute[236126]: 2025-10-02 12:33:25.277 2 DEBUG nova.network.neutron [req-ba6ccd56-d635-44e6-ae84-00e7acddae09 req-c229a7c8-7938-4236-81e3-95e2e471ca27 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updating instance_info_cache with network_info: [{"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:25 np0005465988 nova_compute[236126]: 2025-10-02 12:33:25.298 2 DEBUG oslo_concurrency.lockutils [req-ba6ccd56-d635-44e6-ae84-00e7acddae09 req-c229a7c8-7938-4236-81e3-95e2e471ca27 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:26.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:26 np0005465988 nova_compute[236126]: 2025-10-02 12:33:26.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:26.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:27.362 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:27.363 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:27.364 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:28.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:28.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:28 np0005465988 nova_compute[236126]: 2025-10-02 12:33:28.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.0 total, 600.0 interval#012Cumulative writes: 10K writes, 52K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s#012Cumulative WAL: 10K writes, 10K syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1718 writes, 8384 keys, 1718 commit groups, 1.0 writes per commit group, ingest: 16.35 MB, 0.03 MB/s#012Interval WAL: 1718 writes, 1718 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     74.8      0.82              0.21        30    0.027       0      0       0.0       0.0#012  L6      1/0    9.20 MB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   4.4    135.7    113.7      2.36              1.05        29    0.082    172K    16K       0.0       0.0#012 Sum      1/0    9.20 MB   0.0      0.3     0.1      0.3       0.3      0.1       0.0   5.4    100.8    103.7      3.18              1.27        59    0.054    172K    16K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.3     89.9     89.5      0.82              0.30        12    0.068     46K   3142       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   0.0    135.7    113.7      2.36              1.05        29    0.082    172K    16K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     75.0      0.82              0.21        29    0.028       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3600.0 total, 600.0 interval#012Flush(GB): cumulative 0.060, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.32 GB write, 0.09 MB/s write, 0.31 GB read, 0.09 MB/s read, 3.2 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 304.00 MB usage: 35.55 MB table_size: 0 occupancy: 18446744073709551615 collections: 7 last_copies: 0 last_secs: 0.000279 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2076,34.24 MB,11.2634%) FilterBlock(59,490.80 KB,0.157662%) IndexBlock(59,852.14 KB,0.27374%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 08:33:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:30.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:30 np0005465988 nova_compute[236126]: 2025-10-02 12:33:30.351 2 DEBUG nova.compute.manager [req-ad30f1ec-5623-4856-8300-4c7d3ff9ff5f req-04c1f980-ba6f-4f2d-9b98-5f334d928558 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-changed-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:30 np0005465988 nova_compute[236126]: 2025-10-02 12:33:30.353 2 DEBUG nova.compute.manager [req-ad30f1ec-5623-4856-8300-4c7d3ff9ff5f req-04c1f980-ba6f-4f2d-9b98-5f334d928558 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Refreshing instance network info cache due to event network-changed-26fd7867-5e43-40ee-bb0a-95d52010310c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:30 np0005465988 nova_compute[236126]: 2025-10-02 12:33:30.353 2 DEBUG oslo_concurrency.lockutils [req-ad30f1ec-5623-4856-8300-4c7d3ff9ff5f req-04c1f980-ba6f-4f2d-9b98-5f334d928558 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:30 np0005465988 nova_compute[236126]: 2025-10-02 12:33:30.353 2 DEBUG oslo_concurrency.lockutils [req-ad30f1ec-5623-4856-8300-4c7d3ff9ff5f req-04c1f980-ba6f-4f2d-9b98-5f334d928558 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:30 np0005465988 nova_compute[236126]: 2025-10-02 12:33:30.354 2 DEBUG nova.network.neutron [req-ad30f1ec-5623-4856-8300-4c7d3ff9ff5f req-04c1f980-ba6f-4f2d-9b98-5f334d928558 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Refreshing network info cache for port 26fd7867-5e43-40ee-bb0a-95d52010310c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:30.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:31 np0005465988 nova_compute[236126]: 2025-10-02 12:33:31.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:32.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:32.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:32Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d2:5a:15 10.100.0.6
Oct  2 08:33:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:32Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d2:5a:15 10.100.0.6
Oct  2 08:33:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:33 np0005465988 nova_compute[236126]: 2025-10-02 12:33:33.153 2 DEBUG nova.network.neutron [req-ad30f1ec-5623-4856-8300-4c7d3ff9ff5f req-04c1f980-ba6f-4f2d-9b98-5f334d928558 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updated VIF entry in instance network info cache for port 26fd7867-5e43-40ee-bb0a-95d52010310c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:33 np0005465988 nova_compute[236126]: 2025-10-02 12:33:33.154 2 DEBUG nova.network.neutron [req-ad30f1ec-5623-4856-8300-4c7d3ff9ff5f req-04c1f980-ba6f-4f2d-9b98-5f334d928558 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updating instance_info_cache with network_info: [{"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:33 np0005465988 nova_compute[236126]: 2025-10-02 12:33:33.172 2 DEBUG oslo_concurrency.lockutils [req-ad30f1ec-5623-4856-8300-4c7d3ff9ff5f req-04c1f980-ba6f-4f2d-9b98-5f334d928558 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:33 np0005465988 nova_compute[236126]: 2025-10-02 12:33:33.228 2 DEBUG nova.compute.manager [req-6c5ecfb2-0224-45e0-a339-214cab03baa5 req-7e47f2c7-e7b5-49e2-bde9-5b67eb73b22e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-changed-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:33 np0005465988 nova_compute[236126]: 2025-10-02 12:33:33.228 2 DEBUG nova.compute.manager [req-6c5ecfb2-0224-45e0-a339-214cab03baa5 req-7e47f2c7-e7b5-49e2-bde9-5b67eb73b22e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Refreshing instance network info cache due to event network-changed-26fd7867-5e43-40ee-bb0a-95d52010310c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:33 np0005465988 nova_compute[236126]: 2025-10-02 12:33:33.229 2 DEBUG oslo_concurrency.lockutils [req-6c5ecfb2-0224-45e0-a339-214cab03baa5 req-7e47f2c7-e7b5-49e2-bde9-5b67eb73b22e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:33 np0005465988 nova_compute[236126]: 2025-10-02 12:33:33.229 2 DEBUG oslo_concurrency.lockutils [req-6c5ecfb2-0224-45e0-a339-214cab03baa5 req-7e47f2c7-e7b5-49e2-bde9-5b67eb73b22e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:33 np0005465988 nova_compute[236126]: 2025-10-02 12:33:33.230 2 DEBUG nova.network.neutron [req-6c5ecfb2-0224-45e0-a339-214cab03baa5 req-7e47f2c7-e7b5-49e2-bde9-5b67eb73b22e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Refreshing network info cache for port 26fd7867-5e43-40ee-bb0a-95d52010310c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:33 np0005465988 nova_compute[236126]: 2025-10-02 12:33:33.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:34.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:33:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:34.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:33:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:35Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:67:e4:be 10.100.0.11
Oct  2 08:33:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:35Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:67:e4:be 10.100.0.11
Oct  2 08:33:36 np0005465988 nova_compute[236126]: 2025-10-02 12:33:36.018 2 DEBUG nova.network.neutron [req-6c5ecfb2-0224-45e0-a339-214cab03baa5 req-7e47f2c7-e7b5-49e2-bde9-5b67eb73b22e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updated VIF entry in instance network info cache for port 26fd7867-5e43-40ee-bb0a-95d52010310c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:36 np0005465988 nova_compute[236126]: 2025-10-02 12:33:36.018 2 DEBUG nova.network.neutron [req-6c5ecfb2-0224-45e0-a339-214cab03baa5 req-7e47f2c7-e7b5-49e2-bde9-5b67eb73b22e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updating instance_info_cache with network_info: [{"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:36 np0005465988 nova_compute[236126]: 2025-10-02 12:33:36.042 2 DEBUG oslo_concurrency.lockutils [req-6c5ecfb2-0224-45e0-a339-214cab03baa5 req-7e47f2c7-e7b5-49e2-bde9-5b67eb73b22e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-11c595fe-756a-4f19-8c39-0c834af96d6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:36.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:36 np0005465988 nova_compute[236126]: 2025-10-02 12:33:36.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:36.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:36 np0005465988 podman[292018]: 2025-10-02 12:33:36.52701975 +0000 UTC m=+0.060213403 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 08:33:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:38.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.274 2 DEBUG oslo_concurrency.lockutils [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.275 2 DEBUG oslo_concurrency.lockutils [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.275 2 DEBUG oslo_concurrency.lockutils [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.276 2 DEBUG oslo_concurrency.lockutils [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.276 2 DEBUG oslo_concurrency.lockutils [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.277 2 INFO nova.compute.manager [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Terminating instance#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.278 2 DEBUG nova.compute.manager [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:33:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:38.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:38 np0005465988 kernel: tap26fd7867-5e (unregistering): left promiscuous mode
Oct  2 08:33:38 np0005465988 NetworkManager[45041]: <info>  [1759408418.4915] device (tap26fd7867-5e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:33:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:38Z|00592|binding|INFO|Releasing lport 26fd7867-5e43-40ee-bb0a-95d52010310c from this chassis (sb_readonly=0)
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:38Z|00593|binding|INFO|Setting lport 26fd7867-5e43-40ee-bb0a-95d52010310c down in Southbound
Oct  2 08:33:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:38Z|00594|binding|INFO|Removing iface tap26fd7867-5e ovn-installed in OVS
Oct  2 08:33:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:38.517 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:14:b0 10.100.0.11'], port_security=['fa:16:3e:80:14:b0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '11c595fe-756a-4f19-8c39-0c834af96d6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c4ff335-9221-4a73-8694-cb9e35a2f586', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f2fdda5532bd4487b413e696cfbf1197', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5f941b67-d201-4a3f-bc1a-38e632bfe938', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b7aba6e6-d0b4-4668-8db9-bd9393b1e55e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=26fd7867-5e43-40ee-bb0a-95d52010310c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:38.518 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 26fd7867-5e43-40ee-bb0a-95d52010310c in datapath 4c4ff335-9221-4a73-8694-cb9e35a2f586 unbound from our chassis#033[00m
Oct  2 08:33:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:38.524 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4c4ff335-9221-4a73-8694-cb9e35a2f586 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:33:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:38.526 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1e29d1-6b24-430c-b352-c0218a159d12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:38 np0005465988 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Oct  2 08:33:38 np0005465988 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007c.scope: Consumed 14.860s CPU time.
Oct  2 08:33:38 np0005465988 systemd-machined[192594]: Machine qemu-57-instance-0000007c terminated.
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.725 2 INFO nova.virt.libvirt.driver [-] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Instance destroyed successfully.#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.726 2 DEBUG nova.objects.instance [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lazy-loading 'resources' on Instance uuid 11c595fe-756a-4f19-8c39-0c834af96d6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.747 2 DEBUG nova.virt.libvirt.vif [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-530066287',display_name='tempest-ServerRescueTestJSONUnderV235-server-530066287',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-530066287',id=124,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:33:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f2fdda5532bd4487b413e696cfbf1197',ramdisk_id='',reservation_id='r-icfttuz3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1167302845',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1167302845-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:33:18Z,user_data=None,user_id='95612007183445418f12dc53405b3e7b',uuid=11c595fe-756a-4f19-8c39-0c834af96d6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.747 2 DEBUG nova.network.os_vif_util [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Converting VIF {"id": "26fd7867-5e43-40ee-bb0a-95d52010310c", "address": "fa:16:3e:80:14:b0", "network": {"id": "4c4ff335-9221-4a73-8694-cb9e35a2f586", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-236379424-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f2fdda5532bd4487b413e696cfbf1197", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26fd7867-5e", "ovs_interfaceid": "26fd7867-5e43-40ee-bb0a-95d52010310c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.749 2 DEBUG nova.network.os_vif_util [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:80:14:b0,bridge_name='br-int',has_traffic_filtering=True,id=26fd7867-5e43-40ee-bb0a-95d52010310c,network=Network(4c4ff335-9221-4a73-8694-cb9e35a2f586),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26fd7867-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.749 2 DEBUG os_vif [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:14:b0,bridge_name='br-int',has_traffic_filtering=True,id=26fd7867-5e43-40ee-bb0a-95d52010310c,network=Network(4c4ff335-9221-4a73-8694-cb9e35a2f586),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26fd7867-5e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.753 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap26fd7867-5e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.762 2 INFO os_vif [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:14:b0,bridge_name='br-int',has_traffic_filtering=True,id=26fd7867-5e43-40ee-bb0a-95d52010310c,network=Network(4c4ff335-9221-4a73-8694-cb9e35a2f586),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26fd7867-5e')#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.807 2 DEBUG nova.compute.manager [req-35bd0804-bad9-44f2-a15a-fdcde9d4ff42 req-241a4e8c-2e69-4b65-9469-5488a0bc1d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-vif-unplugged-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.807 2 DEBUG oslo_concurrency.lockutils [req-35bd0804-bad9-44f2-a15a-fdcde9d4ff42 req-241a4e8c-2e69-4b65-9469-5488a0bc1d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.808 2 DEBUG oslo_concurrency.lockutils [req-35bd0804-bad9-44f2-a15a-fdcde9d4ff42 req-241a4e8c-2e69-4b65-9469-5488a0bc1d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.808 2 DEBUG oslo_concurrency.lockutils [req-35bd0804-bad9-44f2-a15a-fdcde9d4ff42 req-241a4e8c-2e69-4b65-9469-5488a0bc1d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.808 2 DEBUG nova.compute.manager [req-35bd0804-bad9-44f2-a15a-fdcde9d4ff42 req-241a4e8c-2e69-4b65-9469-5488a0bc1d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] No waiting events found dispatching network-vif-unplugged-26fd7867-5e43-40ee-bb0a-95d52010310c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:38 np0005465988 nova_compute[236126]: 2025-10-02 12:33:38.808 2 DEBUG nova.compute.manager [req-35bd0804-bad9-44f2-a15a-fdcde9d4ff42 req-241a4e8c-2e69-4b65-9469-5488a0bc1d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-vif-unplugged-26fd7867-5e43-40ee-bb0a-95d52010310c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:33:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:40.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.321 2 INFO nova.virt.libvirt.driver [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Deleting instance files /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a_del#033[00m
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.322 2 INFO nova.virt.libvirt.driver [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Deletion of /var/lib/nova/instances/11c595fe-756a-4f19-8c39-0c834af96d6a_del complete#033[00m
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.399 2 INFO nova.compute.manager [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Took 2.12 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.400 2 DEBUG oslo.service.loopingcall [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.400 2 DEBUG nova.compute.manager [-] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.400 2 DEBUG nova.network.neutron [-] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:33:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:40.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.894 2 DEBUG nova.compute.manager [req-f410c3e6-55fc-4b5e-8dfd-d66592455918 req-09849605-83a6-4250-b3e7-42ae098f3abe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.895 2 DEBUG oslo_concurrency.lockutils [req-f410c3e6-55fc-4b5e-8dfd-d66592455918 req-09849605-83a6-4250-b3e7-42ae098f3abe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.895 2 DEBUG oslo_concurrency.lockutils [req-f410c3e6-55fc-4b5e-8dfd-d66592455918 req-09849605-83a6-4250-b3e7-42ae098f3abe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.896 2 DEBUG oslo_concurrency.lockutils [req-f410c3e6-55fc-4b5e-8dfd-d66592455918 req-09849605-83a6-4250-b3e7-42ae098f3abe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.896 2 DEBUG nova.compute.manager [req-f410c3e6-55fc-4b5e-8dfd-d66592455918 req-09849605-83a6-4250-b3e7-42ae098f3abe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] No waiting events found dispatching network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:40 np0005465988 nova_compute[236126]: 2025-10-02 12:33:40.896 2 WARNING nova.compute.manager [req-f410c3e6-55fc-4b5e-8dfd-d66592455918 req-09849605-83a6-4250-b3e7-42ae098f3abe d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received unexpected event network-vif-plugged-26fd7867-5e43-40ee-bb0a-95d52010310c for instance with vm_state rescued and task_state deleting.#033[00m
Oct  2 08:33:41 np0005465988 nova_compute[236126]: 2025-10-02 12:33:41.427 2 DEBUG nova.network.neutron [-] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:41 np0005465988 nova_compute[236126]: 2025-10-02 12:33:41.446 2 INFO nova.compute.manager [-] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Took 1.05 seconds to deallocate network for instance.#033[00m
Oct  2 08:33:41 np0005465988 nova_compute[236126]: 2025-10-02 12:33:41.501 2 DEBUG oslo_concurrency.lockutils [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:41 np0005465988 nova_compute[236126]: 2025-10-02 12:33:41.501 2 DEBUG oslo_concurrency.lockutils [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:41 np0005465988 nova_compute[236126]: 2025-10-02 12:33:41.509 2 DEBUG nova.compute.manager [req-1f614bef-47ba-4f29-a824-5dde8d767315 req-31184596-a191-45ff-9290-03a2940a6e87 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Received event network-vif-deleted-26fd7867-5e43-40ee-bb0a-95d52010310c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:41 np0005465988 nova_compute[236126]: 2025-10-02 12:33:41.625 2 DEBUG oslo_concurrency.processutils [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:33:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:42.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:33:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:33:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2179889415' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:33:42 np0005465988 nova_compute[236126]: 2025-10-02 12:33:42.170 2 DEBUG oslo_concurrency.processutils [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.545s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:42 np0005465988 nova_compute[236126]: 2025-10-02 12:33:42.177 2 DEBUG nova.compute.provider_tree [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:42 np0005465988 nova_compute[236126]: 2025-10-02 12:33:42.195 2 DEBUG nova.scheduler.client.report [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:42 np0005465988 nova_compute[236126]: 2025-10-02 12:33:42.220 2 DEBUG oslo_concurrency.lockutils [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.719s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:42 np0005465988 nova_compute[236126]: 2025-10-02 12:33:42.282 2 INFO nova.scheduler.client.report [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Deleted allocations for instance 11c595fe-756a-4f19-8c39-0c834af96d6a#033[00m
Oct  2 08:33:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:42.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:42 np0005465988 nova_compute[236126]: 2025-10-02 12:33:42.618 2 DEBUG oslo_concurrency.lockutils [None req-90f935da-0f65-406c-9150-52ac07aca6ac 95612007183445418f12dc53405b3e7b f2fdda5532bd4487b413e696cfbf1197 - - default default] Lock "11c595fe-756a-4f19-8c39-0c834af96d6a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:42 np0005465988 podman[292273]: 2025-10-02 12:33:42.646899536 +0000 UTC m=+0.068606394 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Oct  2 08:33:42 np0005465988 podman[292273]: 2025-10-02 12:33:42.765080355 +0000 UTC m=+0.186787133 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct  2 08:33:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:43 np0005465988 podman[292411]: 2025-10-02 12:33:43.370677373 +0000 UTC m=+0.065863565 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:33:43 np0005465988 podman[292411]: 2025-10-02 12:33:43.406218255 +0000 UTC m=+0.101404387 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:33:43 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:43Z|00595|binding|INFO|Releasing lport 93989b20-c703-4abe-88be-5f6a3f1c5cdc from this chassis (sb_readonly=0)
Oct  2 08:33:43 np0005465988 nova_compute[236126]: 2025-10-02 12:33:43.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005465988 podman[292477]: 2025-10-02 12:33:43.680724001 +0000 UTC m=+0.074470473 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, io.buildah.version=1.28.2, name=keepalived, com.redhat.component=keepalived-container, architecture=x86_64, vendor=Red Hat, Inc.)
Oct  2 08:33:43 np0005465988 podman[292477]: 2025-10-02 12:33:43.697763131 +0000 UTC m=+0.091509553 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, release=1793, architecture=x86_64, io.buildah.version=1.28.2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, distribution-scope=public, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, version=2.2.4, description=keepalived for Ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Oct  2 08:33:43 np0005465988 nova_compute[236126]: 2025-10-02 12:33:43.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005465988 nova_compute[236126]: 2025-10-02 12:33:43.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005465988 ovn_controller[132601]: 2025-10-02T12:33:43Z|00596|binding|INFO|Releasing lport 93989b20-c703-4abe-88be-5f6a3f1c5cdc from this chassis (sb_readonly=0)
Oct  2 08:33:43 np0005465988 nova_compute[236126]: 2025-10-02 12:33:43.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:44.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:44.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:33:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:33:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:44.878 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:44.879 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:33:44 np0005465988 nova_compute[236126]: 2025-10-02 12:33:44.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:44 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:33:44.882 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:33:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:33:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:33:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:46.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:33:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:46.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:33:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:48.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:48.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:48 np0005465988 nova_compute[236126]: 2025-10-02 12:33:48.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:48 np0005465988 nova_compute[236126]: 2025-10-02 12:33:48.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:50.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:33:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:50.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:33:51 np0005465988 nova_compute[236126]: 2025-10-02 12:33:51.538 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:52.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:52.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:53 np0005465988 nova_compute[236126]: 2025-10-02 12:33:53.722 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408418.7208114, 11c595fe-756a-4f19-8c39-0c834af96d6a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:53 np0005465988 nova_compute[236126]: 2025-10-02 12:33:53.723 2 INFO nova.compute.manager [-] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:33:53 np0005465988 nova_compute[236126]: 2025-10-02 12:33:53.744 2 DEBUG nova.compute.manager [None req-cee9ba50-447e-4bf2-b049-b9f049ffe84c - - - - - -] [instance: 11c595fe-756a-4f19-8c39-0c834af96d6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:53 np0005465988 nova_compute[236126]: 2025-10-02 12:33:53.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:53 np0005465988 nova_compute[236126]: 2025-10-02 12:33:53.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:33:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:33:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:54.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:54 np0005465988 nova_compute[236126]: 2025-10-02 12:33:54.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:33:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:54.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:33:54 np0005465988 nova_compute[236126]: 2025-10-02 12:33:54.499 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:54 np0005465988 nova_compute[236126]: 2025-10-02 12:33:54.500 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:54 np0005465988 nova_compute[236126]: 2025-10-02 12:33:54.501 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:54 np0005465988 nova_compute[236126]: 2025-10-02 12:33:54.501 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:33:54 np0005465988 nova_compute[236126]: 2025-10-02 12:33:54.502 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:33:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4283116276' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:33:54 np0005465988 nova_compute[236126]: 2025-10-02 12:33:54.986 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.088 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.089 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.094 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.095 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000080 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:33:55 np0005465988 podman[292792]: 2025-10-02 12:33:55.150042382 +0000 UTC m=+0.090339750 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:33:55 np0005465988 podman[292793]: 2025-10-02 12:33:55.156110546 +0000 UTC m=+0.093509480 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:33:55 np0005465988 podman[292791]: 2025-10-02 12:33:55.2035115 +0000 UTC m=+0.142838300 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.308 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.309 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3800MB free_disk=20.71889877319336GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.309 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.309 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.871 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 5c61d077-c345-4b28-9942-624c141fc0a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.872 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance c5460257-b47c-4a1b-8e44-96ae657d6266 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.872 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:33:55 np0005465988 nova_compute[236126]: 2025-10-02 12:33:55.873 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:33:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:56.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:56 np0005465988 nova_compute[236126]: 2025-10-02 12:33:56.068 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:56.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:33:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/148215647' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:33:56 np0005465988 nova_compute[236126]: 2025-10-02 12:33:56.560 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:56 np0005465988 nova_compute[236126]: 2025-10-02 12:33:56.567 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:56 np0005465988 nova_compute[236126]: 2025-10-02 12:33:56.634 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:56 np0005465988 nova_compute[236126]: 2025-10-02 12:33:56.937 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:33:56 np0005465988 nova_compute[236126]: 2025-10-02 12:33:56.937 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #100. Immutable memtables: 0.
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.853575) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 61] Flushing memtable with next log file: 100
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408437853618, "job": 61, "event": "flush_started", "num_memtables": 1, "num_entries": 1981, "num_deletes": 255, "total_data_size": 4330418, "memory_usage": 4415968, "flush_reason": "Manual Compaction"}
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 61] Level-0 flush table #101: started
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408437868453, "cf_name": "default", "job": 61, "event": "table_file_creation", "file_number": 101, "file_size": 2820366, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 50708, "largest_seqno": 52684, "table_properties": {"data_size": 2812169, "index_size": 4883, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 18403, "raw_average_key_size": 20, "raw_value_size": 2795326, "raw_average_value_size": 3180, "num_data_blocks": 211, "num_entries": 879, "num_filter_entries": 879, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759408293, "oldest_key_time": 1759408293, "file_creation_time": 1759408437, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 101, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 61] Flush lasted 14949 microseconds, and 6705 cpu microseconds.
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.868516) [db/flush_job.cc:967] [default] [JOB 61] Level-0 flush table #101: 2820366 bytes OK
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.868551) [db/memtable_list.cc:519] [default] Level-0 commit table #101 started
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.870025) [db/memtable_list.cc:722] [default] Level-0 commit table #101: memtable #1 done
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.870045) EVENT_LOG_v1 {"time_micros": 1759408437870039, "job": 61, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.870068) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 61] Try to delete WAL files size 4321408, prev total WAL file size 4321408, number of live WAL files 2.
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000097.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.871320) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034323637' seq:72057594037927935, type:22 .. '7061786F730034353139' seq:0, type:0; will stop at (end)
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 62] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 61 Base level 0, inputs: [101(2754KB)], [99(9422KB)]
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408437871393, "job": 62, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [101], "files_L6": [99], "score": -1, "input_data_size": 12469080, "oldest_snapshot_seqno": -1}
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 62] Generated table #102: 7676 keys, 10591214 bytes, temperature: kUnknown
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408437935276, "cf_name": "default", "job": 62, "event": "table_file_creation", "file_number": 102, "file_size": 10591214, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10540701, "index_size": 30241, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19205, "raw_key_size": 198395, "raw_average_key_size": 25, "raw_value_size": 10404534, "raw_average_value_size": 1355, "num_data_blocks": 1185, "num_entries": 7676, "num_filter_entries": 7676, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759408437, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 102, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.935658) [db/compaction/compaction_job.cc:1663] [default] [JOB 62] Compacted 1@0 + 1@6 files to L6 => 10591214 bytes
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.936847) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 194.7 rd, 165.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 9.2 +0.0 blob) out(10.1 +0.0 blob), read-write-amplify(8.2) write-amplify(3.8) OK, records in: 8207, records dropped: 531 output_compression: NoCompression
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.936865) EVENT_LOG_v1 {"time_micros": 1759408437936856, "job": 62, "event": "compaction_finished", "compaction_time_micros": 64045, "compaction_time_cpu_micros": 36599, "output_level": 6, "num_output_files": 1, "total_output_size": 10591214, "num_input_records": 8207, "num_output_records": 7676, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000101.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408437937729, "job": 62, "event": "table_file_deletion", "file_number": 101}
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000099.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408437940013, "job": 62, "event": "table_file_deletion", "file_number": 99}
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.871233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.940167) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.940175) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.940178) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.940187) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:33:57 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:33:57.940189) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:33:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:33:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:33:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:33:58.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:33:58 np0005465988 nova_compute[236126]: 2025-10-02 12:33:58.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:58 np0005465988 nova_compute[236126]: 2025-10-02 12:33:58.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:58 np0005465988 nova_compute[236126]: 2025-10-02 12:33:58.476 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:58 np0005465988 nova_compute[236126]: 2025-10-02 12:33:58.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:33:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:33:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:33:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:33:58.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:33:58 np0005465988 nova_compute[236126]: 2025-10-02 12:33:58.532 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:33:58 np0005465988 nova_compute[236126]: 2025-10-02 12:33:58.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:58 np0005465988 nova_compute[236126]: 2025-10-02 12:33:58.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:00.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.299 2 DEBUG oslo_concurrency.lockutils [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "c5460257-b47c-4a1b-8e44-96ae657d6266" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.300 2 DEBUG oslo_concurrency.lockutils [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.300 2 DEBUG oslo_concurrency.lockutils [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.300 2 DEBUG oslo_concurrency.lockutils [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.300 2 DEBUG oslo_concurrency.lockutils [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.302 2 INFO nova.compute.manager [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Terminating instance#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.303 2 DEBUG nova.compute.manager [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:34:00 np0005465988 kernel: tap717a51c1-3d (unregistering): left promiscuous mode
Oct  2 08:34:00 np0005465988 NetworkManager[45041]: <info>  [1759408440.3709] device (tap717a51c1-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:00Z|00597|binding|INFO|Releasing lport 717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 from this chassis (sb_readonly=0)
Oct  2 08:34:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:00Z|00598|binding|INFO|Setting lport 717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 down in Southbound
Oct  2 08:34:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:00Z|00599|binding|INFO|Removing iface tap717a51c1-3d ovn-installed in OVS
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.421 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:e4:be 10.100.0.11'], port_security=['fa:16:3e:67:e4:be 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c5460257-b47c-4a1b-8e44-96ae657d6266', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3d344f-7e5f-4676-877b-da313e338dc0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '385766b9209941f3ab805e8d5e2af163', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1389a46f-eb3b-49c0-bee4-ea4be4a55967', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6cf2f8e-38d5-4acc-9afc-6fc6835becad, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.423 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 in datapath 9f3d344f-7e5f-4676-877b-da313e338dc0 unbound from our chassis#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.425 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f3d344f-7e5f-4676-877b-da313e338dc0#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.447 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[de3ca0c0-209a-4383-9d22-754df7332e43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:00 np0005465988 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000081.scope: Deactivated successfully.
Oct  2 08:34:00 np0005465988 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000081.scope: Consumed 15.434s CPU time.
Oct  2 08:34:00 np0005465988 systemd-machined[192594]: Machine qemu-58-instance-00000081 terminated.
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.491 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f1d0ef9b-b877-4729-b4b1-91add096c898]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.495 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[636f92a5-8df9-4877-a788-4483f01e5977]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:00.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.530 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.532 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.537 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[98fd50e2-9289-44fd-8a93-cd4ecd74b8a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.550 2 INFO nova.virt.libvirt.driver [-] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Instance destroyed successfully.#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.551 2 DEBUG nova.objects.instance [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lazy-loading 'resources' on Instance uuid c5460257-b47c-4a1b-8e44-96ae657d6266 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.566 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[509a6735-9a48-4668-9d6e-f1b43f0eb386]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f3d344f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:00:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645719, 'reachable_time': 19506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 292900, 'error': None, 'target': 'ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.570 2 DEBUG nova.virt.libvirt.vif [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:33:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1285518133',display_name='tempest-ListServerFiltersTestJSON-instance-1285518133',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1285518133',id=129,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:33:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='385766b9209941f3ab805e8d5e2af163',ramdisk_id='',reservation_id='r-hl6uzcrq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-542915701',owner_user_name='tempest-ListServerFiltersTestJSON-542915701-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:33:19Z,user_data=None,user_id='1c2fbed9aaf84b4e864db97bec4c797c',uuid=c5460257-b47c-4a1b-8e44-96ae657d6266,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "address": "fa:16:3e:67:e4:be", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap717a51c1-3d", "ovs_interfaceid": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.571 2 DEBUG nova.network.os_vif_util [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converting VIF {"id": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "address": "fa:16:3e:67:e4:be", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap717a51c1-3d", "ovs_interfaceid": "717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.573 2 DEBUG nova.network.os_vif_util [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:e4:be,bridge_name='br-int',has_traffic_filtering=True,id=717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap717a51c1-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.574 2 DEBUG os_vif [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:e4:be,bridge_name='br-int',has_traffic_filtering=True,id=717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap717a51c1-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.576 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap717a51c1-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.583 2 INFO os_vif [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:e4:be,bridge_name='br-int',has_traffic_filtering=True,id=717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap717a51c1-3d')#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.595 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dd3b2b2d-ba20-480f-a54b-742db41825bf]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9f3d344f-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 645733, 'tstamp': 645733}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 292903, 'error': None, 'target': 'ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9f3d344f-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 645736, 'tstamp': 645736}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 292903, 'error': None, 'target': 'ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.598 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f3d344f-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.600 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f3d344f-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.601 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.601 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f3d344f-70, col_values=(('external_ids', {'iface-id': '93989b20-c703-4abe-88be-5f6a3f1c5cdc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:00.602 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:00 np0005465988 nova_compute[236126]: 2025-10-02 12:34:00.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:01 np0005465988 nova_compute[236126]: 2025-10-02 12:34:01.476 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:02.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.487 2 INFO nova.virt.libvirt.driver [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Deleting instance files /var/lib/nova/instances/c5460257-b47c-4a1b-8e44-96ae657d6266_del#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.488 2 INFO nova.virt.libvirt.driver [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Deletion of /var/lib/nova/instances/c5460257-b47c-4a1b-8e44-96ae657d6266_del complete#033[00m
Oct  2 08:34:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:02.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.564 2 INFO nova.compute.manager [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Took 2.26 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.565 2 DEBUG oslo.service.loopingcall [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.566 2 DEBUG nova.compute.manager [-] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.566 2 DEBUG nova.network.neutron [-] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.640 2 DEBUG nova.compute.manager [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Received event network-vif-unplugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.641 2 DEBUG oslo_concurrency.lockutils [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.641 2 DEBUG oslo_concurrency.lockutils [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.642 2 DEBUG oslo_concurrency.lockutils [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.642 2 DEBUG nova.compute.manager [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] No waiting events found dispatching network-vif-unplugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.642 2 DEBUG nova.compute.manager [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Received event network-vif-unplugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.642 2 DEBUG nova.compute.manager [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Received event network-vif-plugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.643 2 DEBUG oslo_concurrency.lockutils [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.643 2 DEBUG oslo_concurrency.lockutils [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.643 2 DEBUG oslo_concurrency.lockutils [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.644 2 DEBUG nova.compute.manager [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] No waiting events found dispatching network-vif-plugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:02 np0005465988 nova_compute[236126]: 2025-10-02 12:34:02.644 2 WARNING nova.compute.manager [req-78cd5ce1-e938-4c6c-88b9-ded554512c49 req-fbc02b68-0350-44f1-b08c-3a32408d32a2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Received unexpected event network-vif-plugged-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:34:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:03 np0005465988 nova_compute[236126]: 2025-10-02 12:34:03.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:03 np0005465988 nova_compute[236126]: 2025-10-02 12:34:03.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:03 np0005465988 nova_compute[236126]: 2025-10-02 12:34:03.959 2 DEBUG nova.network.neutron [-] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:03 np0005465988 nova_compute[236126]: 2025-10-02 12:34:03.996 2 INFO nova.compute.manager [-] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Took 1.43 seconds to deallocate network for instance.#033[00m
Oct  2 08:34:04 np0005465988 nova_compute[236126]: 2025-10-02 12:34:04.052 2 DEBUG oslo_concurrency.lockutils [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:04 np0005465988 nova_compute[236126]: 2025-10-02 12:34:04.053 2 DEBUG oslo_concurrency.lockutils [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:04.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:04 np0005465988 nova_compute[236126]: 2025-10-02 12:34:04.150 2 DEBUG oslo_concurrency.processutils [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:04.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:34:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3713526240' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:34:04 np0005465988 nova_compute[236126]: 2025-10-02 12:34:04.631 2 DEBUG oslo_concurrency.processutils [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:04 np0005465988 nova_compute[236126]: 2025-10-02 12:34:04.639 2 DEBUG nova.compute.provider_tree [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:04 np0005465988 nova_compute[236126]: 2025-10-02 12:34:04.686 2 DEBUG nova.scheduler.client.report [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:04 np0005465988 nova_compute[236126]: 2025-10-02 12:34:04.758 2 DEBUG oslo_concurrency.lockutils [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:04 np0005465988 nova_compute[236126]: 2025-10-02 12:34:04.808 2 DEBUG nova.compute.manager [req-a46d72c7-dfe5-404f-a67a-e205365b029a req-0d4f7ba9-2f52-4968-a3bb-d5033b196ec9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Received event network-vif-deleted-717a51c1-3dbe-4f80-bea0-3a0b3fc47ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:04 np0005465988 nova_compute[236126]: 2025-10-02 12:34:04.809 2 INFO nova.scheduler.client.report [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Deleted allocations for instance c5460257-b47c-4a1b-8e44-96ae657d6266#033[00m
Oct  2 08:34:04 np0005465988 nova_compute[236126]: 2025-10-02 12:34:04.931 2 DEBUG oslo_concurrency.lockutils [None req-76817b35-feba-4073-ba47-a7b82cb92ce6 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "c5460257-b47c-4a1b-8e44-96ae657d6266" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.219 2 DEBUG oslo_concurrency.lockutils [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "5c61d077-c345-4b28-9942-624c141fc0a2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.220 2 DEBUG oslo_concurrency.lockutils [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.220 2 DEBUG oslo_concurrency.lockutils [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.220 2 DEBUG oslo_concurrency.lockutils [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.221 2 DEBUG oslo_concurrency.lockutils [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.222 2 INFO nova.compute.manager [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Terminating instance#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.223 2 DEBUG nova.compute.manager [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:34:05 np0005465988 kernel: tap61ec6de3-7b (unregistering): left promiscuous mode
Oct  2 08:34:05 np0005465988 NetworkManager[45041]: <info>  [1759408445.3317] device (tap61ec6de3-7b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:34:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:05Z|00600|binding|INFO|Releasing lport 61ec6de3-7b6b-4f24-bf93-ce21a666d398 from this chassis (sb_readonly=0)
Oct  2 08:34:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:05Z|00601|binding|INFO|Setting lport 61ec6de3-7b6b-4f24-bf93-ce21a666d398 down in Southbound
Oct  2 08:34:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:05Z|00602|binding|INFO|Removing iface tap61ec6de3-7b ovn-installed in OVS
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.351 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:5a:15 10.100.0.6'], port_security=['fa:16:3e:d2:5a:15 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5c61d077-c345-4b28-9942-624c141fc0a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3d344f-7e5f-4676-877b-da313e338dc0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '385766b9209941f3ab805e8d5e2af163', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1389a46f-eb3b-49c0-bee4-ea4be4a55967', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6cf2f8e-38d5-4acc-9afc-6fc6835becad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=61ec6de3-7b6b-4f24-bf93-ce21a666d398) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.353 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 61ec6de3-7b6b-4f24-bf93-ce21a666d398 in datapath 9f3d344f-7e5f-4676-877b-da313e338dc0 unbound from our chassis#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.355 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f3d344f-7e5f-4676-877b-da313e338dc0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.356 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[274a6fc8-1552-410e-9fba-9a05c5a32aeb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.357 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0 namespace which is not needed anymore#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:05 np0005465988 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000080.scope: Deactivated successfully.
Oct  2 08:34:05 np0005465988 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000080.scope: Consumed 15.317s CPU time.
Oct  2 08:34:05 np0005465988 systemd-machined[192594]: Machine qemu-56-instance-00000080 terminated.
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.459 2 INFO nova.virt.libvirt.driver [-] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Instance destroyed successfully.#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.459 2 DEBUG nova.objects.instance [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lazy-loading 'resources' on Instance uuid 5c61d077-c345-4b28-9942-624c141fc0a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.486 2 DEBUG nova.virt.libvirt.vif [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:33:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1879466512',display_name='tempest-ListServerFiltersTestJSON-instance-1879466512',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1879466512',id=128,image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:33:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='385766b9209941f3ab805e8d5e2af163',ramdisk_id='',reservation_id='r-a5ez8lhs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-542915701',owner_user_name='tempest-ListServerFiltersTestJSON-542915701-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:33:17Z,user_data=None,user_id='1c2fbed9aaf84b4e864db97bec4c797c',uuid=5c61d077-c345-4b28-9942-624c141fc0a2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "address": "fa:16:3e:d2:5a:15", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61ec6de3-7b", "ovs_interfaceid": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.486 2 DEBUG nova.network.os_vif_util [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converting VIF {"id": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "address": "fa:16:3e:d2:5a:15", "network": {"id": "9f3d344f-7e5f-4676-877b-da313e338dc0", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1391478832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "385766b9209941f3ab805e8d5e2af163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61ec6de3-7b", "ovs_interfaceid": "61ec6de3-7b6b-4f24-bf93-ce21a666d398", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.487 2 DEBUG nova.network.os_vif_util [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:5a:15,bridge_name='br-int',has_traffic_filtering=True,id=61ec6de3-7b6b-4f24-bf93-ce21a666d398,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61ec6de3-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.487 2 DEBUG os_vif [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:5a:15,bridge_name='br-int',has_traffic_filtering=True,id=61ec6de3-7b6b-4f24-bf93-ce21a666d398,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61ec6de3-7b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61ec6de3-7b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.497 2 INFO os_vif [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:5a:15,bridge_name='br-int',has_traffic_filtering=True,id=61ec6de3-7b6b-4f24-bf93-ce21a666d398,network=Network(9f3d344f-7e5f-4676-877b-da313e338dc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61ec6de3-7b')#033[00m
Oct  2 08:34:05 np0005465988 neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0[291811]: [NOTICE]   (291835) : haproxy version is 2.8.14-c23fe91
Oct  2 08:34:05 np0005465988 neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0[291811]: [NOTICE]   (291835) : path to executable is /usr/sbin/haproxy
Oct  2 08:34:05 np0005465988 neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0[291811]: [WARNING]  (291835) : Exiting Master process...
Oct  2 08:34:05 np0005465988 neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0[291811]: [ALERT]    (291835) : Current worker (291837) exited with code 143 (Terminated)
Oct  2 08:34:05 np0005465988 neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0[291811]: [WARNING]  (291835) : All workers exited. Exiting... (0)
Oct  2 08:34:05 np0005465988 systemd[1]: libpod-946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9.scope: Deactivated successfully.
Oct  2 08:34:05 np0005465988 podman[292974]: 2025-10-02 12:34:05.517020577 +0000 UTC m=+0.064223088 container died 946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:34:05 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9-userdata-shm.mount: Deactivated successfully.
Oct  2 08:34:05 np0005465988 systemd[1]: var-lib-containers-storage-overlay-0bfe66de165e6c52ee2152b2e186d42f032af1b601ee4ba169de08b7de572689-merged.mount: Deactivated successfully.
Oct  2 08:34:05 np0005465988 podman[292974]: 2025-10-02 12:34:05.572096641 +0000 UTC m=+0.119299162 container cleanup 946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:34:05 np0005465988 systemd[1]: libpod-conmon-946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9.scope: Deactivated successfully.
Oct  2 08:34:05 np0005465988 podman[293028]: 2025-10-02 12:34:05.644985868 +0000 UTC m=+0.050646038 container remove 946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.673 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6a786720-1f93-47a0-9bb8-2bf774a27464]: (4, ('Thu Oct  2 12:34:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0 (946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9)\n946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9\nThu Oct  2 12:34:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0 (946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9)\n946b97132c68cc1489e1ba10af5c8d85f0acba6a96fb80e30c00ea350cbd80c9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.677 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[08f3aa01-fe8f-49c3-9a29-1619e111647e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.678 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f3d344f-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:05 np0005465988 kernel: tap9f3d344f-70: left promiscuous mode
Oct  2 08:34:05 np0005465988 nova_compute[236126]: 2025-10-02 12:34:05.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.710 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e747bce7-536f-4dc4-9c9d-643efa04df48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.734 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[021b822e-bc08-467b-998f-ed7458171b22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.736 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ce52f4c9-054d-431f-bbf3-e565ed52f4a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.756 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f35898b9-019e-47fd-b02f-a4952471e353]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645710, 'reachable_time': 22899, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293042, 'error': None, 'target': 'ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.760 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f3d344f-7e5f-4676-877b-da313e338dc0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:34:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:05.760 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[cfcb8c05-35f6-4a78-9685-3f6b0a424f5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:05 np0005465988 systemd[1]: run-netns-ovnmeta\x2d9f3d344f\x2d7e5f\x2d4676\x2d877b\x2dda313e338dc0.mount: Deactivated successfully.
Oct  2 08:34:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:06.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:06 np0005465988 nova_compute[236126]: 2025-10-02 12:34:06.450 2 INFO nova.virt.libvirt.driver [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Deleting instance files /var/lib/nova/instances/5c61d077-c345-4b28-9942-624c141fc0a2_del#033[00m
Oct  2 08:34:06 np0005465988 nova_compute[236126]: 2025-10-02 12:34:06.451 2 INFO nova.virt.libvirt.driver [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Deletion of /var/lib/nova/instances/5c61d077-c345-4b28-9942-624c141fc0a2_del complete#033[00m
Oct  2 08:34:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:06.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:34:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3601.0 total, 600.0 interval#012Cumulative writes: 48K writes, 195K keys, 48K commit groups, 1.0 writes per commit group, ingest: 0.19 GB, 0.05 MB/s#012Cumulative WAL: 48K writes, 17K syncs, 2.73 writes per sync, written: 0.19 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 10K writes, 43K keys, 10K commit groups, 1.0 writes per commit group, ingest: 44.54 MB, 0.07 MB/s#012Interval WAL: 10K writes, 4087 syncs, 2.64 writes per sync, written: 0.04 GB, 0.07 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 08:34:06 np0005465988 nova_compute[236126]: 2025-10-02 12:34:06.567 2 INFO nova.compute.manager [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Took 1.34 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:34:06 np0005465988 nova_compute[236126]: 2025-10-02 12:34:06.567 2 DEBUG oslo.service.loopingcall [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:34:06 np0005465988 nova_compute[236126]: 2025-10-02 12:34:06.568 2 DEBUG nova.compute.manager [-] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:34:06 np0005465988 nova_compute[236126]: 2025-10-02 12:34:06.568 2 DEBUG nova.network.neutron [-] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:34:07 np0005465988 podman[293069]: 2025-10-02 12:34:07.200098417 +0000 UTC m=+0.078228211 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Oct  2 08:34:07 np0005465988 nova_compute[236126]: 2025-10-02 12:34:07.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:07 np0005465988 nova_compute[236126]: 2025-10-02 12:34:07.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:34:07 np0005465988 nova_compute[236126]: 2025-10-02 12:34:07.553 2 DEBUG nova.network.neutron [-] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:07 np0005465988 nova_compute[236126]: 2025-10-02 12:34:07.576 2 INFO nova.compute.manager [-] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Took 1.01 seconds to deallocate network for instance.#033[00m
Oct  2 08:34:07 np0005465988 nova_compute[236126]: 2025-10-02 12:34:07.651 2 DEBUG oslo_concurrency.lockutils [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:07 np0005465988 nova_compute[236126]: 2025-10-02 12:34:07.652 2 DEBUG oslo_concurrency.lockutils [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:07 np0005465988 nova_compute[236126]: 2025-10-02 12:34:07.657 2 DEBUG nova.compute.manager [req-a483169d-fd6c-491d-b2f8-a141fa1a89bf req-59259613-f508-4000-b79c-8ba0a5babe3c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Received event network-vif-deleted-61ec6de3-7b6b-4f24-bf93-ce21a666d398 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:07 np0005465988 nova_compute[236126]: 2025-10-02 12:34:07.701 2 DEBUG oslo_concurrency.processutils [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.067 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.068 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:08.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.086 2 DEBUG nova.compute.manager [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:34:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:34:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1341908670' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.167 2 DEBUG oslo_concurrency.processutils [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.171 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.178 2 DEBUG nova.compute.provider_tree [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.204 2 DEBUG nova.scheduler.client.report [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.224 2 DEBUG oslo_concurrency.lockutils [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.227 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.236 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.237 2 INFO nova.compute.claims [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.256 2 INFO nova.scheduler.client.report [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Deleted allocations for instance 5c61d077-c345-4b28-9942-624c141fc0a2#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.336 2 DEBUG oslo_concurrency.lockutils [None req-6e2d50c2-70d8-44d4-bb8f-1ea8250c523a 1c2fbed9aaf84b4e864db97bec4c797c 385766b9209941f3ab805e8d5e2af163 - - default default] Lock "5c61d077-c345-4b28-9942-624c141fc0a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.338 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:08.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:34:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/612066073' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.772 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.777 2 DEBUG nova.compute.provider_tree [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.803 2 DEBUG nova.scheduler.client.report [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.843 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.844 2 DEBUG nova.compute.manager [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.894 2 DEBUG nova.compute.manager [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.896 2 DEBUG nova.network.neutron [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.937 2 INFO nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:34:08 np0005465988 nova_compute[236126]: 2025-10-02 12:34:08.963 2 DEBUG nova.compute.manager [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.085 2 DEBUG nova.compute.manager [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.087 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.087 2 INFO nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Creating image(s)#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.129 2 DEBUG nova.storage.rbd_utils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.181 2 DEBUG nova.storage.rbd_utils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.223 2 DEBUG nova.storage.rbd_utils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.228 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.295 2 DEBUG nova.policy [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bcd36ab668f449959719ba7058f25e72', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a05e525420b4aa8adcc9561158e73d1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.335 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.336 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.337 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.337 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.378 2 DEBUG nova.storage.rbd_utils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.383 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.490 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.491 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.491 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.514 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:34:09 np0005465988 nova_compute[236126]: 2025-10-02 12:34:09.515 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:34:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:10.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:10 np0005465988 nova_compute[236126]: 2025-10-02 12:34:10.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:10.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:10 np0005465988 nova_compute[236126]: 2025-10-02 12:34:10.696 2 DEBUG nova.network.neutron [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Successfully created port: c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:34:11 np0005465988 nova_compute[236126]: 2025-10-02 12:34:11.197 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.814s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:11 np0005465988 nova_compute[236126]: 2025-10-02 12:34:11.319 2 DEBUG nova.storage.rbd_utils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] resizing rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:34:11 np0005465988 nova_compute[236126]: 2025-10-02 12:34:11.909 2 DEBUG nova.objects.instance [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'migration_context' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:12 np0005465988 nova_compute[236126]: 2025-10-02 12:34:12.031 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:34:12 np0005465988 nova_compute[236126]: 2025-10-02 12:34:12.032 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Ensure instance console log exists: /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:34:12 np0005465988 nova_compute[236126]: 2025-10-02 12:34:12.033 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:12 np0005465988 nova_compute[236126]: 2025-10-02 12:34:12.033 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:12 np0005465988 nova_compute[236126]: 2025-10-02 12:34:12.034 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:12.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:12 np0005465988 nova_compute[236126]: 2025-10-02 12:34:12.406 2 DEBUG nova.network.neutron [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Successfully updated port: c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:34:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:34:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:12.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:34:12 np0005465988 nova_compute[236126]: 2025-10-02 12:34:12.563 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:12 np0005465988 nova_compute[236126]: 2025-10-02 12:34:12.563 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquired lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:12 np0005465988 nova_compute[236126]: 2025-10-02 12:34:12.563 2 DEBUG nova.network.neutron [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:34:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:13 np0005465988 nova_compute[236126]: 2025-10-02 12:34:13.888 2 DEBUG nova.network.neutron [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:34:13 np0005465988 nova_compute[236126]: 2025-10-02 12:34:13.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:13 np0005465988 nova_compute[236126]: 2025-10-02 12:34:13.973 2 DEBUG nova.compute.manager [req-833ccd0c-ba5b-4f6b-82ba-98a7d41e8922 req-3ac894fc-2ea9-4413-95e2-8aea64a5f1ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-changed-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:13 np0005465988 nova_compute[236126]: 2025-10-02 12:34:13.974 2 DEBUG nova.compute.manager [req-833ccd0c-ba5b-4f6b-82ba-98a7d41e8922 req-3ac894fc-2ea9-4413-95e2-8aea64a5f1ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Refreshing instance network info cache due to event network-changed-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:34:13 np0005465988 nova_compute[236126]: 2025-10-02 12:34:13.974 2 DEBUG oslo_concurrency.lockutils [req-833ccd0c-ba5b-4f6b-82ba-98a7d41e8922 req-3ac894fc-2ea9-4413-95e2-8aea64a5f1ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:14.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:14.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.543 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408440.5428793, c5460257-b47c-4a1b-8e44-96ae657d6266 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.544 2 INFO nova.compute.manager [-] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.582 2 DEBUG nova.network.neutron [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updating instance_info_cache with network_info: [{"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.662 2 DEBUG nova.compute.manager [None req-4a4bc12b-cfaf-4f86-8778-0542284f626c - - - - - -] [instance: c5460257-b47c-4a1b-8e44-96ae657d6266] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.890 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Releasing lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.891 2 DEBUG nova.compute.manager [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Instance network_info: |[{"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.892 2 DEBUG oslo_concurrency.lockutils [req-833ccd0c-ba5b-4f6b-82ba-98a7d41e8922 req-3ac894fc-2ea9-4413-95e2-8aea64a5f1ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.893 2 DEBUG nova.network.neutron [req-833ccd0c-ba5b-4f6b-82ba-98a7d41e8922 req-3ac894fc-2ea9-4413-95e2-8aea64a5f1ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Refreshing network info cache for port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.897 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Start _get_guest_xml network_info=[{"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.903 2 WARNING nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.908 2 DEBUG nova.virt.libvirt.host [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.909 2 DEBUG nova.virt.libvirt.host [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.929 2 DEBUG nova.virt.libvirt.host [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.930 2 DEBUG nova.virt.libvirt.host [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.932 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.933 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.934 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.934 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.935 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.935 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.936 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.937 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.937 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.938 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.939 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.939 2 DEBUG nova.virt.hardware [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:34:15 np0005465988 nova_compute[236126]: 2025-10-02 12:34:15.944 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:16.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:34:16 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1936495245' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:34:16 np0005465988 nova_compute[236126]: 2025-10-02 12:34:16.448 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:16 np0005465988 nova_compute[236126]: 2025-10-02 12:34:16.482 2 DEBUG nova.storage.rbd_utils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:16 np0005465988 nova_compute[236126]: 2025-10-02 12:34:16.488 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:16.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:34:16 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1760207176' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:34:16 np0005465988 nova_compute[236126]: 2025-10-02 12:34:16.928 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:16 np0005465988 nova_compute[236126]: 2025-10-02 12:34:16.931 2 DEBUG nova.virt.libvirt.vif [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-639305243',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-639305243',id=130,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH7i72qZf0LyRp/akt/bIu4snLJg8XuAUIsHpF3xOK1XlpVLYZ/YFzz7wr2QY5za8QZBy0/Efb6X+c12F9Zi3EqjS+0mqhH0nerFk7xvdGE6zlwRcwJDWaW/qlypPLaWbQ==',key_name='tempest-keypair-1186187448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-tje0hz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=8736e2a4-70c8-46c1-8ce5-ff68395a22c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:34:16 np0005465988 nova_compute[236126]: 2025-10-02 12:34:16.931 2 DEBUG nova.network.os_vif_util [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:16 np0005465988 nova_compute[236126]: 2025-10-02 12:34:16.932 2 DEBUG nova.network.os_vif_util [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:16 np0005465988 nova_compute[236126]: 2025-10-02 12:34:16.933 2 DEBUG nova.objects.instance [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.091 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  <uuid>8736e2a4-70c8-46c1-8ce5-ff68395a22c9</uuid>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  <name>instance-00000082</name>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-639305243</nova:name>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:34:15</nova:creationTime>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <nova:user uuid="bcd36ab668f449959719ba7058f25e72">tempest-AttachVolumeShelveTestJSON-405673070-project-member</nova:user>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <nova:project uuid="1a05e525420b4aa8adcc9561158e73d1">tempest-AttachVolumeShelveTestJSON-405673070</nova:project>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <nova:port uuid="c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <entry name="serial">8736e2a4-70c8-46c1-8ce5-ff68395a22c9</entry>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <entry name="uuid">8736e2a4-70c8-46c1-8ce5-ff68395a22c9</entry>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:00:a3:24"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <target dev="tapc9dd6bc4-09"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/console.log" append="off"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:34:17 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:34:17 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:34:17 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:34:17 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.093 2 DEBUG nova.compute.manager [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Preparing to wait for external event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.094 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.094 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.095 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.097 2 DEBUG nova.virt.libvirt.vif [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-639305243',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-639305243',id=130,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH7i72qZf0LyRp/akt/bIu4snLJg8XuAUIsHpF3xOK1XlpVLYZ/YFzz7wr2QY5za8QZBy0/Efb6X+c12F9Zi3EqjS+0mqhH0nerFk7xvdGE6zlwRcwJDWaW/qlypPLaWbQ==',key_name='tempest-keypair-1186187448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-tje0hz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=8736e2a4-70c8-46c1-8ce5-ff68395a22c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.097 2 DEBUG nova.network.os_vif_util [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.098 2 DEBUG nova.network.os_vif_util [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.098 2 DEBUG os_vif [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.100 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.100 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.105 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9dd6bc4-09, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.106 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc9dd6bc4-09, col_values=(('external_ids', {'iface-id': 'c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:a3:24', 'vm-uuid': '8736e2a4-70c8-46c1-8ce5-ff68395a22c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:17 np0005465988 NetworkManager[45041]: <info>  [1759408457.1090] manager: (tapc9dd6bc4-09): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.116 2 INFO os_vif [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09')#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.484 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.486 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.486 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No VIF found with MAC fa:16:3e:00:a3:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.486 2 INFO nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Using config drive#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.517 2 DEBUG nova.storage.rbd_utils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.713 2 DEBUG nova.network.neutron [req-833ccd0c-ba5b-4f6b-82ba-98a7d41e8922 req-3ac894fc-2ea9-4413-95e2-8aea64a5f1ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updated VIF entry in instance network info cache for port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.714 2 DEBUG nova.network.neutron [req-833ccd0c-ba5b-4f6b-82ba-98a7d41e8922 req-3ac894fc-2ea9-4413-95e2-8aea64a5f1ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updating instance_info_cache with network_info: [{"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:17 np0005465988 nova_compute[236126]: 2025-10-02 12:34:17.844 2 DEBUG oslo_concurrency.lockutils [req-833ccd0c-ba5b-4f6b-82ba-98a7d41e8922 req-3ac894fc-2ea9-4413-95e2-8aea64a5f1ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:18.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:34:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:18.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:34:18 np0005465988 nova_compute[236126]: 2025-10-02 12:34:18.529 2 INFO nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Creating config drive at /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config#033[00m
Oct  2 08:34:18 np0005465988 nova_compute[236126]: 2025-10-02 12:34:18.541 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7v3fkjpa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:18 np0005465988 nova_compute[236126]: 2025-10-02 12:34:18.689 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7v3fkjpa" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:18 np0005465988 nova_compute[236126]: 2025-10-02 12:34:18.738 2 DEBUG nova.storage.rbd_utils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:18 np0005465988 nova_compute[236126]: 2025-10-02 12:34:18.744 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:18 np0005465988 nova_compute[236126]: 2025-10-02 12:34:18.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:18 np0005465988 nova_compute[236126]: 2025-10-02 12:34:18.943 2 DEBUG oslo_concurrency.processutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.199s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:18 np0005465988 nova_compute[236126]: 2025-10-02 12:34:18.944 2 INFO nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Deleting local config drive /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config because it was imported into RBD.#033[00m
Oct  2 08:34:19 np0005465988 kernel: tapc9dd6bc4-09: entered promiscuous mode
Oct  2 08:34:19 np0005465988 NetworkManager[45041]: <info>  [1759408459.0127] manager: (tapc9dd6bc4-09): new Tun device (/org/freedesktop/NetworkManager/Devices/266)
Oct  2 08:34:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:19Z|00603|binding|INFO|Claiming lport c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b for this chassis.
Oct  2 08:34:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:19Z|00604|binding|INFO|c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b: Claiming fa:16:3e:00:a3:24 10.100.0.11
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:19 np0005465988 systemd-udevd[293465]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.054 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:a3:24 10.100.0.11'], port_security=['fa:16:3e:00:a3:24 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '8736e2a4-70c8-46c1-8ce5-ff68395a22c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a05e525420b4aa8adcc9561158e73d1', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cd7967e6-b4ee-4d94-ab54-c08775c150e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=709db70f-1209-49b9-bf90-2b91d986925d, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.056 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b in datapath 7b216831-24ac-41f0-ac1c-99aae9bc897b bound to our chassis#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.059 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b216831-24ac-41f0-ac1c-99aae9bc897b#033[00m
Oct  2 08:34:19 np0005465988 systemd-machined[192594]: New machine qemu-59-instance-00000082.
Oct  2 08:34:19 np0005465988 NetworkManager[45041]: <info>  [1759408459.0749] device (tapc9dd6bc4-09): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.074 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[012c5eaa-ff0d-4f72-9013-ba3ccbe6596a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.075 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b216831-21 in ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:34:19 np0005465988 NetworkManager[45041]: <info>  [1759408459.0766] device (tapc9dd6bc4-09): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.079 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b216831-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.079 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d2791eb7-4982-469b-b0a3-2a157c38b1a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.080 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4736e372-6581-4f4f-a240-2df03a9feb03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 systemd[1]: Started Virtual Machine qemu-59-instance-00000082.
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.094 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb169ce-a9ed-43c2-a53f-46fe5d000f18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.117 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6903b56f-748d-4aa4-ae0d-01f01f0d4dbe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:19Z|00605|binding|INFO|Setting lport c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b ovn-installed in OVS
Oct  2 08:34:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:19Z|00606|binding|INFO|Setting lport c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b up in Southbound
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.155 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4824a40f-b6fa-4356-91e6-fd7306fd1670]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.161 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2feb2b43-e431-405d-9d39-9dce43f326d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 systemd-udevd[293469]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:34:19 np0005465988 NetworkManager[45041]: <info>  [1759408459.1627] manager: (tap7b216831-20): new Veth device (/org/freedesktop/NetworkManager/Devices/267)
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.207 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c059334e-aa85-40fc-97db-73f604c064fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.211 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d05c468b-1808-40dd-8d28-4bf5673d3e77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 NetworkManager[45041]: <info>  [1759408459.2408] device (tap7b216831-20): carrier: link connected
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.249 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[771f52a9-8750-4528-ba0d-bf8dda3f3eb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.274 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ad124bef-ae9c-4246-be5b-9613354ad781]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b216831-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:a4:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651957, 'reachable_time': 29866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293499, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.298 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0b46a6bd-483f-48ee-b767-ca9e0df14aa1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:a415'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 651957, 'tstamp': 651957}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 293500, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.327 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e03b260a-3659-4baa-b675-335a03ab3ddb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b216831-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:a4:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651957, 'reachable_time': 29866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 293501, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.373 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[81de8614-8106-4bc3-8938-0840b7ed2bd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.443 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[071bdfd4-638b-439b-bf14-eef36e27948a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.446 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b216831-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.446 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.446 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b216831-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:19 np0005465988 kernel: tap7b216831-20: entered promiscuous mode
Oct  2 08:34:19 np0005465988 NetworkManager[45041]: <info>  [1759408459.4502] manager: (tap7b216831-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.453 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b216831-20, col_values=(('external_ids', {'iface-id': '7b6901ce-64cc-402d-847e-45c0d79bbb3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:19Z|00607|binding|INFO|Releasing lport 7b6901ce-64cc-402d-847e-45c0d79bbb3b from this chassis (sb_readonly=0)
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.476 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.477 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0f1e8f2f-a9f3-4033-b64c-9d4a7fb788b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.478 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-7b216831-24ac-41f0-ac1c-99aae9bc897b
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 7b216831-24ac-41f0-ac1c-99aae9bc897b
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:34:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:19.479 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'env', 'PROCESS_TAG=haproxy-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b216831-24ac-41f0-ac1c-99aae9bc897b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.529 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:19Z|00608|binding|INFO|Releasing lport 7b6901ce-64cc-402d-847e-45c0d79bbb3b from this chassis (sb_readonly=0)
Oct  2 08:34:19 np0005465988 podman[293575]: 2025-10-02 12:34:19.89230454 +0000 UTC m=+0.064786144 container create acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.906 2 DEBUG nova.compute.manager [req-4357da38-5688-49a1-8f54-e2e4069b55eb req-233473aa-f3e7-4659-a562-a980a503abb6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.907 2 DEBUG oslo_concurrency.lockutils [req-4357da38-5688-49a1-8f54-e2e4069b55eb req-233473aa-f3e7-4659-a562-a980a503abb6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.907 2 DEBUG oslo_concurrency.lockutils [req-4357da38-5688-49a1-8f54-e2e4069b55eb req-233473aa-f3e7-4659-a562-a980a503abb6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.907 2 DEBUG oslo_concurrency.lockutils [req-4357da38-5688-49a1-8f54-e2e4069b55eb req-233473aa-f3e7-4659-a562-a980a503abb6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.907 2 DEBUG nova.compute.manager [req-4357da38-5688-49a1-8f54-e2e4069b55eb req-233473aa-f3e7-4659-a562-a980a503abb6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Processing event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:34:19 np0005465988 nova_compute[236126]: 2025-10-02 12:34:19.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:19 np0005465988 systemd[1]: Started libpod-conmon-acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db.scope.
Oct  2 08:34:19 np0005465988 podman[293575]: 2025-10-02 12:34:19.859579609 +0000 UTC m=+0.032061243 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:34:19 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:34:19 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f17edf508c213b2dc6412910d807ce0d2a5200c66d5f31f89a9482a26e3add7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:34:19 np0005465988 podman[293575]: 2025-10-02 12:34:19.97887003 +0000 UTC m=+0.151351654 container init acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:34:19 np0005465988 podman[293575]: 2025-10-02 12:34:19.989901708 +0000 UTC m=+0.162383312 container start acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:34:20 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[293589]: [NOTICE]   (293593) : New worker (293595) forked
Oct  2 08:34:20 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[293589]: [NOTICE]   (293593) : Loading success.
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.037 2 DEBUG nova.compute.manager [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.038 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408460.03725, 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.038 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] VM Started (Lifecycle Event)#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.042 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.045 2 INFO nova.virt.libvirt.driver [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Instance spawned successfully.#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.046 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.071 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.075 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.075 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.076 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.076 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.076 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.077 2 DEBUG nova.virt.libvirt.driver [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.082 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:34:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:20.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.112 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.113 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408460.042829, 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.113 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.139 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.148 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408460.0429335, 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.148 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.158 2 INFO nova.compute.manager [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Took 11.07 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.158 2 DEBUG nova.compute.manager [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.317 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.325 2 INFO nova.compute.manager [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Took 12.19 seconds to build instance.#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.328 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.359 2 DEBUG oslo_concurrency.lockutils [None req-ba69fd4b-13cd-4129-bc4e-6e6458bdb501 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.456 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408445.45592, 5c61d077-c345-4b28-9942-624c141fc0a2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.457 2 INFO nova.compute.manager [-] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:34:20 np0005465988 nova_compute[236126]: 2025-10-02 12:34:20.479 2 DEBUG nova.compute.manager [None req-98608515-bc89-4e92-bbd5-ec9622412e8b - - - - - -] [instance: 5c61d077-c345-4b28-9942-624c141fc0a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:20.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:22.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:22 np0005465988 nova_compute[236126]: 2025-10-02 12:34:22.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:22.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:22 np0005465988 nova_compute[236126]: 2025-10-02 12:34:22.746 2 DEBUG nova.compute.manager [req-e2a024e1-27a8-4817-88f4-f6c2ba68fb26 req-28f4679a-e80a-42ee-8f8e-d5339db18b5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:22 np0005465988 nova_compute[236126]: 2025-10-02 12:34:22.746 2 DEBUG oslo_concurrency.lockutils [req-e2a024e1-27a8-4817-88f4-f6c2ba68fb26 req-28f4679a-e80a-42ee-8f8e-d5339db18b5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:22 np0005465988 nova_compute[236126]: 2025-10-02 12:34:22.747 2 DEBUG oslo_concurrency.lockutils [req-e2a024e1-27a8-4817-88f4-f6c2ba68fb26 req-28f4679a-e80a-42ee-8f8e-d5339db18b5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:22 np0005465988 nova_compute[236126]: 2025-10-02 12:34:22.747 2 DEBUG oslo_concurrency.lockutils [req-e2a024e1-27a8-4817-88f4-f6c2ba68fb26 req-28f4679a-e80a-42ee-8f8e-d5339db18b5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:22 np0005465988 nova_compute[236126]: 2025-10-02 12:34:22.747 2 DEBUG nova.compute.manager [req-e2a024e1-27a8-4817-88f4-f6c2ba68fb26 req-28f4679a-e80a-42ee-8f8e-d5339db18b5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] No waiting events found dispatching network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:22 np0005465988 nova_compute[236126]: 2025-10-02 12:34:22.748 2 WARNING nova.compute.manager [req-e2a024e1-27a8-4817-88f4-f6c2ba68fb26 req-28f4679a-e80a-42ee-8f8e-d5339db18b5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received unexpected event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:34:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:23 np0005465988 nova_compute[236126]: 2025-10-02 12:34:23.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:24.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:24 np0005465988 nova_compute[236126]: 2025-10-02 12:34:24.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:24 np0005465988 NetworkManager[45041]: <info>  [1759408464.3473] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Oct  2 08:34:24 np0005465988 NetworkManager[45041]: <info>  [1759408464.3481] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Oct  2 08:34:24 np0005465988 nova_compute[236126]: 2025-10-02 12:34:24.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:24 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:24Z|00609|binding|INFO|Releasing lport 7b6901ce-64cc-402d-847e-45c0d79bbb3b from this chassis (sb_readonly=0)
Oct  2 08:34:24 np0005465988 nova_compute[236126]: 2025-10-02 12:34:24.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:24.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:25 np0005465988 nova_compute[236126]: 2025-10-02 12:34:25.279 2 DEBUG nova.compute.manager [req-01aec323-1411-4c30-8e3e-adf79ab41ecd req-99c8ac3a-1fe6-46c4-be8f-a50677708d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-changed-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:25 np0005465988 nova_compute[236126]: 2025-10-02 12:34:25.280 2 DEBUG nova.compute.manager [req-01aec323-1411-4c30-8e3e-adf79ab41ecd req-99c8ac3a-1fe6-46c4-be8f-a50677708d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Refreshing instance network info cache due to event network-changed-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:34:25 np0005465988 nova_compute[236126]: 2025-10-02 12:34:25.280 2 DEBUG oslo_concurrency.lockutils [req-01aec323-1411-4c30-8e3e-adf79ab41ecd req-99c8ac3a-1fe6-46c4-be8f-a50677708d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:25 np0005465988 nova_compute[236126]: 2025-10-02 12:34:25.280 2 DEBUG oslo_concurrency.lockutils [req-01aec323-1411-4c30-8e3e-adf79ab41ecd req-99c8ac3a-1fe6-46c4-be8f-a50677708d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:25 np0005465988 nova_compute[236126]: 2025-10-02 12:34:25.281 2 DEBUG nova.network.neutron [req-01aec323-1411-4c30-8e3e-adf79ab41ecd req-99c8ac3a-1fe6-46c4-be8f-a50677708d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Refreshing network info cache for port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:34:25 np0005465988 podman[293609]: 2025-10-02 12:34:25.543907616 +0000 UTC m=+0.070664413 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  2 08:34:25 np0005465988 podman[293610]: 2025-10-02 12:34:25.554483721 +0000 UTC m=+0.075602336 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 08:34:25 np0005465988 podman[293608]: 2025-10-02 12:34:25.585263496 +0000 UTC m=+0.111720654 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:34:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:26.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:26.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:27 np0005465988 nova_compute[236126]: 2025-10-02 12:34:27.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:27.363 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:27.364 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:27.364 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:28.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:28.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:28 np0005465988 nova_compute[236126]: 2025-10-02 12:34:28.677 2 DEBUG nova.network.neutron [req-01aec323-1411-4c30-8e3e-adf79ab41ecd req-99c8ac3a-1fe6-46c4-be8f-a50677708d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updated VIF entry in instance network info cache for port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:34:28 np0005465988 nova_compute[236126]: 2025-10-02 12:34:28.678 2 DEBUG nova.network.neutron [req-01aec323-1411-4c30-8e3e-adf79ab41ecd req-99c8ac3a-1fe6-46c4-be8f-a50677708d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updating instance_info_cache with network_info: [{"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:28 np0005465988 nova_compute[236126]: 2025-10-02 12:34:28.705 2 DEBUG oslo_concurrency.lockutils [req-01aec323-1411-4c30-8e3e-adf79ab41ecd req-99c8ac3a-1fe6-46c4-be8f-a50677708d29 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:28 np0005465988 nova_compute[236126]: 2025-10-02 12:34:28.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:30.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:30.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:31 np0005465988 nova_compute[236126]: 2025-10-02 12:34:31.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:32.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:32 np0005465988 nova_compute[236126]: 2025-10-02 12:34:32.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:32.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:32 np0005465988 nova_compute[236126]: 2025-10-02 12:34:32.778 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "a4df176d-5fef-490e-8cee-3424098213f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:32 np0005465988 nova_compute[236126]: 2025-10-02 12:34:32.779 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:32 np0005465988 nova_compute[236126]: 2025-10-02 12:34:32.799 2 DEBUG nova.compute.manager [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:34:32 np0005465988 nova_compute[236126]: 2025-10-02 12:34:32.938 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:32 np0005465988 nova_compute[236126]: 2025-10-02 12:34:32.939 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:32 np0005465988 nova_compute[236126]: 2025-10-02 12:34:32.950 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:34:32 np0005465988 nova_compute[236126]: 2025-10-02 12:34:32.951 2 INFO nova.compute.claims [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.072 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:34:33 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3941843408' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.589 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.596 2 DEBUG nova.compute.provider_tree [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.653 2 DEBUG nova.scheduler.client.report [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.689 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.690 2 DEBUG nova.compute.manager [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.758 2 DEBUG nova.compute.manager [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.759 2 DEBUG nova.network.neutron [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:34:33 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:33Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:00:a3:24 10.100.0.11
Oct  2 08:34:33 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:33Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:00:a3:24 10.100.0.11
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.788 2 INFO nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.830 2 DEBUG nova.compute.manager [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.923 2 DEBUG nova.compute.manager [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.924 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.925 2 INFO nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Creating image(s)#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.950 2 DEBUG nova.storage.rbd_utils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image a4df176d-5fef-490e-8cee-3424098213f5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:33 np0005465988 nova_compute[236126]: 2025-10-02 12:34:33.982 2 DEBUG nova.storage.rbd_utils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image a4df176d-5fef-490e-8cee-3424098213f5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:34 np0005465988 nova_compute[236126]: 2025-10-02 12:34:34.018 2 DEBUG nova.storage.rbd_utils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image a4df176d-5fef-490e-8cee-3424098213f5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:34 np0005465988 nova_compute[236126]: 2025-10-02 12:34:34.024 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:34 np0005465988 nova_compute[236126]: 2025-10-02 12:34:34.071 2 DEBUG nova.policy [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fe9cc788734f406d826446a848700331', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bc0d63d3b4404ef8858166e8836dd0af', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:34:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:34.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:34 np0005465988 nova_compute[236126]: 2025-10-02 12:34:34.112 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:34 np0005465988 nova_compute[236126]: 2025-10-02 12:34:34.113 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:34 np0005465988 nova_compute[236126]: 2025-10-02 12:34:34.114 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:34 np0005465988 nova_compute[236126]: 2025-10-02 12:34:34.114 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:34 np0005465988 nova_compute[236126]: 2025-10-02 12:34:34.147 2 DEBUG nova.storage.rbd_utils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image a4df176d-5fef-490e-8cee-3424098213f5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:34 np0005465988 nova_compute[236126]: 2025-10-02 12:34:34.153 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 a4df176d-5fef-490e-8cee-3424098213f5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:34 np0005465988 nova_compute[236126]: 2025-10-02 12:34:34.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:34.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:34 np0005465988 nova_compute[236126]: 2025-10-02 12:34:34.957 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 a4df176d-5fef-490e-8cee-3424098213f5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.804s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:35 np0005465988 nova_compute[236126]: 2025-10-02 12:34:35.066 2 DEBUG nova.storage.rbd_utils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] resizing rbd image a4df176d-5fef-490e-8cee-3424098213f5_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:34:35 np0005465988 nova_compute[236126]: 2025-10-02 12:34:35.191 2 DEBUG nova.objects.instance [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lazy-loading 'migration_context' on Instance uuid a4df176d-5fef-490e-8cee-3424098213f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:35 np0005465988 nova_compute[236126]: 2025-10-02 12:34:35.207 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:34:35 np0005465988 nova_compute[236126]: 2025-10-02 12:34:35.208 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Ensure instance console log exists: /var/lib/nova/instances/a4df176d-5fef-490e-8cee-3424098213f5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:34:35 np0005465988 nova_compute[236126]: 2025-10-02 12:34:35.209 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:35 np0005465988 nova_compute[236126]: 2025-10-02 12:34:35.210 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:35 np0005465988 nova_compute[236126]: 2025-10-02 12:34:35.210 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:35 np0005465988 nova_compute[236126]: 2025-10-02 12:34:35.219 2 DEBUG nova.network.neutron [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Successfully created port: 7079d58e-139e-4183-9126-02a8a7b45012 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:34:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:36.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:36.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:37 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #53. Immutable memtables: 9.
Oct  2 08:34:37 np0005465988 nova_compute[236126]: 2025-10-02 12:34:37.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:37 np0005465988 podman[293915]: 2025-10-02 12:34:37.522282479 +0000 UTC m=+0.058298918 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent)
Oct  2 08:34:37 np0005465988 nova_compute[236126]: 2025-10-02 12:34:37.586 2 DEBUG nova.network.neutron [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Successfully updated port: 7079d58e-139e-4183-9126-02a8a7b45012 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:34:37 np0005465988 nova_compute[236126]: 2025-10-02 12:34:37.604 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "refresh_cache-a4df176d-5fef-490e-8cee-3424098213f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:37 np0005465988 nova_compute[236126]: 2025-10-02 12:34:37.604 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquired lock "refresh_cache-a4df176d-5fef-490e-8cee-3424098213f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:37 np0005465988 nova_compute[236126]: 2025-10-02 12:34:37.604 2 DEBUG nova.network.neutron [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:34:37 np0005465988 nova_compute[236126]: 2025-10-02 12:34:37.708 2 DEBUG nova.compute.manager [req-fa5657b3-ad13-4fcf-8f0b-7f61e4f06082 req-52f45b8b-4e32-48bc-9eee-da08420d4589 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Received event network-changed-7079d58e-139e-4183-9126-02a8a7b45012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:37 np0005465988 nova_compute[236126]: 2025-10-02 12:34:37.708 2 DEBUG nova.compute.manager [req-fa5657b3-ad13-4fcf-8f0b-7f61e4f06082 req-52f45b8b-4e32-48bc-9eee-da08420d4589 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Refreshing instance network info cache due to event network-changed-7079d58e-139e-4183-9126-02a8a7b45012. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:34:37 np0005465988 nova_compute[236126]: 2025-10-02 12:34:37.709 2 DEBUG oslo_concurrency.lockutils [req-fa5657b3-ad13-4fcf-8f0b-7f61e4f06082 req-52f45b8b-4e32-48bc-9eee-da08420d4589 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-a4df176d-5fef-490e-8cee-3424098213f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:38.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:38 np0005465988 nova_compute[236126]: 2025-10-02 12:34:38.230 2 DEBUG nova.network.neutron [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:34:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:38.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:38 np0005465988 nova_compute[236126]: 2025-10-02 12:34:38.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.576 2 DEBUG nova.network.neutron [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Updating instance_info_cache with network_info: [{"id": "7079d58e-139e-4183-9126-02a8a7b45012", "address": "fa:16:3e:af:56:97", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7079d58e-13", "ovs_interfaceid": "7079d58e-139e-4183-9126-02a8a7b45012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.669 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Releasing lock "refresh_cache-a4df176d-5fef-490e-8cee-3424098213f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.669 2 DEBUG nova.compute.manager [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Instance network_info: |[{"id": "7079d58e-139e-4183-9126-02a8a7b45012", "address": "fa:16:3e:af:56:97", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7079d58e-13", "ovs_interfaceid": "7079d58e-139e-4183-9126-02a8a7b45012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.670 2 DEBUG oslo_concurrency.lockutils [req-fa5657b3-ad13-4fcf-8f0b-7f61e4f06082 req-52f45b8b-4e32-48bc-9eee-da08420d4589 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-a4df176d-5fef-490e-8cee-3424098213f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.670 2 DEBUG nova.network.neutron [req-fa5657b3-ad13-4fcf-8f0b-7f61e4f06082 req-52f45b8b-4e32-48bc-9eee-da08420d4589 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Refreshing network info cache for port 7079d58e-139e-4183-9126-02a8a7b45012 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.672 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Start _get_guest_xml network_info=[{"id": "7079d58e-139e-4183-9126-02a8a7b45012", "address": "fa:16:3e:af:56:97", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7079d58e-13", "ovs_interfaceid": "7079d58e-139e-4183-9126-02a8a7b45012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.677 2 WARNING nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.684 2 DEBUG nova.virt.libvirt.host [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.684 2 DEBUG nova.virt.libvirt.host [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.688 2 DEBUG nova.virt.libvirt.host [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.688 2 DEBUG nova.virt.libvirt.host [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.689 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.690 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.690 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.690 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.690 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.690 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.691 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.691 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.691 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.691 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.691 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.691 2 DEBUG nova.virt.hardware [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:34:39 np0005465988 nova_compute[236126]: 2025-10-02 12:34:39.694 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:40.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:34:40 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2983616495' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.270 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.576s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.297 2 DEBUG nova.storage.rbd_utils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image a4df176d-5fef-490e-8cee-3424098213f5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.302 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:40.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:34:40 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1430517384' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.748 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.749 2 DEBUG nova.virt.libvirt.vif [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1877577260',display_name='tempest-ServersTestJSON-server-1877577260',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1877577260',id=133,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bc0d63d3b4404ef8858166e8836dd0af',ramdisk_id='',reservation_id='r-ozlceizv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-80077074',owner_user_name='tempest-ServersTestJSON-80077074-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:33Z,user_data=None,user_id='fe9cc788734f406d826446a848700331',uuid=a4df176d-5fef-490e-8cee-3424098213f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7079d58e-139e-4183-9126-02a8a7b45012", "address": "fa:16:3e:af:56:97", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7079d58e-13", "ovs_interfaceid": "7079d58e-139e-4183-9126-02a8a7b45012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.750 2 DEBUG nova.network.os_vif_util [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converting VIF {"id": "7079d58e-139e-4183-9126-02a8a7b45012", "address": "fa:16:3e:af:56:97", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7079d58e-13", "ovs_interfaceid": "7079d58e-139e-4183-9126-02a8a7b45012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.751 2 DEBUG nova.network.os_vif_util [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:56:97,bridge_name='br-int',has_traffic_filtering=True,id=7079d58e-139e-4183-9126-02a8a7b45012,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7079d58e-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.752 2 DEBUG nova.objects.instance [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lazy-loading 'pci_devices' on Instance uuid a4df176d-5fef-490e-8cee-3424098213f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.860 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  <uuid>a4df176d-5fef-490e-8cee-3424098213f5</uuid>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  <name>instance-00000085</name>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersTestJSON-server-1877577260</nova:name>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:34:39</nova:creationTime>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <nova:user uuid="fe9cc788734f406d826446a848700331">tempest-ServersTestJSON-80077074-project-member</nova:user>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <nova:project uuid="bc0d63d3b4404ef8858166e8836dd0af">tempest-ServersTestJSON-80077074</nova:project>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <nova:port uuid="7079d58e-139e-4183-9126-02a8a7b45012">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <entry name="serial">a4df176d-5fef-490e-8cee-3424098213f5</entry>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <entry name="uuid">a4df176d-5fef-490e-8cee-3424098213f5</entry>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/a4df176d-5fef-490e-8cee-3424098213f5_disk">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/a4df176d-5fef-490e-8cee-3424098213f5_disk.config">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:af:56:97"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <target dev="tap7079d58e-13"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/a4df176d-5fef-490e-8cee-3424098213f5/console.log" append="off"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:34:40 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:34:40 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:34:40 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:34:40 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.861 2 DEBUG nova.compute.manager [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Preparing to wait for external event network-vif-plugged-7079d58e-139e-4183-9126-02a8a7b45012 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.861 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "a4df176d-5fef-490e-8cee-3424098213f5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.861 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.862 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.862 2 DEBUG nova.virt.libvirt.vif [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1877577260',display_name='tempest-ServersTestJSON-server-1877577260',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1877577260',id=133,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bc0d63d3b4404ef8858166e8836dd0af',ramdisk_id='',reservation_id='r-ozlceizv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-80077074',owner_user_name='tempest-ServersTestJSON-80077074-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:33Z,user_data=None,user_id='fe9cc788734f406d826446a848700331',uuid=a4df176d-5fef-490e-8cee-3424098213f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7079d58e-139e-4183-9126-02a8a7b45012", "address": "fa:16:3e:af:56:97", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7079d58e-13", "ovs_interfaceid": "7079d58e-139e-4183-9126-02a8a7b45012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.863 2 DEBUG nova.network.os_vif_util [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converting VIF {"id": "7079d58e-139e-4183-9126-02a8a7b45012", "address": "fa:16:3e:af:56:97", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7079d58e-13", "ovs_interfaceid": "7079d58e-139e-4183-9126-02a8a7b45012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.863 2 DEBUG nova.network.os_vif_util [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:56:97,bridge_name='br-int',has_traffic_filtering=True,id=7079d58e-139e-4183-9126-02a8a7b45012,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7079d58e-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.864 2 DEBUG os_vif [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:56:97,bridge_name='br-int',has_traffic_filtering=True,id=7079d58e-139e-4183-9126-02a8a7b45012,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7079d58e-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.868 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7079d58e-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.869 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7079d58e-13, col_values=(('external_ids', {'iface-id': '7079d58e-139e-4183-9126-02a8a7b45012', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:56:97', 'vm-uuid': 'a4df176d-5fef-490e-8cee-3424098213f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:40 np0005465988 NetworkManager[45041]: <info>  [1759408480.8719] manager: (tap7079d58e-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.878 2 INFO os_vif [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:56:97,bridge_name='br-int',has_traffic_filtering=True,id=7079d58e-139e-4183-9126-02a8a7b45012,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7079d58e-13')#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.962 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.963 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.963 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] No VIF found with MAC fa:16:3e:af:56:97, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.964 2 INFO nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Using config drive#033[00m
Oct  2 08:34:40 np0005465988 nova_compute[236126]: 2025-10-02 12:34:40.991 2 DEBUG nova.storage.rbd_utils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image a4df176d-5fef-490e-8cee-3424098213f5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:41 np0005465988 nova_compute[236126]: 2025-10-02 12:34:41.640 2 INFO nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Creating config drive at /var/lib/nova/instances/a4df176d-5fef-490e-8cee-3424098213f5/disk.config#033[00m
Oct  2 08:34:41 np0005465988 nova_compute[236126]: 2025-10-02 12:34:41.647 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a4df176d-5fef-490e-8cee-3424098213f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj6zuch1l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:41 np0005465988 nova_compute[236126]: 2025-10-02 12:34:41.689 2 DEBUG nova.network.neutron [req-fa5657b3-ad13-4fcf-8f0b-7f61e4f06082 req-52f45b8b-4e32-48bc-9eee-da08420d4589 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Updated VIF entry in instance network info cache for port 7079d58e-139e-4183-9126-02a8a7b45012. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:34:41 np0005465988 nova_compute[236126]: 2025-10-02 12:34:41.690 2 DEBUG nova.network.neutron [req-fa5657b3-ad13-4fcf-8f0b-7f61e4f06082 req-52f45b8b-4e32-48bc-9eee-da08420d4589 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Updating instance_info_cache with network_info: [{"id": "7079d58e-139e-4183-9126-02a8a7b45012", "address": "fa:16:3e:af:56:97", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7079d58e-13", "ovs_interfaceid": "7079d58e-139e-4183-9126-02a8a7b45012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:41 np0005465988 nova_compute[236126]: 2025-10-02 12:34:41.723 2 DEBUG oslo_concurrency.lockutils [req-fa5657b3-ad13-4fcf-8f0b-7f61e4f06082 req-52f45b8b-4e32-48bc-9eee-da08420d4589 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-a4df176d-5fef-490e-8cee-3424098213f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:41 np0005465988 nova_compute[236126]: 2025-10-02 12:34:41.794 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a4df176d-5fef-490e-8cee-3424098213f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj6zuch1l" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:41 np0005465988 nova_compute[236126]: 2025-10-02 12:34:41.825 2 DEBUG nova.storage.rbd_utils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image a4df176d-5fef-490e-8cee-3424098213f5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:41 np0005465988 nova_compute[236126]: 2025-10-02 12:34:41.830 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a4df176d-5fef-490e-8cee-3424098213f5/disk.config a4df176d-5fef-490e-8cee-3424098213f5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.021 2 DEBUG oslo_concurrency.processutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a4df176d-5fef-490e-8cee-3424098213f5/disk.config a4df176d-5fef-490e-8cee-3424098213f5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.191s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.022 2 INFO nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Deleting local config drive /var/lib/nova/instances/a4df176d-5fef-490e-8cee-3424098213f5/disk.config because it was imported into RBD.#033[00m
Oct  2 08:34:42 np0005465988 kernel: tap7079d58e-13: entered promiscuous mode
Oct  2 08:34:42 np0005465988 NetworkManager[45041]: <info>  [1759408482.0951] manager: (tap7079d58e-13): new Tun device (/org/freedesktop/NetworkManager/Devices/272)
Oct  2 08:34:42 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:42Z|00610|binding|INFO|Claiming lport 7079d58e-139e-4183-9126-02a8a7b45012 for this chassis.
Oct  2 08:34:42 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:42Z|00611|binding|INFO|7079d58e-139e-4183-9126-02a8a7b45012: Claiming fa:16:3e:af:56:97 10.100.0.10
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.105 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:56:97 10.100.0.10'], port_security=['fa:16:3e:af:56:97 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a4df176d-5fef-490e-8cee-3424098213f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc0d63d3b4404ef8858166e8836dd0af', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2a11ff87-bec6-4638-b302-adcd655efba9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=797b6af2-473b-4626-9e97-a0a489119419, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7079d58e-139e-4183-9126-02a8a7b45012) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.108 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7079d58e-139e-4183-9126-02a8a7b45012 in datapath d7203b00-e5e4-402e-b777-ac6280fa23ac bound to our chassis#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.113 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7203b00-e5e4-402e-b777-ac6280fa23ac#033[00m
Oct  2 08:34:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:42 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:42Z|00612|binding|INFO|Setting lport 7079d58e-139e-4183-9126-02a8a7b45012 ovn-installed in OVS
Oct  2 08:34:42 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:42Z|00613|binding|INFO|Setting lport 7079d58e-139e-4183-9126-02a8a7b45012 up in Southbound
Oct  2 08:34:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.011000316s ======
Oct  2 08:34:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:42.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.011000316s
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.131 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f3794d46-f69a-4250-aa3c-7690c2dc053c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.132 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7203b00-e1 in ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.134 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7203b00-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.134 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[61df81de-42a4-4dd1-8f50-c1e8458fd915]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.136 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0d997172-91f4-4ebe-8697-f4c896cd4f23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:42 np0005465988 systemd-machined[192594]: New machine qemu-60-instance-00000085.
Oct  2 08:34:42 np0005465988 systemd[1]: Started Virtual Machine qemu-60-instance-00000085.
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.161 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[53141449-a1ef-4ca5-abab-d096362c2fe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 systemd-udevd[294076]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.188 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[28b0464e-34e2-4ab7-9518-17b057725a49]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 NetworkManager[45041]: <info>  [1759408482.1910] device (tap7079d58e-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:34:42 np0005465988 NetworkManager[45041]: <info>  [1759408482.1931] device (tap7079d58e-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.220 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c27d9c65-64cb-479d-b276-5709036806c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 NetworkManager[45041]: <info>  [1759408482.2262] manager: (tapd7203b00-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/273)
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.224 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[33047c0b-84d3-4ff1-84b5-84818eb6e457]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.261 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5994febd-5f29-40a2-83d7-77f1ed5db5b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.264 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[015931a3-21cc-4114-b843-a0b49f38c077]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 NetworkManager[45041]: <info>  [1759408482.2918] device (tapd7203b00-e0): carrier: link connected
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.295 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ed268071-35bb-459b-abaa-4cb09774bbd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.312 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e1781a7a-d65d-4218-b61b-e0f2093f8ba5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7203b00-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c4:e9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 654262, 'reachable_time': 41472, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294106, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.329 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2d79ecfd-28e2-4339-b64c-8234c6b4bb35]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:c4e9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 654262, 'tstamp': 654262}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294107, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.346 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[11486804-0163-4c9f-98a7-49ffa750abf1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7203b00-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c4:e9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 654262, 'reachable_time': 41472, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294108, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.378 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[00fb10f9-d1c8-43fa-b38b-685ac395cedc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.444 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[65747c33-671a-49e6-a8a4-862055340218]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.445 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7203b00-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.445 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.446 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7203b00-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:42 np0005465988 kernel: tapd7203b00-e0: entered promiscuous mode
Oct  2 08:34:42 np0005465988 NetworkManager[45041]: <info>  [1759408482.4490] manager: (tapd7203b00-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.452 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7203b00-e0, col_values=(('external_ids', {'iface-id': '6f9d54ba-3cfb-48b9-bef7-b2077e6931d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:42 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:42Z|00614|binding|INFO|Releasing lport 6f9d54ba-3cfb-48b9-bef7-b2077e6931d7 from this chassis (sb_readonly=0)
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.471 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7203b00-e5e4-402e-b777-ac6280fa23ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7203b00-e5e4-402e-b777-ac6280fa23ac.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.472 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[84e65d65-0ba0-41e2-ba5d-9e0343b4821c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.473 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-d7203b00-e5e4-402e-b777-ac6280fa23ac
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/d7203b00-e5e4-402e-b777-ac6280fa23ac.pid.haproxy
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID d7203b00-e5e4-402e-b777-ac6280fa23ac
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:34:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:42.473 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'env', 'PROCESS_TAG=haproxy-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7203b00-e5e4-402e-b777-ac6280fa23ac.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.496 2 DEBUG nova.compute.manager [req-9e3602e1-1cc6-4ba6-a5f2-243eac2bea41 req-d22cbd8b-6ac6-4921-a284-748b8384672e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Received event network-vif-plugged-7079d58e-139e-4183-9126-02a8a7b45012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.497 2 DEBUG oslo_concurrency.lockutils [req-9e3602e1-1cc6-4ba6-a5f2-243eac2bea41 req-d22cbd8b-6ac6-4921-a284-748b8384672e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "a4df176d-5fef-490e-8cee-3424098213f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.497 2 DEBUG oslo_concurrency.lockutils [req-9e3602e1-1cc6-4ba6-a5f2-243eac2bea41 req-d22cbd8b-6ac6-4921-a284-748b8384672e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.498 2 DEBUG oslo_concurrency.lockutils [req-9e3602e1-1cc6-4ba6-a5f2-243eac2bea41 req-d22cbd8b-6ac6-4921-a284-748b8384672e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:42 np0005465988 nova_compute[236126]: 2025-10-02 12:34:42.498 2 DEBUG nova.compute.manager [req-9e3602e1-1cc6-4ba6-a5f2-243eac2bea41 req-d22cbd8b-6ac6-4921-a284-748b8384672e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Processing event network-vif-plugged-7079d58e-139e-4183-9126-02a8a7b45012 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:34:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:42.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:42 np0005465988 podman[294174]: 2025-10-02 12:34:42.847343701 +0000 UTC m=+0.050727140 container create bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:34:42 np0005465988 systemd[1]: Started libpod-conmon-bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef.scope.
Oct  2 08:34:42 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:34:42 np0005465988 podman[294174]: 2025-10-02 12:34:42.822776095 +0000 UTC m=+0.026159554 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:34:42 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b69793b0997210d2d8ebf2901adbfffeb98b1b1c64fec1c1a4143056fe839307/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:34:42 np0005465988 podman[294174]: 2025-10-02 12:34:42.944012662 +0000 UTC m=+0.147396121 container init bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:34:42 np0005465988 podman[294174]: 2025-10-02 12:34:42.952749453 +0000 UTC m=+0.156132932 container start bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:34:42 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[294197]: [NOTICE]   (294201) : New worker (294204) forked
Oct  2 08:34:42 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[294197]: [NOTICE]   (294201) : Loading success.
Oct  2 08:34:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.362 2 DEBUG nova.compute.manager [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.363 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408483.3618493, a4df176d-5fef-490e-8cee-3424098213f5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.363 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a4df176d-5fef-490e-8cee-3424098213f5] VM Started (Lifecycle Event)#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.368 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.371 2 INFO nova.virt.libvirt.driver [-] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Instance spawned successfully.#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.371 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.391 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.396 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.406 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.406 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.407 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.407 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.408 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.408 2 DEBUG nova.virt.libvirt.driver [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.416 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a4df176d-5fef-490e-8cee-3424098213f5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.417 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408483.3654084, a4df176d-5fef-490e-8cee-3424098213f5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.417 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a4df176d-5fef-490e-8cee-3424098213f5] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.445 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.450 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408483.3668022, a4df176d-5fef-490e-8cee-3424098213f5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.450 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a4df176d-5fef-490e-8cee-3424098213f5] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.490 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.495 2 INFO nova.compute.manager [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Took 9.57 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.495 2 DEBUG nova.compute.manager [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.498 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.519 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: a4df176d-5fef-490e-8cee-3424098213f5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.594 2 INFO nova.compute.manager [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Took 10.68 seconds to build instance.#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.611 2 DEBUG oslo_concurrency.lockutils [None req-330fcb6d-d37c-4126-823a-f59d893f40ea fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.832s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:43 np0005465988 nova_compute[236126]: 2025-10-02 12:34:43.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:44 np0005465988 nova_compute[236126]: 2025-10-02 12:34:44.025 2 DEBUG oslo_concurrency.lockutils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:44 np0005465988 nova_compute[236126]: 2025-10-02 12:34:44.025 2 DEBUG oslo_concurrency.lockutils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:44 np0005465988 nova_compute[236126]: 2025-10-02 12:34:44.026 2 INFO nova.compute.manager [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Shelving#033[00m
Oct  2 08:34:44 np0005465988 nova_compute[236126]: 2025-10-02 12:34:44.052 2 DEBUG nova.virt.libvirt.driver [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:34:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:44.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:44.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:44 np0005465988 nova_compute[236126]: 2025-10-02 12:34:44.647 2 DEBUG nova.compute.manager [req-43e36d8a-efd0-4454-9863-e928a8b58133 req-1e737258-aa36-4ea7-93fb-92b6620696b2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Received event network-vif-plugged-7079d58e-139e-4183-9126-02a8a7b45012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:44 np0005465988 nova_compute[236126]: 2025-10-02 12:34:44.648 2 DEBUG oslo_concurrency.lockutils [req-43e36d8a-efd0-4454-9863-e928a8b58133 req-1e737258-aa36-4ea7-93fb-92b6620696b2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "a4df176d-5fef-490e-8cee-3424098213f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:44 np0005465988 nova_compute[236126]: 2025-10-02 12:34:44.648 2 DEBUG oslo_concurrency.lockutils [req-43e36d8a-efd0-4454-9863-e928a8b58133 req-1e737258-aa36-4ea7-93fb-92b6620696b2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:44 np0005465988 nova_compute[236126]: 2025-10-02 12:34:44.648 2 DEBUG oslo_concurrency.lockutils [req-43e36d8a-efd0-4454-9863-e928a8b58133 req-1e737258-aa36-4ea7-93fb-92b6620696b2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:44 np0005465988 nova_compute[236126]: 2025-10-02 12:34:44.648 2 DEBUG nova.compute.manager [req-43e36d8a-efd0-4454-9863-e928a8b58133 req-1e737258-aa36-4ea7-93fb-92b6620696b2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] No waiting events found dispatching network-vif-plugged-7079d58e-139e-4183-9126-02a8a7b45012 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:44 np0005465988 nova_compute[236126]: 2025-10-02 12:34:44.648 2 WARNING nova.compute.manager [req-43e36d8a-efd0-4454-9863-e928a8b58133 req-1e737258-aa36-4ea7-93fb-92b6620696b2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Received unexpected event network-vif-plugged-7079d58e-139e-4183-9126-02a8a7b45012 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:34:45 np0005465988 nova_compute[236126]: 2025-10-02 12:34:45.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:45 np0005465988 nova_compute[236126]: 2025-10-02 12:34:45.920 2 DEBUG oslo_concurrency.lockutils [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "a4df176d-5fef-490e-8cee-3424098213f5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:45 np0005465988 nova_compute[236126]: 2025-10-02 12:34:45.921 2 DEBUG oslo_concurrency.lockutils [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:45 np0005465988 nova_compute[236126]: 2025-10-02 12:34:45.921 2 DEBUG oslo_concurrency.lockutils [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "a4df176d-5fef-490e-8cee-3424098213f5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:45 np0005465988 nova_compute[236126]: 2025-10-02 12:34:45.921 2 DEBUG oslo_concurrency.lockutils [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:45 np0005465988 nova_compute[236126]: 2025-10-02 12:34:45.922 2 DEBUG oslo_concurrency.lockutils [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:45 np0005465988 nova_compute[236126]: 2025-10-02 12:34:45.924 2 INFO nova.compute.manager [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Terminating instance#033[00m
Oct  2 08:34:45 np0005465988 nova_compute[236126]: 2025-10-02 12:34:45.925 2 DEBUG nova.compute.manager [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:34:45 np0005465988 kernel: tap7079d58e-13 (unregistering): left promiscuous mode
Oct  2 08:34:45 np0005465988 NetworkManager[45041]: <info>  [1759408485.9640] device (tap7079d58e-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:34:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:45Z|00615|binding|INFO|Releasing lport 7079d58e-139e-4183-9126-02a8a7b45012 from this chassis (sb_readonly=0)
Oct  2 08:34:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:45Z|00616|binding|INFO|Setting lport 7079d58e-139e-4183-9126-02a8a7b45012 down in Southbound
Oct  2 08:34:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:45Z|00617|binding|INFO|Removing iface tap7079d58e-13 ovn-installed in OVS
Oct  2 08:34:45 np0005465988 nova_compute[236126]: 2025-10-02 12:34:45.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:45.983 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:56:97 10.100.0.10'], port_security=['fa:16:3e:af:56:97 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a4df176d-5fef-490e-8cee-3424098213f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc0d63d3b4404ef8858166e8836dd0af', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2a11ff87-bec6-4638-b302-adcd655efba9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=797b6af2-473b-4626-9e97-a0a489119419, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7079d58e-139e-4183-9126-02a8a7b45012) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:45.984 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7079d58e-139e-4183-9126-02a8a7b45012 in datapath d7203b00-e5e4-402e-b777-ac6280fa23ac unbound from our chassis#033[00m
Oct  2 08:34:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:45.986 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7203b00-e5e4-402e-b777-ac6280fa23ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:34:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:45.987 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2c82e51d-bde0-4aed-8ea2-82dd88913c5c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:45.988 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac namespace which is not needed anymore#033[00m
Oct  2 08:34:45 np0005465988 nova_compute[236126]: 2025-10-02 12:34:45.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005465988 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000085.scope: Deactivated successfully.
Oct  2 08:34:46 np0005465988 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000085.scope: Consumed 3.744s CPU time.
Oct  2 08:34:46 np0005465988 systemd-machined[192594]: Machine qemu-60-instance-00000085 terminated.
Oct  2 08:34:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:34:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:46.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:34:46 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[294197]: [NOTICE]   (294201) : haproxy version is 2.8.14-c23fe91
Oct  2 08:34:46 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[294197]: [NOTICE]   (294201) : path to executable is /usr/sbin/haproxy
Oct  2 08:34:46 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[294197]: [WARNING]  (294201) : Exiting Master process...
Oct  2 08:34:46 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[294197]: [ALERT]    (294201) : Current worker (294204) exited with code 143 (Terminated)
Oct  2 08:34:46 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[294197]: [WARNING]  (294201) : All workers exited. Exiting... (0)
Oct  2 08:34:46 np0005465988 systemd[1]: libpod-bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef.scope: Deactivated successfully.
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005465988 podman[294237]: 2025-10-02 12:34:46.150914281 +0000 UTC m=+0.054388105 container died bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.163 2 INFO nova.virt.libvirt.driver [-] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Instance destroyed successfully.#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.164 2 DEBUG nova.objects.instance [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lazy-loading 'resources' on Instance uuid a4df176d-5fef-490e-8cee-3424098213f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:46 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef-userdata-shm.mount: Deactivated successfully.
Oct  2 08:34:46 np0005465988 systemd[1]: var-lib-containers-storage-overlay-b69793b0997210d2d8ebf2901adbfffeb98b1b1c64fec1c1a4143056fe839307-merged.mount: Deactivated successfully.
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.187 2 DEBUG nova.virt.libvirt.vif [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:34:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1877577260',display_name='tempest-ServersTestJSON-server-1877577260',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1877577260',id=133,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:34:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bc0d63d3b4404ef8858166e8836dd0af',ramdisk_id='',reservation_id='r-ozlceizv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-80077074',owner_user_name='tempest-ServersTestJSON-80077074-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:34:43Z,user_data=None,user_id='fe9cc788734f406d826446a848700331',uuid=a4df176d-5fef-490e-8cee-3424098213f5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7079d58e-139e-4183-9126-02a8a7b45012", "address": "fa:16:3e:af:56:97", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7079d58e-13", "ovs_interfaceid": "7079d58e-139e-4183-9126-02a8a7b45012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.188 2 DEBUG nova.network.os_vif_util [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converting VIF {"id": "7079d58e-139e-4183-9126-02a8a7b45012", "address": "fa:16:3e:af:56:97", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7079d58e-13", "ovs_interfaceid": "7079d58e-139e-4183-9126-02a8a7b45012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.189 2 DEBUG nova.network.os_vif_util [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:56:97,bridge_name='br-int',has_traffic_filtering=True,id=7079d58e-139e-4183-9126-02a8a7b45012,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7079d58e-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.190 2 DEBUG os_vif [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:56:97,bridge_name='br-int',has_traffic_filtering=True,id=7079d58e-139e-4183-9126-02a8a7b45012,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7079d58e-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.192 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7079d58e-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:46 np0005465988 podman[294237]: 2025-10-02 12:34:46.198997574 +0000 UTC m=+0.102471388 container cleanup bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.200 2 INFO os_vif [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:56:97,bridge_name='br-int',has_traffic_filtering=True,id=7079d58e-139e-4183-9126-02a8a7b45012,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7079d58e-13')#033[00m
Oct  2 08:34:46 np0005465988 systemd[1]: libpod-conmon-bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef.scope: Deactivated successfully.
Oct  2 08:34:46 np0005465988 podman[294282]: 2025-10-02 12:34:46.284516444 +0000 UTC m=+0.054628592 container remove bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.290 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cbd94d7d-7ff6-4f20-a8e8-c1de81463412]: (4, ('Thu Oct  2 12:34:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac (bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef)\nbcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef\nThu Oct  2 12:34:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac (bcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef)\nbcbbd46df6478be0805ca6e7de8179e5af7a0691274f359604c36d184db4d9ef\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.291 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7beb3805-761d-4a85-8581-de8e31ce42ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.293 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7203b00-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005465988 kernel: tapd7203b00-e0: left promiscuous mode
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.319 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8371d484-41c9-4464-9380-28cfcf693dd4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.344 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a45430-0020-4638-a1b5-0162cccc82ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.346 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[be538e7a-89fd-4131-9292-ca306407314d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.363 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c07c8ee0-3f0d-435a-95d3-eff2dc32d9aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 654254, 'reachable_time': 28245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294308, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:46 np0005465988 systemd[1]: run-netns-ovnmeta\x2dd7203b00\x2de5e4\x2d402e\x2db777\x2dac6280fa23ac.mount: Deactivated successfully.
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.368 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.369 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[f3af9c38-8982-4651-b9ec-cd327bccfd35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:46.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:46 np0005465988 kernel: tapc9dd6bc4-09 (unregistering): left promiscuous mode
Oct  2 08:34:46 np0005465988 NetworkManager[45041]: <info>  [1759408486.6549] device (tapc9dd6bc4-09): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:34:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:46Z|00618|binding|INFO|Releasing lport c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b from this chassis (sb_readonly=0)
Oct  2 08:34:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:46Z|00619|binding|INFO|Setting lport c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b down in Southbound
Oct  2 08:34:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:34:46Z|00620|binding|INFO|Removing iface tapc9dd6bc4-09 ovn-installed in OVS
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.683 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:a3:24 10.100.0.11'], port_security=['fa:16:3e:00:a3:24 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '8736e2a4-70c8-46c1-8ce5-ff68395a22c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a05e525420b4aa8adcc9561158e73d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cd7967e6-b4ee-4d94-ab54-c08775c150e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.179'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=709db70f-1209-49b9-bf90-2b91d986925d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.684 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b in datapath 7b216831-24ac-41f0-ac1c-99aae9bc897b unbound from our chassis#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.686 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b216831-24ac-41f0-ac1c-99aae9bc897b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.688 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0b2ffe11-e847-4c01-8e33-7ffda4a95b36]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.689 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b namespace which is not needed anymore#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005465988 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000082.scope: Deactivated successfully.
Oct  2 08:34:46 np0005465988 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000082.scope: Consumed 14.097s CPU time.
Oct  2 08:34:46 np0005465988 systemd-machined[192594]: Machine qemu-59-instance-00000082 terminated.
Oct  2 08:34:46 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[293589]: [NOTICE]   (293593) : haproxy version is 2.8.14-c23fe91
Oct  2 08:34:46 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[293589]: [NOTICE]   (293593) : path to executable is /usr/sbin/haproxy
Oct  2 08:34:46 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[293589]: [WARNING]  (293593) : Exiting Master process...
Oct  2 08:34:46 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[293589]: [ALERT]    (293593) : Current worker (293595) exited with code 143 (Terminated)
Oct  2 08:34:46 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[293589]: [WARNING]  (293593) : All workers exited. Exiting... (0)
Oct  2 08:34:46 np0005465988 systemd[1]: libpod-acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db.scope: Deactivated successfully.
Oct  2 08:34:46 np0005465988 podman[294332]: 2025-10-02 12:34:46.828645615 +0000 UTC m=+0.047603441 container died acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.856 2 DEBUG nova.compute.manager [req-d6986339-7ac9-497c-ab65-b707b05859fe req-a7c72354-9063-444b-a7f5-284df397ba5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Received event network-vif-unplugged-7079d58e-139e-4183-9126-02a8a7b45012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.856 2 DEBUG oslo_concurrency.lockutils [req-d6986339-7ac9-497c-ab65-b707b05859fe req-a7c72354-9063-444b-a7f5-284df397ba5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "a4df176d-5fef-490e-8cee-3424098213f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.857 2 DEBUG oslo_concurrency.lockutils [req-d6986339-7ac9-497c-ab65-b707b05859fe req-a7c72354-9063-444b-a7f5-284df397ba5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.857 2 DEBUG oslo_concurrency.lockutils [req-d6986339-7ac9-497c-ab65-b707b05859fe req-a7c72354-9063-444b-a7f5-284df397ba5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.857 2 DEBUG nova.compute.manager [req-d6986339-7ac9-497c-ab65-b707b05859fe req-a7c72354-9063-444b-a7f5-284df397ba5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] No waiting events found dispatching network-vif-unplugged-7079d58e-139e-4183-9126-02a8a7b45012 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.858 2 DEBUG nova.compute.manager [req-d6986339-7ac9-497c-ab65-b707b05859fe req-a7c72354-9063-444b-a7f5-284df397ba5e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Received event network-vif-unplugged-7079d58e-139e-4183-9126-02a8a7b45012 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:34:46 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db-userdata-shm.mount: Deactivated successfully.
Oct  2 08:34:46 np0005465988 systemd[1]: var-lib-containers-storage-overlay-2f17edf508c213b2dc6412910d807ce0d2a5200c66d5f31f89a9482a26e3add7-merged.mount: Deactivated successfully.
Oct  2 08:34:46 np0005465988 podman[294332]: 2025-10-02 12:34:46.866330819 +0000 UTC m=+0.085288665 container cleanup acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:34:46 np0005465988 systemd[1]: libpod-conmon-acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db.scope: Deactivated successfully.
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.894 2 INFO nova.virt.libvirt.driver [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Deleting instance files /var/lib/nova/instances/a4df176d-5fef-490e-8cee-3424098213f5_del#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.894 2 INFO nova.virt.libvirt.driver [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Deletion of /var/lib/nova/instances/a4df176d-5fef-490e-8cee-3424098213f5_del complete#033[00m
Oct  2 08:34:46 np0005465988 NetworkManager[45041]: <info>  [1759408486.8977] manager: (tapc9dd6bc4-09): new Tun device (/org/freedesktop/NetworkManager/Devices/275)
Oct  2 08:34:46 np0005465988 podman[294363]: 2025-10-02 12:34:46.9494857 +0000 UTC m=+0.047196988 container remove acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.956 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[423806a0-271e-4e89-b834-bf03c05be0b7]: (4, ('Thu Oct  2 12:34:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b (acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db)\nacafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db\nThu Oct  2 12:34:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b (acafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db)\nacafb5a43a9b2058ea7a44b8b6bc45bcdc1e399fa19dd46e539b8505e90743db\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.958 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[32c38d3f-df13-4e93-8845-55b0065bba8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.959 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b216831-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005465988 kernel: tap7b216831-20: left promiscuous mode
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.981 2 INFO nova.compute.manager [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Took 1.06 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.982 2 DEBUG oslo.service.loopingcall [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.982 2 DEBUG nova.compute.manager [-] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.983 2 DEBUG nova.network.neutron [-] [instance: a4df176d-5fef-490e-8cee-3424098213f5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:34:46 np0005465988 nova_compute[236126]: 2025-10-02 12:34:46.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:46.992 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf67565-87a4-4101-b88b-4a2cfdda6ae5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.004 2 DEBUG nova.compute.manager [req-ad11ba27-7878-489a-92ea-5e9727979e9d req-ea548dd5-ca5e-4208-b0ce-c5bab543df72 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-vif-unplugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.004 2 DEBUG oslo_concurrency.lockutils [req-ad11ba27-7878-489a-92ea-5e9727979e9d req-ea548dd5-ca5e-4208-b0ce-c5bab543df72 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.005 2 DEBUG oslo_concurrency.lockutils [req-ad11ba27-7878-489a-92ea-5e9727979e9d req-ea548dd5-ca5e-4208-b0ce-c5bab543df72 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.005 2 DEBUG oslo_concurrency.lockutils [req-ad11ba27-7878-489a-92ea-5e9727979e9d req-ea548dd5-ca5e-4208-b0ce-c5bab543df72 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.005 2 DEBUG nova.compute.manager [req-ad11ba27-7878-489a-92ea-5e9727979e9d req-ea548dd5-ca5e-4208-b0ce-c5bab543df72 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] No waiting events found dispatching network-vif-unplugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.006 2 WARNING nova.compute.manager [req-ad11ba27-7878-489a-92ea-5e9727979e9d req-ea548dd5-ca5e-4208-b0ce-c5bab543df72 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received unexpected event network-vif-unplugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b for instance with vm_state active and task_state shelving.#033[00m
Oct  2 08:34:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:47.023 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[45e2aab2-d7a3-48a2-ae47-34221b4ff669]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:47.025 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[34107933-d45d-4b1c-a2a3-0182f5a5bb05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:47.046 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b40746d1-1e14-49de-bd0e-0a0263ed58d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651947, 'reachable_time': 22431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294391, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:47.048 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:34:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:47.048 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e9547a-d722-458c-a8d0-9f3027d12ce5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.071 2 INFO nova.virt.libvirt.driver [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.076 2 INFO nova.virt.libvirt.driver [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Instance destroyed successfully.#033[00m
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.076 2 DEBUG nova.objects.instance [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'numa_topology' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:47 np0005465988 systemd[1]: run-netns-ovnmeta\x2d7b216831\x2d24ac\x2d41f0\x2dac1c\x2d99aae9bc897b.mount: Deactivated successfully.
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.503 2 INFO nova.virt.libvirt.driver [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Beginning cold snapshot process#033[00m
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.669 2 DEBUG nova.virt.libvirt.imagebackend [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:34:47 np0005465988 nova_compute[236126]: 2025-10-02 12:34:47.967 2 DEBUG nova.storage.rbd_utils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] creating snapshot(dec55bdf15924e0981dff2f0ade8a20c) on rbd image(8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:34:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:48.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:48 np0005465988 nova_compute[236126]: 2025-10-02 12:34:48.288 2 DEBUG nova.network.neutron [-] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e326 e326: 3 total, 3 up, 3 in
Oct  2 08:34:48 np0005465988 nova_compute[236126]: 2025-10-02 12:34:48.483 2 INFO nova.compute.manager [-] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Took 1.50 seconds to deallocate network for instance.#033[00m
Oct  2 08:34:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:48.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:48 np0005465988 nova_compute[236126]: 2025-10-02 12:34:48.678 2 DEBUG nova.storage.rbd_utils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] cloning vms/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk@dec55bdf15924e0981dff2f0ade8a20c to images/656ad3f4-b233-4eb8-822c-07efc986a6e0 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:34:48 np0005465988 nova_compute[236126]: 2025-10-02 12:34:48.754 2 DEBUG oslo_concurrency.lockutils [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:48 np0005465988 nova_compute[236126]: 2025-10-02 12:34:48.754 2 DEBUG oslo_concurrency.lockutils [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:48 np0005465988 nova_compute[236126]: 2025-10-02 12:34:48.831 2 DEBUG nova.storage.rbd_utils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] flattening images/656ad3f4-b233-4eb8-822c-07efc986a6e0 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:34:48 np0005465988 nova_compute[236126]: 2025-10-02 12:34:48.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.336 2 DEBUG oslo_concurrency.processutils [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.402 2 DEBUG nova.compute.manager [req-2f699647-f198-44f5-9cf6-2829c5f90813 req-70e22716-bc04-4637-b127-e7cd3de49834 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Received event network-vif-plugged-7079d58e-139e-4183-9126-02a8a7b45012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.404 2 DEBUG oslo_concurrency.lockutils [req-2f699647-f198-44f5-9cf6-2829c5f90813 req-70e22716-bc04-4637-b127-e7cd3de49834 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "a4df176d-5fef-490e-8cee-3424098213f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.404 2 DEBUG oslo_concurrency.lockutils [req-2f699647-f198-44f5-9cf6-2829c5f90813 req-70e22716-bc04-4637-b127-e7cd3de49834 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.405 2 DEBUG oslo_concurrency.lockutils [req-2f699647-f198-44f5-9cf6-2829c5f90813 req-70e22716-bc04-4637-b127-e7cd3de49834 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.405 2 DEBUG nova.compute.manager [req-2f699647-f198-44f5-9cf6-2829c5f90813 req-70e22716-bc04-4637-b127-e7cd3de49834 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] No waiting events found dispatching network-vif-plugged-7079d58e-139e-4183-9126-02a8a7b45012 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.406 2 WARNING nova.compute.manager [req-2f699647-f198-44f5-9cf6-2829c5f90813 req-70e22716-bc04-4637-b127-e7cd3de49834 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Received unexpected event network-vif-plugged-7079d58e-139e-4183-9126-02a8a7b45012 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.711 2 DEBUG nova.compute.manager [req-0c63f6c0-ca05-4fcd-b8c9-cf9e5399b186 req-f7ed2ea0-60b2-4eb7-a5be-102a21e3930d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.712 2 DEBUG oslo_concurrency.lockutils [req-0c63f6c0-ca05-4fcd-b8c9-cf9e5399b186 req-f7ed2ea0-60b2-4eb7-a5be-102a21e3930d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.713 2 DEBUG oslo_concurrency.lockutils [req-0c63f6c0-ca05-4fcd-b8c9-cf9e5399b186 req-f7ed2ea0-60b2-4eb7-a5be-102a21e3930d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.713 2 DEBUG oslo_concurrency.lockutils [req-0c63f6c0-ca05-4fcd-b8c9-cf9e5399b186 req-f7ed2ea0-60b2-4eb7-a5be-102a21e3930d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.714 2 DEBUG nova.compute.manager [req-0c63f6c0-ca05-4fcd-b8c9-cf9e5399b186 req-f7ed2ea0-60b2-4eb7-a5be-102a21e3930d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] No waiting events found dispatching network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.714 2 WARNING nova.compute.manager [req-0c63f6c0-ca05-4fcd-b8c9-cf9e5399b186 req-f7ed2ea0-60b2-4eb7-a5be-102a21e3930d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received unexpected event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.715 2 DEBUG nova.compute.manager [req-0c63f6c0-ca05-4fcd-b8c9-cf9e5399b186 req-f7ed2ea0-60b2-4eb7-a5be-102a21e3930d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Received event network-vif-deleted-7079d58e-139e-4183-9126-02a8a7b45012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:34:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3434496002' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.784 2 DEBUG oslo_concurrency.processutils [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.791 2 DEBUG nova.compute.provider_tree [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:49.837 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:49.840 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.855 2 DEBUG nova.storage.rbd_utils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] removing snapshot(dec55bdf15924e0981dff2f0ade8a20c) on rbd image(8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.863 2 DEBUG nova.scheduler.client.report [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:49 np0005465988 nova_compute[236126]: 2025-10-02 12:34:49.982 2 DEBUG oslo_concurrency.lockutils [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:50 np0005465988 nova_compute[236126]: 2025-10-02 12:34:50.040 2 INFO nova.scheduler.client.report [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Deleted allocations for instance a4df176d-5fef-490e-8cee-3424098213f5#033[00m
Oct  2 08:34:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:50.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:50 np0005465988 nova_compute[236126]: 2025-10-02 12:34:50.267 2 DEBUG oslo_concurrency.lockutils [None req-d9d5cd4a-b732-4c99-80b8-9edd80be2596 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "a4df176d-5fef-490e-8cee-3424098213f5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:50.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e327 e327: 3 total, 3 up, 3 in
Oct  2 08:34:51 np0005465988 nova_compute[236126]: 2025-10-02 12:34:51.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:51 np0005465988 nova_compute[236126]: 2025-10-02 12:34:51.390 2 DEBUG nova.storage.rbd_utils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] creating snapshot(snap) on rbd image(656ad3f4-b233-4eb8-822c-07efc986a6e0) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:34:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:52.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e328 e328: 3 total, 3 up, 3 in
Oct  2 08:34:52 np0005465988 nova_compute[236126]: 2025-10-02 12:34:52.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:52.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:53 np0005465988 nova_compute[236126]: 2025-10-02 12:34:53.820 2 INFO nova.virt.libvirt.driver [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Snapshot image upload complete#033[00m
Oct  2 08:34:53 np0005465988 nova_compute[236126]: 2025-10-02 12:34:53.822 2 DEBUG nova.compute.manager [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:53 np0005465988 nova_compute[236126]: 2025-10-02 12:34:53.906 2 INFO nova.compute.manager [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Shelve offloading#033[00m
Oct  2 08:34:53 np0005465988 nova_compute[236126]: 2025-10-02 12:34:53.914 2 INFO nova.virt.libvirt.driver [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Instance destroyed successfully.#033[00m
Oct  2 08:34:53 np0005465988 nova_compute[236126]: 2025-10-02 12:34:53.914 2 DEBUG nova.compute.manager [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:53 np0005465988 nova_compute[236126]: 2025-10-02 12:34:53.917 2 DEBUG oslo_concurrency.lockutils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:53 np0005465988 nova_compute[236126]: 2025-10-02 12:34:53.918 2 DEBUG oslo_concurrency.lockutils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquired lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:53 np0005465988 nova_compute[236126]: 2025-10-02 12:34:53.918 2 DEBUG nova.network.neutron [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:34:53 np0005465988 nova_compute[236126]: 2025-10-02 12:34:53.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:54.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.508 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.509 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.544 2 DEBUG nova.compute.manager [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:34:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:54.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.653 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.654 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.670 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.671 2 INFO nova.compute.claims [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.748 2 DEBUG nova.scheduler.client.report [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.768 2 DEBUG nova.scheduler.client.report [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.768 2 DEBUG nova.compute.provider_tree [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.782 2 DEBUG nova.scheduler.client.report [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.801 2 DEBUG nova.scheduler.client.report [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:34:54 np0005465988 nova_compute[236126]: 2025-10-02 12:34:54.884 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.173 2 DEBUG nova.network.neutron [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updating instance_info_cache with network_info: [{"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.205 2 DEBUG oslo_concurrency.lockutils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Releasing lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.420 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.429 2 DEBUG nova.compute.provider_tree [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.472 2 DEBUG nova.scheduler.client.report [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.479 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.520 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.866s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.521 2 DEBUG nova.compute.manager [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.523 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.524 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.524 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.524 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.524 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.777 2 DEBUG nova.compute.manager [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.778 2 DEBUG nova.network.neutron [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.868 2 INFO nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:34:55 np0005465988 nova_compute[236126]: 2025-10-02 12:34:55.993 2 DEBUG nova.compute.manager [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:34:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:34:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/763028527' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.034 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.055 2 DEBUG nova.policy [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fe9cc788734f406d826446a848700331', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bc0d63d3b4404ef8858166e8836dd0af', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:34:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:56.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:56 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:34:56 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:34:56 np0005465988 podman[294906]: 2025-10-02 12:34:56.179813991 +0000 UTC m=+0.082537975 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.license=GPLv2)
Oct  2 08:34:56 np0005465988 podman[294907]: 2025-10-02 12:34:56.191224719 +0000 UTC m=+0.084564853 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.200 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.200 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:34:56 np0005465988 podman[294905]: 2025-10-02 12:34:56.211684718 +0000 UTC m=+0.114587947 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.355 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.356 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4181MB free_disk=20.697750091552734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.356 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.356 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.367 2 INFO nova.virt.libvirt.driver [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Instance destroyed successfully.#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.367 2 DEBUG nova.objects.instance [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'resources' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.440 2 DEBUG nova.compute.manager [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.442 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.442 2 INFO nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Creating image(s)#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.480 2 DEBUG nova.storage.rbd_utils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.507 2 DEBUG nova.storage.rbd_utils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.539 2 DEBUG nova.storage.rbd_utils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.544 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:56.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.581 2 DEBUG nova.virt.libvirt.vif [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:34:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-639305243',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-639305243',id=130,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH7i72qZf0LyRp/akt/bIu4snLJg8XuAUIsHpF3xOK1XlpVLYZ/YFzz7wr2QY5za8QZBy0/Efb6X+c12F9Zi3EqjS+0mqhH0nerFk7xvdGE6zlwRcwJDWaW/qlypPLaWbQ==',key_name='tempest-keypair-1186187448',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:34:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-tje0hz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member',shelved_at='2025-10-02T12:34:53.822058',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='656ad3f4-b233-4eb8-822c-07efc986a6e0'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:34:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=8736e2a4-70c8-46c1-8ce5-ff68395a22c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.582 2 DEBUG nova.network.os_vif_util [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.584 2 DEBUG nova.network.os_vif_util [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.584 2 DEBUG os_vif [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.590 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9dd6bc4-09, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.626 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.627 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.628 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.628 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.654 2 DEBUG nova.storage.rbd_utils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.658 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.699 2 DEBUG nova.compute.manager [req-1e6ecb7f-93e5-46a7-a0a3-3ffda646e424 req-1f500c0c-3634-42db-9d5e-194ff7907aae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-changed-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.699 2 DEBUG nova.compute.manager [req-1e6ecb7f-93e5-46a7-a0a3-3ffda646e424 req-1f500c0c-3634-42db-9d5e-194ff7907aae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Refreshing instance network info cache due to event network-changed-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.700 2 DEBUG oslo_concurrency.lockutils [req-1e6ecb7f-93e5-46a7-a0a3-3ffda646e424 req-1f500c0c-3634-42db-9d5e-194ff7907aae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.700 2 DEBUG oslo_concurrency.lockutils [req-1e6ecb7f-93e5-46a7-a0a3-3ffda646e424 req-1f500c0c-3634-42db-9d5e-194ff7907aae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.700 2 DEBUG nova.network.neutron [req-1e6ecb7f-93e5-46a7-a0a3-3ffda646e424 req-1f500c0c-3634-42db-9d5e-194ff7907aae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Refreshing network info cache for port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.705 2 INFO os_vif [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09')#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.726 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.727 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 9e1649d5-a78b-44ea-bbdb-8da86ace9900 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.727 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.727 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:34:56 np0005465988 nova_compute[236126]: 2025-10-02 12:34:56.784 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:34:57 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2061018038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:34:57 np0005465988 nova_compute[236126]: 2025-10-02 12:34:57.222 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:57 np0005465988 nova_compute[236126]: 2025-10-02 12:34:57.229 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:57 np0005465988 nova_compute[236126]: 2025-10-02 12:34:57.232 2 DEBUG nova.network.neutron [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Successfully created port: 4158634f-0525-4c7b-b9b1-047f0a82eb12 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:34:57 np0005465988 nova_compute[236126]: 2025-10-02 12:34:57.279 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:57 np0005465988 nova_compute[236126]: 2025-10-02 12:34:57.621 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:34:57 np0005465988 nova_compute[236126]: 2025-10-02 12:34:57.622 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:57 np0005465988 nova_compute[236126]: 2025-10-02 12:34:57.803 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:57 np0005465988 nova_compute[236126]: 2025-10-02 12:34:57.904 2 DEBUG nova.storage.rbd_utils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] resizing rbd image 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:34:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:34:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:34:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:34:58.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:34:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.310 2 DEBUG nova.network.neutron [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Successfully updated port: 4158634f-0525-4c7b-b9b1-047f0a82eb12 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.391 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "refresh_cache-9e1649d5-a78b-44ea-bbdb-8da86ace9900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.391 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquired lock "refresh_cache-9e1649d5-a78b-44ea-bbdb-8da86ace9900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.391 2 DEBUG nova.network.neutron [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.407 2 DEBUG nova.compute.manager [req-32809645-fe0d-42f5-b954-5916be268375 req-2f3633cd-6aaa-4c00-8520-23a12c470d66 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Received event network-changed-4158634f-0525-4c7b-b9b1-047f0a82eb12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.408 2 DEBUG nova.compute.manager [req-32809645-fe0d-42f5-b954-5916be268375 req-2f3633cd-6aaa-4c00-8520-23a12c470d66 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Refreshing instance network info cache due to event network-changed-4158634f-0525-4c7b-b9b1-047f0a82eb12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.408 2 DEBUG oslo_concurrency.lockutils [req-32809645-fe0d-42f5-b954-5916be268375 req-2f3633cd-6aaa-4c00-8520-23a12c470d66 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-9e1649d5-a78b-44ea-bbdb-8da86ace9900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.530 2 DEBUG nova.network.neutron [req-1e6ecb7f-93e5-46a7-a0a3-3ffda646e424 req-1f500c0c-3634-42db-9d5e-194ff7907aae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updated VIF entry in instance network info cache for port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.531 2 DEBUG nova.network.neutron [req-1e6ecb7f-93e5-46a7-a0a3-3ffda646e424 req-1f500c0c-3634-42db-9d5e-194ff7907aae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updating instance_info_cache with network_info: [{"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": null, "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.539 2 DEBUG nova.objects.instance [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lazy-loading 'migration_context' on Instance uuid 9e1649d5-a78b-44ea-bbdb-8da86ace9900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:34:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:34:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:34:58.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.644 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.645 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Ensure instance console log exists: /var/lib/nova/instances/9e1649d5-a78b-44ea-bbdb-8da86ace9900/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.646 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.647 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.647 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.650 2 DEBUG oslo_concurrency.lockutils [req-1e6ecb7f-93e5-46a7-a0a3-3ffda646e424 req-1f500c0c-3634-42db-9d5e-194ff7907aae d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:58 np0005465988 nova_compute[236126]: 2025-10-02 12:34:58.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:59 np0005465988 nova_compute[236126]: 2025-10-02 12:34:59.251 2 DEBUG nova.network.neutron [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:34:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 e329: 3 total, 3 up, 3 in
Oct  2 08:34:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:34:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:34:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:34:59 np0005465988 nova_compute[236126]: 2025-10-02 12:34:59.617 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:59 np0005465988 nova_compute[236126]: 2025-10-02 12:34:59.618 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:34:59.842 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:00.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:00.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:35:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:35:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.162 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408486.1616304, a4df176d-5fef-490e-8cee-3424098213f5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.163 2 INFO nova.compute.manager [-] [instance: a4df176d-5fef-490e-8cee-3424098213f5] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.304 2 DEBUG nova.compute.manager [None req-f960f384-3b9c-4b30-b57f-610ec88f2232 - - - - - -] [instance: a4df176d-5fef-490e-8cee-3424098213f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.306 2 DEBUG nova.network.neutron [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Updating instance_info_cache with network_info: [{"id": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "address": "fa:16:3e:80:96:e2", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4158634f-05", "ovs_interfaceid": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.910 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408486.9086564, 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.910 2 INFO nova.compute.manager [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.985 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Releasing lock "refresh_cache-9e1649d5-a78b-44ea-bbdb-8da86ace9900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.985 2 DEBUG nova.compute.manager [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Instance network_info: |[{"id": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "address": "fa:16:3e:80:96:e2", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4158634f-05", "ovs_interfaceid": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.985 2 DEBUG oslo_concurrency.lockutils [req-32809645-fe0d-42f5-b954-5916be268375 req-2f3633cd-6aaa-4c00-8520-23a12c470d66 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-9e1649d5-a78b-44ea-bbdb-8da86ace9900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.986 2 DEBUG nova.network.neutron [req-32809645-fe0d-42f5-b954-5916be268375 req-2f3633cd-6aaa-4c00-8520-23a12c470d66 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Refreshing network info cache for port 4158634f-0525-4c7b-b9b1-047f0a82eb12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.988 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Start _get_guest_xml network_info=[{"id": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "address": "fa:16:3e:80:96:e2", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4158634f-05", "ovs_interfaceid": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.993 2 WARNING nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.998 2 DEBUG nova.virt.libvirt.host [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:35:01 np0005465988 nova_compute[236126]: 2025-10-02 12:35:01.999 2 DEBUG nova.virt.libvirt.host [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.002 2 DEBUG nova.virt.libvirt.host [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.002 2 DEBUG nova.virt.libvirt.host [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.003 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.003 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.003 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.004 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.004 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.004 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.004 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.004 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.004 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.005 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.005 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.005 2 DEBUG nova.virt.hardware [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.007 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.132 2 DEBUG nova.compute.manager [None req-7194e110-29cb-4a61-9cda-11d6f373fa9c - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.137 2 DEBUG nova.compute.manager [None req-7194e110-29cb-4a61-9cda-11d6f373fa9c - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:35:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:02.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:35:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4276342209' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.458 2 INFO nova.compute.manager [None req-7194e110-29cb-4a61-9cda-11d6f373fa9c - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] During sync_power_state the instance has a pending task (shelving_offloading). Skip.#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.563 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.598 2 DEBUG nova.storage.rbd_utils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:02 np0005465988 nova_compute[236126]: 2025-10-02 12:35:02.603 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:02.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:35:03 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3877520165' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:35:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.444 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.841s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.447 2 DEBUG nova.virt.libvirt.vif [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-192500961',display_name='tempest-ServersTestJSON-server-192500961',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-192500961',id=136,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5yqdemPRSv/m+i5ogmEl+ivj7kcDM1udJ4tkcei/4AP6qUNnO+lTrS7vg8PUvVvug9lLB/spCLtqYH7A5wTPmeda4+Mcg76kFWzsdjncekZs4BfCndGERKlMIyOQ01rQ==',key_name='tempest-key-1099587481',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bc0d63d3b4404ef8858166e8836dd0af',ramdisk_id='',reservation_id='r-popyf3zc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-80077074',owner_user_name='tempest-ServersTestJSON-80077074-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:56Z,user_data=None,user_id='fe9cc788734f406d826446a848700331',uuid=9e1649d5-a78b-44ea-bbdb-8da86ace9900,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "address": "fa:16:3e:80:96:e2", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4158634f-05", "ovs_interfaceid": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.447 2 DEBUG nova.network.os_vif_util [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converting VIF {"id": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "address": "fa:16:3e:80:96:e2", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4158634f-05", "ovs_interfaceid": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.449 2 DEBUG nova.network.os_vif_util [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:96:e2,bridge_name='br-int',has_traffic_filtering=True,id=4158634f-0525-4c7b-b9b1-047f0a82eb12,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4158634f-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.451 2 DEBUG nova.objects.instance [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lazy-loading 'pci_devices' on Instance uuid 9e1649d5-a78b-44ea-bbdb-8da86ace9900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.495 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  <uuid>9e1649d5-a78b-44ea-bbdb-8da86ace9900</uuid>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  <name>instance-00000088</name>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersTestJSON-server-192500961</nova:name>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:35:01</nova:creationTime>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <nova:user uuid="fe9cc788734f406d826446a848700331">tempest-ServersTestJSON-80077074-project-member</nova:user>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <nova:project uuid="bc0d63d3b4404ef8858166e8836dd0af">tempest-ServersTestJSON-80077074</nova:project>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <nova:port uuid="4158634f-0525-4c7b-b9b1-047f0a82eb12">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <entry name="serial">9e1649d5-a78b-44ea-bbdb-8da86ace9900</entry>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <entry name="uuid">9e1649d5-a78b-44ea-bbdb-8da86ace9900</entry>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk.config">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:80:96:e2"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <target dev="tap4158634f-05"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/9e1649d5-a78b-44ea-bbdb-8da86ace9900/console.log" append="off"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:35:03 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:35:03 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:35:03 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:35:03 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.497 2 DEBUG nova.compute.manager [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Preparing to wait for external event network-vif-plugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.498 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.499 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.499 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.501 2 DEBUG nova.virt.libvirt.vif [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-192500961',display_name='tempest-ServersTestJSON-server-192500961',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-192500961',id=136,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5yqdemPRSv/m+i5ogmEl+ivj7kcDM1udJ4tkcei/4AP6qUNnO+lTrS7vg8PUvVvug9lLB/spCLtqYH7A5wTPmeda4+Mcg76kFWzsdjncekZs4BfCndGERKlMIyOQ01rQ==',key_name='tempest-key-1099587481',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bc0d63d3b4404ef8858166e8836dd0af',ramdisk_id='',reservation_id='r-popyf3zc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-80077074',owner_user_name='tempest-ServersTestJSON-80077074-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:56Z,user_data=None,user_id='fe9cc788734f406d826446a848700331',uuid=9e1649d5-a78b-44ea-bbdb-8da86ace9900,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "address": "fa:16:3e:80:96:e2", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4158634f-05", "ovs_interfaceid": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.501 2 DEBUG nova.network.os_vif_util [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converting VIF {"id": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "address": "fa:16:3e:80:96:e2", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4158634f-05", "ovs_interfaceid": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.502 2 DEBUG nova.network.os_vif_util [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:96:e2,bridge_name='br-int',has_traffic_filtering=True,id=4158634f-0525-4c7b-b9b1-047f0a82eb12,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4158634f-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.503 2 DEBUG os_vif [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:96:e2,bridge_name='br-int',has_traffic_filtering=True,id=4158634f-0525-4c7b-b9b1-047f0a82eb12,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4158634f-05') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.505 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.506 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.510 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4158634f-05, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.511 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4158634f-05, col_values=(('external_ids', {'iface-id': '4158634f-0525-4c7b-b9b1-047f0a82eb12', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:96:e2', 'vm-uuid': '9e1649d5-a78b-44ea-bbdb-8da86ace9900'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:03 np0005465988 NetworkManager[45041]: <info>  [1759408503.5147] manager: (tap4158634f-05): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.524 2 INFO os_vif [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:96:e2,bridge_name='br-int',has_traffic_filtering=True,id=4158634f-0525-4c7b-b9b1-047f0a82eb12,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4158634f-05')#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.744 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.745 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.746 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] No VIF found with MAC fa:16:3e:80:96:e2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.747 2 INFO nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Using config drive#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.794 2 DEBUG nova.storage.rbd_utils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:03 np0005465988 nova_compute[236126]: 2025-10-02 12:35:03.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:04.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:04 np0005465988 nova_compute[236126]: 2025-10-02 12:35:04.384 2 INFO nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Creating config drive at /var/lib/nova/instances/9e1649d5-a78b-44ea-bbdb-8da86ace9900/disk.config#033[00m
Oct  2 08:35:04 np0005465988 nova_compute[236126]: 2025-10-02 12:35:04.389 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9e1649d5-a78b-44ea-bbdb-8da86ace9900/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx6vdvabg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:04 np0005465988 nova_compute[236126]: 2025-10-02 12:35:04.529 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9e1649d5-a78b-44ea-bbdb-8da86ace9900/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx6vdvabg" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:04 np0005465988 nova_compute[236126]: 2025-10-02 12:35:04.570 2 DEBUG nova.storage.rbd_utils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:04 np0005465988 nova_compute[236126]: 2025-10-02 12:35:04.573 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9e1649d5-a78b-44ea-bbdb-8da86ace9900/disk.config 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:04.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:05 np0005465988 nova_compute[236126]: 2025-10-02 12:35:05.017 2 DEBUG nova.network.neutron [req-32809645-fe0d-42f5-b954-5916be268375 req-2f3633cd-6aaa-4c00-8520-23a12c470d66 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Updated VIF entry in instance network info cache for port 4158634f-0525-4c7b-b9b1-047f0a82eb12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:35:05 np0005465988 nova_compute[236126]: 2025-10-02 12:35:05.018 2 DEBUG nova.network.neutron [req-32809645-fe0d-42f5-b954-5916be268375 req-2f3633cd-6aaa-4c00-8520-23a12c470d66 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Updating instance_info_cache with network_info: [{"id": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "address": "fa:16:3e:80:96:e2", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4158634f-05", "ovs_interfaceid": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:05 np0005465988 nova_compute[236126]: 2025-10-02 12:35:05.062 2 DEBUG oslo_concurrency.lockutils [req-32809645-fe0d-42f5-b954-5916be268375 req-2f3633cd-6aaa-4c00-8520-23a12c470d66 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-9e1649d5-a78b-44ea-bbdb-8da86ace9900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:05 np0005465988 nova_compute[236126]: 2025-10-02 12:35:05.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:06.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:06 np0005465988 nova_compute[236126]: 2025-10-02 12:35:06.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:06.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:07 np0005465988 podman[295327]: 2025-10-02 12:35:07.803542312 +0000 UTC m=+0.079411385 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 08:35:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:08.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:08 np0005465988 nova_compute[236126]: 2025-10-02 12:35:08.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:08.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:08 np0005465988 nova_compute[236126]: 2025-10-02 12:35:08.855 2 DEBUG oslo_concurrency.processutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9e1649d5-a78b-44ea-bbdb-8da86ace9900/disk.config 9e1649d5-a78b-44ea-bbdb-8da86ace9900_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.282s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:08 np0005465988 nova_compute[236126]: 2025-10-02 12:35:08.856 2 INFO nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Deleting local config drive /var/lib/nova/instances/9e1649d5-a78b-44ea-bbdb-8da86ace9900/disk.config because it was imported into RBD.#033[00m
Oct  2 08:35:08 np0005465988 nova_compute[236126]: 2025-10-02 12:35:08.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:08 np0005465988 kernel: tap4158634f-05: entered promiscuous mode
Oct  2 08:35:08 np0005465988 NetworkManager[45041]: <info>  [1759408508.9563] manager: (tap4158634f-05): new Tun device (/org/freedesktop/NetworkManager/Devices/277)
Oct  2 08:35:08 np0005465988 nova_compute[236126]: 2025-10-02 12:35:08.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:08 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:08Z|00621|binding|INFO|Claiming lport 4158634f-0525-4c7b-b9b1-047f0a82eb12 for this chassis.
Oct  2 08:35:08 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:08Z|00622|binding|INFO|4158634f-0525-4c7b-b9b1-047f0a82eb12: Claiming fa:16:3e:80:96:e2 10.100.0.9
Oct  2 08:35:08 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:08Z|00623|binding|INFO|Setting lport 4158634f-0525-4c7b-b9b1-047f0a82eb12 ovn-installed in OVS
Oct  2 08:35:08 np0005465988 nova_compute[236126]: 2025-10-02 12:35:08.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:08 np0005465988 nova_compute[236126]: 2025-10-02 12:35:08.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:08 np0005465988 systemd-udevd[295385]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:35:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:08.995 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:96:e2 10.100.0.9'], port_security=['fa:16:3e:80:96:e2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9e1649d5-a78b-44ea-bbdb-8da86ace9900', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc0d63d3b4404ef8858166e8836dd0af', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2a11ff87-bec6-4638-b302-adcd655efba9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=797b6af2-473b-4626-9e97-a0a489119419, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=4158634f-0525-4c7b-b9b1-047f0a82eb12) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:35:08 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:08Z|00624|binding|INFO|Setting lport 4158634f-0525-4c7b-b9b1-047f0a82eb12 up in Southbound
Oct  2 08:35:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:08.996 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 4158634f-0525-4c7b-b9b1-047f0a82eb12 in datapath d7203b00-e5e4-402e-b777-ac6280fa23ac bound to our chassis#033[00m
Oct  2 08:35:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:08.998 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7203b00-e5e4-402e-b777-ac6280fa23ac#033[00m
Oct  2 08:35:09 np0005465988 NetworkManager[45041]: <info>  [1759408509.0011] device (tap4158634f-05): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:35:09 np0005465988 NetworkManager[45041]: <info>  [1759408509.0021] device (tap4158634f-05): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:35:09 np0005465988 systemd-machined[192594]: New machine qemu-61-instance-00000088.
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.012 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b5a62f9a-36d9-40ed-b052-1bc98e400c6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.012 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7203b00-e1 in ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.014 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7203b00-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.014 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[85eed37c-d67c-459b-8f37-2551411256d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.015 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[db845734-c4cf-4550-a5a9-e24ef685a414]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.027 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[7483c286-164a-4160-b424-842b0f3ba42f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 systemd[1]: Started Virtual Machine qemu-61-instance-00000088.
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.050 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[876e5073-95b9-4296-bb6a-a5b4f889980f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.086 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[07ce5f06-b429-4098-b61d-e593586eae2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 NetworkManager[45041]: <info>  [1759408509.0919] manager: (tapd7203b00-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/278)
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.091 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[539b06d1-c708-4162-b8df-18b2091f73ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.128 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d8896dfb-7474-44a2-929d-ccc5f1157d47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.131 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[627060c5-e425-4047-82d6-e022fbd0ccb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 NetworkManager[45041]: <info>  [1759408509.1564] device (tapd7203b00-e0): carrier: link connected
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.161 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f1854e66-71f3-42b6-9755-4edeabab1317]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.181 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0f30b224-85f0-4cdb-bca1-9717a84ffed5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7203b00-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c4:e9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656948, 'reachable_time': 17815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295419, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.199 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e8572417-9b25-43a8-bd41-6bd561914b93]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:c4e9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 656948, 'tstamp': 656948}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 295420, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.217 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[21262d8d-69b8-4e45-81fe-2ed7a583ef67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7203b00-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c4:e9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656948, 'reachable_time': 17815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 295421, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.248 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[03271c6f-4c45-45df-9161-2f1eed9979f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.307 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6984ab02-bbca-455f-91dd-47e552aefb91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.309 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7203b00-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.309 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.310 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7203b00-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:09 np0005465988 NetworkManager[45041]: <info>  [1759408509.3131] manager: (tapd7203b00-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Oct  2 08:35:09 np0005465988 kernel: tapd7203b00-e0: entered promiscuous mode
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.318 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7203b00-e0, col_values=(('external_ids', {'iface-id': '6f9d54ba-3cfb-48b9-bef7-b2077e6931d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:09Z|00625|binding|INFO|Releasing lport 6f9d54ba-3cfb-48b9-bef7-b2077e6931d7 from this chassis (sb_readonly=0)
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.335 2 DEBUG nova.compute.manager [req-7506537b-7e0b-4f78-8afb-eec1fc7f46bb req-c6f11955-4f9d-4988-8dd5-a24020be9e36 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Received event network-vif-plugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.335 2 DEBUG oslo_concurrency.lockutils [req-7506537b-7e0b-4f78-8afb-eec1fc7f46bb req-c6f11955-4f9d-4988-8dd5-a24020be9e36 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.335 2 DEBUG oslo_concurrency.lockutils [req-7506537b-7e0b-4f78-8afb-eec1fc7f46bb req-c6f11955-4f9d-4988-8dd5-a24020be9e36 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.336 2 DEBUG oslo_concurrency.lockutils [req-7506537b-7e0b-4f78-8afb-eec1fc7f46bb req-c6f11955-4f9d-4988-8dd5-a24020be9e36 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.336 2 DEBUG nova.compute.manager [req-7506537b-7e0b-4f78-8afb-eec1fc7f46bb req-c6f11955-4f9d-4988-8dd5-a24020be9e36 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Processing event network-vif-plugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.386 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7203b00-e5e4-402e-b777-ac6280fa23ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7203b00-e5e4-402e-b777-ac6280fa23ac.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.387 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dad65472-42ac-48ad-adfb-62a7c4a99d32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.388 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-d7203b00-e5e4-402e-b777-ac6280fa23ac
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/d7203b00-e5e4-402e-b777-ac6280fa23ac.pid.haproxy
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID d7203b00-e5e4-402e-b777-ac6280fa23ac
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:35:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:09.389 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'env', 'PROCESS_TAG=haproxy-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7203b00-e5e4-402e-b777-ac6280fa23ac.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.545 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.546 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.546 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.547 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:35:09 np0005465988 nova_compute[236126]: 2025-10-02 12:35:09.547 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:09 np0005465988 podman[295460]: 2025-10-02 12:35:09.792436847 +0000 UTC m=+0.041744752 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:35:10 np0005465988 podman[295460]: 2025-10-02 12:35:10.046144795 +0000 UTC m=+0.295452650 container create e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:35:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:10.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:10 np0005465988 systemd[1]: Started libpod-conmon-e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978.scope.
Oct  2 08:35:10 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:35:10 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96812dcca62091f2b5a49420a61c0cf2fd236baa8d03b87d4b8215ebe320773f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:35:10 np0005465988 podman[295460]: 2025-10-02 12:35:10.291645436 +0000 UTC m=+0.540953341 container init e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:35:10 np0005465988 podman[295460]: 2025-10-02 12:35:10.301894801 +0000 UTC m=+0.551202656 container start e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:35:10 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[295476]: [NOTICE]   (295480) : New worker (295490) forked
Oct  2 08:35:10 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[295476]: [NOTICE]   (295480) : Loading success.
Oct  2 08:35:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:10.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.047 2 DEBUG nova.compute.manager [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.048 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408511.0467877, 9e1649d5-a78b-44ea-bbdb-8da86ace9900 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.048 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] VM Started (Lifecycle Event)#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.051 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.055 2 INFO nova.virt.libvirt.driver [-] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Instance spawned successfully.#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.055 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.118 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.125 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.129 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.130 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.130 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.131 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.131 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.131 2 DEBUG nova.virt.libvirt.driver [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.180 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.181 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408511.047809, 9e1649d5-a78b-44ea-bbdb-8da86ace9900 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.182 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.346 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.350 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408511.050671, 9e1649d5-a78b-44ea-bbdb-8da86ace9900 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.351 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.447 2 INFO nova.compute.manager [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Took 15.01 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.448 2 DEBUG nova.compute.manager [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.452 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.462 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.556 2 INFO nova.virt.libvirt.driver [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Deleting instance files /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_del#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.557 2 INFO nova.virt.libvirt.driver [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Deletion of /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_del complete#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.754 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.800 2 INFO nova.compute.manager [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Took 17.21 seconds to build instance.#033[00m
Oct  2 08:35:11 np0005465988 nova_compute[236126]: 2025-10-02 12:35:11.979 2 DEBUG oslo_concurrency.lockutils [None req-5367864f-e8cf-42e2-9690-2e94d4d4d2a7 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:12.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:12 np0005465988 nova_compute[236126]: 2025-10-02 12:35:12.286 2 INFO nova.scheduler.client.report [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Deleted allocations for instance 8736e2a4-70c8-46c1-8ce5-ff68395a22c9#033[00m
Oct  2 08:35:12 np0005465988 nova_compute[236126]: 2025-10-02 12:35:12.562 2 DEBUG oslo_concurrency.lockutils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:12 np0005465988 nova_compute[236126]: 2025-10-02 12:35:12.563 2 DEBUG oslo_concurrency.lockutils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:12 np0005465988 nova_compute[236126]: 2025-10-02 12:35:12.600 2 DEBUG oslo_concurrency.processutils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:12.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:35:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1174958944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.043 2 DEBUG oslo_concurrency.processutils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.050 2 DEBUG nova.compute.provider_tree [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.140 2 DEBUG nova.scheduler.client.report [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.183 2 DEBUG oslo_concurrency.lockutils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.521 2 DEBUG nova.compute.manager [req-5bbded86-dac5-44a0-b131-0c8f7f8f7e03 req-11086e20-b460-484f-9ddb-77a455e627cd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Received event network-vif-plugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.522 2 DEBUG oslo_concurrency.lockutils [req-5bbded86-dac5-44a0-b131-0c8f7f8f7e03 req-11086e20-b460-484f-9ddb-77a455e627cd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.523 2 DEBUG oslo_concurrency.lockutils [req-5bbded86-dac5-44a0-b131-0c8f7f8f7e03 req-11086e20-b460-484f-9ddb-77a455e627cd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.523 2 DEBUG oslo_concurrency.lockutils [req-5bbded86-dac5-44a0-b131-0c8f7f8f7e03 req-11086e20-b460-484f-9ddb-77a455e627cd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.524 2 DEBUG nova.compute.manager [req-5bbded86-dac5-44a0-b131-0c8f7f8f7e03 req-11086e20-b460-484f-9ddb-77a455e627cd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] No waiting events found dispatching network-vif-plugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.524 2 WARNING nova.compute.manager [req-5bbded86-dac5-44a0-b131-0c8f7f8f7e03 req-11086e20-b460-484f-9ddb-77a455e627cd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Received unexpected event network-vif-plugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.546 2 DEBUG oslo_concurrency.lockutils [None req-286ac369-f846-453c-b6b1-99d839870657 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 29.520s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:13 np0005465988 nova_compute[236126]: 2025-10-02 12:35:13.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:14.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:14.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:15 np0005465988 nova_compute[236126]: 2025-10-02 12:35:15.433 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updating instance_info_cache with network_info: [{"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": null, "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:15 np0005465988 nova_compute[236126]: 2025-10-02 12:35:15.521 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:15 np0005465988 nova_compute[236126]: 2025-10-02 12:35:15.521 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:35:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:16.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:16.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.272 2 DEBUG oslo_concurrency.lockutils [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.273 2 DEBUG oslo_concurrency.lockutils [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.273 2 DEBUG oslo_concurrency.lockutils [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.274 2 DEBUG oslo_concurrency.lockutils [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.274 2 DEBUG oslo_concurrency.lockutils [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.277 2 INFO nova.compute.manager [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Terminating instance#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.278 2 DEBUG nova.compute.manager [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:35:17 np0005465988 kernel: tap4158634f-05 (unregistering): left promiscuous mode
Oct  2 08:35:17 np0005465988 NetworkManager[45041]: <info>  [1759408517.4426] device (tap4158634f-05): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:35:17 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:17Z|00626|binding|INFO|Releasing lport 4158634f-0525-4c7b-b9b1-047f0a82eb12 from this chassis (sb_readonly=0)
Oct  2 08:35:17 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:17Z|00627|binding|INFO|Setting lport 4158634f-0525-4c7b-b9b1-047f0a82eb12 down in Southbound
Oct  2 08:35:17 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:17Z|00628|binding|INFO|Removing iface tap4158634f-05 ovn-installed in OVS
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005465988 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000088.scope: Deactivated successfully.
Oct  2 08:35:17 np0005465988 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000088.scope: Consumed 7.535s CPU time.
Oct  2 08:35:17 np0005465988 systemd-machined[192594]: Machine qemu-61-instance-00000088 terminated.
Oct  2 08:35:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:17.546 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:96:e2 10.100.0.9'], port_security=['fa:16:3e:80:96:e2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9e1649d5-a78b-44ea-bbdb-8da86ace9900', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc0d63d3b4404ef8858166e8836dd0af', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2a11ff87-bec6-4638-b302-adcd655efba9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=797b6af2-473b-4626-9e97-a0a489119419, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=4158634f-0525-4c7b-b9b1-047f0a82eb12) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:35:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:17.548 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 4158634f-0525-4c7b-b9b1-047f0a82eb12 in datapath d7203b00-e5e4-402e-b777-ac6280fa23ac unbound from our chassis#033[00m
Oct  2 08:35:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:17.550 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7203b00-e5e4-402e-b777-ac6280fa23ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:35:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:17.552 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[27a7e806-367b-4976-979b-50874474eaa9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:17.553 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac namespace which is not needed anymore#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.727 2 INFO nova.virt.libvirt.driver [-] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Instance destroyed successfully.#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.729 2 DEBUG nova.objects.instance [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lazy-loading 'resources' on Instance uuid 9e1649d5-a78b-44ea-bbdb-8da86ace9900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:17 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[295476]: [NOTICE]   (295480) : haproxy version is 2.8.14-c23fe91
Oct  2 08:35:17 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[295476]: [NOTICE]   (295480) : path to executable is /usr/sbin/haproxy
Oct  2 08:35:17 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[295476]: [WARNING]  (295480) : Exiting Master process...
Oct  2 08:35:17 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[295476]: [ALERT]    (295480) : Current worker (295490) exited with code 143 (Terminated)
Oct  2 08:35:17 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[295476]: [WARNING]  (295480) : All workers exited. Exiting... (0)
Oct  2 08:35:17 np0005465988 systemd[1]: libpod-e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978.scope: Deactivated successfully.
Oct  2 08:35:17 np0005465988 podman[295576]: 2025-10-02 12:35:17.785310504 +0000 UTC m=+0.075507413 container died e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:35:17 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978-userdata-shm.mount: Deactivated successfully.
Oct  2 08:35:17 np0005465988 systemd[1]: var-lib-containers-storage-overlay-96812dcca62091f2b5a49420a61c0cf2fd236baa8d03b87d4b8215ebe320773f-merged.mount: Deactivated successfully.
Oct  2 08:35:17 np0005465988 podman[295576]: 2025-10-02 12:35:17.834388766 +0000 UTC m=+0.124585665 container cleanup e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:35:17 np0005465988 systemd[1]: libpod-conmon-e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978.scope: Deactivated successfully.
Oct  2 08:35:17 np0005465988 podman[295615]: 2025-10-02 12:35:17.926852825 +0000 UTC m=+0.059137342 container remove e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:35:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:17.939 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[97d41780-228e-4504-88b0-b861d18ad574]: (4, ('Thu Oct  2 12:35:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac (e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978)\ne3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978\nThu Oct  2 12:35:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac (e3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978)\ne3aee06136bc921f0f6ea6bb6c5944bbcd3fa894ffc6777ac04be58ca2a83978\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:17.941 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8e179ae6-1506-4649-aa3a-6899f8fba408]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:17.944 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7203b00-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:17 np0005465988 kernel: tapd7203b00-e0: left promiscuous mode
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:17.986 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fd45f81e-04c1-4650-a304-3b3f148b4a04]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.987 2 DEBUG nova.virt.libvirt.vif [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:34:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-192500961',display_name='tempest-ServersTestJSON-server-192500961',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-192500961',id=136,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5yqdemPRSv/m+i5ogmEl+ivj7kcDM1udJ4tkcei/4AP6qUNnO+lTrS7vg8PUvVvug9lLB/spCLtqYH7A5wTPmeda4+Mcg76kFWzsdjncekZs4BfCndGERKlMIyOQ01rQ==',key_name='tempest-key-1099587481',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:35:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bc0d63d3b4404ef8858166e8836dd0af',ramdisk_id='',reservation_id='r-popyf3zc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-80077074',owner_user_name='tempest-ServersTestJSON-80077074-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:35:11Z,user_data=None,user_id='fe9cc788734f406d826446a848700331',uuid=9e1649d5-a78b-44ea-bbdb-8da86ace9900,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "address": "fa:16:3e:80:96:e2", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4158634f-05", "ovs_interfaceid": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.989 2 DEBUG nova.network.os_vif_util [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converting VIF {"id": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "address": "fa:16:3e:80:96:e2", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4158634f-05", "ovs_interfaceid": "4158634f-0525-4c7b-b9b1-047f0a82eb12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.991 2 DEBUG nova.network.os_vif_util [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:96:e2,bridge_name='br-int',has_traffic_filtering=True,id=4158634f-0525-4c7b-b9b1-047f0a82eb12,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4158634f-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.992 2 DEBUG os_vif [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:96:e2,bridge_name='br-int',has_traffic_filtering=True,id=4158634f-0525-4c7b-b9b1-047f0a82eb12,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4158634f-05') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005465988 nova_compute[236126]: 2025-10-02 12:35:17.997 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4158634f-05, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:18 np0005465988 nova_compute[236126]: 2025-10-02 12:35:18.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:18 np0005465988 nova_compute[236126]: 2025-10-02 12:35:18.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:35:18 np0005465988 nova_compute[236126]: 2025-10-02 12:35:18.007 2 INFO os_vif [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:96:e2,bridge_name='br-int',has_traffic_filtering=True,id=4158634f-0525-4c7b-b9b1-047f0a82eb12,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4158634f-05')#033[00m
Oct  2 08:35:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:18.020 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0691579f-cefa-4ffd-9ff2-c97933f2d9e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:18.021 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6ceb1bff-c331-4f07-8825-bc9cda1dfa13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:18.044 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[72196975-bb02-4aa0-a322-08667cbffafa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656941, 'reachable_time': 38125, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295647, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:18 np0005465988 systemd[1]: run-netns-ovnmeta\x2dd7203b00\x2de5e4\x2d402e\x2db777\x2dac6280fa23ac.mount: Deactivated successfully.
Oct  2 08:35:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:18.049 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:35:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:18.050 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[9c5648bd-64dd-43d7-a4ee-a9f71cc16f92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:18.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:18 np0005465988 nova_compute[236126]: 2025-10-02 12:35:18.393 2 DEBUG nova.compute.manager [req-34678e60-209c-465a-b3d9-75bf2ac35c68 req-80ca9fa6-0309-409d-ada3-c12dc2575233 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Received event network-vif-unplugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:18 np0005465988 nova_compute[236126]: 2025-10-02 12:35:18.393 2 DEBUG oslo_concurrency.lockutils [req-34678e60-209c-465a-b3d9-75bf2ac35c68 req-80ca9fa6-0309-409d-ada3-c12dc2575233 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:18 np0005465988 nova_compute[236126]: 2025-10-02 12:35:18.394 2 DEBUG oslo_concurrency.lockutils [req-34678e60-209c-465a-b3d9-75bf2ac35c68 req-80ca9fa6-0309-409d-ada3-c12dc2575233 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:18 np0005465988 nova_compute[236126]: 2025-10-02 12:35:18.394 2 DEBUG oslo_concurrency.lockutils [req-34678e60-209c-465a-b3d9-75bf2ac35c68 req-80ca9fa6-0309-409d-ada3-c12dc2575233 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:18 np0005465988 nova_compute[236126]: 2025-10-02 12:35:18.395 2 DEBUG nova.compute.manager [req-34678e60-209c-465a-b3d9-75bf2ac35c68 req-80ca9fa6-0309-409d-ada3-c12dc2575233 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] No waiting events found dispatching network-vif-unplugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:18 np0005465988 nova_compute[236126]: 2025-10-02 12:35:18.395 2 DEBUG nova.compute.manager [req-34678e60-209c-465a-b3d9-75bf2ac35c68 req-80ca9fa6-0309-409d-ada3-c12dc2575233 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Received event network-vif-unplugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:35:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:18.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:18 np0005465988 nova_compute[236126]: 2025-10-02 12:35:18.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:20.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.177 2 INFO nova.virt.libvirt.driver [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Deleting instance files /var/lib/nova/instances/9e1649d5-a78b-44ea-bbdb-8da86ace9900_del#033[00m
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.177 2 INFO nova.virt.libvirt.driver [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Deletion of /var/lib/nova/instances/9e1649d5-a78b-44ea-bbdb-8da86ace9900_del complete#033[00m
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.422 2 INFO nova.compute.manager [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Took 3.14 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.424 2 DEBUG oslo.service.loopingcall [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.424 2 DEBUG nova.compute.manager [-] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.425 2 DEBUG nova.network.neutron [-] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.540 2 DEBUG nova.compute.manager [req-7be37b4e-7362-48c5-94b0-343cc7d1f5e2 req-1fb15343-fb03-4a02-8b43-0fd388c32be0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Received event network-vif-plugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.540 2 DEBUG oslo_concurrency.lockutils [req-7be37b4e-7362-48c5-94b0-343cc7d1f5e2 req-1fb15343-fb03-4a02-8b43-0fd388c32be0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.541 2 DEBUG oslo_concurrency.lockutils [req-7be37b4e-7362-48c5-94b0-343cc7d1f5e2 req-1fb15343-fb03-4a02-8b43-0fd388c32be0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.541 2 DEBUG oslo_concurrency.lockutils [req-7be37b4e-7362-48c5-94b0-343cc7d1f5e2 req-1fb15343-fb03-4a02-8b43-0fd388c32be0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.541 2 DEBUG nova.compute.manager [req-7be37b4e-7362-48c5-94b0-343cc7d1f5e2 req-1fb15343-fb03-4a02-8b43-0fd388c32be0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] No waiting events found dispatching network-vif-plugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:20 np0005465988 nova_compute[236126]: 2025-10-02 12:35:20.541 2 WARNING nova.compute.manager [req-7be37b4e-7362-48c5-94b0-343cc7d1f5e2 req-1fb15343-fb03-4a02-8b43-0fd388c32be0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Received unexpected event network-vif-plugged-4158634f-0525-4c7b-b9b1-047f0a82eb12 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:35:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:20.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:35:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:35:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:22.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:22.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:23 np0005465988 nova_compute[236126]: 2025-10-02 12:35:23.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:23 np0005465988 nova_compute[236126]: 2025-10-02 12:35:23.758 2 DEBUG nova.network.neutron [-] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:23 np0005465988 nova_compute[236126]: 2025-10-02 12:35:23.906 2 INFO nova.compute.manager [-] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Took 3.48 seconds to deallocate network for instance.#033[00m
Oct  2 08:35:23 np0005465988 nova_compute[236126]: 2025-10-02 12:35:23.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.001 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.001 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.002 2 INFO nova.compute.manager [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Unshelving#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.106 2 DEBUG oslo_concurrency.lockutils [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.106 2 DEBUG oslo_concurrency.lockutils [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:24.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.190 2 DEBUG oslo_concurrency.processutils [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.549 2 INFO nova.virt.block_device [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Booting with volume d674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c at /dev/vdc#033[00m
Oct  2 08:35:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:24.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.702 2 DEBUG os_brick.utils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.704 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:35:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2371010568' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.723 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.723 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[9a61220b-0e54-46e9-aec4-74b88ad8e4e6]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.725 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.736 2 DEBUG oslo_concurrency.processutils [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.546s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.737 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.738 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f331e7-5358-4f04-b0c1-eb7c1e16ed4d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.740 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.745 2 DEBUG nova.compute.provider_tree [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.748 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.749 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[90470c1e-6e64-407c-9c88-7587e85cc0ad]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.751 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[fdee5486-8c9c-41f3-891e-ab9f7ba799ac]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.752 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.803 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "nvme version" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.807 2 DEBUG os_brick.initiator.connectors.lightos [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.808 2 DEBUG os_brick.initiator.connectors.lightos [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.808 2 DEBUG os_brick.initiator.connectors.lightos [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.809 2 DEBUG os_brick.utils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] <== get_connector_properties: return (105ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:35:24 np0005465988 nova_compute[236126]: 2025-10-02 12:35:24.810 2 DEBUG nova.virt.block_device [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updating existing volume attachment record: 6e0d7c38-c1b5-4721-97f8-ac34aebd9ec3 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:35:25 np0005465988 nova_compute[236126]: 2025-10-02 12:35:25.015 2 DEBUG nova.scheduler.client.report [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:35:25 np0005465988 nova_compute[236126]: 2025-10-02 12:35:25.262 2 DEBUG oslo_concurrency.lockutils [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:25 np0005465988 nova_compute[236126]: 2025-10-02 12:35:25.597 2 INFO nova.scheduler.client.report [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Deleted allocations for instance 9e1649d5-a78b-44ea-bbdb-8da86ace9900#033[00m
Oct  2 08:35:26 np0005465988 nova_compute[236126]: 2025-10-02 12:35:26.063 2 DEBUG nova.compute.manager [req-e218bfcb-032b-41fe-98f3-4d77ac69c712 req-898c5923-1222-4b1e-98c6-0778b2a03ad2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Received event network-vif-deleted-4158634f-0525-4c7b-b9b1-047f0a82eb12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:26.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:26 np0005465988 nova_compute[236126]: 2025-10-02 12:35:26.483 2 DEBUG oslo_concurrency.lockutils [None req-e629ae7d-eb25-445c-9fd1-d370bdd37ccf fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "9e1649d5-a78b-44ea-bbdb-8da86ace9900" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:26 np0005465988 podman[295738]: 2025-10-02 12:35:26.542812794 +0000 UTC m=+0.070794487 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:35:26 np0005465988 podman[295739]: 2025-10-02 12:35:26.555407667 +0000 UTC m=+0.079960341 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:35:26 np0005465988 podman[295737]: 2025-10-02 12:35:26.581416325 +0000 UTC m=+0.110492449 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:35:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:26.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:26 np0005465988 nova_compute[236126]: 2025-10-02 12:35:26.752 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:26 np0005465988 nova_compute[236126]: 2025-10-02 12:35:26.753 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:26 np0005465988 nova_compute[236126]: 2025-10-02 12:35:26.759 2 DEBUG nova.objects.instance [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:27 np0005465988 nova_compute[236126]: 2025-10-02 12:35:26.999 2 DEBUG nova.objects.instance [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'numa_topology' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:27 np0005465988 nova_compute[236126]: 2025-10-02 12:35:27.130 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:35:27 np0005465988 nova_compute[236126]: 2025-10-02 12:35:27.131 2 INFO nova.compute.claims [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:35:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:27.365 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:27.365 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:27.365 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:27 np0005465988 nova_compute[236126]: 2025-10-02 12:35:27.704 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:28 np0005465988 nova_compute[236126]: 2025-10-02 12:35:28.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:35:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:28.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:35:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:35:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4226396856' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:35:28 np0005465988 nova_compute[236126]: 2025-10-02 12:35:28.248 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:28 np0005465988 nova_compute[236126]: 2025-10-02 12:35:28.257 2 DEBUG nova.compute.provider_tree [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:35:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:28.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:28 np0005465988 nova_compute[236126]: 2025-10-02 12:35:28.845 2 DEBUG nova.scheduler.client.report [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:35:28 np0005465988 nova_compute[236126]: 2025-10-02 12:35:28.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:29 np0005465988 nova_compute[236126]: 2025-10-02 12:35:29.540 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:30.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:30 np0005465988 nova_compute[236126]: 2025-10-02 12:35:30.248 2 INFO nova.network.neutron [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updating port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  2 08:35:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:35:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:30.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:35:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:31.583 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:35:31 np0005465988 nova_compute[236126]: 2025-10-02 12:35:31.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:31.586 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:35:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:32.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.274 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "974fea45-f024-430a-bdbb-a615e05d954c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.275 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.410 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.410 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquired lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.411 2 DEBUG nova.network.neutron [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.425 2 DEBUG nova.compute.manager [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.655 2 DEBUG nova.compute.manager [req-36e85e2a-425d-44b0-ba53-3718a54179f6 req-64409161-d560-46cd-bf1a-a2bdcd7f4fe5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-changed-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.656 2 DEBUG nova.compute.manager [req-36e85e2a-425d-44b0-ba53-3718a54179f6 req-64409161-d560-46cd-bf1a-a2bdcd7f4fe5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Refreshing instance network info cache due to event network-changed-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.656 2 DEBUG oslo_concurrency.lockutils [req-36e85e2a-425d-44b0-ba53-3718a54179f6 req-64409161-d560-46cd-bf1a-a2bdcd7f4fe5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:32.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.723 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408517.722882, 9e1649d5-a78b-44ea-bbdb-8da86ace9900 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.724 2 INFO nova.compute.manager [-] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.738 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.739 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.746 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:35:32 np0005465988 nova_compute[236126]: 2025-10-02 12:35:32.746 2 INFO nova.compute.claims [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:35:33 np0005465988 nova_compute[236126]: 2025-10-02 12:35:33.020 2 DEBUG nova.compute.manager [None req-18e1ae1a-1977-4f4d-8f5c-7543af1e0a08 - - - - - -] [instance: 9e1649d5-a78b-44ea-bbdb-8da86ace9900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:33 np0005465988 nova_compute[236126]: 2025-10-02 12:35:33.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:33 np0005465988 nova_compute[236126]: 2025-10-02 12:35:33.614 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:33 np0005465988 nova_compute[236126]: 2025-10-02 12:35:33.856 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:33 np0005465988 nova_compute[236126]: 2025-10-02 12:35:33.938 2 WARNING nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] While synchronizing instance power states, found 2 instances in the database and 0 instances on the hypervisor.#033[00m
Oct  2 08:35:33 np0005465988 nova_compute[236126]: 2025-10-02 12:35:33.938 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Triggering sync for uuid 974fea45-f024-430a-bdbb-a615e05d954c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:35:33 np0005465988 nova_compute[236126]: 2025-10-02 12:35:33.939 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Triggering sync for uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:35:33 np0005465988 nova_compute[236126]: 2025-10-02 12:35:33.939 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "974fea45-f024-430a-bdbb-a615e05d954c" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:33 np0005465988 nova_compute[236126]: 2025-10-02 12:35:33.940 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:33 np0005465988 nova_compute[236126]: 2025-10-02 12:35:33.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:34.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:35:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2685851625' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:35:34 np0005465988 nova_compute[236126]: 2025-10-02 12:35:34.214 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.600s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:34 np0005465988 nova_compute[236126]: 2025-10-02 12:35:34.223 2 DEBUG nova.compute.provider_tree [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:35:34 np0005465988 nova_compute[236126]: 2025-10-02 12:35:34.283 2 DEBUG nova.network.neutron [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updating instance_info_cache with network_info: [{"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:34 np0005465988 nova_compute[236126]: 2025-10-02 12:35:34.622 2 DEBUG nova.scheduler.client.report [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:35:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:34.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:34 np0005465988 nova_compute[236126]: 2025-10-02 12:35:34.859 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Releasing lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:34 np0005465988 nova_compute[236126]: 2025-10-02 12:35:34.862 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:35:34 np0005465988 nova_compute[236126]: 2025-10-02 12:35:34.863 2 INFO nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Creating image(s)#033[00m
Oct  2 08:35:34 np0005465988 nova_compute[236126]: 2025-10-02 12:35:34.904 2 DEBUG nova.storage.rbd_utils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:34 np0005465988 nova_compute[236126]: 2025-10-02 12:35:34.910 2 DEBUG nova.objects.instance [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:34 np0005465988 nova_compute[236126]: 2025-10-02 12:35:34.913 2 DEBUG oslo_concurrency.lockutils [req-36e85e2a-425d-44b0-ba53-3718a54179f6 req-64409161-d560-46cd-bf1a-a2bdcd7f4fe5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:34 np0005465988 nova_compute[236126]: 2025-10-02 12:35:34.913 2 DEBUG nova.network.neutron [req-36e85e2a-425d-44b0-ba53-3718a54179f6 req-64409161-d560-46cd-bf1a-a2bdcd7f4fe5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Refreshing network info cache for port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.070 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.071 2 DEBUG nova.compute.manager [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.188 2 DEBUG nova.storage.rbd_utils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.223 2 DEBUG nova.storage.rbd_utils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.229 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "f4581e9241b6a870e35024d9f8550c9535e31e5f" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.230 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "f4581e9241b6a870e35024d9f8550c9535e31e5f" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.479 2 DEBUG nova.virt.libvirt.imagebackend [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Image locations are: [{'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/656ad3f4-b233-4eb8-822c-07efc986a6e0/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/656ad3f4-b233-4eb8-822c-07efc986a6e0/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.543 2 DEBUG nova.compute.manager [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.544 2 DEBUG nova.network.neutron [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.551 2 DEBUG nova.virt.libvirt.imagebackend [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Selected location: {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/656ad3f4-b233-4eb8-822c-07efc986a6e0/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.552 2 DEBUG nova.storage.rbd_utils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] cloning images/656ad3f4-b233-4eb8-822c-07efc986a6e0@snap to None/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:35:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:35.588 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:35 np0005465988 nova_compute[236126]: 2025-10-02 12:35:35.839 2 INFO nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:35:36 np0005465988 nova_compute[236126]: 2025-10-02 12:35:36.085 2 DEBUG nova.compute.manager [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:35:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:36.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:36 np0005465988 nova_compute[236126]: 2025-10-02 12:35:36.297 2 DEBUG nova.policy [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fe9cc788734f406d826446a848700331', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bc0d63d3b4404ef8858166e8836dd0af', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:35:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:36.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:36 np0005465988 nova_compute[236126]: 2025-10-02 12:35:36.855 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "f4581e9241b6a870e35024d9f8550c9535e31e5f" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.052 2 DEBUG nova.compute.manager [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.054 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.054 2 INFO nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Creating image(s)#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.092 2 DEBUG nova.storage.rbd_utils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 974fea45-f024-430a-bdbb-a615e05d954c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.136 2 DEBUG nova.storage.rbd_utils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 974fea45-f024-430a-bdbb-a615e05d954c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.177 2 DEBUG nova.storage.rbd_utils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 974fea45-f024-430a-bdbb-a615e05d954c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.182 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.240 2 DEBUG nova.objects.instance [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'migration_context' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.280 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.281 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.282 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.283 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.319 2 DEBUG nova.storage.rbd_utils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 974fea45-f024-430a-bdbb-a615e05d954c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.324 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 974fea45-f024-430a-bdbb-a615e05d954c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:37 np0005465988 nova_compute[236126]: 2025-10-02 12:35:37.439 2 DEBUG nova.storage.rbd_utils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] flattening vms/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:35:38 np0005465988 nova_compute[236126]: 2025-10-02 12:35:38.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:38.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:38 np0005465988 podman[296206]: 2025-10-02 12:35:38.534775945 +0000 UTC m=+0.069421327 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  2 08:35:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:38.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:38 np0005465988 nova_compute[236126]: 2025-10-02 12:35:38.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:39 np0005465988 nova_compute[236126]: 2025-10-02 12:35:39.271 2 DEBUG nova.network.neutron [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Successfully created port: 91400667-e168-40d0-8f0a-ffc8c9dd7fa4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:35:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:40.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:40 np0005465988 nova_compute[236126]: 2025-10-02 12:35:40.298 2 DEBUG nova.network.neutron [req-36e85e2a-425d-44b0-ba53-3718a54179f6 req-64409161-d560-46cd-bf1a-a2bdcd7f4fe5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updated VIF entry in instance network info cache for port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:35:40 np0005465988 nova_compute[236126]: 2025-10-02 12:35:40.299 2 DEBUG nova.network.neutron [req-36e85e2a-425d-44b0-ba53-3718a54179f6 req-64409161-d560-46cd-bf1a-a2bdcd7f4fe5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updating instance_info_cache with network_info: [{"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:40 np0005465988 nova_compute[236126]: 2025-10-02 12:35:40.545 2 DEBUG oslo_concurrency.lockutils [req-36e85e2a-425d-44b0-ba53-3718a54179f6 req-64409161-d560-46cd-bf1a-a2bdcd7f4fe5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-8736e2a4-70c8-46c1-8ce5-ff68395a22c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:40 np0005465988 nova_compute[236126]: 2025-10-02 12:35:40.615 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 974fea45-f024-430a-bdbb-a615e05d954c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.291s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:40.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:42.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:42.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:43 np0005465988 ceph-mds[84851]: mds.beacon.cephfs.compute-2.gpiyct missed beacon ack from the monitors
Oct  2 08:35:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.071 2 DEBUG nova.network.neutron [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Successfully updated port: 91400667-e168-40d0-8f0a-ffc8c9dd7fa4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.076 2 DEBUG nova.compute.manager [req-6f356ebd-5429-41ef-9fb3-21c1231d7798 req-cf40ca95-3d23-48d3-9fc3-00783abf226f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Received event network-changed-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.077 2 DEBUG nova.compute.manager [req-6f356ebd-5429-41ef-9fb3-21c1231d7798 req-cf40ca95-3d23-48d3-9fc3-00783abf226f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Refreshing instance network info cache due to event network-changed-91400667-e168-40d0-8f0a-ffc8c9dd7fa4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.077 2 DEBUG oslo_concurrency.lockutils [req-6f356ebd-5429-41ef-9fb3-21c1231d7798 req-cf40ca95-3d23-48d3-9fc3-00783abf226f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-974fea45-f024-430a-bdbb-a615e05d954c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.077 2 DEBUG oslo_concurrency.lockutils [req-6f356ebd-5429-41ef-9fb3-21c1231d7798 req-cf40ca95-3d23-48d3-9fc3-00783abf226f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-974fea45-f024-430a-bdbb-a615e05d954c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.077 2 DEBUG nova.network.neutron [req-6f356ebd-5429-41ef-9fb3-21c1231d7798 req-cf40ca95-3d23-48d3-9fc3-00783abf226f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Refreshing network info cache for port 91400667-e168-40d0-8f0a-ffc8c9dd7fa4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.118 2 DEBUG nova.storage.rbd_utils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] resizing rbd image 974fea45-f024-430a-bdbb-a615e05d954c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:35:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:44.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.272 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "refresh_cache-974fea45-f024-430a-bdbb-a615e05d954c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.282 2 DEBUG nova.objects.instance [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lazy-loading 'migration_context' on Instance uuid 974fea45-f024-430a-bdbb-a615e05d954c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.402 2 DEBUG nova.network.neutron [req-6f356ebd-5429-41ef-9fb3-21c1231d7798 req-cf40ca95-3d23-48d3-9fc3-00783abf226f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.468 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.469 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Ensure instance console log exists: /var/lib/nova/instances/974fea45-f024-430a-bdbb-a615e05d954c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.470 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.470 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.471 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:44.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.724 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Image rbd:vms/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.725 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.725 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Ensure instance console log exists: /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.726 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.727 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.727 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.733 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Start _get_guest_xml network_info=[{"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdc': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-10-02T12:34:43Z,direct_url=<?>,disk_format='raw',id=656ad3f4-b233-4eb8-822c-07efc986a6e0,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-639305243-shelved',owner='1a05e525420b4aa8adcc9561158e73d1',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-10-02T12:34:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': None, 'mount_device': '/dev/vdc', 'attachment_id': '6e0d7c38-c1b5-4721-97f8-ac34aebd9ec3', 'disk_bus': 'virtio', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-d674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'd674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'attached', 'instance': '8736e2a4-70c8-46c1-8ce5-ff68395a22c9', 'attached_at': '', 'detached_at': '', 'volume_id': 'd674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c', 'serial': 'd674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.741 2 WARNING nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.747 2 DEBUG nova.virt.libvirt.host [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.748 2 DEBUG nova.virt.libvirt.host [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.751 2 DEBUG nova.virt.libvirt.host [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.751 2 DEBUG nova.virt.libvirt.host [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.753 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.753 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-10-02T12:34:43Z,direct_url=<?>,disk_format='raw',id=656ad3f4-b233-4eb8-822c-07efc986a6e0,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-639305243-shelved',owner='1a05e525420b4aa8adcc9561158e73d1',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-10-02T12:34:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.753 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.754 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.754 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.754 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.754 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.755 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.755 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.755 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.755 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.756 2 DEBUG nova.virt.hardware [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.756 2 DEBUG nova.objects.instance [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.759 2 DEBUG nova.network.neutron [req-6f356ebd-5429-41ef-9fb3-21c1231d7798 req-cf40ca95-3d23-48d3-9fc3-00783abf226f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.805 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.940 2 DEBUG oslo_concurrency.lockutils [req-6f356ebd-5429-41ef-9fb3-21c1231d7798 req-cf40ca95-3d23-48d3-9fc3-00783abf226f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-974fea45-f024-430a-bdbb-a615e05d954c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.942 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquired lock "refresh_cache-974fea45-f024-430a-bdbb-a615e05d954c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:44 np0005465988 nova_compute[236126]: 2025-10-02 12:35:44.942 2 DEBUG nova.network.neutron [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:35:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:35:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1047015636' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:35:45 np0005465988 nova_compute[236126]: 2025-10-02 12:35:45.376 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.571s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:45 np0005465988 nova_compute[236126]: 2025-10-02 12:35:45.411 2 DEBUG nova.storage.rbd_utils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:45 np0005465988 nova_compute[236126]: 2025-10-02 12:35:45.417 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:35:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2897118727' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:35:45 np0005465988 nova_compute[236126]: 2025-10-02 12:35:45.866 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.043 2 DEBUG nova.virt.libvirt.vif [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:34:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-639305243',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-639305243',id=130,image_ref='656ad3f4-b233-4eb8-822c-07efc986a6e0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1186187448',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:34:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-tje0hz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member',shelved_at='2025-10-02T12:34:53.822058',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='656ad3f4-b233-4eb8-822c-07efc986a6e0'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:35:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=8736e2a4-70c8-46c1-8ce5-ff68395a22c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.044 2 DEBUG nova.network.os_vif_util [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.044 2 DEBUG nova.network.os_vif_util [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.046 2 DEBUG nova.objects.instance [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.100 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  <uuid>8736e2a4-70c8-46c1-8ce5-ff68395a22c9</uuid>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  <name>instance-00000082</name>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-639305243</nova:name>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:35:44</nova:creationTime>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <nova:user uuid="bcd36ab668f449959719ba7058f25e72">tempest-AttachVolumeShelveTestJSON-405673070-project-member</nova:user>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <nova:project uuid="1a05e525420b4aa8adcc9561158e73d1">tempest-AttachVolumeShelveTestJSON-405673070</nova:project>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="656ad3f4-b233-4eb8-822c-07efc986a6e0"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <nova:port uuid="c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <entry name="serial">8736e2a4-70c8-46c1-8ce5-ff68395a22c9</entry>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <entry name="uuid">8736e2a4-70c8-46c1-8ce5-ff68395a22c9</entry>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-d674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <target dev="vdc" bus="virtio"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <serial>d674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c</serial>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:00:a3:24"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <target dev="tapc9dd6bc4-09"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/console.log" append="off"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <input type="keyboard" bus="usb"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:35:46 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:35:46 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:35:46 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:35:46 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.101 2 DEBUG nova.compute.manager [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Preparing to wait for external event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.101 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.102 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.102 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.103 2 DEBUG nova.virt.libvirt.vif [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:34:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-639305243',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-639305243',id=130,image_ref='656ad3f4-b233-4eb8-822c-07efc986a6e0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1186187448',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:34:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-tje0hz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member',shelved_at='2025-10-02T12:34:53.822058',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='656ad3f4-b233-4eb8-822c-07efc986a6e0'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:35:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=8736e2a4-70c8-46c1-8ce5-ff68395a22c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.103 2 DEBUG nova.network.os_vif_util [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.103 2 DEBUG nova.network.os_vif_util [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.104 2 DEBUG os_vif [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.105 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.105 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.107 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9dd6bc4-09, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.108 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc9dd6bc4-09, col_values=(('external_ids', {'iface-id': 'c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:a3:24', 'vm-uuid': '8736e2a4-70c8-46c1-8ce5-ff68395a22c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:46 np0005465988 NetworkManager[45041]: <info>  [1759408546.1102] manager: (tapc9dd6bc4-09): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.117 2 INFO os_vif [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09')#033[00m
Oct  2 08:35:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:46.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.457 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.458 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.458 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.458 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No VIF found with MAC fa:16:3e:00:a3:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.459 2 INFO nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Using config drive#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.494 2 DEBUG nova.storage.rbd_utils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.680 2 DEBUG nova.objects.instance [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:46.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:46 np0005465988 nova_compute[236126]: 2025-10-02 12:35:46.962 2 DEBUG nova.objects.instance [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'keypairs' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:47 np0005465988 nova_compute[236126]: 2025-10-02 12:35:47.363 2 DEBUG nova.network.neutron [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:35:47 np0005465988 nova_compute[236126]: 2025-10-02 12:35:47.555 2 INFO nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Creating config drive at /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config#033[00m
Oct  2 08:35:47 np0005465988 nova_compute[236126]: 2025-10-02 12:35:47.563 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzq946dyd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:47 np0005465988 nova_compute[236126]: 2025-10-02 12:35:47.709 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzq946dyd" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:47 np0005465988 nova_compute[236126]: 2025-10-02 12:35:47.810 2 DEBUG nova.storage.rbd_utils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:47 np0005465988 nova_compute[236126]: 2025-10-02 12:35:47.817 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:48.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:48 np0005465988 nova_compute[236126]: 2025-10-02 12:35:48.613 2 DEBUG oslo_concurrency.processutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config 8736e2a4-70c8-46c1-8ce5-ff68395a22c9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.796s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:48 np0005465988 nova_compute[236126]: 2025-10-02 12:35:48.614 2 INFO nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Deleting local config drive /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9/disk.config because it was imported into RBD.#033[00m
Oct  2 08:35:48 np0005465988 kernel: tapc9dd6bc4-09: entered promiscuous mode
Oct  2 08:35:48 np0005465988 NetworkManager[45041]: <info>  [1759408548.6712] manager: (tapc9dd6bc4-09): new Tun device (/org/freedesktop/NetworkManager/Devices/281)
Oct  2 08:35:48 np0005465988 nova_compute[236126]: 2025-10-02 12:35:48.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:48Z|00629|binding|INFO|Claiming lport c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b for this chassis.
Oct  2 08:35:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:48Z|00630|binding|INFO|c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b: Claiming fa:16:3e:00:a3:24 10.100.0.11
Oct  2 08:35:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:48Z|00631|binding|INFO|Setting lport c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b ovn-installed in OVS
Oct  2 08:35:48 np0005465988 nova_compute[236126]: 2025-10-02 12:35:48.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:35:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:48.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:35:48 np0005465988 nova_compute[236126]: 2025-10-02 12:35:48.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:48 np0005465988 systemd-udevd[296489]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:35:48 np0005465988 systemd-machined[192594]: New machine qemu-62-instance-00000082.
Oct  2 08:35:48 np0005465988 NetworkManager[45041]: <info>  [1759408548.7133] device (tapc9dd6bc4-09): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:35:48 np0005465988 NetworkManager[45041]: <info>  [1759408548.7143] device (tapc9dd6bc4-09): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:35:48 np0005465988 systemd[1]: Started Virtual Machine qemu-62-instance-00000082.
Oct  2 08:35:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:48Z|00632|binding|INFO|Setting lport c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b up in Southbound
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.812 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:a3:24 10.100.0.11'], port_security=['fa:16:3e:00:a3:24 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '8736e2a4-70c8-46c1-8ce5-ff68395a22c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a05e525420b4aa8adcc9561158e73d1', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'cd7967e6-b4ee-4d94-ab54-c08775c150e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.179'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=709db70f-1209-49b9-bf90-2b91d986925d, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.814 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b in datapath 7b216831-24ac-41f0-ac1c-99aae9bc897b bound to our chassis#033[00m
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.817 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b216831-24ac-41f0-ac1c-99aae9bc897b#033[00m
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.832 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2b5822eb-5fca-4faa-bed6-a2300f9b3b26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.833 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b216831-21 in ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.835 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b216831-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.835 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf4d67a-84e8-426e-84cd-02b8442d166b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.836 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dca74cfd-5044-4e83-99d3-2610eab47a61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.853 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8230dc-4d7d-44ba-92ae-ffb79cccf698]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.878 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[69334b6d-e04f-4636-b0db-ee2cb8a2a67c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.913 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2e61f84e-1d36-4112-a8b0-90d4c6fc0360]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:48 np0005465988 systemd-udevd[296491]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:35:48 np0005465988 NetworkManager[45041]: <info>  [1759408548.9225] manager: (tap7b216831-20): new Veth device (/org/freedesktop/NetworkManager/Devices/282)
Oct  2 08:35:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:48.930 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[32628ad8-c894-4e1c-928b-f9bad57ebed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.021 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0c8aca06-1bd7-4331-81e9-70a09631eb20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.024 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d20d81d4-4cd7-4148-87ca-22aeec20dca9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:49 np0005465988 NetworkManager[45041]: <info>  [1759408549.0516] device (tap7b216831-20): carrier: link connected
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.058 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[752d843a-e2fc-4f07-9be0-93e995f95c3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.080 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4eda2e-ac51-4b5d-b792-aa6493c70527]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b216831-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:a4:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660938, 'reachable_time': 30975, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296523, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.100 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4f34878d-a6d0-48e4-b8d2-73c9a67a390d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:a415'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 660938, 'tstamp': 660938}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 296524, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.121 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6ac1d027-3969-455c-8610-3d73dbac4713]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b216831-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:a4:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660938, 'reachable_time': 30975, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 296525, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.156 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5942ae-5a33-4d4b-ac08-5fa72fc9cbfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.221 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed27d3b-5c21-499f-a695-6268a52d0f7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.222 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b216831-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.223 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.224 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b216831-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:49 np0005465988 NetworkManager[45041]: <info>  [1759408549.2279] manager: (tap7b216831-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/283)
Oct  2 08:35:49 np0005465988 kernel: tap7b216831-20: entered promiscuous mode
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.233 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b216831-20, col_values=(('external_ids', {'iface-id': '7b6901ce-64cc-402d-847e-45c0d79bbb3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:49 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:49Z|00633|binding|INFO|Releasing lport 7b6901ce-64cc-402d-847e-45c0d79bbb3b from this chassis (sb_readonly=0)
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.252 2 DEBUG nova.network.neutron [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Updating instance_info_cache with network_info: [{"id": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "address": "fa:16:3e:95:df:98", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91400667-e1", "ovs_interfaceid": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.262 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.263 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2a345421-0e64-49c7-a886-0a504b1bedfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.264 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-7b216831-24ac-41f0-ac1c-99aae9bc897b
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 7b216831-24ac-41f0-ac1c-99aae9bc897b
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:35:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:35:49.264 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'env', 'PROCESS_TAG=haproxy-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b216831-24ac-41f0-ac1c-99aae9bc897b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.440 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Releasing lock "refresh_cache-974fea45-f024-430a-bdbb-a615e05d954c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.441 2 DEBUG nova.compute.manager [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Instance network_info: |[{"id": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "address": "fa:16:3e:95:df:98", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91400667-e1", "ovs_interfaceid": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.444 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Start _get_guest_xml network_info=[{"id": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "address": "fa:16:3e:95:df:98", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91400667-e1", "ovs_interfaceid": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.451 2 WARNING nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.455 2 DEBUG nova.virt.libvirt.host [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.456 2 DEBUG nova.virt.libvirt.host [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.458 2 DEBUG nova.virt.libvirt.host [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.459 2 DEBUG nova.virt.libvirt.host [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.460 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.460 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.461 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.461 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.461 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.461 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.461 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.462 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.462 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.462 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.462 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.462 2 DEBUG nova.virt.hardware [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.465 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.530 2 DEBUG nova.compute.manager [req-74755814-a2ee-48e1-8068-7ba4f8f2ccdc req-7b566666-46fc-4e14-9653-7271b306d82a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.530 2 DEBUG oslo_concurrency.lockutils [req-74755814-a2ee-48e1-8068-7ba4f8f2ccdc req-7b566666-46fc-4e14-9653-7271b306d82a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.531 2 DEBUG oslo_concurrency.lockutils [req-74755814-a2ee-48e1-8068-7ba4f8f2ccdc req-7b566666-46fc-4e14-9653-7271b306d82a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.531 2 DEBUG oslo_concurrency.lockutils [req-74755814-a2ee-48e1-8068-7ba4f8f2ccdc req-7b566666-46fc-4e14-9653-7271b306d82a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.531 2 DEBUG nova.compute.manager [req-74755814-a2ee-48e1-8068-7ba4f8f2ccdc req-7b566666-46fc-4e14-9653-7271b306d82a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Processing event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:35:49 np0005465988 podman[296595]: 2025-10-02 12:35:49.659257058 +0000 UTC m=+0.067495353 container create 99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:35:49 np0005465988 systemd[1]: Started libpod-conmon-99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2.scope.
Oct  2 08:35:49 np0005465988 podman[296595]: 2025-10-02 12:35:49.6110126 +0000 UTC m=+0.019250925 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:35:49 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:35:49 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0847bb8c16e3554679516ff56dcaa4602d206105fae67a01472a4830dd8bcb22/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:35:49 np0005465988 podman[296595]: 2025-10-02 12:35:49.750605175 +0000 UTC m=+0.158843520 container init 99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:35:49 np0005465988 podman[296595]: 2025-10-02 12:35:49.755616459 +0000 UTC m=+0.163854774 container start 99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  2 08:35:49 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[296627]: [NOTICE]   (296631) : New worker (296648) forked
Oct  2 08:35:49 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[296627]: [NOTICE]   (296631) : Loading success.
Oct  2 08:35:49 np0005465988 ovn_controller[132601]: 2025-10-02T12:35:49Z|00634|binding|INFO|Releasing lport 7b6901ce-64cc-402d-847e-45c0d79bbb3b from this chassis (sb_readonly=0)
Oct  2 08:35:49 np0005465988 nova_compute[236126]: 2025-10-02 12:35:49.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:35:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2869866938' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:35:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:50.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:50 np0005465988 nova_compute[236126]: 2025-10-02 12:35:50.344 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408550.3441687, 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:50 np0005465988 nova_compute[236126]: 2025-10-02 12:35:50.345 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] VM Started (Lifecycle Event)#033[00m
Oct  2 08:35:50 np0005465988 nova_compute[236126]: 2025-10-02 12:35:50.348 2 DEBUG nova.compute.manager [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:35:50 np0005465988 nova_compute[236126]: 2025-10-02 12:35:50.352 2 DEBUG nova.virt.libvirt.driver [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:35:50 np0005465988 nova_compute[236126]: 2025-10-02 12:35:50.355 2 INFO nova.virt.libvirt.driver [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Instance spawned successfully.#033[00m
Oct  2 08:35:50 np0005465988 nova_compute[236126]: 2025-10-02 12:35:50.494 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:50 np0005465988 nova_compute[236126]: 2025-10-02 12:35:50.499 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:35:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:50.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:50 np0005465988 nova_compute[236126]: 2025-10-02 12:35:50.790 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:35:50 np0005465988 nova_compute[236126]: 2025-10-02 12:35:50.792 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408550.3443592, 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:50 np0005465988 nova_compute[236126]: 2025-10-02 12:35:50.793 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:35:51 np0005465988 nova_compute[236126]: 2025-10-02 12:35:51.049 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:51 np0005465988 nova_compute[236126]: 2025-10-02 12:35:51.053 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408550.351488, 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:51 np0005465988 nova_compute[236126]: 2025-10-02 12:35:51.053 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:35:51 np0005465988 nova_compute[236126]: 2025-10-02 12:35:51.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:51 np0005465988 nova_compute[236126]: 2025-10-02 12:35:51.732 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:51 np0005465988 nova_compute[236126]: 2025-10-02 12:35:51.737 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:35:51 np0005465988 nova_compute[236126]: 2025-10-02 12:35:51.988 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.015 2 DEBUG nova.storage.rbd_utils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 974fea45-f024-430a-bdbb-a615e05d954c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.019 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.110 2 DEBUG nova.compute.manager [req-5892beba-e5b0-4168-844b-f86bdf9a61ed req-adf27d48-6809-4e87-af19-576684440be1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.110 2 DEBUG oslo_concurrency.lockutils [req-5892beba-e5b0-4168-844b-f86bdf9a61ed req-adf27d48-6809-4e87-af19-576684440be1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.110 2 DEBUG oslo_concurrency.lockutils [req-5892beba-e5b0-4168-844b-f86bdf9a61ed req-adf27d48-6809-4e87-af19-576684440be1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.111 2 DEBUG oslo_concurrency.lockutils [req-5892beba-e5b0-4168-844b-f86bdf9a61ed req-adf27d48-6809-4e87-af19-576684440be1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.111 2 DEBUG nova.compute.manager [req-5892beba-e5b0-4168-844b-f86bdf9a61ed req-adf27d48-6809-4e87-af19-576684440be1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] No waiting events found dispatching network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.111 2 WARNING nova.compute.manager [req-5892beba-e5b0-4168-844b-f86bdf9a61ed req-adf27d48-6809-4e87-af19-576684440be1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received unexpected event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Oct  2 08:35:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:52.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.231 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:35:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:35:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3010537431' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.531 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.533 2 DEBUG nova.virt.libvirt.vif [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:35:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1334211739',display_name='tempest-ServersTestJSON-server-1334211739',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1334211739',id=139,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bc0d63d3b4404ef8858166e8836dd0af',ramdisk_id='',reservation_id='r-2fd6gv8p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-80077074',owner_user_name='tempest-ServersTestJSON-80077074-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:35:36Z,user_data=None,user_id='fe9cc788734f406d826446a848700331',uuid=974fea45-f024-430a-bdbb-a615e05d954c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "address": "fa:16:3e:95:df:98", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91400667-e1", "ovs_interfaceid": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.533 2 DEBUG nova.network.os_vif_util [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converting VIF {"id": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "address": "fa:16:3e:95:df:98", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91400667-e1", "ovs_interfaceid": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.534 2 DEBUG nova.network.os_vif_util [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:df:98,bridge_name='br-int',has_traffic_filtering=True,id=91400667-e168-40d0-8f0a-ffc8c9dd7fa4,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91400667-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:52 np0005465988 nova_compute[236126]: 2025-10-02 12:35:52.538 2 DEBUG nova.objects.instance [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lazy-loading 'pci_devices' on Instance uuid 974fea45-f024-430a-bdbb-a615e05d954c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:35:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:52.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.103 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  <uuid>974fea45-f024-430a-bdbb-a615e05d954c</uuid>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  <name>instance-0000008b</name>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServersTestJSON-server-1334211739</nova:name>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:35:49</nova:creationTime>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <nova:user uuid="fe9cc788734f406d826446a848700331">tempest-ServersTestJSON-80077074-project-member</nova:user>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <nova:project uuid="bc0d63d3b4404ef8858166e8836dd0af">tempest-ServersTestJSON-80077074</nova:project>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <nova:port uuid="91400667-e168-40d0-8f0a-ffc8c9dd7fa4">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <entry name="serial">974fea45-f024-430a-bdbb-a615e05d954c</entry>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <entry name="uuid">974fea45-f024-430a-bdbb-a615e05d954c</entry>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/974fea45-f024-430a-bdbb-a615e05d954c_disk">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/974fea45-f024-430a-bdbb-a615e05d954c_disk.config">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:95:df:98"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <target dev="tap91400667-e1"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/974fea45-f024-430a-bdbb-a615e05d954c/console.log" append="off"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:35:53 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:35:53 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:35:53 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:35:53 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.117 2 DEBUG nova.compute.manager [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Preparing to wait for external event network-vif-plugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.118 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "974fea45-f024-430a-bdbb-a615e05d954c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.118 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.118 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.119 2 DEBUG nova.virt.libvirt.vif [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:35:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1334211739',display_name='tempest-ServersTestJSON-server-1334211739',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1334211739',id=139,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bc0d63d3b4404ef8858166e8836dd0af',ramdisk_id='',reservation_id='r-2fd6gv8p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-80077074',owner_user_name='tempest-ServersTestJSON-80077074-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:35:36Z,user_data=None,user_id='fe9cc788734f406d826446a848700331',uuid=974fea45-f024-430a-bdbb-a615e05d954c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "address": "fa:16:3e:95:df:98", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91400667-e1", "ovs_interfaceid": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.119 2 DEBUG nova.network.os_vif_util [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converting VIF {"id": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "address": "fa:16:3e:95:df:98", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91400667-e1", "ovs_interfaceid": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.120 2 DEBUG nova.network.os_vif_util [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:df:98,bridge_name='br-int',has_traffic_filtering=True,id=91400667-e168-40d0-8f0a-ffc8c9dd7fa4,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91400667-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.121 2 DEBUG os_vif [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:df:98,bridge_name='br-int',has_traffic_filtering=True,id=91400667-e168-40d0-8f0a-ffc8c9dd7fa4,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91400667-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.122 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.122 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.133 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap91400667-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.134 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap91400667-e1, col_values=(('external_ids', {'iface-id': '91400667-e168-40d0-8f0a-ffc8c9dd7fa4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:df:98', 'vm-uuid': '974fea45-f024-430a-bdbb-a615e05d954c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:53 np0005465988 NetworkManager[45041]: <info>  [1759408553.1380] manager: (tap91400667-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.145 2 INFO os_vif [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:df:98,bridge_name='br-int',has_traffic_filtering=True,id=91400667-e168-40d0-8f0a-ffc8c9dd7fa4,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91400667-e1')#033[00m
Oct  2 08:35:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.557 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.984 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.985 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.985 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] No VIF found with MAC fa:16:3e:95:df:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:35:53 np0005465988 nova_compute[236126]: 2025-10-02 12:35:53.986 2 INFO nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Using config drive#033[00m
Oct  2 08:35:54 np0005465988 nova_compute[236126]: 2025-10-02 12:35:54.019 2 DEBUG nova.storage.rbd_utils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 974fea45-f024-430a-bdbb-a615e05d954c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:54 np0005465988 nova_compute[236126]: 2025-10-02 12:35:54.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:54.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e330 e330: 3 total, 3 up, 3 in
Oct  2 08:35:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:54.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:56 np0005465988 nova_compute[236126]: 2025-10-02 12:35:56.045 2 INFO nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Creating config drive at /var/lib/nova/instances/974fea45-f024-430a-bdbb-a615e05d954c/disk.config#033[00m
Oct  2 08:35:56 np0005465988 nova_compute[236126]: 2025-10-02 12:35:56.052 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/974fea45-f024-430a-bdbb-a615e05d954c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvf9z50sd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:56.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:56 np0005465988 nova_compute[236126]: 2025-10-02 12:35:56.222 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/974fea45-f024-430a-bdbb-a615e05d954c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvf9z50sd" returned: 0 in 0.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:56 np0005465988 nova_compute[236126]: 2025-10-02 12:35:56.418 2 DEBUG nova.storage.rbd_utils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] rbd image 974fea45-f024-430a-bdbb-a615e05d954c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:35:56 np0005465988 nova_compute[236126]: 2025-10-02 12:35:56.423 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/974fea45-f024-430a-bdbb-a615e05d954c/disk.config 974fea45-f024-430a-bdbb-a615e05d954c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:56.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:57 np0005465988 nova_compute[236126]: 2025-10-02 12:35:57.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:57 np0005465988 nova_compute[236126]: 2025-10-02 12:35:57.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:57 np0005465988 podman[296771]: 2025-10-02 12:35:57.548561356 +0000 UTC m=+0.072457115 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:35:57 np0005465988 podman[296776]: 2025-10-02 12:35:57.556281008 +0000 UTC m=+0.069583812 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:35:57 np0005465988 podman[296770]: 2025-10-02 12:35:57.580277098 +0000 UTC m=+0.111972611 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:35:57 np0005465988 nova_compute[236126]: 2025-10-02 12:35:57.626 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:57 np0005465988 nova_compute[236126]: 2025-10-02 12:35:57.627 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:57 np0005465988 nova_compute[236126]: 2025-10-02 12:35:57.627 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:57 np0005465988 nova_compute[236126]: 2025-10-02 12:35:57.627 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:35:57 np0005465988 nova_compute[236126]: 2025-10-02 12:35:57.628 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:35:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2397694068' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:35:58 np0005465988 nova_compute[236126]: 2025-10-02 12:35:58.071 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:58 np0005465988 nova_compute[236126]: 2025-10-02 12:35:58.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:35:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:35:58.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:35:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e330 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:35:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:35:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:35:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:35:58.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:35:58 np0005465988 nova_compute[236126]: 2025-10-02 12:35:58.952 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:35:58 np0005465988 nova_compute[236126]: 2025-10-02 12:35:58.953 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:35:58 np0005465988 nova_compute[236126]: 2025-10-02 12:35:58.965 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:35:58 np0005465988 nova_compute[236126]: 2025-10-02 12:35:58.965 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:35:58 np0005465988 nova_compute[236126]: 2025-10-02 12:35:58.965 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:35:59 np0005465988 nova_compute[236126]: 2025-10-02 12:35:59.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:59 np0005465988 nova_compute[236126]: 2025-10-02 12:35:59.161 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:35:59 np0005465988 nova_compute[236126]: 2025-10-02 12:35:59.163 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4076MB free_disk=20.71868896484375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:35:59 np0005465988 nova_compute[236126]: 2025-10-02 12:35:59.163 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:59 np0005465988 nova_compute[236126]: 2025-10-02 12:35:59.163 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:36:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:00.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:36:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:00.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:00 np0005465988 nova_compute[236126]: 2025-10-02 12:36:00.818 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:36:00 np0005465988 nova_compute[236126]: 2025-10-02 12:36:00.819 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 974fea45-f024-430a-bdbb-a615e05d954c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:36:00 np0005465988 nova_compute[236126]: 2025-10-02 12:36:00.819 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:36:00 np0005465988 nova_compute[236126]: 2025-10-02 12:36:00.820 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:36:00 np0005465988 nova_compute[236126]: 2025-10-02 12:36:00.924 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:36:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/109249967' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:36:01 np0005465988 nova_compute[236126]: 2025-10-02 12:36:01.377 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:01 np0005465988 nova_compute[236126]: 2025-10-02 12:36:01.385 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:36:01 np0005465988 nova_compute[236126]: 2025-10-02 12:36:01.598 2 DEBUG nova.compute.manager [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:01 np0005465988 nova_compute[236126]: 2025-10-02 12:36:01.631 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:36:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:02.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:02 np0005465988 nova_compute[236126]: 2025-10-02 12:36:02.644 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:36:02 np0005465988 nova_compute[236126]: 2025-10-02 12:36:02.644 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:02.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:02 np0005465988 nova_compute[236126]: 2025-10-02 12:36:02.744 2 DEBUG oslo_concurrency.lockutils [None req-ef4c8647-9a2e-4ce4-a2ca-cd00caf7d245 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 38.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:02 np0005465988 nova_compute[236126]: 2025-10-02 12:36:02.746 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 28.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:02 np0005465988 nova_compute[236126]: 2025-10-02 12:36:02.746 2 INFO nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:36:02 np0005465988 nova_compute[236126]: 2025-10-02 12:36:02.746 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:03 np0005465988 nova_compute[236126]: 2025-10-02 12:36:03.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e330 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:04 np0005465988 nova_compute[236126]: 2025-10-02 12:36:04.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:04.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:04 np0005465988 nova_compute[236126]: 2025-10-02 12:36:04.284 2 DEBUG oslo_concurrency.processutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/974fea45-f024-430a-bdbb-a615e05d954c/disk.config 974fea45-f024-430a-bdbb-a615e05d954c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 7.861s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:04 np0005465988 nova_compute[236126]: 2025-10-02 12:36:04.284 2 INFO nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Deleting local config drive /var/lib/nova/instances/974fea45-f024-430a-bdbb-a615e05d954c/disk.config because it was imported into RBD.#033[00m
Oct  2 08:36:04 np0005465988 kernel: tap91400667-e1: entered promiscuous mode
Oct  2 08:36:04 np0005465988 NetworkManager[45041]: <info>  [1759408564.3382] manager: (tap91400667-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/285)
Oct  2 08:36:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:04Z|00635|binding|INFO|Claiming lport 91400667-e168-40d0-8f0a-ffc8c9dd7fa4 for this chassis.
Oct  2 08:36:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:04Z|00636|binding|INFO|91400667-e168-40d0-8f0a-ffc8c9dd7fa4: Claiming fa:16:3e:95:df:98 10.100.0.8
Oct  2 08:36:04 np0005465988 nova_compute[236126]: 2025-10-02 12:36:04.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:04 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:04Z|00637|binding|INFO|Setting lport 91400667-e168-40d0-8f0a-ffc8c9dd7fa4 ovn-installed in OVS
Oct  2 08:36:04 np0005465988 nova_compute[236126]: 2025-10-02 12:36:04.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:04 np0005465988 nova_compute[236126]: 2025-10-02 12:36:04.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:04 np0005465988 systemd-udevd[296899]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:36:04 np0005465988 systemd-machined[192594]: New machine qemu-63-instance-0000008b.
Oct  2 08:36:04 np0005465988 NetworkManager[45041]: <info>  [1759408564.3879] device (tap91400667-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:36:04 np0005465988 NetworkManager[45041]: <info>  [1759408564.3886] device (tap91400667-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:36:04 np0005465988 systemd[1]: Started Virtual Machine qemu-63-instance-0000008b.
Oct  2 08:36:04 np0005465988 nova_compute[236126]: 2025-10-02 12:36:04.645 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:04 np0005465988 nova_compute[236126]: 2025-10-02 12:36:04.647 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:04 np0005465988 nova_compute[236126]: 2025-10-02 12:36:04.647 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:04 np0005465988 nova_compute[236126]: 2025-10-02 12:36:04.647 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:36:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:04.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:05 np0005465988 nova_compute[236126]: 2025-10-02 12:36:05.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 e331: 3 total, 3 up, 3 in
Oct  2 08:36:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:05Z|00638|binding|INFO|Setting lport 91400667-e168-40d0-8f0a-ffc8c9dd7fa4 up in Southbound
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.679 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:df:98 10.100.0.8'], port_security=['fa:16:3e:95:df:98 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '974fea45-f024-430a-bdbb-a615e05d954c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc0d63d3b4404ef8858166e8836dd0af', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2a11ff87-bec6-4638-b302-adcd655efba9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=797b6af2-473b-4626-9e97-a0a489119419, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=91400667-e168-40d0-8f0a-ffc8c9dd7fa4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.680 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 91400667-e168-40d0-8f0a-ffc8c9dd7fa4 in datapath d7203b00-e5e4-402e-b777-ac6280fa23ac bound to our chassis#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.682 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7203b00-e5e4-402e-b777-ac6280fa23ac#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.695 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[05a4db3f-5728-4d5e-a552-596befe74fef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.696 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7203b00-e1 in ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.701 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7203b00-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.702 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1fcce466-d2fe-44e2-8170-b5075225a5e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.702 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ff11c73a-d572-4b91-8e6b-61a26297a365]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.724 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[c86c981b-e120-4244-b826-21779eb3ab90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.753 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e0e2a21a-effb-4596-8133-7527682350dc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.788 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e61c1a66-ea40-4d70-8936-7ac0ab11ad6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 systemd-udevd[296902]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:36:05 np0005465988 NetworkManager[45041]: <info>  [1759408565.7971] manager: (tapd7203b00-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/286)
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.793 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[238f5be9-7870-4389-8b6c-6f55bf2a9973]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.835 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b36e16a0-6993-4f64-851a-febc9de5cc73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.839 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a2b00a-c030-4740-bb39-e7298a3b2dc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 NetworkManager[45041]: <info>  [1759408565.8806] device (tapd7203b00-e0): carrier: link connected
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.885 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0eea72-44ab-49a7-b6c2-282b19e41982]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 nova_compute[236126]: 2025-10-02 12:36:05.902 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408565.902182, 974fea45-f024-430a-bdbb-a615e05d954c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:05 np0005465988 nova_compute[236126]: 2025-10-02 12:36:05.903 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.905 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c172e3b6-7e07-4bd7-a642-1f24400ec2d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7203b00-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c4:e9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662620, 'reachable_time': 35389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296976, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.918 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1355087d-94b2-4a8f-9ba7-1807896c9cc3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:c4e9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662620, 'tstamp': 662620}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 296977, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.936 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[76fb2be9-1c03-4080-8abc-5bbc77013e0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7203b00-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c4:e9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662620, 'reachable_time': 35389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 296978, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:05.971 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac0c6de-5596-4ee5-bfb3-3ea7307996eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:06.033 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6a8ddc30-ae43-4630-8819-a79d1c174a7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:06.037 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7203b00-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:06.037 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:06.038 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7203b00-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:06 np0005465988 nova_compute[236126]: 2025-10-02 12:36:06.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:06 np0005465988 NetworkManager[45041]: <info>  [1759408566.0417] manager: (tapd7203b00-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Oct  2 08:36:06 np0005465988 kernel: tapd7203b00-e0: entered promiscuous mode
Oct  2 08:36:06 np0005465988 nova_compute[236126]: 2025-10-02 12:36:06.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:06.046 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7203b00-e0, col_values=(('external_ids', {'iface-id': '6f9d54ba-3cfb-48b9-bef7-b2077e6931d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:06 np0005465988 nova_compute[236126]: 2025-10-02 12:36:06.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:06 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:06Z|00639|binding|INFO|Releasing lport 6f9d54ba-3cfb-48b9-bef7-b2077e6931d7 from this chassis (sb_readonly=0)
Oct  2 08:36:06 np0005465988 nova_compute[236126]: 2025-10-02 12:36:06.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:06.076 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7203b00-e5e4-402e-b777-ac6280fa23ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7203b00-e5e4-402e-b777-ac6280fa23ac.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:06.077 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b339aa3a-e6ed-4565-b1d5-0e0109a53a99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:06.078 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-d7203b00-e5e4-402e-b777-ac6280fa23ac
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/d7203b00-e5e4-402e-b777-ac6280fa23ac.pid.haproxy
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID d7203b00-e5e4-402e-b777-ac6280fa23ac
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:36:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:06.079 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'env', 'PROCESS_TAG=haproxy-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7203b00-e5e4-402e-b777-ac6280fa23ac.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:36:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:06.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:06 np0005465988 nova_compute[236126]: 2025-10-02 12:36:06.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:06 np0005465988 podman[297011]: 2025-10-02 12:36:06.497919834 +0000 UTC m=+0.076240384 container create d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 08:36:06 np0005465988 podman[297011]: 2025-10-02 12:36:06.455459063 +0000 UTC m=+0.033779603 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:36:06 np0005465988 systemd[1]: Started libpod-conmon-d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6.scope.
Oct  2 08:36:06 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:36:06 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec5c323268d4993353a6a72de6b0e9ad592189fa974d6b70b2cbb12938ba1852/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:36:06 np0005465988 podman[297011]: 2025-10-02 12:36:06.617192615 +0000 UTC m=+0.195513155 container init d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:36:06 np0005465988 podman[297011]: 2025-10-02 12:36:06.624351711 +0000 UTC m=+0.202672231 container start d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:36:06 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[297027]: [NOTICE]   (297031) : New worker (297033) forked
Oct  2 08:36:06 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[297027]: [NOTICE]   (297031) : Loading success.
Oct  2 08:36:06 np0005465988 nova_compute[236126]: 2025-10-02 12:36:06.668 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:06 np0005465988 nova_compute[236126]: 2025-10-02 12:36:06.673 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408565.902946, 974fea45-f024-430a-bdbb-a615e05d954c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:06 np0005465988 nova_compute[236126]: 2025-10-02 12:36:06.673 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:36:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:06.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.211 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.217 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.370 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:36:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:07Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:00:a3:24 10.100.0.11
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.561 2 DEBUG nova.compute.manager [req-ffb38ec5-a980-4da4-8f85-05872e7fda98 req-6a39691d-f54d-4159-9333-4ef81de0fe25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Received event network-vif-plugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.561 2 DEBUG oslo_concurrency.lockutils [req-ffb38ec5-a980-4da4-8f85-05872e7fda98 req-6a39691d-f54d-4159-9333-4ef81de0fe25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "974fea45-f024-430a-bdbb-a615e05d954c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.562 2 DEBUG oslo_concurrency.lockutils [req-ffb38ec5-a980-4da4-8f85-05872e7fda98 req-6a39691d-f54d-4159-9333-4ef81de0fe25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.562 2 DEBUG oslo_concurrency.lockutils [req-ffb38ec5-a980-4da4-8f85-05872e7fda98 req-6a39691d-f54d-4159-9333-4ef81de0fe25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.562 2 DEBUG nova.compute.manager [req-ffb38ec5-a980-4da4-8f85-05872e7fda98 req-6a39691d-f54d-4159-9333-4ef81de0fe25 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Processing event network-vif-plugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.563 2 DEBUG nova.compute.manager [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.568 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.569 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408567.5685875, 974fea45-f024-430a-bdbb-a615e05d954c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.569 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.573 2 INFO nova.virt.libvirt.driver [-] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Instance spawned successfully.#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.574 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.672 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.679 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.680 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.681 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.681 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.682 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.682 2 DEBUG nova.virt.libvirt.driver [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:07 np0005465988 nova_compute[236126]: 2025-10-02 12:36:07.686 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:36:08 np0005465988 nova_compute[236126]: 2025-10-02 12:36:08.038 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:36:08 np0005465988 nova_compute[236126]: 2025-10-02 12:36:08.092 2 INFO nova.compute.manager [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Took 31.04 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:36:08 np0005465988 nova_compute[236126]: 2025-10-02 12:36:08.093 2 DEBUG nova.compute.manager [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:08 np0005465988 nova_compute[236126]: 2025-10-02 12:36:08.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:08 np0005465988 nova_compute[236126]: 2025-10-02 12:36:08.217 2 INFO nova.compute.manager [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Took 35.51 seconds to build instance.#033[00m
Oct  2 08:36:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:36:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:08.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:36:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:08 np0005465988 nova_compute[236126]: 2025-10-02 12:36:08.302 2 DEBUG oslo_concurrency.lockutils [None req-7e70ed3b-f115-43fa-a1cd-1dc841608c45 fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 36.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:08 np0005465988 nova_compute[236126]: 2025-10-02 12:36:08.305 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "974fea45-f024-430a-bdbb-a615e05d954c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 34.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:08 np0005465988 nova_compute[236126]: 2025-10-02 12:36:08.457 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "974fea45-f024-430a-bdbb-a615e05d954c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:08.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:09 np0005465988 nova_compute[236126]: 2025-10-02 12:36:09.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:09 np0005465988 podman[297094]: 2025-10-02 12:36:09.542934397 +0000 UTC m=+0.078933571 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:36:09 np0005465988 nova_compute[236126]: 2025-10-02 12:36:09.724 2 DEBUG nova.compute.manager [req-e8abd3b9-78f9-4879-96f1-feef239ac65b req-cd0f7d7e-c273-4d51-aca4-59bb9b8bdc34 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Received event network-vif-plugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:09 np0005465988 nova_compute[236126]: 2025-10-02 12:36:09.724 2 DEBUG oslo_concurrency.lockutils [req-e8abd3b9-78f9-4879-96f1-feef239ac65b req-cd0f7d7e-c273-4d51-aca4-59bb9b8bdc34 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "974fea45-f024-430a-bdbb-a615e05d954c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:09 np0005465988 nova_compute[236126]: 2025-10-02 12:36:09.725 2 DEBUG oslo_concurrency.lockutils [req-e8abd3b9-78f9-4879-96f1-feef239ac65b req-cd0f7d7e-c273-4d51-aca4-59bb9b8bdc34 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:09 np0005465988 nova_compute[236126]: 2025-10-02 12:36:09.725 2 DEBUG oslo_concurrency.lockutils [req-e8abd3b9-78f9-4879-96f1-feef239ac65b req-cd0f7d7e-c273-4d51-aca4-59bb9b8bdc34 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:09 np0005465988 nova_compute[236126]: 2025-10-02 12:36:09.725 2 DEBUG nova.compute.manager [req-e8abd3b9-78f9-4879-96f1-feef239ac65b req-cd0f7d7e-c273-4d51-aca4-59bb9b8bdc34 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] No waiting events found dispatching network-vif-plugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:09 np0005465988 nova_compute[236126]: 2025-10-02 12:36:09.725 2 WARNING nova.compute.manager [req-e8abd3b9-78f9-4879-96f1-feef239ac65b req-cd0f7d7e-c273-4d51-aca4-59bb9b8bdc34 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Received unexpected event network-vif-plugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:36:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 08:36:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:10.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 08:36:10 np0005465988 nova_compute[236126]: 2025-10-02 12:36:10.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:10 np0005465988 nova_compute[236126]: 2025-10-02 12:36:10.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:36:10 np0005465988 nova_compute[236126]: 2025-10-02 12:36:10.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:36:10 np0005465988 nova_compute[236126]: 2025-10-02 12:36:10.666 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-974fea45-f024-430a-bdbb-a615e05d954c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:10 np0005465988 nova_compute[236126]: 2025-10-02 12:36:10.666 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-974fea45-f024-430a-bdbb-a615e05d954c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:10 np0005465988 nova_compute[236126]: 2025-10-02 12:36:10.667 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:36:10 np0005465988 nova_compute[236126]: 2025-10-02 12:36:10.667 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 974fea45-f024-430a-bdbb-a615e05d954c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:10.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:12 np0005465988 nova_compute[236126]: 2025-10-02 12:36:12.016 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Updating instance_info_cache with network_info: [{"id": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "address": "fa:16:3e:95:df:98", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91400667-e1", "ovs_interfaceid": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:12 np0005465988 nova_compute[236126]: 2025-10-02 12:36:12.072 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-974fea45-f024-430a-bdbb-a615e05d954c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:12 np0005465988 nova_compute[236126]: 2025-10-02 12:36:12.073 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:36:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:36:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:12.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:36:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:12.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:13 np0005465988 nova_compute[236126]: 2025-10-02 12:36:13.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:14 np0005465988 nova_compute[236126]: 2025-10-02 12:36:14.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:14.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:14.527 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:14.529 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:36:14 np0005465988 nova_compute[236126]: 2025-10-02 12:36:14.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:36:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:14.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:36:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:16.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:16.531 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:36:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:16.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:36:18 np0005465988 nova_compute[236126]: 2025-10-02 12:36:18.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:18.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:18.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:19 np0005465988 nova_compute[236126]: 2025-10-02 12:36:19.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:36:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:20.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:36:20 np0005465988 nova_compute[236126]: 2025-10-02 12:36:20.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:20 np0005465988 nova_compute[236126]: 2025-10-02 12:36:20.633 2 DEBUG oslo_concurrency.lockutils [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:20 np0005465988 nova_compute[236126]: 2025-10-02 12:36:20.634 2 DEBUG oslo_concurrency.lockutils [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:20 np0005465988 nova_compute[236126]: 2025-10-02 12:36:20.657 2 INFO nova.compute.manager [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Detaching volume d674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c#033[00m
Oct  2 08:36:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:20.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:20 np0005465988 nova_compute[236126]: 2025-10-02 12:36:20.867 2 INFO nova.virt.block_device [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Attempting to driver detach volume d674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c from mountpoint /dev/vdc#033[00m
Oct  2 08:36:20 np0005465988 nova_compute[236126]: 2025-10-02 12:36:20.877 2 DEBUG nova.virt.libvirt.driver [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Attempting to detach device vdc from instance 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:36:20 np0005465988 nova_compute[236126]: 2025-10-02 12:36:20.878 2 DEBUG nova.virt.libvirt.guest [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-d674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c">
Oct  2 08:36:20 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  <target dev="vdc" bus="virtio"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  <serial>d674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c</serial>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:36:20 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:36:20 np0005465988 nova_compute[236126]: 2025-10-02 12:36:20.956 2 INFO nova.virt.libvirt.driver [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Successfully detached device vdc from instance 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 from the persistent domain config.#033[00m
Oct  2 08:36:20 np0005465988 nova_compute[236126]: 2025-10-02 12:36:20.957 2 DEBUG nova.virt.libvirt.driver [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:36:20 np0005465988 nova_compute[236126]: 2025-10-02 12:36:20.958 2 DEBUG nova.virt.libvirt.guest [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-d674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c">
Oct  2 08:36:20 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  <target dev="vdc" bus="virtio"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  <serial>d674cd8c-5d45-4ef7-9cb1-cd5cf5c7d43c</serial>
Oct  2 08:36:20 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct  2 08:36:20 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:36:20 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:36:21 np0005465988 nova_compute[236126]: 2025-10-02 12:36:21.429 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Received event <DeviceRemovedEvent: 1759408581.4292598, 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:36:21 np0005465988 nova_compute[236126]: 2025-10-02 12:36:21.432 2 DEBUG nova.virt.libvirt.driver [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:36:21 np0005465988 nova_compute[236126]: 2025-10-02 12:36:21.434 2 INFO nova.virt.libvirt.driver [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Successfully detached device vdc from instance 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 from the live domain config.#033[00m
Oct  2 08:36:21 np0005465988 nova_compute[236126]: 2025-10-02 12:36:21.594 2 DEBUG nova.objects.instance [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'flavor' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:21 np0005465988 nova_compute[236126]: 2025-10-02 12:36:21.629 2 DEBUG oslo_concurrency.lockutils [None req-6166fc38-faae-43e6-9976-490fe9ed7226 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.995s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:21 np0005465988 podman[297395]: 2025-10-02 12:36:21.718420569 +0000 UTC m=+0.022658923 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 08:36:21 np0005465988 podman[297395]: 2025-10-02 12:36:21.913070488 +0000 UTC m=+0.217308802 container create bef1eb54ca847e1fefadc86536a6856bb69a528359f343d3ae985c2b405a4fea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_snyder, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 08:36:21 np0005465988 systemd[1]: Started libpod-conmon-bef1eb54ca847e1fefadc86536a6856bb69a528359f343d3ae985c2b405a4fea.scope.
Oct  2 08:36:22 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:36:22 np0005465988 nova_compute[236126]: 2025-10-02 12:36:22.159 2 DEBUG oslo_concurrency.lockutils [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:22 np0005465988 nova_compute[236126]: 2025-10-02 12:36:22.160 2 DEBUG oslo_concurrency.lockutils [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:22 np0005465988 nova_compute[236126]: 2025-10-02 12:36:22.160 2 DEBUG oslo_concurrency.lockutils [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:22 np0005465988 nova_compute[236126]: 2025-10-02 12:36:22.160 2 DEBUG oslo_concurrency.lockutils [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:22 np0005465988 nova_compute[236126]: 2025-10-02 12:36:22.160 2 DEBUG oslo_concurrency.lockutils [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:22 np0005465988 nova_compute[236126]: 2025-10-02 12:36:22.161 2 INFO nova.compute.manager [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Terminating instance#033[00m
Oct  2 08:36:22 np0005465988 nova_compute[236126]: 2025-10-02 12:36:22.162 2 DEBUG nova.compute.manager [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:36:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:36:22 np0005465988 podman[297395]: 2025-10-02 12:36:22.200545707 +0000 UTC m=+0.504784051 container init bef1eb54ca847e1fefadc86536a6856bb69a528359f343d3ae985c2b405a4fea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_snyder, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Oct  2 08:36:22 np0005465988 podman[297395]: 2025-10-02 12:36:22.209955717 +0000 UTC m=+0.514194031 container start bef1eb54ca847e1fefadc86536a6856bb69a528359f343d3ae985c2b405a4fea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_snyder, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Oct  2 08:36:22 np0005465988 systemd[1]: libpod-bef1eb54ca847e1fefadc86536a6856bb69a528359f343d3ae985c2b405a4fea.scope: Deactivated successfully.
Oct  2 08:36:22 np0005465988 serene_snyder[297412]: 167 167
Oct  2 08:36:22 np0005465988 conmon[297412]: conmon bef1eb54ca847e1fefad <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bef1eb54ca847e1fefadc86536a6856bb69a528359f343d3ae985c2b405a4fea.scope/container/memory.events
Oct  2 08:36:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:22.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:22 np0005465988 podman[297395]: 2025-10-02 12:36:22.469485452 +0000 UTC m=+0.773723846 container attach bef1eb54ca847e1fefadc86536a6856bb69a528359f343d3ae985c2b405a4fea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_snyder, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 08:36:22 np0005465988 podman[297395]: 2025-10-02 12:36:22.470352887 +0000 UTC m=+0.774591231 container died bef1eb54ca847e1fefadc86536a6856bb69a528359f343d3ae985c2b405a4fea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_snyder, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True)
Oct  2 08:36:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:22.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:22 np0005465988 kernel: tapc9dd6bc4-09 (unregistering): left promiscuous mode
Oct  2 08:36:22 np0005465988 NetworkManager[45041]: <info>  [1759408582.7682] device (tapc9dd6bc4-09): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:36:22 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:22Z|00640|binding|INFO|Releasing lport c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b from this chassis (sb_readonly=0)
Oct  2 08:36:22 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:22Z|00641|binding|INFO|Setting lport c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b down in Southbound
Oct  2 08:36:22 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:22Z|00642|binding|INFO|Removing iface tapc9dd6bc4-09 ovn-installed in OVS
Oct  2 08:36:22 np0005465988 nova_compute[236126]: 2025-10-02 12:36:22.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:22 np0005465988 nova_compute[236126]: 2025-10-02 12:36:22.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:22.794 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:a3:24 10.100.0.11'], port_security=['fa:16:3e:00:a3:24 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '8736e2a4-70c8-46c1-8ce5-ff68395a22c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a05e525420b4aa8adcc9561158e73d1', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'cd7967e6-b4ee-4d94-ab54-c08775c150e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.179', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=709db70f-1209-49b9-bf90-2b91d986925d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:22.796 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b in datapath 7b216831-24ac-41f0-ac1c-99aae9bc897b unbound from our chassis#033[00m
Oct  2 08:36:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:22.798 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b216831-24ac-41f0-ac1c-99aae9bc897b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:36:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:22.800 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3b1fe0-958f-4122-a372-2f8ce48f49c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:22.800 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b namespace which is not needed anymore#033[00m
Oct  2 08:36:22 np0005465988 nova_compute[236126]: 2025-10-02 12:36:22.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:22 np0005465988 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000082.scope: Deactivated successfully.
Oct  2 08:36:22 np0005465988 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000082.scope: Consumed 16.177s CPU time.
Oct  2 08:36:22 np0005465988 systemd-machined[192594]: Machine qemu-62-instance-00000082 terminated.
Oct  2 08:36:22 np0005465988 systemd[1]: var-lib-containers-storage-overlay-c57a8138b784ebb2040d86450a6a1e525db497cbe3fbcd4da629b4c5d6459e95-merged.mount: Deactivated successfully.
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.007 2 INFO nova.virt.libvirt.driver [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Instance destroyed successfully.#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.007 2 DEBUG nova.objects.instance [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'resources' on Instance uuid 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.031 2 DEBUG nova.virt.libvirt.vif [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:34:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-639305243',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-639305243',id=130,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH7i72qZf0LyRp/akt/bIu4snLJg8XuAUIsHpF3xOK1XlpVLYZ/YFzz7wr2QY5za8QZBy0/Efb6X+c12F9Zi3EqjS+0mqhH0nerFk7xvdGE6zlwRcwJDWaW/qlypPLaWbQ==',key_name='tempest-keypair-1186187448',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-tje0hz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=8736e2a4-70c8-46c1-8ce5-ff68395a22c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.032 2 DEBUG nova.network.os_vif_util [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "address": "fa:16:3e:00:a3:24", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9dd6bc4-09", "ovs_interfaceid": "c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.033 2 DEBUG nova.network.os_vif_util [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.033 2 DEBUG os_vif [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.035 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9dd6bc4-09, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.040 2 INFO os_vif [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:a3:24,bridge_name='br-int',has_traffic_filtering=True,id=c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9dd6bc4-09')#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.084 2 DEBUG nova.compute.manager [req-3b5ed7d1-b421-4bf8-b761-24b1713319db req-8c43de5e-4eb4-4736-9084-9d986b45be30 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-vif-unplugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.085 2 DEBUG oslo_concurrency.lockutils [req-3b5ed7d1-b421-4bf8-b761-24b1713319db req-8c43de5e-4eb4-4736-9084-9d986b45be30 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.085 2 DEBUG oslo_concurrency.lockutils [req-3b5ed7d1-b421-4bf8-b761-24b1713319db req-8c43de5e-4eb4-4736-9084-9d986b45be30 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.085 2 DEBUG oslo_concurrency.lockutils [req-3b5ed7d1-b421-4bf8-b761-24b1713319db req-8c43de5e-4eb4-4736-9084-9d986b45be30 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.085 2 DEBUG nova.compute.manager [req-3b5ed7d1-b421-4bf8-b761-24b1713319db req-8c43de5e-4eb4-4736-9084-9d986b45be30 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] No waiting events found dispatching network-vif-unplugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.086 2 DEBUG nova.compute.manager [req-3b5ed7d1-b421-4bf8-b761-24b1713319db req-8c43de5e-4eb4-4736-9084-9d986b45be30 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-vif-unplugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:36:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:23 np0005465988 podman[297395]: 2025-10-02 12:36:23.438859714 +0000 UTC m=+1.743098068 container remove bef1eb54ca847e1fefadc86536a6856bb69a528359f343d3ae985c2b405a4fea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_snyder, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct  2 08:36:23 np0005465988 systemd[1]: libpod-conmon-bef1eb54ca847e1fefadc86536a6856bb69a528359f343d3ae985c2b405a4fea.scope: Deactivated successfully.
Oct  2 08:36:23 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:36:23 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:36:23 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:36:23 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[296627]: [NOTICE]   (296631) : haproxy version is 2.8.14-c23fe91
Oct  2 08:36:23 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[296627]: [NOTICE]   (296631) : path to executable is /usr/sbin/haproxy
Oct  2 08:36:23 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[296627]: [WARNING]  (296631) : Exiting Master process...
Oct  2 08:36:23 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[296627]: [ALERT]    (296631) : Current worker (296648) exited with code 143 (Terminated)
Oct  2 08:36:23 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[296627]: [WARNING]  (296631) : All workers exited. Exiting... (0)
Oct  2 08:36:23 np0005465988 systemd[1]: libpod-99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2.scope: Deactivated successfully.
Oct  2 08:36:23 np0005465988 conmon[296627]: conmon 99a56af69af5abd7f6ad <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2.scope/container/memory.events
Oct  2 08:36:23 np0005465988 podman[297483]: 2025-10-02 12:36:23.751894468 +0000 UTC m=+0.179800863 container died 99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:36:23 np0005465988 nova_compute[236126]: 2025-10-02 12:36:23.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:23 np0005465988 podman[297501]: 2025-10-02 12:36:23.756398207 +0000 UTC m=+0.114438512 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 08:36:23 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:23Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:df:98 10.100.0.8
Oct  2 08:36:23 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:23Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:df:98 10.100.0.8
Oct  2 08:36:24 np0005465988 nova_compute[236126]: 2025-10-02 12:36:24.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:24 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2-userdata-shm.mount: Deactivated successfully.
Oct  2 08:36:24 np0005465988 systemd[1]: var-lib-containers-storage-overlay-0847bb8c16e3554679516ff56dcaa4602d206105fae67a01472a4830dd8bcb22-merged.mount: Deactivated successfully.
Oct  2 08:36:24 np0005465988 nova_compute[236126]: 2025-10-02 12:36:24.067 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:24.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:24 np0005465988 podman[297483]: 2025-10-02 12:36:24.624038933 +0000 UTC m=+1.051945358 container cleanup 99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:36:24 np0005465988 systemd[1]: libpod-conmon-99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2.scope: Deactivated successfully.
Oct  2 08:36:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:36:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:24.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:36:24 np0005465988 podman[297501]: 2025-10-02 12:36:24.92153287 +0000 UTC m=+1.279573175 container create 6c8f1212c0114eaf0a6fb7c8eaee9e1eb63abba59102ffe6ebbe36d0fa101f86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_perlman, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 08:36:25 np0005465988 nova_compute[236126]: 2025-10-02 12:36:25.167 2 DEBUG nova.compute.manager [req-bd01d185-dafc-4748-b15a-ddf8ed3e9a4f req-74750094-9c05-437c-9a82-ff9619a22b40 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:25 np0005465988 nova_compute[236126]: 2025-10-02 12:36:25.169 2 DEBUG oslo_concurrency.lockutils [req-bd01d185-dafc-4748-b15a-ddf8ed3e9a4f req-74750094-9c05-437c-9a82-ff9619a22b40 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:25 np0005465988 nova_compute[236126]: 2025-10-02 12:36:25.169 2 DEBUG oslo_concurrency.lockutils [req-bd01d185-dafc-4748-b15a-ddf8ed3e9a4f req-74750094-9c05-437c-9a82-ff9619a22b40 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:25 np0005465988 nova_compute[236126]: 2025-10-02 12:36:25.170 2 DEBUG oslo_concurrency.lockutils [req-bd01d185-dafc-4748-b15a-ddf8ed3e9a4f req-74750094-9c05-437c-9a82-ff9619a22b40 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:25 np0005465988 nova_compute[236126]: 2025-10-02 12:36:25.170 2 DEBUG nova.compute.manager [req-bd01d185-dafc-4748-b15a-ddf8ed3e9a4f req-74750094-9c05-437c-9a82-ff9619a22b40 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] No waiting events found dispatching network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:25 np0005465988 nova_compute[236126]: 2025-10-02 12:36:25.171 2 WARNING nova.compute.manager [req-bd01d185-dafc-4748-b15a-ddf8ed3e9a4f req-74750094-9c05-437c-9a82-ff9619a22b40 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received unexpected event network-vif-plugged-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:36:25 np0005465988 systemd[1]: Started libpod-conmon-6c8f1212c0114eaf0a6fb7c8eaee9e1eb63abba59102ffe6ebbe36d0fa101f86.scope.
Oct  2 08:36:25 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:36:25 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74400aacdd7f4a4434c7e7fc4488bdd5bb11f0285110d501bdcbdb07e469086f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct  2 08:36:25 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74400aacdd7f4a4434c7e7fc4488bdd5bb11f0285110d501bdcbdb07e469086f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 08:36:25 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74400aacdd7f4a4434c7e7fc4488bdd5bb11f0285110d501bdcbdb07e469086f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 08:36:25 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74400aacdd7f4a4434c7e7fc4488bdd5bb11f0285110d501bdcbdb07e469086f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 08:36:25 np0005465988 podman[297501]: 2025-10-02 12:36:25.327470956 +0000 UTC m=+1.685511331 container init 6c8f1212c0114eaf0a6fb7c8eaee9e1eb63abba59102ffe6ebbe36d0fa101f86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_perlman, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Oct  2 08:36:25 np0005465988 podman[297501]: 2025-10-02 12:36:25.342177239 +0000 UTC m=+1.700217524 container start 6c8f1212c0114eaf0a6fb7c8eaee9e1eb63abba59102ffe6ebbe36d0fa101f86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_perlman, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True)
Oct  2 08:36:25 np0005465988 podman[297501]: 2025-10-02 12:36:25.346772761 +0000 UTC m=+1.704813146 container attach 6c8f1212c0114eaf0a6fb7c8eaee9e1eb63abba59102ffe6ebbe36d0fa101f86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_perlman, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 08:36:25 np0005465988 podman[297532]: 2025-10-02 12:36:25.355093431 +0000 UTC m=+0.695274110 container remove 99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:36:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:25.367 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[729a37b2-f119-4243-971d-918a7909bd32]: (4, ('Thu Oct  2 12:36:23 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b (99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2)\n99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2\nThu Oct  2 12:36:24 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b (99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2)\n99a56af69af5abd7f6ad7f2be35a440a28be05a22190824cddb1638a91635ac2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:25.369 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b1bc853a-ee80-40db-8d84-2929743f10d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:25.370 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b216831-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:25 np0005465988 nova_compute[236126]: 2025-10-02 12:36:25.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:25 np0005465988 kernel: tap7b216831-20: left promiscuous mode
Oct  2 08:36:25 np0005465988 nova_compute[236126]: 2025-10-02 12:36:25.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:25 np0005465988 nova_compute[236126]: 2025-10-02 12:36:25.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:25.417 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a0600e-4611-4d2a-b75a-78a3681a3601]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:25.446 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5eeffae3-2b9c-4970-8c80-2a335dbed3b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:25.448 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cf6dc090-865f-467f-bcdf-08d163df648d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:25.473 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[db917c16-0b61-4f4f-ba39-df9139e370b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660923, 'reachable_time': 26714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297557, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:25 np0005465988 systemd[1]: run-netns-ovnmeta\x2d7b216831\x2d24ac\x2d41f0\x2dac1c\x2d99aae9bc897b.mount: Deactivated successfully.
Oct  2 08:36:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:25.476 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:36:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:25.477 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[81f8c1a3-2a37-4f2a-ac18-10e76641f536]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:26.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:36:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:26.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]: [
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:    {
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:        "available": false,
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:        "ceph_device": false,
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:        "device_id": "QEMU_DVD-ROM_QM00001",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:        "lsm_data": {},
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:        "lvs": [],
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:        "path": "/dev/sr0",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:        "rejected_reasons": [
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "Has a FileSystem",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "Insufficient space (<5GB)"
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:        ],
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:        "sys_api": {
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "actuators": null,
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "device_nodes": "sr0",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "devname": "sr0",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "human_readable_size": "482.00 KB",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "id_bus": "ata",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "model": "QEMU DVD-ROM",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "nr_requests": "2",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "parent": "/dev/sr0",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "partitions": {},
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "path": "/dev/sr0",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "removable": "1",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "rev": "2.5+",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "ro": "0",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "rotational": "0",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "sas_address": "",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "sas_device_handle": "",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "scheduler_mode": "mq-deadline",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "sectors": 0,
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "sectorsize": "2048",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "size": 493568.0,
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "support_discard": "2048",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "type": "disk",
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:            "vendor": "QEMU"
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:        }
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]:    }
Oct  2 08:36:26 np0005465988 reverent_perlman[297549]: ]
Oct  2 08:36:26 np0005465988 systemd[1]: libpod-6c8f1212c0114eaf0a6fb7c8eaee9e1eb63abba59102ffe6ebbe36d0fa101f86.scope: Deactivated successfully.
Oct  2 08:36:26 np0005465988 podman[297501]: 2025-10-02 12:36:26.896719512 +0000 UTC m=+3.254759807 container died 6c8f1212c0114eaf0a6fb7c8eaee9e1eb63abba59102ffe6ebbe36d0fa101f86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_perlman, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:36:26 np0005465988 systemd[1]: libpod-6c8f1212c0114eaf0a6fb7c8eaee9e1eb63abba59102ffe6ebbe36d0fa101f86.scope: Consumed 1.450s CPU time.
Oct  2 08:36:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:27.366 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:27.366 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:27.367 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:27 np0005465988 systemd[1]: var-lib-containers-storage-overlay-74400aacdd7f4a4434c7e7fc4488bdd5bb11f0285110d501bdcbdb07e469086f-merged.mount: Deactivated successfully.
Oct  2 08:36:27 np0005465988 podman[297501]: 2025-10-02 12:36:27.539861631 +0000 UTC m=+3.897901946 container remove 6c8f1212c0114eaf0a6fb7c8eaee9e1eb63abba59102ffe6ebbe36d0fa101f86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_perlman, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 08:36:27 np0005465988 systemd[1]: libpod-conmon-6c8f1212c0114eaf0a6fb7c8eaee9e1eb63abba59102ffe6ebbe36d0fa101f86.scope: Deactivated successfully.
Oct  2 08:36:27 np0005465988 podman[298826]: 2025-10-02 12:36:27.741196191 +0000 UTC m=+0.077638994 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:36:27 np0005465988 podman[298825]: 2025-10-02 12:36:27.749699535 +0000 UTC m=+0.080972550 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:36:27 np0005465988 podman[298824]: 2025-10-02 12:36:27.776169047 +0000 UTC m=+0.120483947 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:36:28 np0005465988 nova_compute[236126]: 2025-10-02 12:36:28.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:36:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:28.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:36:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:28.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:29 np0005465988 nova_compute[236126]: 2025-10-02 12:36:29.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:36:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:36:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:36:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:36:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:36:30 np0005465988 nova_compute[236126]: 2025-10-02 12:36:30.202 2 INFO nova.virt.libvirt.driver [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Deleting instance files /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_del#033[00m
Oct  2 08:36:30 np0005465988 nova_compute[236126]: 2025-10-02 12:36:30.203 2 INFO nova.virt.libvirt.driver [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Deletion of /var/lib/nova/instances/8736e2a4-70c8-46c1-8ce5-ff68395a22c9_del complete#033[00m
Oct  2 08:36:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:30.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:30 np0005465988 nova_compute[236126]: 2025-10-02 12:36:30.271 2 INFO nova.compute.manager [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Took 8.11 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:36:30 np0005465988 nova_compute[236126]: 2025-10-02 12:36:30.272 2 DEBUG oslo.service.loopingcall [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:36:30 np0005465988 nova_compute[236126]: 2025-10-02 12:36:30.273 2 DEBUG nova.compute.manager [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:36:30 np0005465988 nova_compute[236126]: 2025-10-02 12:36:30.273 2 DEBUG nova.network.neutron [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:36:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:36:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:30.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:36:31 np0005465988 nova_compute[236126]: 2025-10-02 12:36:31.321 2 DEBUG nova.network.neutron [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:31 np0005465988 nova_compute[236126]: 2025-10-02 12:36:31.394 2 INFO nova.compute.manager [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Took 1.12 seconds to deallocate network for instance.#033[00m
Oct  2 08:36:31 np0005465988 nova_compute[236126]: 2025-10-02 12:36:31.507 2 DEBUG oslo_concurrency.lockutils [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:31 np0005465988 nova_compute[236126]: 2025-10-02 12:36:31.508 2 DEBUG oslo_concurrency.lockutils [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:31 np0005465988 nova_compute[236126]: 2025-10-02 12:36:31.513 2 DEBUG nova.compute.manager [req-4f8e5dd6-5ddc-495e-b906-b99f88d37b4e req-f84a0f64-8e42-4cc4-bcc8-94bbb2f49d5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Received event network-vif-deleted-c9dd6bc4-09d5-4e1e-aff9-af9aae80a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:31 np0005465988 nova_compute[236126]: 2025-10-02 12:36:31.576 2 DEBUG oslo_concurrency.processutils [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:36:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/86529349' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:36:32 np0005465988 nova_compute[236126]: 2025-10-02 12:36:32.047 2 DEBUG oslo_concurrency.processutils [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:32 np0005465988 nova_compute[236126]: 2025-10-02 12:36:32.056 2 DEBUG nova.compute.provider_tree [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:36:32 np0005465988 nova_compute[236126]: 2025-10-02 12:36:32.077 2 DEBUG nova.scheduler.client.report [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:36:32 np0005465988 nova_compute[236126]: 2025-10-02 12:36:32.116 2 DEBUG oslo_concurrency.lockutils [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:32 np0005465988 nova_compute[236126]: 2025-10-02 12:36:32.174 2 INFO nova.scheduler.client.report [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Deleted allocations for instance 8736e2a4-70c8-46c1-8ce5-ff68395a22c9#033[00m
Oct  2 08:36:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:32.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:32 np0005465988 nova_compute[236126]: 2025-10-02 12:36:32.271 2 DEBUG oslo_concurrency.lockutils [None req-c5b74599-c437-479a-85e1-7abbf2fc91cb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "8736e2a4-70c8-46c1-8ce5-ff68395a22c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:32.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:33 np0005465988 nova_compute[236126]: 2025-10-02 12:36:33.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:34 np0005465988 nova_compute[236126]: 2025-10-02 12:36:34.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:36:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:34.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:36:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:36:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:34.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #103. Immutable memtables: 0.
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.073324) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 63] Flushing memtable with next log file: 103
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408595073443, "job": 63, "event": "flush_started", "num_memtables": 1, "num_entries": 1895, "num_deletes": 257, "total_data_size": 4475138, "memory_usage": 4537888, "flush_reason": "Manual Compaction"}
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 63] Level-0 flush table #104: started
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408595118243, "cf_name": "default", "job": 63, "event": "table_file_creation", "file_number": 104, "file_size": 2909933, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 52689, "largest_seqno": 54579, "table_properties": {"data_size": 2901821, "index_size": 4862, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 17419, "raw_average_key_size": 20, "raw_value_size": 2885420, "raw_average_value_size": 3406, "num_data_blocks": 211, "num_entries": 847, "num_filter_entries": 847, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759408438, "oldest_key_time": 1759408438, "file_creation_time": 1759408595, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 104, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 63] Flush lasted 45081 microseconds, and 11028 cpu microseconds.
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.118340) [db/flush_job.cc:967] [default] [JOB 63] Level-0 flush table #104: 2909933 bytes OK
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.118437) [db/memtable_list.cc:519] [default] Level-0 commit table #104 started
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.120574) [db/memtable_list.cc:722] [default] Level-0 commit table #104: memtable #1 done
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.120591) EVENT_LOG_v1 {"time_micros": 1759408595120585, "job": 63, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.120609) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 63] Try to delete WAL files size 4466488, prev total WAL file size 4466488, number of live WAL files 2.
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000100.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.122198) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031373536' seq:72057594037927935, type:22 .. '6C6F676D0032303037' seq:0, type:0; will stop at (end)
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 64] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 63 Base level 0, inputs: [104(2841KB)], [102(10MB)]
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408595122252, "job": 64, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [104], "files_L6": [102], "score": -1, "input_data_size": 13501147, "oldest_snapshot_seqno": -1}
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 64] Generated table #105: 7988 keys, 13336419 bytes, temperature: kUnknown
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408595199179, "cf_name": "default", "job": 64, "event": "table_file_creation", "file_number": 105, "file_size": 13336419, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13280884, "index_size": 34431, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20037, "raw_key_size": 206157, "raw_average_key_size": 25, "raw_value_size": 13136450, "raw_average_value_size": 1644, "num_data_blocks": 1361, "num_entries": 7988, "num_filter_entries": 7988, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759408595, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 105, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.199449) [db/compaction/compaction_job.cc:1663] [default] [JOB 64] Compacted 1@0 + 1@6 files to L6 => 13336419 bytes
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.200712) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.4 rd, 173.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.8, 10.1 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(9.2) write-amplify(4.6) OK, records in: 8523, records dropped: 535 output_compression: NoCompression
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.200727) EVENT_LOG_v1 {"time_micros": 1759408595200720, "job": 64, "event": "compaction_finished", "compaction_time_micros": 76994, "compaction_time_cpu_micros": 41675, "output_level": 6, "num_output_files": 1, "total_output_size": 13336419, "num_input_records": 8523, "num_output_records": 7988, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000104.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408595201261, "job": 64, "event": "table_file_deletion", "file_number": 104}
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000102.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408595203109, "job": 64, "event": "table_file_deletion", "file_number": 102}
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.122053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.203174) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.203182) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.203185) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.203188) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:36:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:36:35.203191) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:36:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:36.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:36:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:36.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:36:38 np0005465988 nova_compute[236126]: 2025-10-02 12:36:38.006 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408583.0040839, 8736e2a4-70c8-46c1-8ce5-ff68395a22c9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:38 np0005465988 nova_compute[236126]: 2025-10-02 12:36:38.006 2 INFO nova.compute.manager [-] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:36:38 np0005465988 nova_compute[236126]: 2025-10-02 12:36:38.026 2 DEBUG nova.compute.manager [None req-fc13ffde-7e26-4f8e-8f22-2ccc2eb15021 - - - - - -] [instance: 8736e2a4-70c8-46c1-8ce5-ff68395a22c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:38 np0005465988 nova_compute[236126]: 2025-10-02 12:36:38.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:38.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:36:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:38.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.120 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.121 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.144 2 DEBUG nova.compute.manager [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.219 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.220 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.229 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.230 2 INFO nova.compute.claims [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.354 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:36:39 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/531003767' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.853 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.861 2 DEBUG nova.compute.provider_tree [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.892 2 DEBUG nova.scheduler.client.report [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.926 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.927 2 DEBUG nova.compute.manager [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.997 2 DEBUG nova.compute.manager [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:36:39 np0005465988 nova_compute[236126]: 2025-10-02 12:36:39.997 2 DEBUG nova.network.neutron [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.018 2 INFO nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.057 2 DEBUG nova.compute.manager [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.155 2 DEBUG nova.compute.manager [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.157 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.157 2 INFO nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Creating image(s)#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.188 2 DEBUG nova.storage.rbd_utils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.218 2 DEBUG nova.storage.rbd_utils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.247 2 DEBUG nova.storage.rbd_utils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.251 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:40.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.290 2 DEBUG nova.policy [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bcd36ab668f449959719ba7058f25e72', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a05e525420b4aa8adcc9561158e73d1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.338 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.339 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.340 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.340 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.374 2 DEBUG nova.storage.rbd_utils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.378 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 be1174bf-d7e1-4801-a2eb-67020632d637_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:40 np0005465988 podman[299071]: 2025-10-02 12:36:40.518134003 +0000 UTC m=+0.058448142 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:36:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:40.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:40 np0005465988 nova_compute[236126]: 2025-10-02 12:36:40.940 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 be1174bf-d7e1-4801-a2eb-67020632d637_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.562s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:41 np0005465988 nova_compute[236126]: 2025-10-02 12:36:41.016 2 DEBUG nova.storage.rbd_utils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] resizing rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:36:41 np0005465988 nova_compute[236126]: 2025-10-02 12:36:41.152 2 DEBUG nova.objects.instance [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'migration_context' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:41 np0005465988 nova_compute[236126]: 2025-10-02 12:36:41.169 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:36:41 np0005465988 nova_compute[236126]: 2025-10-02 12:36:41.169 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Ensure instance console log exists: /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:36:41 np0005465988 nova_compute[236126]: 2025-10-02 12:36:41.170 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:41 np0005465988 nova_compute[236126]: 2025-10-02 12:36:41.170 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:41 np0005465988 nova_compute[236126]: 2025-10-02 12:36:41.170 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:41 np0005465988 nova_compute[236126]: 2025-10-02 12:36:41.346 2 DEBUG nova.network.neutron [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Successfully created port: 2af53b80-5072-4407-80f0-88120c0351f7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:36:42 np0005465988 nova_compute[236126]: 2025-10-02 12:36:42.235 2 DEBUG nova.network.neutron [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Successfully updated port: 2af53b80-5072-4407-80f0-88120c0351f7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:36:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:36:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:42.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:36:42 np0005465988 nova_compute[236126]: 2025-10-02 12:36:42.265 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:42 np0005465988 nova_compute[236126]: 2025-10-02 12:36:42.265 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquired lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:42 np0005465988 nova_compute[236126]: 2025-10-02 12:36:42.265 2 DEBUG nova.network.neutron [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:36:42 np0005465988 nova_compute[236126]: 2025-10-02 12:36:42.371 2 DEBUG nova.compute.manager [req-396b4ce2-381d-4e71-b00b-11da5ae419e4 req-dd566772-a67d-4b9c-9d77-7cf25da0f83c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-changed-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:42 np0005465988 nova_compute[236126]: 2025-10-02 12:36:42.372 2 DEBUG nova.compute.manager [req-396b4ce2-381d-4e71-b00b-11da5ae419e4 req-dd566772-a67d-4b9c-9d77-7cf25da0f83c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Refreshing instance network info cache due to event network-changed-2af53b80-5072-4407-80f0-88120c0351f7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:36:42 np0005465988 nova_compute[236126]: 2025-10-02 12:36:42.373 2 DEBUG oslo_concurrency.lockutils [req-396b4ce2-381d-4e71-b00b-11da5ae419e4 req-dd566772-a67d-4b9c-9d77-7cf25da0f83c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:42 np0005465988 nova_compute[236126]: 2025-10-02 12:36:42.439 2 DEBUG nova.network.neutron [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:36:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:36:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:42.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.291 2 DEBUG nova.network.neutron [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updating instance_info_cache with network_info: [{"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.312 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Releasing lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.313 2 DEBUG nova.compute.manager [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Instance network_info: |[{"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.314 2 DEBUG oslo_concurrency.lockutils [req-396b4ce2-381d-4e71-b00b-11da5ae419e4 req-dd566772-a67d-4b9c-9d77-7cf25da0f83c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.314 2 DEBUG nova.network.neutron [req-396b4ce2-381d-4e71-b00b-11da5ae419e4 req-dd566772-a67d-4b9c-9d77-7cf25da0f83c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Refreshing network info cache for port 2af53b80-5072-4407-80f0-88120c0351f7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.320 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Start _get_guest_xml network_info=[{"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.326 2 WARNING nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:36:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.341 2 DEBUG nova.virt.libvirt.host [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.343 2 DEBUG nova.virt.libvirt.host [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.349 2 DEBUG nova.virt.libvirt.host [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.349 2 DEBUG nova.virt.libvirt.host [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.351 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.352 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.353 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.353 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.354 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.354 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.355 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.355 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.355 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.356 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.356 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.357 2 DEBUG nova.virt.hardware [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.361 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:36:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/397089190' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.830 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.867 2 DEBUG nova.storage.rbd_utils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:36:43 np0005465988 nova_compute[236126]: 2025-10-02 12:36:43.873 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:44.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:36:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3417013561' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.425 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.552s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.426 2 DEBUG nova.virt.libvirt.vif [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:36:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2076879726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2076879726',id=142,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA2KA3xcRPebgvgTazx0E34aPT9rxhs35D4g1Uzjz4PIuwR8cc5jli8pSQUOimuckXeWKODOH/ieI/CBtPk6/J+xp5vS+z3Hichw9+q7Uc2dLlyh1Q0msK0J8MjXf0hI6Q==',key_name='tempest-keypair-152732924',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-hgcyfy0h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:36:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=be1174bf-d7e1-4801-a2eb-67020632d637,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.426 2 DEBUG nova.network.os_vif_util [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.427 2 DEBUG nova.network.os_vif_util [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.428 2 DEBUG nova.objects.instance [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'pci_devices' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.448 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  <uuid>be1174bf-d7e1-4801-a2eb-67020632d637</uuid>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  <name>instance-0000008e</name>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-2076879726</nova:name>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:36:43</nova:creationTime>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <nova:user uuid="bcd36ab668f449959719ba7058f25e72">tempest-AttachVolumeShelveTestJSON-405673070-project-member</nova:user>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <nova:project uuid="1a05e525420b4aa8adcc9561158e73d1">tempest-AttachVolumeShelveTestJSON-405673070</nova:project>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <nova:port uuid="2af53b80-5072-4407-80f0-88120c0351f7">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <entry name="serial">be1174bf-d7e1-4801-a2eb-67020632d637</entry>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <entry name="uuid">be1174bf-d7e1-4801-a2eb-67020632d637</entry>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/be1174bf-d7e1-4801-a2eb-67020632d637_disk">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/be1174bf-d7e1-4801-a2eb-67020632d637_disk.config">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:c9:3b:23"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <target dev="tap2af53b80-50"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/console.log" append="off"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:36:44 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:36:44 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:36:44 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:36:44 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.449 2 DEBUG nova.compute.manager [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Preparing to wait for external event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.450 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.450 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.450 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.451 2 DEBUG nova.virt.libvirt.vif [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:36:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2076879726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2076879726',id=142,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA2KA3xcRPebgvgTazx0E34aPT9rxhs35D4g1Uzjz4PIuwR8cc5jli8pSQUOimuckXeWKODOH/ieI/CBtPk6/J+xp5vS+z3Hichw9+q7Uc2dLlyh1Q0msK0J8MjXf0hI6Q==',key_name='tempest-keypair-152732924',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-hgcyfy0h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:36:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=be1174bf-d7e1-4801-a2eb-67020632d637,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.451 2 DEBUG nova.network.os_vif_util [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.452 2 DEBUG nova.network.os_vif_util [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.452 2 DEBUG os_vif [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.453 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.453 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:36:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:36:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.457 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2af53b80-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.457 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2af53b80-50, col_values=(('external_ids', {'iface-id': '2af53b80-5072-4407-80f0-88120c0351f7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:3b:23', 'vm-uuid': 'be1174bf-d7e1-4801-a2eb-67020632d637'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:44 np0005465988 NetworkManager[45041]: <info>  [1759408604.4607] manager: (tap2af53b80-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.470 2 INFO os_vif [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50')#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.569 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.570 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.570 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No VIF found with MAC fa:16:3e:c9:3b:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.570 2 INFO nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Using config drive#033[00m
Oct  2 08:36:44 np0005465988 nova_compute[236126]: 2025-10-02 12:36:44.598 2 DEBUG nova.storage.rbd_utils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:36:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:44.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:46.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:46 np0005465988 nova_compute[236126]: 2025-10-02 12:36:46.274 2 INFO nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Creating config drive at /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config#033[00m
Oct  2 08:36:46 np0005465988 nova_compute[236126]: 2025-10-02 12:36:46.286 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3tco17hk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:46 np0005465988 nova_compute[236126]: 2025-10-02 12:36:46.433 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3tco17hk" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:46 np0005465988 nova_compute[236126]: 2025-10-02 12:36:46.467 2 DEBUG nova.storage.rbd_utils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:36:46 np0005465988 nova_compute[236126]: 2025-10-02 12:36:46.471 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config be1174bf-d7e1-4801-a2eb-67020632d637_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:46 np0005465988 nova_compute[236126]: 2025-10-02 12:36:46.745 2 DEBUG oslo_concurrency.processutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config be1174bf-d7e1-4801-a2eb-67020632d637_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.273s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:46 np0005465988 nova_compute[236126]: 2025-10-02 12:36:46.746 2 INFO nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Deleting local config drive /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config because it was imported into RBD.#033[00m
Oct  2 08:36:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:46.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:46 np0005465988 kernel: tap2af53b80-50: entered promiscuous mode
Oct  2 08:36:46 np0005465988 NetworkManager[45041]: <info>  [1759408606.8251] manager: (tap2af53b80-50): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Oct  2 08:36:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:46Z|00643|binding|INFO|Claiming lport 2af53b80-5072-4407-80f0-88120c0351f7 for this chassis.
Oct  2 08:36:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:46Z|00644|binding|INFO|2af53b80-5072-4407-80f0-88120c0351f7: Claiming fa:16:3e:c9:3b:23 10.100.0.3
Oct  2 08:36:46 np0005465988 nova_compute[236126]: 2025-10-02 12:36:46.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.841 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:3b:23 10.100.0.3'], port_security=['fa:16:3e:c9:3b:23 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'be1174bf-d7e1-4801-a2eb-67020632d637', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a05e525420b4aa8adcc9561158e73d1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '13f3412c-42ad-420a-aa32-1b2881f511f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=709db70f-1209-49b9-bf90-2b91d986925d, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=2af53b80-5072-4407-80f0-88120c0351f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.842 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 2af53b80-5072-4407-80f0-88120c0351f7 in datapath 7b216831-24ac-41f0-ac1c-99aae9bc897b bound to our chassis#033[00m
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.844 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b216831-24ac-41f0-ac1c-99aae9bc897b#033[00m
Oct  2 08:36:46 np0005465988 systemd-udevd[299362]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.858 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b56c2132-75cd-465a-8806-f9eea3f39a94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.859 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b216831-21 in ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.861 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b216831-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.861 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8fce3907-ef68-4650-9b22-075275d15059]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.862 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5dd67c-2bc1-46e6-a321-7456b02947bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:46Z|00645|binding|INFO|Setting lport 2af53b80-5072-4407-80f0-88120c0351f7 ovn-installed in OVS
Oct  2 08:36:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:46Z|00646|binding|INFO|Setting lport 2af53b80-5072-4407-80f0-88120c0351f7 up in Southbound
Oct  2 08:36:46 np0005465988 nova_compute[236126]: 2025-10-02 12:36:46.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:46 np0005465988 NetworkManager[45041]: <info>  [1759408606.8712] device (tap2af53b80-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:36:46 np0005465988 NetworkManager[45041]: <info>  [1759408606.8724] device (tap2af53b80-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:36:46 np0005465988 nova_compute[236126]: 2025-10-02 12:36:46.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.879 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7f0f1b-2029-4522-afd3-2f587b1c376a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:46 np0005465988 systemd-machined[192594]: New machine qemu-64-instance-0000008e.
Oct  2 08:36:46 np0005465988 systemd[1]: Started Virtual Machine qemu-64-instance-0000008e.
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.897 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[62307ebb-1d6d-4a3c-9141-8697e2b2efd8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.938 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[fd5f836d-bf79-4be0-af3a-e711907d6904]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.945 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ea4005a8-6c5d-4b16-89e0-6ef61e9ce329]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:46 np0005465988 NetworkManager[45041]: <info>  [1759408606.9477] manager: (tap7b216831-20): new Veth device (/org/freedesktop/NetworkManager/Devices/290)
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.983 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b6e8a3-3da8-4a6e-a9a0-6185d150c29e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:46.987 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[bc2a7fa1-b709-4655-b80c-b2a926370d90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:47 np0005465988 NetworkManager[45041]: <info>  [1759408607.0147] device (tap7b216831-20): carrier: link connected
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.021 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2305133e-3898-47dc-9ea5-3bfc5628adf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.038 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[82433b85-3f14-4276-b35d-5bb83b6d8de4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b216831-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:a4:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 666734, 'reachable_time': 41253, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299398, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.058 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[617db2f0-f0d0-4f9b-99ba-953ca3fdf43d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:a415'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 666734, 'tstamp': 666734}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 299399, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.079 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[45910723-2017-47fb-9a0c-dc14765aadc9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b216831-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:a4:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 666734, 'reachable_time': 41253, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 299400, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.119 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4be643-b18c-454c-99aa-f4c47f684861]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.195 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3788aa22-a643-4180-b6de-830a828d0e3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.197 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b216831-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.198 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.199 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b216831-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:47 np0005465988 kernel: tap7b216831-20: entered promiscuous mode
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:47 np0005465988 NetworkManager[45041]: <info>  [1759408607.2047] manager: (tap7b216831-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.206 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b216831-20, col_values=(('external_ids', {'iface-id': '7b6901ce-64cc-402d-847e-45c0d79bbb3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:47Z|00647|binding|INFO|Releasing lport 7b6901ce-64cc-402d-847e-45c0d79bbb3b from this chassis (sb_readonly=0)
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.226 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.227 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ca4794-6ced-4042-9326-d7b8a4ad5095]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.228 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-7b216831-24ac-41f0-ac1c-99aae9bc897b
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 7b216831-24ac-41f0-ac1c-99aae9bc897b
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:36:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:47.229 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'env', 'PROCESS_TAG=haproxy-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b216831-24ac-41f0-ac1c-99aae9bc897b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.345 2 DEBUG nova.network.neutron [req-396b4ce2-381d-4e71-b00b-11da5ae419e4 req-dd566772-a67d-4b9c-9d77-7cf25da0f83c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updated VIF entry in instance network info cache for port 2af53b80-5072-4407-80f0-88120c0351f7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.346 2 DEBUG nova.network.neutron [req-396b4ce2-381d-4e71-b00b-11da5ae419e4 req-dd566772-a67d-4b9c-9d77-7cf25da0f83c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updating instance_info_cache with network_info: [{"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.373 2 DEBUG oslo_concurrency.lockutils [req-396b4ce2-381d-4e71-b00b-11da5ae419e4 req-dd566772-a67d-4b9c-9d77-7cf25da0f83c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.688 2 DEBUG nova.compute.manager [req-1ba29881-a2ac-45cd-ad3e-2bc09dec4cca req-f2b92364-1aea-407e-8b89-4689f654b6e5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.689 2 DEBUG oslo_concurrency.lockutils [req-1ba29881-a2ac-45cd-ad3e-2bc09dec4cca req-f2b92364-1aea-407e-8b89-4689f654b6e5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.689 2 DEBUG oslo_concurrency.lockutils [req-1ba29881-a2ac-45cd-ad3e-2bc09dec4cca req-f2b92364-1aea-407e-8b89-4689f654b6e5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.689 2 DEBUG oslo_concurrency.lockutils [req-1ba29881-a2ac-45cd-ad3e-2bc09dec4cca req-f2b92364-1aea-407e-8b89-4689f654b6e5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.689 2 DEBUG nova.compute.manager [req-1ba29881-a2ac-45cd-ad3e-2bc09dec4cca req-f2b92364-1aea-407e-8b89-4689f654b6e5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Processing event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:36:47 np0005465988 podman[299474]: 2025-10-02 12:36:47.599671186 +0000 UTC m=+0.026997637 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:36:47 np0005465988 podman[299474]: 2025-10-02 12:36:47.780636241 +0000 UTC m=+0.207962672 container create bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:36:47 np0005465988 systemd[1]: Started libpod-conmon-bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a.scope.
Oct  2 08:36:47 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:36:47 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1532f05480f538890b0892213430a6d5775035e09f2b892f6205899a9b2521d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.985 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408607.9842713, be1174bf-d7e1-4801-a2eb-67020632d637 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.986 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] VM Started (Lifecycle Event)#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.989 2 DEBUG nova.compute.manager [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.993 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.997 2 INFO nova.virt.libvirt.driver [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Instance spawned successfully.#033[00m
Oct  2 08:36:47 np0005465988 nova_compute[236126]: 2025-10-02 12:36:47.997 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.004 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.008 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.020 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.021 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.021 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.021 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.022 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.022 2 DEBUG nova.virt.libvirt.driver [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.027 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.028 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408607.9843996, be1174bf-d7e1-4801-a2eb-67020632d637 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.028 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.062 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.066 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408607.991301, be1174bf-d7e1-4801-a2eb-67020632d637 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.066 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.096 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.100 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:36:48 np0005465988 podman[299474]: 2025-10-02 12:36:48.111851318 +0000 UTC m=+0.539177759 container init bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.111 2 INFO nova.compute.manager [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Took 7.96 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.112 2 DEBUG nova.compute.manager [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:48 np0005465988 podman[299474]: 2025-10-02 12:36:48.119402995 +0000 UTC m=+0.546729426 container start bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.120 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:36:48 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[299489]: [NOTICE]   (299493) : New worker (299495) forked
Oct  2 08:36:48 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[299489]: [NOTICE]   (299493) : Loading success.
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.178 2 INFO nova.compute.manager [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Took 8.99 seconds to build instance.#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.198 2 DEBUG oslo_concurrency.lockutils [None req-52fec356-3787-4008-928c-4487aa4179eb bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:36:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:48.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:36:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:36:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:48.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.891 2 DEBUG oslo_concurrency.lockutils [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "974fea45-f024-430a-bdbb-a615e05d954c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.892 2 DEBUG oslo_concurrency.lockutils [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.892 2 DEBUG oslo_concurrency.lockutils [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "974fea45-f024-430a-bdbb-a615e05d954c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.892 2 DEBUG oslo_concurrency.lockutils [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.893 2 DEBUG oslo_concurrency.lockutils [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.894 2 INFO nova.compute.manager [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Terminating instance#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.895 2 DEBUG nova.compute.manager [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:36:48 np0005465988 kernel: tap91400667-e1 (unregistering): left promiscuous mode
Oct  2 08:36:48 np0005465988 NetworkManager[45041]: <info>  [1759408608.9579] device (tap91400667-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:36:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:48Z|00648|binding|INFO|Releasing lport 91400667-e168-40d0-8f0a-ffc8c9dd7fa4 from this chassis (sb_readonly=0)
Oct  2 08:36:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:48Z|00649|binding|INFO|Setting lport 91400667-e168-40d0-8f0a-ffc8c9dd7fa4 down in Southbound
Oct  2 08:36:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:48Z|00650|binding|INFO|Removing iface tap91400667-e1 ovn-installed in OVS
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:48.976 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:df:98 10.100.0.8'], port_security=['fa:16:3e:95:df:98 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '974fea45-f024-430a-bdbb-a615e05d954c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc0d63d3b4404ef8858166e8836dd0af', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2a11ff87-bec6-4638-b302-adcd655efba9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=797b6af2-473b-4626-9e97-a0a489119419, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=91400667-e168-40d0-8f0a-ffc8c9dd7fa4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:48.978 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 91400667-e168-40d0-8f0a-ffc8c9dd7fa4 in datapath d7203b00-e5e4-402e-b777-ac6280fa23ac unbound from our chassis#033[00m
Oct  2 08:36:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:48.980 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7203b00-e5e4-402e-b777-ac6280fa23ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:36:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:48.981 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[03113967-2f42-4918-9776-42ef0c02a673]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:48.985 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac namespace which is not needed anymore#033[00m
Oct  2 08:36:48 np0005465988 nova_compute[236126]: 2025-10-02 12:36:48.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:49 np0005465988 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Oct  2 08:36:49 np0005465988 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000008b.scope: Consumed 15.105s CPU time.
Oct  2 08:36:49 np0005465988 systemd-machined[192594]: Machine qemu-63-instance-0000008b terminated.
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:49 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[297027]: [NOTICE]   (297031) : haproxy version is 2.8.14-c23fe91
Oct  2 08:36:49 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[297027]: [NOTICE]   (297031) : path to executable is /usr/sbin/haproxy
Oct  2 08:36:49 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[297027]: [WARNING]  (297031) : Exiting Master process...
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.138 2 INFO nova.virt.libvirt.driver [-] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Instance destroyed successfully.#033[00m
Oct  2 08:36:49 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[297027]: [ALERT]    (297031) : Current worker (297033) exited with code 143 (Terminated)
Oct  2 08:36:49 np0005465988 neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac[297027]: [WARNING]  (297031) : All workers exited. Exiting... (0)
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.139 2 DEBUG nova.objects.instance [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lazy-loading 'resources' on Instance uuid 974fea45-f024-430a-bdbb-a615e05d954c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:49 np0005465988 systemd[1]: libpod-d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6.scope: Deactivated successfully.
Oct  2 08:36:49 np0005465988 podman[299576]: 2025-10-02 12:36:49.148814973 +0000 UTC m=+0.057787593 container died d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.156 2 DEBUG nova.virt.libvirt.vif [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:35:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1334211739',display_name='tempest-ServersTestJSON-server-1334211739',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1334211739',id=139,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bc0d63d3b4404ef8858166e8836dd0af',ramdisk_id='',reservation_id='r-2fd6gv8p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-80077074',owner_user_name='tempest-ServersTestJSON-80077074-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:08Z,user_data=None,user_id='fe9cc788734f406d826446a848700331',uuid=974fea45-f024-430a-bdbb-a615e05d954c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "address": "fa:16:3e:95:df:98", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91400667-e1", "ovs_interfaceid": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.157 2 DEBUG nova.network.os_vif_util [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converting VIF {"id": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "address": "fa:16:3e:95:df:98", "network": {"id": "d7203b00-e5e4-402e-b777-ac6280fa23ac", "bridge": "br-int", "label": "tempest-ServersTestJSON-1524378232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc0d63d3b4404ef8858166e8836dd0af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91400667-e1", "ovs_interfaceid": "91400667-e168-40d0-8f0a-ffc8c9dd7fa4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.158 2 DEBUG nova.network.os_vif_util [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:df:98,bridge_name='br-int',has_traffic_filtering=True,id=91400667-e168-40d0-8f0a-ffc8c9dd7fa4,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91400667-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.158 2 DEBUG os_vif [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:df:98,bridge_name='br-int',has_traffic_filtering=True,id=91400667-e168-40d0-8f0a-ffc8c9dd7fa4,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91400667-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.167 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91400667-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.175 2 INFO os_vif [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:df:98,bridge_name='br-int',has_traffic_filtering=True,id=91400667-e168-40d0-8f0a-ffc8c9dd7fa4,network=Network(d7203b00-e5e4-402e-b777-ac6280fa23ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91400667-e1')#033[00m
Oct  2 08:36:49 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6-userdata-shm.mount: Deactivated successfully.
Oct  2 08:36:49 np0005465988 systemd[1]: var-lib-containers-storage-overlay-ec5c323268d4993353a6a72de6b0e9ad592189fa974d6b70b2cbb12938ba1852-merged.mount: Deactivated successfully.
Oct  2 08:36:49 np0005465988 podman[299576]: 2025-10-02 12:36:49.214483042 +0000 UTC m=+0.123455662 container cleanup d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:36:49 np0005465988 systemd[1]: libpod-conmon-d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6.scope: Deactivated successfully.
Oct  2 08:36:49 np0005465988 podman[299636]: 2025-10-02 12:36:49.299980181 +0000 UTC m=+0.059668427 container remove d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:36:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:49.310 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b2c586f6-c45a-444f-a196-07dc9c707488]: (4, ('Thu Oct  2 12:36:49 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac (d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6)\nd8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6\nThu Oct  2 12:36:49 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac (d8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6)\nd8e2ffc4b8aa694f7d1a87d26be081249e4c388371daadc440adc4be2592fef6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:49.312 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[64d432e4-12d1-4a2d-958a-7dad201eedc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:49.314 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7203b00-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:49 np0005465988 kernel: tapd7203b00-e0: left promiscuous mode
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:49.324 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[92e30c07-fd1d-4657-9620-32801a64e8f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:49.359 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4a6d2db3-75b2-41a5-bbb4-27d9d69ffc94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:49.361 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5a2352fc-13a6-4e48-989c-e6eecc191ac5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:49.378 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cdead880-364d-477d-938c-1388ad2e2448]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662611, 'reachable_time': 20478, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299651, 'error': None, 'target': 'ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:49.381 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7203b00-e5e4-402e-b777-ac6280fa23ac deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:36:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:36:49.381 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e4b15b-7c1c-4352-bbf5-18ee6b497025]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:49 np0005465988 systemd[1]: run-netns-ovnmeta\x2dd7203b00\x2de5e4\x2d402e\x2db777\x2dac6280fa23ac.mount: Deactivated successfully.
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.766 2 DEBUG nova.compute.manager [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.766 2 DEBUG oslo_concurrency.lockutils [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.767 2 DEBUG oslo_concurrency.lockutils [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.767 2 DEBUG oslo_concurrency.lockutils [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.767 2 DEBUG nova.compute.manager [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] No waiting events found dispatching network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.767 2 WARNING nova.compute.manager [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received unexpected event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.767 2 DEBUG nova.compute.manager [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Received event network-vif-unplugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.767 2 DEBUG oslo_concurrency.lockutils [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "974fea45-f024-430a-bdbb-a615e05d954c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.768 2 DEBUG oslo_concurrency.lockutils [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.768 2 DEBUG oslo_concurrency.lockutils [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.768 2 DEBUG nova.compute.manager [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] No waiting events found dispatching network-vif-unplugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.768 2 DEBUG nova.compute.manager [req-b5d9c8e9-e965-4bc6-8ded-0636c3094fcd req-d903b01a-8c8e-4d0c-b106-fe43532ccb31 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Received event network-vif-unplugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.990 2 INFO nova.virt.libvirt.driver [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Deleting instance files /var/lib/nova/instances/974fea45-f024-430a-bdbb-a615e05d954c_del#033[00m
Oct  2 08:36:49 np0005465988 nova_compute[236126]: 2025-10-02 12:36:49.991 2 INFO nova.virt.libvirt.driver [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Deletion of /var/lib/nova/instances/974fea45-f024-430a-bdbb-a615e05d954c_del complete#033[00m
Oct  2 08:36:50 np0005465988 nova_compute[236126]: 2025-10-02 12:36:50.140 2 INFO nova.compute.manager [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Took 1.24 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:36:50 np0005465988 nova_compute[236126]: 2025-10-02 12:36:50.141 2 DEBUG oslo.service.loopingcall [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:36:50 np0005465988 nova_compute[236126]: 2025-10-02 12:36:50.141 2 DEBUG nova.compute.manager [-] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:36:50 np0005465988 nova_compute[236126]: 2025-10-02 12:36:50.141 2 DEBUG nova.network.neutron [-] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:36:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:50.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:50.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.288 2 DEBUG nova.network.neutron [-] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.550 2 INFO nova.compute.manager [-] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Took 1.41 seconds to deallocate network for instance.#033[00m
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.607 2 DEBUG oslo_concurrency.lockutils [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.607 2 DEBUG oslo_concurrency.lockutils [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.673 2 DEBUG oslo_concurrency.processutils [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.867 2 DEBUG nova.compute.manager [req-f3429c61-8e56-4804-a131-bb6fb45e6ac7 req-6b2bfa14-4b0d-4d91-a0ff-3c706353abcc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Received event network-vif-plugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.868 2 DEBUG oslo_concurrency.lockutils [req-f3429c61-8e56-4804-a131-bb6fb45e6ac7 req-6b2bfa14-4b0d-4d91-a0ff-3c706353abcc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "974fea45-f024-430a-bdbb-a615e05d954c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.869 2 DEBUG oslo_concurrency.lockutils [req-f3429c61-8e56-4804-a131-bb6fb45e6ac7 req-6b2bfa14-4b0d-4d91-a0ff-3c706353abcc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.869 2 DEBUG oslo_concurrency.lockutils [req-f3429c61-8e56-4804-a131-bb6fb45e6ac7 req-6b2bfa14-4b0d-4d91-a0ff-3c706353abcc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.869 2 DEBUG nova.compute.manager [req-f3429c61-8e56-4804-a131-bb6fb45e6ac7 req-6b2bfa14-4b0d-4d91-a0ff-3c706353abcc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] No waiting events found dispatching network-vif-plugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.869 2 WARNING nova.compute.manager [req-f3429c61-8e56-4804-a131-bb6fb45e6ac7 req-6b2bfa14-4b0d-4d91-a0ff-3c706353abcc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Received unexpected event network-vif-plugged-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:36:51 np0005465988 nova_compute[236126]: 2025-10-02 12:36:51.870 2 DEBUG nova.compute.manager [req-f3429c61-8e56-4804-a131-bb6fb45e6ac7 req-6b2bfa14-4b0d-4d91-a0ff-3c706353abcc d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Received event network-vif-deleted-91400667-e168-40d0-8f0a-ffc8c9dd7fa4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:36:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2831796320' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:36:52 np0005465988 nova_compute[236126]: 2025-10-02 12:36:52.166 2 DEBUG oslo_concurrency.processutils [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:52 np0005465988 nova_compute[236126]: 2025-10-02 12:36:52.171 2 DEBUG nova.compute.provider_tree [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:36:52 np0005465988 nova_compute[236126]: 2025-10-02 12:36:52.198 2 DEBUG nova.scheduler.client.report [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:36:52 np0005465988 nova_compute[236126]: 2025-10-02 12:36:52.225 2 DEBUG oslo_concurrency.lockutils [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:52 np0005465988 nova_compute[236126]: 2025-10-02 12:36:52.263 2 INFO nova.scheduler.client.report [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Deleted allocations for instance 974fea45-f024-430a-bdbb-a615e05d954c#033[00m
Oct  2 08:36:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:52.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:52 np0005465988 nova_compute[236126]: 2025-10-02 12:36:52.358 2 DEBUG oslo_concurrency.lockutils [None req-ec1fc58d-4774-4e21-b6c8-80849b48772b fe9cc788734f406d826446a848700331 bc0d63d3b4404ef8858166e8836dd0af - - default default] Lock "974fea45-f024-430a-bdbb-a615e05d954c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.466s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:52 np0005465988 nova_compute[236126]: 2025-10-02 12:36:52.626 2 DEBUG nova.compute.manager [req-55f1c07e-70c5-466b-b1f3-62d0ff624486 req-b9b485a2-e61d-44d6-9872-58c6d360e9ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-changed-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:52 np0005465988 nova_compute[236126]: 2025-10-02 12:36:52.626 2 DEBUG nova.compute.manager [req-55f1c07e-70c5-466b-b1f3-62d0ff624486 req-b9b485a2-e61d-44d6-9872-58c6d360e9ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Refreshing instance network info cache due to event network-changed-2af53b80-5072-4407-80f0-88120c0351f7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:36:52 np0005465988 nova_compute[236126]: 2025-10-02 12:36:52.627 2 DEBUG oslo_concurrency.lockutils [req-55f1c07e-70c5-466b-b1f3-62d0ff624486 req-b9b485a2-e61d-44d6-9872-58c6d360e9ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:52 np0005465988 nova_compute[236126]: 2025-10-02 12:36:52.627 2 DEBUG oslo_concurrency.lockutils [req-55f1c07e-70c5-466b-b1f3-62d0ff624486 req-b9b485a2-e61d-44d6-9872-58c6d360e9ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:52 np0005465988 nova_compute[236126]: 2025-10-02 12:36:52.627 2 DEBUG nova.network.neutron [req-55f1c07e-70c5-466b-b1f3-62d0ff624486 req-b9b485a2-e61d-44d6-9872-58c6d360e9ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Refreshing network info cache for port 2af53b80-5072-4407-80f0-88120c0351f7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:36:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:52.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:54 np0005465988 nova_compute[236126]: 2025-10-02 12:36:54.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:54 np0005465988 nova_compute[236126]: 2025-10-02 12:36:54.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:36:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:54.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:36:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:54.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:55 np0005465988 nova_compute[236126]: 2025-10-02 12:36:55.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:56 np0005465988 nova_compute[236126]: 2025-10-02 12:36:55.997 2 DEBUG nova.network.neutron [req-55f1c07e-70c5-466b-b1f3-62d0ff624486 req-b9b485a2-e61d-44d6-9872-58c6d360e9ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updated VIF entry in instance network info cache for port 2af53b80-5072-4407-80f0-88120c0351f7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:36:56 np0005465988 nova_compute[236126]: 2025-10-02 12:36:55.998 2 DEBUG nova.network.neutron [req-55f1c07e-70c5-466b-b1f3-62d0ff624486 req-b9b485a2-e61d-44d6-9872-58c6d360e9ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updating instance_info_cache with network_info: [{"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:56 np0005465988 nova_compute[236126]: 2025-10-02 12:36:56.028 2 DEBUG oslo_concurrency.lockutils [req-55f1c07e-70c5-466b-b1f3-62d0ff624486 req-b9b485a2-e61d-44d6-9872-58c6d360e9ff d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:56.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:56.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:57 np0005465988 nova_compute[236126]: 2025-10-02 12:36:57.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:36:58Z|00651|binding|INFO|Releasing lport 7b6901ce-64cc-402d-847e-45c0d79bbb3b from this chassis (sb_readonly=0)
Oct  2 08:36:58 np0005465988 nova_compute[236126]: 2025-10-02 12:36:58.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:36:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:36:58.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:36:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:36:58 np0005465988 nova_compute[236126]: 2025-10-02 12:36:58.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:58 np0005465988 nova_compute[236126]: 2025-10-02 12:36:58.499 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:58 np0005465988 nova_compute[236126]: 2025-10-02 12:36:58.500 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:58 np0005465988 nova_compute[236126]: 2025-10-02 12:36:58.500 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:58 np0005465988 nova_compute[236126]: 2025-10-02 12:36:58.500 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:36:58 np0005465988 nova_compute[236126]: 2025-10-02 12:36:58.500 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:58 np0005465988 podman[299680]: 2025-10-02 12:36:58.534188193 +0000 UTC m=+0.073548906 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:36:58 np0005465988 podman[299679]: 2025-10-02 12:36:58.569308583 +0000 UTC m=+0.108119261 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct  2 08:36:58 np0005465988 podman[299681]: 2025-10-02 12:36:58.589064871 +0000 UTC m=+0.115382359 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:36:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:36:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:36:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:36:58.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:36:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:36:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4287727608' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.001 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.102 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.102 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.319 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.321 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4054MB free_disk=20.900936126708984GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.322 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.322 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.426 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance be1174bf-d7e1-4801-a2eb-67020632d637 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.426 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.426 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.470 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:36:59 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4056615805' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.930 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.936 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.953 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.974 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:36:59 np0005465988 nova_compute[236126]: 2025-10-02 12:36:59.974 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:00.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:00.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:01 np0005465988 nova_compute[236126]: 2025-10-02 12:37:01.975 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:02.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:02.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:03 np0005465988 nova_compute[236126]: 2025-10-02 12:37:03.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:03 np0005465988 nova_compute[236126]: 2025-10-02 12:37:03.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:37:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:04 np0005465988 nova_compute[236126]: 2025-10-02 12:37:04.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:04 np0005465988 nova_compute[236126]: 2025-10-02 12:37:04.133 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408609.1312785, 974fea45-f024-430a-bdbb-a615e05d954c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:04 np0005465988 nova_compute[236126]: 2025-10-02 12:37:04.134 2 INFO nova.compute.manager [-] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:37:04 np0005465988 nova_compute[236126]: 2025-10-02 12:37:04.158 2 DEBUG nova.compute.manager [None req-3f403f70-c7b7-4620-81cd-6f39ee39532b - - - - - -] [instance: 974fea45-f024-430a-bdbb-a615e05d954c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:04 np0005465988 nova_compute[236126]: 2025-10-02 12:37:04.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:04.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:04 np0005465988 nova_compute[236126]: 2025-10-02 12:37:04.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:37:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:04.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:37:05 np0005465988 nova_compute[236126]: 2025-10-02 12:37:05.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:06 np0005465988 nova_compute[236126]: 2025-10-02 12:37:06.040 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "c9f6d037-d76c-47e5-b6c4-9b36ef31f934" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:06 np0005465988 nova_compute[236126]: 2025-10-02 12:37:06.041 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "c9f6d037-d76c-47e5-b6c4-9b36ef31f934" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:06.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:06 np0005465988 ovn_controller[132601]: 2025-10-02T12:37:06Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c9:3b:23 10.100.0.3
Oct  2 08:37:06 np0005465988 ovn_controller[132601]: 2025-10-02T12:37:06Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:3b:23 10.100.0.3
Oct  2 08:37:06 np0005465988 nova_compute[236126]: 2025-10-02 12:37:06.400 2 DEBUG nova.compute.manager [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:37:06 np0005465988 nova_compute[236126]: 2025-10-02 12:37:06.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:06 np0005465988 nova_compute[236126]: 2025-10-02 12:37:06.500 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:06 np0005465988 nova_compute[236126]: 2025-10-02 12:37:06.500 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:06 np0005465988 nova_compute[236126]: 2025-10-02 12:37:06.506 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:37:06 np0005465988 nova_compute[236126]: 2025-10-02 12:37:06.506 2 INFO nova.compute.claims [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:37:06 np0005465988 nova_compute[236126]: 2025-10-02 12:37:06.652 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:06.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:37:07 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1727174770' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.074 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.079 2 DEBUG nova.compute.provider_tree [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.093 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "63f78a44-925c-43f9-84ec-ba97ed0bceeb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.093 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "63f78a44-925c-43f9-84ec-ba97ed0bceeb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.110 2 DEBUG nova.scheduler.client.report [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.119 2 DEBUG nova.compute.manager [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.144 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.146 2 DEBUG nova.compute.manager [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.189 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.190 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.194 2 DEBUG nova.compute.manager [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.204 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.204 2 INFO nova.compute.claims [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.225 2 INFO nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.249 2 DEBUG nova.compute.manager [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.355 2 DEBUG nova.compute.manager [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.357 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.358 2 INFO nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Creating image(s)#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.397 2 DEBUG nova.storage.rbd_utils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.442 2 DEBUG nova.storage.rbd_utils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.488 2 DEBUG nova.storage.rbd_utils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.494 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.544 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.603 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.109s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.604 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.605 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.605 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.643 2 DEBUG nova.storage.rbd_utils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:07 np0005465988 nova_compute[236126]: 2025-10-02 12:37:07.650 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.088 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.096 2 DEBUG nova.compute.provider_tree [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.113 2 DEBUG nova.scheduler.client.report [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.134 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.944s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.135 2 DEBUG nova.compute.manager [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.192 2 DEBUG nova.compute.manager [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.215 2 INFO nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.238 2 DEBUG nova.compute.manager [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:37:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:08.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.321 2 DEBUG nova.compute.manager [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.324 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.325 2 INFO nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Creating image(s)#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.412 2 DEBUG nova.storage.rbd_utils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.451 2 DEBUG nova.storage.rbd_utils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.487 2 DEBUG nova.storage.rbd_utils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.493 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.597 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.599 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.601 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.601 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.634 2 DEBUG nova.storage.rbd_utils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.638 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.678 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:08 np0005465988 nova_compute[236126]: 2025-10-02 12:37:08.770 2 DEBUG nova.storage.rbd_utils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] resizing rbd image c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:37:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:08.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.080 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.147 2 DEBUG nova.storage.rbd_utils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] resizing rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.238 2 DEBUG nova.objects.instance [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'migration_context' on Instance uuid 63f78a44-925c-43f9-84ec-ba97ed0bceeb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.273 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.274 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Ensure instance console log exists: /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.274 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.275 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.275 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.277 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.283 2 WARNING nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.288 2 DEBUG nova.virt.libvirt.host [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.289 2 DEBUG nova.virt.libvirt.host [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.292 2 DEBUG nova.virt.libvirt.host [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.293 2 DEBUG nova.virt.libvirt.host [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.294 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.294 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.295 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.295 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.295 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.296 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.296 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.296 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.297 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.297 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.297 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.297 2 DEBUG nova.virt.hardware [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.300 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.462 2 DEBUG nova.objects.instance [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'migration_context' on Instance uuid c9f6d037-d76c-47e5-b6c4-9b36ef31f934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.482 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.483 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Ensure instance console log exists: /var/lib/nova/instances/c9f6d037-d76c-47e5-b6c4-9b36ef31f934/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.483 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.483 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.484 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.485 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.490 2 WARNING nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.494 2 DEBUG nova.virt.libvirt.host [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.494 2 DEBUG nova.virt.libvirt.host [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.498 2 DEBUG nova.virt.libvirt.host [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.498 2 DEBUG nova.virt.libvirt.host [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.499 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.499 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.500 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.500 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.500 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.500 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.501 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.501 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.501 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.501 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.501 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.502 2 DEBUG nova.virt.hardware [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.505 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:37:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/346022943' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.829 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.858 2 DEBUG nova.storage.rbd_utils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.862 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:37:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3394177219' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.960 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:09 np0005465988 nova_compute[236126]: 2025-10-02 12:37:09.995 2 DEBUG nova.storage.rbd_utils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.000 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:37:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/615945225' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:37:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:10.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.319 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.320 2 DEBUG nova.objects.instance [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'pci_devices' on Instance uuid 63f78a44-925c-43f9-84ec-ba97ed0bceeb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.340 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <uuid>63f78a44-925c-43f9-84ec-ba97ed0bceeb</uuid>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <name>instance-00000091</name>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerShowV247Test-server-741150032</nova:name>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:37:09</nova:creationTime>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:user uuid="f7d78f04152a425b81d486a834213a76">tempest-ServerShowV247Test-377191045-project-member</nova:user>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:project uuid="af873f6cea354e099a97c3b09ce8ca27">tempest-ServerShowV247Test-377191045</nova:project>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="serial">63f78a44-925c-43f9-84ec-ba97ed0bceeb</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="uuid">63f78a44-925c-43f9-84ec-ba97ed0bceeb</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/console.log" append="off"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:37:10 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:37:10 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.400 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.401 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.401 2 INFO nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Using config drive#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.433 2 DEBUG nova.storage.rbd_utils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:37:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4168009933' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.785 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.786s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.786 2 DEBUG nova.objects.instance [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'pci_devices' on Instance uuid c9f6d037-d76c-47e5-b6c4-9b36ef31f934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.804 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <uuid>c9f6d037-d76c-47e5-b6c4-9b36ef31f934</uuid>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <name>instance-00000090</name>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerShowV247Test-server-1655457864</nova:name>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:37:09</nova:creationTime>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:user uuid="f7d78f04152a425b81d486a834213a76">tempest-ServerShowV247Test-377191045-project-member</nova:user>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <nova:project uuid="af873f6cea354e099a97c3b09ce8ca27">tempest-ServerShowV247Test-377191045</nova:project>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="serial">c9f6d037-d76c-47e5-b6c4-9b36ef31f934</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="uuid">c9f6d037-d76c-47e5-b6c4-9b36ef31f934</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk.config">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/c9f6d037-d76c-47e5-b6c4-9b36ef31f934/console.log" append="off"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:37:10 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:37:10 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:37:10 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:37:10 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:37:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:10.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.856 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.856 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.858 2 INFO nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Using config drive#033[00m
Oct  2 08:37:10 np0005465988 nova_compute[236126]: 2025-10-02 12:37:10.915 2 DEBUG nova.storage.rbd_utils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.244 2 INFO nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Creating config drive at /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config#033[00m
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.255 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcz73x4ry execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.301 2 INFO nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Creating config drive at /var/lib/nova/instances/c9f6d037-d76c-47e5-b6c4-9b36ef31f934/disk.config#033[00m
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.308 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c9f6d037-d76c-47e5-b6c4-9b36ef31f934/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnx6oeito execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.411 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcz73x4ry" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.454 2 DEBUG nova.storage.rbd_utils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.459 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.496 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c9f6d037-d76c-47e5-b6c4-9b36ef31f934/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnx6oeito" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.538 2 DEBUG nova.storage.rbd_utils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:11 np0005465988 podman[300402]: 2025-10-02 12:37:11.544091934 +0000 UTC m=+0.066283947 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.545 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c9f6d037-d76c-47e5-b6c4-9b36ef31f934/disk.config c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.979 2 DEBUG oslo_concurrency.processutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:11 np0005465988 nova_compute[236126]: 2025-10-02 12:37:11.981 2 INFO nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Deleting local config drive /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config because it was imported into RBD.#033[00m
Oct  2 08:37:12 np0005465988 systemd-machined[192594]: New machine qemu-65-instance-00000091.
Oct  2 08:37:12 np0005465988 systemd[1]: Started Virtual Machine qemu-65-instance-00000091.
Oct  2 08:37:12 np0005465988 nova_compute[236126]: 2025-10-02 12:37:12.187 2 DEBUG oslo_concurrency.processutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c9f6d037-d76c-47e5-b6c4-9b36ef31f934/disk.config c9f6d037-d76c-47e5-b6c4-9b36ef31f934_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.642s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:12 np0005465988 nova_compute[236126]: 2025-10-02 12:37:12.188 2 INFO nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Deleting local config drive /var/lib/nova/instances/c9f6d037-d76c-47e5-b6c4-9b36ef31f934/disk.config because it was imported into RBD.#033[00m
Oct  2 08:37:12 np0005465988 systemd-machined[192594]: New machine qemu-66-instance-00000090.
Oct  2 08:37:12 np0005465988 systemd[1]: Started Virtual Machine qemu-66-instance-00000090.
Oct  2 08:37:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:12.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:12 np0005465988 nova_compute[236126]: 2025-10-02 12:37:12.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:12 np0005465988 nova_compute[236126]: 2025-10-02 12:37:12.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:37:12 np0005465988 nova_compute[236126]: 2025-10-02 12:37:12.499 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:37:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:12.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.247 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408633.247401, 63f78a44-925c-43f9-84ec-ba97ed0bceeb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.248 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.251 2 DEBUG nova.compute.manager [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.251 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.255 2 INFO nova.virt.libvirt.driver [-] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Instance spawned successfully.#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.255 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.281 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.286 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.289 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.289 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.290 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.290 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.290 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.291 2 DEBUG nova.virt.libvirt.driver [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.322 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.323 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408633.2505326, 63f78a44-925c-43f9-84ec-ba97ed0bceeb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.323 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] VM Started (Lifecycle Event)#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.524 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.532 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:37:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.565 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.598 2 INFO nova.compute.manager [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Took 5.28 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.598 2 DEBUG nova.compute.manager [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.650 2 INFO nova.compute.manager [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Took 6.48 seconds to build instance.#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.656 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408633.6560228, c9f6d037-d76c-47e5-b6c4-9b36ef31f934 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.656 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.658 2 DEBUG nova.compute.manager [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.658 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.662 2 INFO nova.virt.libvirt.driver [-] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Instance spawned successfully.#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.662 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.673 2 DEBUG oslo_concurrency.lockutils [None req-21bb96f1-1908-46f1-a57c-a2a42f34914d f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "63f78a44-925c-43f9-84ec-ba97ed0bceeb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.693 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.700 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.705 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.706 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.706 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.707 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.707 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.708 2 DEBUG nova.virt.libvirt.driver [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.736 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.737 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408633.658022, c9f6d037-d76c-47e5-b6c4-9b36ef31f934 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.737 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] VM Started (Lifecycle Event)#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.768 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.772 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.797 2 INFO nova.compute.manager [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Took 6.44 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.798 2 DEBUG nova.compute.manager [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.811 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.902 2 INFO nova.compute.manager [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Took 7.44 seconds to build instance.#033[00m
Oct  2 08:37:13 np0005465988 nova_compute[236126]: 2025-10-02 12:37:13.932 2 DEBUG oslo_concurrency.lockutils [None req-528cfd20-f959-48f2-bd2b-6b6c8c6a41c1 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "c9f6d037-d76c-47e5-b6c4-9b36ef31f934" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:14 np0005465988 nova_compute[236126]: 2025-10-02 12:37:14.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:14 np0005465988 nova_compute[236126]: 2025-10-02 12:37:14.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:14.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:14.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:15.441 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:15 np0005465988 nova_compute[236126]: 2025-10-02 12:37:15.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:15.443 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:37:15 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:15.444 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:15 np0005465988 nova_compute[236126]: 2025-10-02 12:37:15.873 2 INFO nova.compute.manager [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Rebuilding instance#033[00m
Oct  2 08:37:16 np0005465988 nova_compute[236126]: 2025-10-02 12:37:16.136 2 DEBUG nova.objects.instance [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 63f78a44-925c-43f9-84ec-ba97ed0bceeb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:16 np0005465988 nova_compute[236126]: 2025-10-02 12:37:16.157 2 DEBUG nova.compute.manager [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:16 np0005465988 nova_compute[236126]: 2025-10-02 12:37:16.241 2 DEBUG nova.objects.instance [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'pci_requests' on Instance uuid 63f78a44-925c-43f9-84ec-ba97ed0bceeb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:16 np0005465988 nova_compute[236126]: 2025-10-02 12:37:16.264 2 DEBUG nova.objects.instance [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'pci_devices' on Instance uuid 63f78a44-925c-43f9-84ec-ba97ed0bceeb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:16 np0005465988 nova_compute[236126]: 2025-10-02 12:37:16.285 2 DEBUG nova.objects.instance [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'resources' on Instance uuid 63f78a44-925c-43f9-84ec-ba97ed0bceeb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:16 np0005465988 nova_compute[236126]: 2025-10-02 12:37:16.307 2 DEBUG nova.objects.instance [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'migration_context' on Instance uuid 63f78a44-925c-43f9-84ec-ba97ed0bceeb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:16.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:16 np0005465988 nova_compute[236126]: 2025-10-02 12:37:16.334 2 DEBUG nova.objects.instance [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:37:16 np0005465988 nova_compute[236126]: 2025-10-02 12:37:16.338 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:37:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:16.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:18.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:18.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:19 np0005465988 nova_compute[236126]: 2025-10-02 12:37:19.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:19 np0005465988 nova_compute[236126]: 2025-10-02 12:37:19.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:20.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:20.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:22.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:22.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:24 np0005465988 nova_compute[236126]: 2025-10-02 12:37:24.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:24 np0005465988 nova_compute[236126]: 2025-10-02 12:37:24.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:37:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:24.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:37:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:24.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:26.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:26 np0005465988 nova_compute[236126]: 2025-10-02 12:37:26.383 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:37:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:26.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:26 np0005465988 nova_compute[236126]: 2025-10-02 12:37:26.902 2 DEBUG oslo_concurrency.lockutils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:26 np0005465988 nova_compute[236126]: 2025-10-02 12:37:26.902 2 DEBUG oslo_concurrency.lockutils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:26 np0005465988 nova_compute[236126]: 2025-10-02 12:37:26.903 2 INFO nova.compute.manager [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Shelving#033[00m
Oct  2 08:37:26 np0005465988 nova_compute[236126]: 2025-10-02 12:37:26.958 2 DEBUG nova.virt.libvirt.driver [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:37:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:27.367 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:27.368 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:27.368 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:28.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:28.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:29 np0005465988 nova_compute[236126]: 2025-10-02 12:37:29.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:29 np0005465988 podman[300627]: 2025-10-02 12:37:29.124418636 +0000 UTC m=+0.091950325 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Oct  2 08:37:29 np0005465988 podman[300625]: 2025-10-02 12:37:29.124677284 +0000 UTC m=+0.103480178 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:37:29 np0005465988 podman[300626]: 2025-10-02 12:37:29.137215184 +0000 UTC m=+0.114069962 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:37:29 np0005465988 nova_compute[236126]: 2025-10-02 12:37:29.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:29 np0005465988 nova_compute[236126]: 2025-10-02 12:37:29.399 2 INFO nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:37:29 np0005465988 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000091.scope: Deactivated successfully.
Oct  2 08:37:29 np0005465988 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000091.scope: Consumed 13.570s CPU time.
Oct  2 08:37:29 np0005465988 systemd-machined[192594]: Machine qemu-65-instance-00000091 terminated.
Oct  2 08:37:29 np0005465988 nova_compute[236126]: 2025-10-02 12:37:29.829 2 INFO nova.virt.libvirt.driver [-] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Instance destroyed successfully.#033[00m
Oct  2 08:37:29 np0005465988 nova_compute[236126]: 2025-10-02 12:37:29.835 2 INFO nova.virt.libvirt.driver [-] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Instance destroyed successfully.#033[00m
Oct  2 08:37:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #106. Immutable memtables: 0.
Oct  2 08:37:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:29.943745) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:37:29 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 65] Flushing memtable with next log file: 106
Oct  2 08:37:29 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408649943792, "job": 65, "event": "flush_started", "num_memtables": 1, "num_entries": 823, "num_deletes": 251, "total_data_size": 1506884, "memory_usage": 1523984, "flush_reason": "Manual Compaction"}
Oct  2 08:37:29 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 65] Level-0 flush table #107: started
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408650074122, "cf_name": "default", "job": 65, "event": "table_file_creation", "file_number": 107, "file_size": 993471, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 54584, "largest_seqno": 55402, "table_properties": {"data_size": 989649, "index_size": 1602, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9032, "raw_average_key_size": 19, "raw_value_size": 981854, "raw_average_value_size": 2143, "num_data_blocks": 70, "num_entries": 458, "num_filter_entries": 458, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759408595, "oldest_key_time": 1759408595, "file_creation_time": 1759408649, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 107, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 65] Flush lasted 130425 microseconds, and 3695 cpu microseconds.
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.074163) [db/flush_job.cc:967] [default] [JOB 65] Level-0 flush table #107: 993471 bytes OK
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.074188) [db/memtable_list.cc:519] [default] Level-0 commit table #107 started
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.082616) [db/memtable_list.cc:722] [default] Level-0 commit table #107: memtable #1 done
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.082662) EVENT_LOG_v1 {"time_micros": 1759408650082651, "job": 65, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.082686) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 65] Try to delete WAL files size 1502612, prev total WAL file size 1502612, number of live WAL files 2.
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000103.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.083598) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034353138' seq:72057594037927935, type:22 .. '7061786F730034373730' seq:0, type:0; will stop at (end)
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 66] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 65 Base level 0, inputs: [107(970KB)], [105(12MB)]
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408650083637, "job": 66, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [107], "files_L6": [105], "score": -1, "input_data_size": 14329890, "oldest_snapshot_seqno": -1}
Oct  2 08:37:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:37:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:30.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 66] Generated table #108: 7933 keys, 12464525 bytes, temperature: kUnknown
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408650409210, "cf_name": "default", "job": 66, "event": "table_file_creation", "file_number": 108, "file_size": 12464525, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12410231, "index_size": 33382, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19845, "raw_key_size": 205802, "raw_average_key_size": 25, "raw_value_size": 12267577, "raw_average_value_size": 1546, "num_data_blocks": 1312, "num_entries": 7933, "num_filter_entries": 7933, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759408650, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 108, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.409580) [db/compaction/compaction_job.cc:1663] [default] [JOB 66] Compacted 1@0 + 1@6 files to L6 => 12464525 bytes
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.437354) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 44.0 rd, 38.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.7 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(27.0) write-amplify(12.5) OK, records in: 8446, records dropped: 513 output_compression: NoCompression
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.437443) EVENT_LOG_v1 {"time_micros": 1759408650437429, "job": 66, "event": "compaction_finished", "compaction_time_micros": 325668, "compaction_time_cpu_micros": 35431, "output_level": 6, "num_output_files": 1, "total_output_size": 12464525, "num_input_records": 8446, "num_output_records": 7933, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000107.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408650437833, "job": 66, "event": "table_file_deletion", "file_number": 107}
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000105.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408650440866, "job": 66, "event": "table_file_deletion", "file_number": 105}
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.083472) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.440969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.440984) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.440988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.440992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:37:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:37:30.440996) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:37:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:30.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:30 np0005465988 nova_compute[236126]: 2025-10-02 12:37:30.987 2 INFO nova.virt.libvirt.driver [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Instance shutdown successfully after 4 seconds.#033[00m
Oct  2 08:37:31 np0005465988 kernel: tap2af53b80-50 (unregistering): left promiscuous mode
Oct  2 08:37:31 np0005465988 NetworkManager[45041]: <info>  [1759408651.0762] device (tap2af53b80-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:37:31Z|00652|binding|INFO|Releasing lport 2af53b80-5072-4407-80f0-88120c0351f7 from this chassis (sb_readonly=0)
Oct  2 08:37:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:37:31Z|00653|binding|INFO|Setting lport 2af53b80-5072-4407-80f0-88120c0351f7 down in Southbound
Oct  2 08:37:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:37:31Z|00654|binding|INFO|Removing iface tap2af53b80-50 ovn-installed in OVS
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005465988 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008e.scope: Deactivated successfully.
Oct  2 08:37:31 np0005465988 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008e.scope: Consumed 15.673s CPU time.
Oct  2 08:37:31 np0005465988 systemd-machined[192594]: Machine qemu-64-instance-0000008e terminated.
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.164 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:3b:23 10.100.0.3'], port_security=['fa:16:3e:c9:3b:23 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'be1174bf-d7e1-4801-a2eb-67020632d637', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a05e525420b4aa8adcc9561158e73d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '13f3412c-42ad-420a-aa32-1b2881f511f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.198'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=709db70f-1209-49b9-bf90-2b91d986925d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=2af53b80-5072-4407-80f0-88120c0351f7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.165 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 2af53b80-5072-4407-80f0-88120c0351f7 in datapath 7b216831-24ac-41f0-ac1c-99aae9bc897b unbound from our chassis#033[00m
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.167 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b216831-24ac-41f0-ac1c-99aae9bc897b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.169 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[81663310-2546-4943-b19e-8ef2154ab34d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.169 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b namespace which is not needed anymore#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.224 2 INFO nova.virt.libvirt.driver [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Instance destroyed successfully.#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.224 2 DEBUG nova.objects.instance [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'numa_topology' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:31 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[299489]: [NOTICE]   (299493) : haproxy version is 2.8.14-c23fe91
Oct  2 08:37:31 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[299489]: [NOTICE]   (299493) : path to executable is /usr/sbin/haproxy
Oct  2 08:37:31 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[299489]: [WARNING]  (299493) : Exiting Master process...
Oct  2 08:37:31 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[299489]: [ALERT]    (299493) : Current worker (299495) exited with code 143 (Terminated)
Oct  2 08:37:31 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[299489]: [WARNING]  (299493) : All workers exited. Exiting... (0)
Oct  2 08:37:31 np0005465988 systemd[1]: libpod-bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a.scope: Deactivated successfully.
Oct  2 08:37:31 np0005465988 podman[300766]: 2025-10-02 12:37:31.440560367 +0000 UTC m=+0.164347558 container died bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.556 2 DEBUG nova.compute.manager [req-07834ebe-ed88-4b54-a48d-d640bf84ab51 req-3875a5e4-e64b-4e2f-a2c1-14e80e4e2323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-vif-unplugged-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.556 2 DEBUG oslo_concurrency.lockutils [req-07834ebe-ed88-4b54-a48d-d640bf84ab51 req-3875a5e4-e64b-4e2f-a2c1-14e80e4e2323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.557 2 DEBUG oslo_concurrency.lockutils [req-07834ebe-ed88-4b54-a48d-d640bf84ab51 req-3875a5e4-e64b-4e2f-a2c1-14e80e4e2323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.557 2 DEBUG oslo_concurrency.lockutils [req-07834ebe-ed88-4b54-a48d-d640bf84ab51 req-3875a5e4-e64b-4e2f-a2c1-14e80e4e2323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.557 2 DEBUG nova.compute.manager [req-07834ebe-ed88-4b54-a48d-d640bf84ab51 req-3875a5e4-e64b-4e2f-a2c1-14e80e4e2323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] No waiting events found dispatching network-vif-unplugged-2af53b80-5072-4407-80f0-88120c0351f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.558 2 WARNING nova.compute.manager [req-07834ebe-ed88-4b54-a48d-d640bf84ab51 req-3875a5e4-e64b-4e2f-a2c1-14e80e4e2323 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received unexpected event network-vif-unplugged-2af53b80-5072-4407-80f0-88120c0351f7 for instance with vm_state active and task_state shelving.#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.564 2 INFO nova.virt.libvirt.driver [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Beginning cold snapshot process#033[00m
Oct  2 08:37:31 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a-userdata-shm.mount: Deactivated successfully.
Oct  2 08:37:31 np0005465988 systemd[1]: var-lib-containers-storage-overlay-b1532f05480f538890b0892213430a6d5775035e09f2b892f6205899a9b2521d-merged.mount: Deactivated successfully.
Oct  2 08:37:31 np0005465988 podman[300766]: 2025-10-02 12:37:31.654658405 +0000 UTC m=+0.378445596 container cleanup bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:37:31 np0005465988 systemd[1]: libpod-conmon-bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a.scope: Deactivated successfully.
Oct  2 08:37:31 np0005465988 podman[300795]: 2025-10-02 12:37:31.725688768 +0000 UTC m=+0.044915173 container remove bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.733 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a0de5387-9415-45fc-9821-aed420c67486]: (4, ('Thu Oct  2 12:37:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b (bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a)\nbb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a\nThu Oct  2 12:37:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b (bb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a)\nbb0abf89c391c08fd42c8e6c20f703b643afbfcb70e4e6d27001c12c97b49d8a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.735 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f49687fd-e049-40a9-8e13-fc1d84d6baa2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.737 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b216831-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:31 np0005465988 kernel: tap7b216831-20: left promiscuous mode
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.747 2 DEBUG nova.virt.libvirt.imagebackend [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005465988 nova_compute[236126]: 2025-10-02 12:37:31.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.760 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[64b47930-d8e9-41c7-9928-7d319f2bed33]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.790 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1b1cedb5-de67-4880-a3e1-045432c9f2fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.792 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f69a7bfd-eafd-46f7-8514-40c6f59c5824]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.810 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b40ef926-913c-4780-bcd4-5d525f9704a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 666726, 'reachable_time': 39798, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300846, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005465988 systemd[1]: run-netns-ovnmeta\x2d7b216831\x2d24ac\x2d41f0\x2dac1c\x2d99aae9bc897b.mount: Deactivated successfully.
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.814 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:37:31 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:37:31.814 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[772bd20c-ac80-4f87-9a88-d7770665d80b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:32 np0005465988 nova_compute[236126]: 2025-10-02 12:37:32.059 2 DEBUG nova.storage.rbd_utils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] creating snapshot(6efb2b2c30a04e2ca8d38fceb5093c9b) on rbd image(be1174bf-d7e1-4801-a2eb-67020632d637_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:37:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e332 e332: 3 total, 3 up, 3 in
Oct  2 08:37:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:37:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:32.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:37:32 np0005465988 nova_compute[236126]: 2025-10-02 12:37:32.432 2 DEBUG nova.storage.rbd_utils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] cloning vms/be1174bf-d7e1-4801-a2eb-67020632d637_disk@6efb2b2c30a04e2ca8d38fceb5093c9b to images/766f42a5-3020-41dc-a077-fda5642a1d60 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:37:32 np0005465988 nova_compute[236126]: 2025-10-02 12:37:32.667 2 DEBUG nova.storage.rbd_utils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] flattening images/766f42a5-3020-41dc-a077-fda5642a1d60 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:37:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:37:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:32.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.564 2 INFO nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Deleting instance files /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb_del#033[00m
Oct  2 08:37:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.565 2 INFO nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Deletion of /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb_del complete#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.640 2 DEBUG nova.compute.manager [req-95094d23-f980-4dbf-ae9c-b2a430918cfe req-74f15478-04d8-47f0-bbe4-c54e3f6e9f90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.641 2 DEBUG oslo_concurrency.lockutils [req-95094d23-f980-4dbf-ae9c-b2a430918cfe req-74f15478-04d8-47f0-bbe4-c54e3f6e9f90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.641 2 DEBUG oslo_concurrency.lockutils [req-95094d23-f980-4dbf-ae9c-b2a430918cfe req-74f15478-04d8-47f0-bbe4-c54e3f6e9f90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.642 2 DEBUG oslo_concurrency.lockutils [req-95094d23-f980-4dbf-ae9c-b2a430918cfe req-74f15478-04d8-47f0-bbe4-c54e3f6e9f90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.642 2 DEBUG nova.compute.manager [req-95094d23-f980-4dbf-ae9c-b2a430918cfe req-74f15478-04d8-47f0-bbe4-c54e3f6e9f90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] No waiting events found dispatching network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.642 2 WARNING nova.compute.manager [req-95094d23-f980-4dbf-ae9c-b2a430918cfe req-74f15478-04d8-47f0-bbe4-c54e3f6e9f90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received unexpected event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.712 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.713 2 INFO nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Creating image(s)#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.739 2 DEBUG nova.storage.rbd_utils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.772 2 DEBUG nova.storage.rbd_utils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.804 2 DEBUG nova.storage.rbd_utils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.812 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.918 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.920 2 DEBUG oslo_concurrency.lockutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.920 2 DEBUG oslo_concurrency.lockutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.921 2 DEBUG oslo_concurrency.lockutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.950 2 DEBUG nova.storage.rbd_utils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:33 np0005465988 nova_compute[236126]: 2025-10-02 12:37:33.956 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:34 np0005465988 nova_compute[236126]: 2025-10-02 12:37:34.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:34 np0005465988 nova_compute[236126]: 2025-10-02 12:37:34.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:34 np0005465988 nova_compute[236126]: 2025-10-02 12:37:34.241 2 DEBUG nova.storage.rbd_utils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] removing snapshot(6efb2b2c30a04e2ca8d38fceb5093c9b) on rbd image(be1174bf-d7e1-4801-a2eb-67020632d637_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:37:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:37:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:34.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:37:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:34.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e333 e333: 3 total, 3 up, 3 in
Oct  2 08:37:35 np0005465988 nova_compute[236126]: 2025-10-02 12:37:35.016 2 DEBUG nova.storage.rbd_utils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] creating snapshot(snap) on rbd image(766f42a5-3020-41dc-a077-fda5642a1d60) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:37:35 np0005465988 nova_compute[236126]: 2025-10-02 12:37:35.445 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:35 np0005465988 nova_compute[236126]: 2025-10-02 12:37:35.532 2 DEBUG nova.storage.rbd_utils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] resizing rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.216 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.217 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Ensure instance console log exists: /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.218 2 DEBUG oslo_concurrency.lockutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.218 2 DEBUG oslo_concurrency.lockutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.219 2 DEBUG oslo_concurrency.lockutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.222 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:47Z,direct_url=<?>,disk_format='qcow2',id=db05f54c-61f8-42d6-a1e2-da3219a77b12,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.229 2 WARNING nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.236 2 DEBUG nova.virt.libvirt.host [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.237 2 DEBUG nova.virt.libvirt.host [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.241 2 DEBUG nova.virt.libvirt.host [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.242 2 DEBUG nova.virt.libvirt.host [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.243 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.244 2 DEBUG nova.virt.hardware [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:47Z,direct_url=<?>,disk_format='qcow2',id=db05f54c-61f8-42d6-a1e2-da3219a77b12,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.245 2 DEBUG nova.virt.hardware [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.245 2 DEBUG nova.virt.hardware [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.245 2 DEBUG nova.virt.hardware [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.246 2 DEBUG nova.virt.hardware [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.246 2 DEBUG nova.virt.hardware [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.246 2 DEBUG nova.virt.hardware [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.247 2 DEBUG nova.virt.hardware [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.247 2 DEBUG nova.virt.hardware [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.248 2 DEBUG nova.virt.hardware [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.248 2 DEBUG nova.virt.hardware [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.248 2 DEBUG nova.objects.instance [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 63f78a44-925c-43f9-84ec-ba97ed0bceeb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.298 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:36.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e334 e334: 3 total, 3 up, 3 in
Oct  2 08:37:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:37:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3408644407' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.828 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:36.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.870 2 DEBUG nova.storage.rbd_utils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:36 np0005465988 nova_compute[236126]: 2025-10-02 12:37:36.875 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:37:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1957920708' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.316 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.320 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  <uuid>63f78a44-925c-43f9-84ec-ba97ed0bceeb</uuid>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  <name>instance-00000091</name>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerShowV247Test-server-741150032</nova:name>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:37:36</nova:creationTime>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <nova:user uuid="f7d78f04152a425b81d486a834213a76">tempest-ServerShowV247Test-377191045-project-member</nova:user>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <nova:project uuid="af873f6cea354e099a97c3b09ce8ca27">tempest-ServerShowV247Test-377191045</nova:project>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="db05f54c-61f8-42d6-a1e2-da3219a77b12"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <nova:ports/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <entry name="serial">63f78a44-925c-43f9-84ec-ba97ed0bceeb</entry>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <entry name="uuid">63f78a44-925c-43f9-84ec-ba97ed0bceeb</entry>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/console.log" append="off"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:37:37 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:37:37 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:37:37 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:37:37 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.428 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.429 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.429 2 INFO nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Using config drive#033[00m
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.506 2 DEBUG nova.storage.rbd_utils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.569 2 DEBUG nova.objects.instance [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 63f78a44-925c-43f9-84ec-ba97ed0bceeb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.606 2 DEBUG nova.objects.instance [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'keypairs' on Instance uuid 63f78a44-925c-43f9-84ec-ba97ed0bceeb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.792 2 INFO nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Creating config drive at /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config#033[00m
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.804 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp02y1zukf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.963 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp02y1zukf" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:37 np0005465988 nova_compute[236126]: 2025-10-02 12:37:37.996 2 DEBUG nova.storage.rbd_utils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] rbd image 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:37:38 np0005465988 nova_compute[236126]: 2025-10-02 12:37:38.000 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:38 np0005465988 nova_compute[236126]: 2025-10-02 12:37:38.045 2 INFO nova.virt.libvirt.driver [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Snapshot image upload complete#033[00m
Oct  2 08:37:38 np0005465988 nova_compute[236126]: 2025-10-02 12:37:38.046 2 DEBUG nova.compute.manager [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:38 np0005465988 nova_compute[236126]: 2025-10-02 12:37:38.100 2 INFO nova.compute.manager [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Shelve offloading#033[00m
Oct  2 08:37:38 np0005465988 nova_compute[236126]: 2025-10-02 12:37:38.107 2 INFO nova.virt.libvirt.driver [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Instance destroyed successfully.#033[00m
Oct  2 08:37:38 np0005465988 nova_compute[236126]: 2025-10-02 12:37:38.107 2 DEBUG nova.compute.manager [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:38 np0005465988 nova_compute[236126]: 2025-10-02 12:37:38.110 2 DEBUG oslo_concurrency.lockutils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:38 np0005465988 nova_compute[236126]: 2025-10-02 12:37:38.110 2 DEBUG oslo_concurrency.lockutils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquired lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:38 np0005465988 nova_compute[236126]: 2025-10-02 12:37:38.110 2 DEBUG nova.network.neutron [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:37:38 np0005465988 nova_compute[236126]: 2025-10-02 12:37:38.205 2 DEBUG oslo_concurrency.processutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config 63f78a44-925c-43f9-84ec-ba97ed0bceeb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.204s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:38 np0005465988 nova_compute[236126]: 2025-10-02 12:37:38.205 2 INFO nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Deleting local config drive /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb/disk.config because it was imported into RBD.#033[00m
Oct  2 08:37:38 np0005465988 systemd-machined[192594]: New machine qemu-67-instance-00000091.
Oct  2 08:37:38 np0005465988 systemd[1]: Started Virtual Machine qemu-67-instance-00000091.
Oct  2 08:37:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:38.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e334 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:38.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.411 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 63f78a44-925c-43f9-84ec-ba97ed0bceeb due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.411 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408659.4104342, 63f78a44-925c-43f9-84ec-ba97ed0bceeb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.412 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.414 2 DEBUG nova.compute.manager [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.415 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.419 2 INFO nova.virt.libvirt.driver [-] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Instance spawned successfully.#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.420 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.437 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.442 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.445 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.445 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.446 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.446 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.447 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.447 2 DEBUG nova.virt.libvirt.driver [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.475 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.475 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408659.4121802, 63f78a44-925c-43f9-84ec-ba97ed0bceeb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.476 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] VM Started (Lifecycle Event)#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.507 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.509 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.512 2 DEBUG nova.compute.manager [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.535 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.562 2 DEBUG oslo_concurrency.lockutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.562 2 DEBUG oslo_concurrency.lockutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.562 2 DEBUG nova.objects.instance [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:37:39 np0005465988 nova_compute[236126]: 2025-10-02 12:37:39.625 2 DEBUG oslo_concurrency.lockutils [None req-802f5fab-0ad5-464f-90d4-da6903681a57 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e335 e335: 3 total, 3 up, 3 in
Oct  2 08:37:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:40.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:40.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:41 np0005465988 nova_compute[236126]: 2025-10-02 12:37:41.276 2 DEBUG nova.network.neutron [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updating instance_info_cache with network_info: [{"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:41 np0005465988 nova_compute[236126]: 2025-10-02 12:37:41.299 2 DEBUG oslo_concurrency.lockutils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Releasing lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:41 np0005465988 nova_compute[236126]: 2025-10-02 12:37:41.964 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "63f78a44-925c-43f9-84ec-ba97ed0bceeb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:41 np0005465988 nova_compute[236126]: 2025-10-02 12:37:41.964 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "63f78a44-925c-43f9-84ec-ba97ed0bceeb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:41 np0005465988 nova_compute[236126]: 2025-10-02 12:37:41.965 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "63f78a44-925c-43f9-84ec-ba97ed0bceeb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:41 np0005465988 nova_compute[236126]: 2025-10-02 12:37:41.965 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "63f78a44-925c-43f9-84ec-ba97ed0bceeb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:41 np0005465988 nova_compute[236126]: 2025-10-02 12:37:41.966 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "63f78a44-925c-43f9-84ec-ba97ed0bceeb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:41 np0005465988 nova_compute[236126]: 2025-10-02 12:37:41.967 2 INFO nova.compute.manager [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Terminating instance#033[00m
Oct  2 08:37:41 np0005465988 nova_compute[236126]: 2025-10-02 12:37:41.969 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "refresh_cache-63f78a44-925c-43f9-84ec-ba97ed0bceeb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:41 np0005465988 nova_compute[236126]: 2025-10-02 12:37:41.970 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquired lock "refresh_cache-63f78a44-925c-43f9-84ec-ba97ed0bceeb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:41 np0005465988 nova_compute[236126]: 2025-10-02 12:37:41.972 2 DEBUG nova.network.neutron [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.112 2 DEBUG nova.network.neutron [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.342 2 DEBUG nova.network.neutron [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:37:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:42.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.359 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Releasing lock "refresh_cache-63f78a44-925c-43f9-84ec-ba97ed0bceeb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.360 2 DEBUG nova.compute.manager [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.388 2 INFO nova.virt.libvirt.driver [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Instance destroyed successfully.#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.389 2 DEBUG nova.objects.instance [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'resources' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.400 2 DEBUG nova.virt.libvirt.vif [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:36:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2076879726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2076879726',id=142,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA2KA3xcRPebgvgTazx0E34aPT9rxhs35D4g1Uzjz4PIuwR8cc5jli8pSQUOimuckXeWKODOH/ieI/CBtPk6/J+xp5vS+z3Hichw9+q7Uc2dLlyh1Q0msK0J8MjXf0hI6Q==',key_name='tempest-keypair-152732924',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-hgcyfy0h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member',shelved_at='2025-10-02T12:37:38.046792',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='766f42a5-3020-41dc-a077-fda5642a1d60'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:37:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=be1174bf-d7e1-4801-a2eb-67020632d637,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.400 2 DEBUG nova.network.os_vif_util [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.401 2 DEBUG nova.network.os_vif_util [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.402 2 DEBUG os_vif [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.404 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2af53b80-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.412 2 INFO os_vif [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50')#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.462 2 DEBUG nova.compute.manager [req-934bee2d-29a9-4704-97c1-bcd39c3372ce req-dbc88580-bfae-460f-aacf-c9d35c480500 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-changed-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.463 2 DEBUG nova.compute.manager [req-934bee2d-29a9-4704-97c1-bcd39c3372ce req-dbc88580-bfae-460f-aacf-c9d35c480500 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Refreshing instance network info cache due to event network-changed-2af53b80-5072-4407-80f0-88120c0351f7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.463 2 DEBUG oslo_concurrency.lockutils [req-934bee2d-29a9-4704-97c1-bcd39c3372ce req-dbc88580-bfae-460f-aacf-c9d35c480500 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.463 2 DEBUG oslo_concurrency.lockutils [req-934bee2d-29a9-4704-97c1-bcd39c3372ce req-dbc88580-bfae-460f-aacf-c9d35c480500 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.463 2 DEBUG nova.network.neutron [req-934bee2d-29a9-4704-97c1-bcd39c3372ce req-dbc88580-bfae-460f-aacf-c9d35c480500 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Refreshing network info cache for port 2af53b80-5072-4407-80f0-88120c0351f7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:37:42 np0005465988 podman[301320]: 2025-10-02 12:37:42.520295441 +0000 UTC m=+0.054113137 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:37:42 np0005465988 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000091.scope: Deactivated successfully.
Oct  2 08:37:42 np0005465988 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000091.scope: Consumed 4.088s CPU time.
Oct  2 08:37:42 np0005465988 systemd-machined[192594]: Machine qemu-67-instance-00000091 terminated.
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.787 2 INFO nova.virt.libvirt.driver [-] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Instance destroyed successfully.#033[00m
Oct  2 08:37:42 np0005465988 nova_compute[236126]: 2025-10-02 12:37:42.789 2 DEBUG nova.objects.instance [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'resources' on Instance uuid 63f78a44-925c-43f9-84ec-ba97ed0bceeb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:42.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:44 np0005465988 nova_compute[236126]: 2025-10-02 12:37:44.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:44.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:44.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:45 np0005465988 nova_compute[236126]: 2025-10-02 12:37:45.544 2 DEBUG nova.network.neutron [req-934bee2d-29a9-4704-97c1-bcd39c3372ce req-dbc88580-bfae-460f-aacf-c9d35c480500 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updated VIF entry in instance network info cache for port 2af53b80-5072-4407-80f0-88120c0351f7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:37:45 np0005465988 nova_compute[236126]: 2025-10-02 12:37:45.544 2 DEBUG nova.network.neutron [req-934bee2d-29a9-4704-97c1-bcd39c3372ce req-dbc88580-bfae-460f-aacf-c9d35c480500 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updating instance_info_cache with network_info: [{"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": null, "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap2af53b80-50", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:45 np0005465988 nova_compute[236126]: 2025-10-02 12:37:45.564 2 DEBUG oslo_concurrency.lockutils [req-934bee2d-29a9-4704-97c1-bcd39c3372ce req-dbc88580-bfae-460f-aacf-c9d35c480500 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:46 np0005465988 nova_compute[236126]: 2025-10-02 12:37:46.221 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408651.2202337, be1174bf-d7e1-4801-a2eb-67020632d637 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:46 np0005465988 nova_compute[236126]: 2025-10-02 12:37:46.222 2 INFO nova.compute.manager [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:37:46 np0005465988 nova_compute[236126]: 2025-10-02 12:37:46.254 2 DEBUG nova.compute.manager [None req-55b304fb-fe2e-4602-9aa2-957892aa62be - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:46 np0005465988 nova_compute[236126]: 2025-10-02 12:37:46.263 2 DEBUG nova.compute.manager [None req-55b304fb-fe2e-4602-9aa2-957892aa62be - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:37:46 np0005465988 nova_compute[236126]: 2025-10-02 12:37:46.297 2 INFO nova.compute.manager [None req-55b304fb-fe2e-4602-9aa2-957892aa62be - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] During sync_power_state the instance has a pending task (shelving_offloading). Skip.#033[00m
Oct  2 08:37:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:46.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 08:37:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct  2 08:37:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:37:46 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:37:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:46.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:47 np0005465988 nova_compute[236126]: 2025-10-02 12:37:47.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 08:37:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:37:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:37:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:37:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:48.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:48.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:49 np0005465988 nova_compute[236126]: 2025-10-02 12:37:49.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:50.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:50.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.575 2 INFO nova.virt.libvirt.driver [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Deleting instance files /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb_del#033[00m
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.576 2 INFO nova.virt.libvirt.driver [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Deletion of /var/lib/nova/instances/63f78a44-925c-43f9-84ec-ba97ed0bceeb_del complete#033[00m
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.634 2 INFO nova.compute.manager [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Took 9.27 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.635 2 DEBUG oslo.service.loopingcall [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.636 2 DEBUG nova.compute.manager [-] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.636 2 DEBUG nova.network.neutron [-] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.771 2 DEBUG nova.network.neutron [-] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.782 2 DEBUG nova.network.neutron [-] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.796 2 INFO nova.compute.manager [-] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Took 0.16 seconds to deallocate network for instance.#033[00m
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.857 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.858 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:51 np0005465988 nova_compute[236126]: 2025-10-02 12:37:51.972 2 DEBUG oslo_concurrency.processutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.360 2 INFO nova.virt.libvirt.driver [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Deleting instance files /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637_del#033[00m
Oct  2 08:37:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:52.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.363 2 INFO nova.virt.libvirt.driver [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Deletion of /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637_del complete#033[00m
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:37:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2421256036' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.458 2 DEBUG oslo_concurrency.processutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.463 2 INFO nova.scheduler.client.report [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Deleted allocations for instance be1174bf-d7e1-4801-a2eb-67020632d637#033[00m
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.475 2 DEBUG nova.compute.provider_tree [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.504 2 DEBUG nova.scheduler.client.report [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.546 2 DEBUG oslo_concurrency.lockutils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.560 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.563 2 DEBUG oslo_concurrency.lockutils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.597 2 INFO nova.scheduler.client.report [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Deleted allocations for instance 63f78a44-925c-43f9-84ec-ba97ed0bceeb#033[00m
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.623 2 DEBUG oslo_concurrency.processutils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:52 np0005465988 nova_compute[236126]: 2025-10-02 12:37:52.674 2 DEBUG oslo_concurrency.lockutils [None req-a4dd000f-b25e-45f3-8e36-e072ebdd3e9f f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "63f78a44-925c-43f9-84ec-ba97ed0bceeb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:52.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:37:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/195267348' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.087 2 DEBUG oslo_concurrency.processutils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.091 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "c9f6d037-d76c-47e5-b6c4-9b36ef31f934" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.091 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "c9f6d037-d76c-47e5-b6c4-9b36ef31f934" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.092 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "c9f6d037-d76c-47e5-b6c4-9b36ef31f934-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.092 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "c9f6d037-d76c-47e5-b6c4-9b36ef31f934-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.092 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "c9f6d037-d76c-47e5-b6c4-9b36ef31f934-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.093 2 INFO nova.compute.manager [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Terminating instance#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.094 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "refresh_cache-c9f6d037-d76c-47e5-b6c4-9b36ef31f934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.095 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquired lock "refresh_cache-c9f6d037-d76c-47e5-b6c4-9b36ef31f934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.095 2 DEBUG nova.network.neutron [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.098 2 DEBUG nova.compute.provider_tree [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.121 2 DEBUG nova.scheduler.client.report [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.151 2 DEBUG oslo_concurrency.lockutils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.227 2 DEBUG oslo_concurrency.lockutils [None req-5263d100-7189-487d-8b4c-0f9293f2005f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 26.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.255 2 DEBUG nova.network.neutron [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.532 2 DEBUG nova.network.neutron [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.551 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Releasing lock "refresh_cache-c9f6d037-d76c-47e5-b6c4-9b36ef31f934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:53 np0005465988 nova_compute[236126]: 2025-10-02 12:37:53.552 2 DEBUG nova.compute.manager [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:37:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:54 np0005465988 nova_compute[236126]: 2025-10-02 12:37:54.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:54 np0005465988 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000090.scope: Deactivated successfully.
Oct  2 08:37:54 np0005465988 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000090.scope: Consumed 14.614s CPU time.
Oct  2 08:37:54 np0005465988 systemd-machined[192594]: Machine qemu-66-instance-00000090 terminated.
Oct  2 08:37:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:54.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:54 np0005465988 nova_compute[236126]: 2025-10-02 12:37:54.376 2 INFO nova.virt.libvirt.driver [-] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Instance destroyed successfully.#033[00m
Oct  2 08:37:54 np0005465988 nova_compute[236126]: 2025-10-02 12:37:54.376 2 DEBUG nova.objects.instance [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lazy-loading 'resources' on Instance uuid c9f6d037-d76c-47e5-b6c4-9b36ef31f934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:54.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:55 np0005465988 nova_compute[236126]: 2025-10-02 12:37:55.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:37:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:56.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:37:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:56.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:57 np0005465988 nova_compute[236126]: 2025-10-02 12:37:57.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:57 np0005465988 nova_compute[236126]: 2025-10-02 12:37:57.787 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408662.7852216, 63f78a44-925c-43f9-84ec-ba97ed0bceeb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:57 np0005465988 nova_compute[236126]: 2025-10-02 12:37:57.787 2 INFO nova.compute.manager [-] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:37:57 np0005465988 nova_compute[236126]: 2025-10-02 12:37:57.832 2 DEBUG nova.compute.manager [None req-8d119430-2eb6-4e66-9e34-630fbb4d9c8e - - - - - -] [instance: 63f78a44-925c-43f9-84ec-ba97ed0bceeb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:37:58Z|00655|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Oct  2 08:37:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:37:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:37:58.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.507 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.508 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.509 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.509 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.510 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.693 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.694 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.695 2 INFO nova.compute.manager [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Unshelving#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.791 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.792 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.801 2 DEBUG nova.objects.instance [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'pci_requests' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.825 2 DEBUG nova.objects.instance [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'numa_topology' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.849 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.849 2 INFO nova.compute.claims [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:37:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:37:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:37:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:37:58.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:37:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:37:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3759468216' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:37:58 np0005465988 nova_compute[236126]: 2025-10-02 12:37:58.979 2 DEBUG oslo_concurrency.processutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.011 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.089 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000090 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.089 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000090 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.214 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.215 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4193MB free_disk=20.76099395751953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.215 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:37:59 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3809355517' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.534 2 DEBUG oslo_concurrency.processutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.555s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.539 2 DEBUG nova.compute.provider_tree [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:37:59 np0005465988 podman[301784]: 2025-10-02 12:37:59.544278016 +0000 UTC m=+0.069366936 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:37:59 np0005465988 podman[301782]: 2025-10-02 12:37:59.551355289 +0000 UTC m=+0.089004101 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.565 2 DEBUG nova.scheduler.client.report [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:37:59 np0005465988 podman[301783]: 2025-10-02 12:37:59.571847209 +0000 UTC m=+0.099088721 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.595 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.600 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.662 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance c9f6d037-d76c-47e5-b6c4-9b36ef31f934 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.663 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance be1174bf-d7e1-4801-a2eb-67020632d637 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.663 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.664 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.716 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:59 np0005465988 nova_compute[236126]: 2025-10-02 12:37:59.749 2 INFO nova.network.neutron [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updating port 2af53b80-5072-4407-80f0-88120c0351f7 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  2 08:38:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:38:00 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3194258724' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:38:00 np0005465988 nova_compute[236126]: 2025-10-02 12:38:00.190 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:00 np0005465988 nova_compute[236126]: 2025-10-02 12:38:00.198 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:38:00 np0005465988 nova_compute[236126]: 2025-10-02 12:38:00.216 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:38:00 np0005465988 nova_compute[236126]: 2025-10-02 12:38:00.242 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:38:00 np0005465988 nova_compute[236126]: 2025-10-02 12:38:00.243 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:00.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:00 np0005465988 nova_compute[236126]: 2025-10-02 12:38:00.454 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:00 np0005465988 nova_compute[236126]: 2025-10-02 12:38:00.455 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquired lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:00 np0005465988 nova_compute[236126]: 2025-10-02 12:38:00.455 2 DEBUG nova.network.neutron [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:38:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:00.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:01 np0005465988 nova_compute[236126]: 2025-10-02 12:38:01.244 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.032 2 DEBUG nova.compute.manager [req-0f2242bd-9dcc-4186-89fb-c7613f064403 req-eeff4001-3bd0-4182-a907-0dd32689160b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-changed-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.033 2 DEBUG nova.compute.manager [req-0f2242bd-9dcc-4186-89fb-c7613f064403 req-eeff4001-3bd0-4182-a907-0dd32689160b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Refreshing instance network info cache due to event network-changed-2af53b80-5072-4407-80f0-88120c0351f7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.033 2 DEBUG oslo_concurrency.lockutils [req-0f2242bd-9dcc-4186-89fb-c7613f064403 req-eeff4001-3bd0-4182-a907-0dd32689160b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.070 2 DEBUG nova.network.neutron [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updating instance_info_cache with network_info: [{"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.194 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Releasing lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.196 2 DEBUG nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.197 2 INFO nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Creating image(s)#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.229 2 DEBUG nova.storage.rbd_utils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.232 2 DEBUG nova.objects.instance [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.233 2 DEBUG oslo_concurrency.lockutils [req-0f2242bd-9dcc-4186-89fb-c7613f064403 req-eeff4001-3bd0-4182-a907-0dd32689160b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.234 2 DEBUG nova.network.neutron [req-0f2242bd-9dcc-4186-89fb-c7613f064403 req-eeff4001-3bd0-4182-a907-0dd32689160b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Refreshing network info cache for port 2af53b80-5072-4407-80f0-88120c0351f7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.341 2 DEBUG nova.storage.rbd_utils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:02.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.382 2 DEBUG nova.storage.rbd_utils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.385 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "b1dea7fc799d5d6db6eb697392d92961fa892f91" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.386 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "b1dea7fc799d5d6db6eb697392d92961fa892f91" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.658 2 DEBUG nova.virt.libvirt.imagebackend [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Image locations are: [{'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/766f42a5-3020-41dc-a077-fda5642a1d60/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/766f42a5-3020-41dc-a077-fda5642a1d60/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.724 2 DEBUG nova.virt.libvirt.imagebackend [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Selected location: {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/766f42a5-3020-41dc-a077-fda5642a1d60/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Oct  2 08:38:02 np0005465988 nova_compute[236126]: 2025-10-02 12:38:02.725 2 DEBUG nova.storage.rbd_utils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] cloning images/766f42a5-3020-41dc-a077-fda5642a1d60@snap to None/be1174bf-d7e1-4801-a2eb-67020632d637_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:38:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:02.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:03 np0005465988 nova_compute[236126]: 2025-10-02 12:38:03.262 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "b1dea7fc799d5d6db6eb697392d92961fa892f91" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.877s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:03 np0005465988 nova_compute[236126]: 2025-10-02 12:38:03.446 2 DEBUG nova.objects.instance [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'migration_context' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:03 np0005465988 nova_compute[236126]: 2025-10-02 12:38:03.519 2 DEBUG nova.storage.rbd_utils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] flattening vms/be1174bf-d7e1-4801-a2eb-67020632d637_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:38:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:03 np0005465988 nova_compute[236126]: 2025-10-02 12:38:03.586 2 DEBUG nova.network.neutron [req-0f2242bd-9dcc-4186-89fb-c7613f064403 req-eeff4001-3bd0-4182-a907-0dd32689160b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updated VIF entry in instance network info cache for port 2af53b80-5072-4407-80f0-88120c0351f7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:38:03 np0005465988 nova_compute[236126]: 2025-10-02 12:38:03.587 2 DEBUG nova.network.neutron [req-0f2242bd-9dcc-4186-89fb-c7613f064403 req-eeff4001-3bd0-4182-a907-0dd32689160b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updating instance_info_cache with network_info: [{"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:03 np0005465988 nova_compute[236126]: 2025-10-02 12:38:03.605 2 DEBUG oslo_concurrency.lockutils [req-0f2242bd-9dcc-4186-89fb-c7613f064403 req-eeff4001-3bd0-4182-a907-0dd32689160b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:04 np0005465988 nova_compute[236126]: 2025-10-02 12:38:04.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 08:38:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:04.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 08:38:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:04.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:05 np0005465988 nova_compute[236126]: 2025-10-02 12:38:05.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:05 np0005465988 nova_compute[236126]: 2025-10-02 12:38:05.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:05 np0005465988 nova_compute[236126]: 2025-10-02 12:38:05.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:38:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:06.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:06 np0005465988 nova_compute[236126]: 2025-10-02 12:38:06.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:06.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.300 2 DEBUG nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Image rbd:vms/be1174bf-d7e1-4801-a2eb-67020632d637_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.301 2 DEBUG nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.302 2 DEBUG nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Ensure instance console log exists: /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.302 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.303 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.303 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.308 2 DEBUG nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Start _get_guest_xml network_info=[{"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-10-02T12:37:26Z,direct_url=<?>,disk_format='raw',id=766f42a5-3020-41dc-a077-fda5642a1d60,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-2076879726-shelved',owner='1a05e525420b4aa8adcc9561158e73d1',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-10-02T12:37:37Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.313 2 WARNING nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.321 2 DEBUG nova.virt.libvirt.host [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.322 2 DEBUG nova.virt.libvirt.host [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.326 2 DEBUG nova.virt.libvirt.host [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.327 2 DEBUG nova.virt.libvirt.host [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.329 2 DEBUG nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.329 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-10-02T12:37:26Z,direct_url=<?>,disk_format='raw',id=766f42a5-3020-41dc-a077-fda5642a1d60,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-2076879726-shelved',owner='1a05e525420b4aa8adcc9561158e73d1',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-10-02T12:37:37Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.330 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.331 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.331 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.332 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.332 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.333 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.333 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.334 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.334 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.335 2 DEBUG nova.virt.hardware [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.335 2 DEBUG nova.objects.instance [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.373 2 DEBUG oslo_concurrency.processutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:38:07 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2084835883' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.871 2 DEBUG oslo_concurrency.processutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.902 2 DEBUG nova.storage.rbd_utils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:07 np0005465988 nova_compute[236126]: 2025-10-02 12:38:07.907 2 DEBUG oslo_concurrency.processutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:38:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1445632556' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.382 2 DEBUG oslo_concurrency.processutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.385 2 DEBUG nova.virt.libvirt.vif [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:36:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2076879726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2076879726',id=142,image_ref='766f42a5-3020-41dc-a077-fda5642a1d60',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-152732924',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-hgcyfy0h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member',shelved_at='2025-10-02T12:37:38.046792',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='766f42a5-3020-41dc-a077-fda5642a1d60'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:37:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=be1174bf-d7e1-4801-a2eb-67020632d637,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.386 2 DEBUG nova.network.os_vif_util [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:08.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.388 2 DEBUG nova.network.os_vif_util [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.390 2 DEBUG nova.objects.instance [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'pci_devices' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.413 2 DEBUG nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  <uuid>be1174bf-d7e1-4801-a2eb-67020632d637</uuid>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  <name>instance-0000008e</name>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-2076879726</nova:name>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:38:07</nova:creationTime>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <nova:user uuid="bcd36ab668f449959719ba7058f25e72">tempest-AttachVolumeShelveTestJSON-405673070-project-member</nova:user>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <nova:project uuid="1a05e525420b4aa8adcc9561158e73d1">tempest-AttachVolumeShelveTestJSON-405673070</nova:project>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="766f42a5-3020-41dc-a077-fda5642a1d60"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <nova:port uuid="2af53b80-5072-4407-80f0-88120c0351f7">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <entry name="serial">be1174bf-d7e1-4801-a2eb-67020632d637</entry>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <entry name="uuid">be1174bf-d7e1-4801-a2eb-67020632d637</entry>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/be1174bf-d7e1-4801-a2eb-67020632d637_disk">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/be1174bf-d7e1-4801-a2eb-67020632d637_disk.config">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:c9:3b:23"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <target dev="tap2af53b80-50"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/console.log" append="off"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <input type="keyboard" bus="usb"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:38:08 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:38:08 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:38:08 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:38:08 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.416 2 DEBUG nova.compute.manager [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Preparing to wait for external event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.416 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.417 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.417 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.418 2 DEBUG nova.virt.libvirt.vif [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:36:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2076879726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2076879726',id=142,image_ref='766f42a5-3020-41dc-a077-fda5642a1d60',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-152732924',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-hgcyfy0h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member',shelved_at='2025-10-02T12:37:38.046792',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='766f42a5-3020-41dc-a077-fda5642a1d60'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:37:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=be1174bf-d7e1-4801-a2eb-67020632d637,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.419 2 DEBUG nova.network.os_vif_util [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.420 2 DEBUG nova.network.os_vif_util [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.420 2 DEBUG os_vif [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.422 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.423 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.428 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2af53b80-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.429 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2af53b80-50, col_values=(('external_ids', {'iface-id': '2af53b80-5072-4407-80f0-88120c0351f7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:3b:23', 'vm-uuid': 'be1174bf-d7e1-4801-a2eb-67020632d637'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:08 np0005465988 NetworkManager[45041]: <info>  [1759408688.4339] manager: (tap2af53b80-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.441 2 INFO os_vif [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50')#033[00m
Oct  2 08:38:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:38:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.533 2 DEBUG nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.534 2 DEBUG nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.534 2 DEBUG nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] No VIF found with MAC fa:16:3e:c9:3b:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.535 2 INFO nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Using config drive#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.568 2 DEBUG nova.storage.rbd_utils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.640 2 DEBUG nova.objects.instance [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'ec2_ids' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:08 np0005465988 nova_compute[236126]: 2025-10-02 12:38:08.682 2 DEBUG nova.objects.instance [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'keypairs' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:08.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.363 2 INFO nova.virt.libvirt.driver [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Deleting instance files /var/lib/nova/instances/c9f6d037-d76c-47e5-b6c4-9b36ef31f934_del#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.364 2 INFO nova.virt.libvirt.driver [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Deletion of /var/lib/nova/instances/c9f6d037-d76c-47e5-b6c4-9b36ef31f934_del complete#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.374 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408674.37367, c9f6d037-d76c-47e5-b6c4-9b36ef31f934 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.374 2 INFO nova.compute.manager [-] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.390 2 DEBUG nova.compute.manager [None req-729ba710-296a-4131-8f5b-07874d0af6b0 - - - - - -] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.395 2 DEBUG nova.compute.manager [None req-729ba710-296a-4131-8f5b-07874d0af6b0 - - - - - -] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.420 2 INFO nova.compute.manager [None req-729ba710-296a-4131-8f5b-07874d0af6b0 - - - - - -] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.423 2 INFO nova.compute.manager [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Took 15.87 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.424 2 DEBUG oslo.service.loopingcall [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.424 2 DEBUG nova.compute.manager [-] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.424 2 DEBUG nova.network.neutron [-] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.488 2 INFO nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Creating config drive at /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.495 2 DEBUG oslo_concurrency.processutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0590g_7g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.593 2 DEBUG nova.network.neutron [-] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.604 2 DEBUG nova.network.neutron [-] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.618 2 INFO nova.compute.manager [-] [instance: c9f6d037-d76c-47e5-b6c4-9b36ef31f934] Took 0.19 seconds to deallocate network for instance.#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.644 2 DEBUG oslo_concurrency.processutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0590g_7g" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.686 2 DEBUG nova.storage.rbd_utils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] rbd image be1174bf-d7e1-4801-a2eb-67020632d637_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.691 2 DEBUG oslo_concurrency.processutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config be1174bf-d7e1-4801-a2eb-67020632d637_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.744 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.745 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:09 np0005465988 nova_compute[236126]: 2025-10-02 12:38:09.842 2 DEBUG oslo_concurrency.processutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:38:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4040968926' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.266 2 DEBUG oslo_concurrency.processutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.272 2 DEBUG nova.compute.provider_tree [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.287 2 DEBUG nova.scheduler.client.report [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.308 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.341 2 INFO nova.scheduler.client.report [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Deleted allocations for instance c9f6d037-d76c-47e5-b6c4-9b36ef31f934#033[00m
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.350 2 DEBUG oslo_concurrency.processutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config be1174bf-d7e1-4801-a2eb-67020632d637_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.659s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.351 2 INFO nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Deleting local config drive /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637/disk.config because it was imported into RBD.#033[00m
Oct  2 08:38:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:10.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:10 np0005465988 kernel: tap2af53b80-50: entered promiscuous mode
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.413 2 DEBUG oslo_concurrency.lockutils [None req-a64bfb19-04b4-4ef2-b241-23c049704683 f7d78f04152a425b81d486a834213a76 af873f6cea354e099a97c3b09ce8ca27 - - default default] Lock "c9f6d037-d76c-47e5-b6c4-9b36ef31f934" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 17.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:10 np0005465988 NetworkManager[45041]: <info>  [1759408690.4149] manager: (tap2af53b80-50): new Tun device (/org/freedesktop/NetworkManager/Devices/293)
Oct  2 08:38:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:10Z|00656|binding|INFO|Claiming lport 2af53b80-5072-4407-80f0-88120c0351f7 for this chassis.
Oct  2 08:38:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:10Z|00657|binding|INFO|2af53b80-5072-4407-80f0-88120c0351f7: Claiming fa:16:3e:c9:3b:23 10.100.0.3
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.466 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:3b:23 10.100.0.3'], port_security=['fa:16:3e:c9:3b:23 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'be1174bf-d7e1-4801-a2eb-67020632d637', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a05e525420b4aa8adcc9561158e73d1', 'neutron:revision_number': '7', 'neutron:security_group_ids': '13f3412c-42ad-420a-aa32-1b2881f511f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.198'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=709db70f-1209-49b9-bf90-2b91d986925d, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=2af53b80-5072-4407-80f0-88120c0351f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.467 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 2af53b80-5072-4407-80f0-88120c0351f7 in datapath 7b216831-24ac-41f0-ac1c-99aae9bc897b bound to our chassis#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.469 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b216831-24ac-41f0-ac1c-99aae9bc897b#033[00m
Oct  2 08:38:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:10Z|00658|binding|INFO|Setting lport 2af53b80-5072-4407-80f0-88120c0351f7 ovn-installed in OVS
Oct  2 08:38:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:10Z|00659|binding|INFO|Setting lport 2af53b80-5072-4407-80f0-88120c0351f7 up in Southbound
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.483 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[97cdbd51-e157-44f1-bde7-35239d284a32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.484 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b216831-21 in ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.486 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b216831-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.486 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[be0f1303-9717-4e78-8365-7f02315f38ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.487 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e7341a62-f3e2-4170-960e-8c7f0d2cee66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 systemd-udevd[302345]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:38:10 np0005465988 systemd-machined[192594]: New machine qemu-68-instance-0000008e.
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.498 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[ee995642-674d-4d83-8509-270d6cf0691d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 NetworkManager[45041]: <info>  [1759408690.5095] device (tap2af53b80-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:38:10 np0005465988 NetworkManager[45041]: <info>  [1759408690.5106] device (tap2af53b80-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:38:10 np0005465988 systemd[1]: Started Virtual Machine qemu-68-instance-0000008e.
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.524 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[06da9f76-167e-4abf-a4b8-e72f1fd1a6a4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.550 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c1024f9f-9cbd-4f27-a11a-554a3aab1ce3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 NetworkManager[45041]: <info>  [1759408690.5574] manager: (tap7b216831-20): new Veth device (/org/freedesktop/NetworkManager/Devices/294)
Oct  2 08:38:10 np0005465988 systemd-udevd[302349]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.556 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[29de4e27-2410-4de3-896b-9ec521e4ac57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.590 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[63b8d8f3-138f-46a6-bc24-798efea7b4fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.592 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[be173b21-23b6-4250-955e-48324bcdeec6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.599 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:10 np0005465988 NetworkManager[45041]: <info>  [1759408690.6207] device (tap7b216831-20): carrier: link connected
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.631 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b4b8d651-42d1-41c4-b247-914db5bacf8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.647 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0b80a9-1457-497a-9c1b-2b81c164924f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b216831-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:a4:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675095, 'reachable_time': 18888, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302377, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.661 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6a878d6b-05ed-45e1-8ec0-481d8ef9f5a2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:a415'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675095, 'tstamp': 675095}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 302378, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.685 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6601ebfc-6091-4cc9-9c28-ed80889b70cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b216831-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:a4:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675095, 'reachable_time': 18888, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 302379, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.719 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[36e518c3-3024-4dd8-8ce0-d272c464115e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.797 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cc23cf0e-3bc2-4c11-bf91-a1553e38c22f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.799 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b216831-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.800 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.800 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b216831-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:10 np0005465988 NetworkManager[45041]: <info>  [1759408690.8030] manager: (tap7b216831-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Oct  2 08:38:10 np0005465988 kernel: tap7b216831-20: entered promiscuous mode
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.813 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b216831-20, col_values=(('external_ids', {'iface-id': '7b6901ce-64cc-402d-847e-45c0d79bbb3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:10Z|00660|binding|INFO|Releasing lport 7b6901ce-64cc-402d-847e-45c0d79bbb3b from this chassis (sb_readonly=0)
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:10 np0005465988 nova_compute[236126]: 2025-10-02 12:38:10.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.834 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.836 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ea87e66b-868a-473a-9e40-9e4b07833438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.836 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-7b216831-24ac-41f0-ac1c-99aae9bc897b
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/7b216831-24ac-41f0-ac1c-99aae9bc897b.pid.haproxy
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 7b216831-24ac-41f0-ac1c-99aae9bc897b
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:38:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:10.837 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'env', 'PROCESS_TAG=haproxy-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b216831-24ac-41f0-ac1c-99aae9bc897b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:38:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:10.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:11 np0005465988 podman[302454]: 2025-10-02 12:38:11.288392171 +0000 UTC m=+0.074709910 container create f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:38:11 np0005465988 podman[302454]: 2025-10-02 12:38:11.248652558 +0000 UTC m=+0.034970317 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:38:11 np0005465988 systemd[1]: Started libpod-conmon-f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797.scope.
Oct  2 08:38:11 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:38:11 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee2dc8b092953ec90516b3573505bf614228ecc99c484080c724ba96684e08cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:38:11 np0005465988 podman[302454]: 2025-10-02 12:38:11.409919935 +0000 UTC m=+0.196237724 container init f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:38:11 np0005465988 podman[302454]: 2025-10-02 12:38:11.417354119 +0000 UTC m=+0.203671888 container start f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:38:11 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[302469]: [NOTICE]   (302473) : New worker (302475) forked
Oct  2 08:38:11 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[302469]: [NOTICE]   (302473) : Loading success.
Oct  2 08:38:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:11.511 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.704 2 DEBUG nova.compute.manager [req-2810dea6-4fb0-4173-818b-d1cf22cfe757 req-fb116730-a804-4844-b6a5-54dcc6ea17b2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.705 2 DEBUG oslo_concurrency.lockutils [req-2810dea6-4fb0-4173-818b-d1cf22cfe757 req-fb116730-a804-4844-b6a5-54dcc6ea17b2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.705 2 DEBUG oslo_concurrency.lockutils [req-2810dea6-4fb0-4173-818b-d1cf22cfe757 req-fb116730-a804-4844-b6a5-54dcc6ea17b2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.706 2 DEBUG oslo_concurrency.lockutils [req-2810dea6-4fb0-4173-818b-d1cf22cfe757 req-fb116730-a804-4844-b6a5-54dcc6ea17b2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.706 2 DEBUG nova.compute.manager [req-2810dea6-4fb0-4173-818b-d1cf22cfe757 req-fb116730-a804-4844-b6a5-54dcc6ea17b2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Processing event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.706 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408691.6999025, be1174bf-d7e1-4801-a2eb-67020632d637 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.707 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] VM Started (Lifecycle Event)#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.709 2 DEBUG nova.compute.manager [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.714 2 DEBUG nova.virt.libvirt.driver [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.721 2 INFO nova.virt.libvirt.driver [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Instance spawned successfully.#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.731 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.738 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.768 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.769 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408691.7055001, be1174bf-d7e1-4801-a2eb-67020632d637 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.770 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.789 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.795 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408691.7125368, be1174bf-d7e1-4801-a2eb-67020632d637 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.796 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.832 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.839 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:38:11 np0005465988 nova_compute[236126]: 2025-10-02 12:38:11.869 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:38:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:12.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:12.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e336 e336: 3 total, 3 up, 3 in
Oct  2 08:38:13 np0005465988 nova_compute[236126]: 2025-10-02 12:38:13.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:13 np0005465988 podman[302485]: 2025-10-02 12:38:13.553034658 +0000 UTC m=+0.066345850 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 08:38:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e336 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:13 np0005465988 nova_compute[236126]: 2025-10-02 12:38:13.834 2 DEBUG nova.compute.manager [req-046faa70-0d66-4311-a589-21fd103dcc1c req-93fc85be-5648-4920-902c-fd4408241df4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:13 np0005465988 nova_compute[236126]: 2025-10-02 12:38:13.835 2 DEBUG oslo_concurrency.lockutils [req-046faa70-0d66-4311-a589-21fd103dcc1c req-93fc85be-5648-4920-902c-fd4408241df4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:13 np0005465988 nova_compute[236126]: 2025-10-02 12:38:13.835 2 DEBUG oslo_concurrency.lockutils [req-046faa70-0d66-4311-a589-21fd103dcc1c req-93fc85be-5648-4920-902c-fd4408241df4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:13 np0005465988 nova_compute[236126]: 2025-10-02 12:38:13.836 2 DEBUG oslo_concurrency.lockutils [req-046faa70-0d66-4311-a589-21fd103dcc1c req-93fc85be-5648-4920-902c-fd4408241df4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:13 np0005465988 nova_compute[236126]: 2025-10-02 12:38:13.836 2 DEBUG nova.compute.manager [req-046faa70-0d66-4311-a589-21fd103dcc1c req-93fc85be-5648-4920-902c-fd4408241df4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] No waiting events found dispatching network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:13 np0005465988 nova_compute[236126]: 2025-10-02 12:38:13.837 2 WARNING nova.compute.manager [req-046faa70-0d66-4311-a589-21fd103dcc1c req-93fc85be-5648-4920-902c-fd4408241df4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received unexpected event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Oct  2 08:38:14 np0005465988 nova_compute[236126]: 2025-10-02 12:38:14.094 2 DEBUG nova.compute.manager [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:14 np0005465988 nova_compute[236126]: 2025-10-02 12:38:14.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:14 np0005465988 nova_compute[236126]: 2025-10-02 12:38:14.176 2 DEBUG oslo_concurrency.lockutils [None req-d73ec667-80d7-401e-a8a4-83aba40fef9f bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 15.482s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:14 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #54. Immutable memtables: 10.
Oct  2 08:38:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:14.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:14 np0005465988 nova_compute[236126]: 2025-10-02 12:38:14.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:14 np0005465988 nova_compute[236126]: 2025-10-02 12:38:14.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:38:14 np0005465988 nova_compute[236126]: 2025-10-02 12:38:14.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:38:14 np0005465988 nova_compute[236126]: 2025-10-02 12:38:14.723 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:14 np0005465988 nova_compute[236126]: 2025-10-02 12:38:14.724 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:14 np0005465988 nova_compute[236126]: 2025-10-02 12:38:14.724 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:38:14 np0005465988 nova_compute[236126]: 2025-10-02 12:38:14.724 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:14.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:16 np0005465988 nova_compute[236126]: 2025-10-02 12:38:16.202 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updating instance_info_cache with network_info: [{"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:16 np0005465988 nova_compute[236126]: 2025-10-02 12:38:16.221 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-be1174bf-d7e1-4801-a2eb-67020632d637" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:16 np0005465988 nova_compute[236126]: 2025-10-02 12:38:16.221 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:38:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:16.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:16.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:18.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:18 np0005465988 nova_compute[236126]: 2025-10-02 12:38:18.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e336 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:18.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:19 np0005465988 nova_compute[236126]: 2025-10-02 12:38:19.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:20.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:20.514 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:20.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e337 e337: 3 total, 3 up, 3 in
Oct  2 08:38:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:22.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:22.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:23 np0005465988 nova_compute[236126]: 2025-10-02 12:38:23.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:24 np0005465988 nova_compute[236126]: 2025-10-02 12:38:24.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:24 np0005465988 nova_compute[236126]: 2025-10-02 12:38:24.216 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:38:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:24.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:38:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:24.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:25Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:3b:23 10.100.0.3
Oct  2 08:38:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:38:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:26.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:38:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:26.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:27.368 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:27.369 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:27.369 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:27 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:27Z|00661|binding|INFO|Releasing lport 7b6901ce-64cc-402d-847e-45c0d79bbb3b from this chassis (sb_readonly=0)
Oct  2 08:38:28 np0005465988 nova_compute[236126]: 2025-10-02 12:38:28.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:28.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:28 np0005465988 nova_compute[236126]: 2025-10-02 12:38:28.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:28.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:29 np0005465988 nova_compute[236126]: 2025-10-02 12:38:29.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:29 np0005465988 podman[302565]: 2025-10-02 12:38:29.674631189 +0000 UTC m=+0.059573365 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible)
Oct  2 08:38:29 np0005465988 podman[302563]: 2025-10-02 12:38:29.701186623 +0000 UTC m=+0.094527200 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:38:29 np0005465988 podman[302564]: 2025-10-02 12:38:29.701246714 +0000 UTC m=+0.092124320 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:38:30 np0005465988 nova_compute[236126]: 2025-10-02 12:38:30.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:30.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:30.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:31 np0005465988 nova_compute[236126]: 2025-10-02 12:38:31.514 2 DEBUG oslo_concurrency.lockutils [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:31 np0005465988 nova_compute[236126]: 2025-10-02 12:38:31.515 2 DEBUG oslo_concurrency.lockutils [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:31 np0005465988 nova_compute[236126]: 2025-10-02 12:38:31.516 2 DEBUG oslo_concurrency.lockutils [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:31 np0005465988 nova_compute[236126]: 2025-10-02 12:38:31.516 2 DEBUG oslo_concurrency.lockutils [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:31 np0005465988 nova_compute[236126]: 2025-10-02 12:38:31.517 2 DEBUG oslo_concurrency.lockutils [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:31 np0005465988 nova_compute[236126]: 2025-10-02 12:38:31.518 2 INFO nova.compute.manager [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Terminating instance#033[00m
Oct  2 08:38:31 np0005465988 nova_compute[236126]: 2025-10-02 12:38:31.519 2 DEBUG nova.compute.manager [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:38:32 np0005465988 kernel: tap2af53b80-50 (unregistering): left promiscuous mode
Oct  2 08:38:32 np0005465988 NetworkManager[45041]: <info>  [1759408712.2780] device (tap2af53b80-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:32Z|00662|binding|INFO|Releasing lport 2af53b80-5072-4407-80f0-88120c0351f7 from this chassis (sb_readonly=0)
Oct  2 08:38:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:32Z|00663|binding|INFO|Setting lport 2af53b80-5072-4407-80f0-88120c0351f7 down in Southbound
Oct  2 08:38:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:32Z|00664|binding|INFO|Removing iface tap2af53b80-50 ovn-installed in OVS
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.318 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:3b:23 10.100.0.3'], port_security=['fa:16:3e:c9:3b:23 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'be1174bf-d7e1-4801-a2eb-67020632d637', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a05e525420b4aa8adcc9561158e73d1', 'neutron:revision_number': '9', 'neutron:security_group_ids': '13f3412c-42ad-420a-aa32-1b2881f511f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.198', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=709db70f-1209-49b9-bf90-2b91d986925d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=2af53b80-5072-4407-80f0-88120c0351f7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.319 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 2af53b80-5072-4407-80f0-88120c0351f7 in datapath 7b216831-24ac-41f0-ac1c-99aae9bc897b unbound from our chassis#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.321 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b216831-24ac-41f0-ac1c-99aae9bc897b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.324 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0752d78d-04da-488a-92de-e47d4c0cea84]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.325 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b namespace which is not needed anymore#033[00m
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:32 np0005465988 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008e.scope: Deactivated successfully.
Oct  2 08:38:32 np0005465988 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008e.scope: Consumed 14.602s CPU time.
Oct  2 08:38:32 np0005465988 systemd-machined[192594]: Machine qemu-68-instance-0000008e terminated.
Oct  2 08:38:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:32.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:32 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[302469]: [NOTICE]   (302473) : haproxy version is 2.8.14-c23fe91
Oct  2 08:38:32 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[302469]: [NOTICE]   (302473) : path to executable is /usr/sbin/haproxy
Oct  2 08:38:32 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[302469]: [WARNING]  (302473) : Exiting Master process...
Oct  2 08:38:32 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[302469]: [ALERT]    (302473) : Current worker (302475) exited with code 143 (Terminated)
Oct  2 08:38:32 np0005465988 neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b[302469]: [WARNING]  (302473) : All workers exited. Exiting... (0)
Oct  2 08:38:32 np0005465988 systemd[1]: libpod-f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797.scope: Deactivated successfully.
Oct  2 08:38:32 np0005465988 podman[302656]: 2025-10-02 12:38:32.513947176 +0000 UTC m=+0.075781091 container died f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.572 2 INFO nova.virt.libvirt.driver [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Instance destroyed successfully.#033[00m
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.573 2 DEBUG nova.objects.instance [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lazy-loading 'resources' on Instance uuid be1174bf-d7e1-4801-a2eb-67020632d637 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.595 2 DEBUG nova.virt.libvirt.vif [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:36:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2076879726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2076879726',id=142,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA2KA3xcRPebgvgTazx0E34aPT9rxhs35D4g1Uzjz4PIuwR8cc5jli8pSQUOimuckXeWKODOH/ieI/CBtPk6/J+xp5vS+z3Hichw9+q7Uc2dLlyh1Q0msK0J8MjXf0hI6Q==',key_name='tempest-keypair-152732924',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:38:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1a05e525420b4aa8adcc9561158e73d1',ramdisk_id='',reservation_id='r-hgcyfy0h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-405673070',owner_user_name='tempest-AttachVolumeShelveTestJSON-405673070-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:38:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bcd36ab668f449959719ba7058f25e72',uuid=be1174bf-d7e1-4801-a2eb-67020632d637,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.596 2 DEBUG nova.network.os_vif_util [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converting VIF {"id": "2af53b80-5072-4407-80f0-88120c0351f7", "address": "fa:16:3e:c9:3b:23", "network": {"id": "7b216831-24ac-41f0-ac1c-99aae9bc897b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-116436084-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a05e525420b4aa8adcc9561158e73d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2af53b80-50", "ovs_interfaceid": "2af53b80-5072-4407-80f0-88120c0351f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.596 2 DEBUG nova.network.os_vif_util [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.597 2 DEBUG os_vif [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.599 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2af53b80-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:32 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797-userdata-shm.mount: Deactivated successfully.
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:38:32 np0005465988 systemd[1]: var-lib-containers-storage-overlay-ee2dc8b092953ec90516b3573505bf614228ecc99c484080c724ba96684e08cc-merged.mount: Deactivated successfully.
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.607 2 INFO os_vif [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:3b:23,bridge_name='br-int',has_traffic_filtering=True,id=2af53b80-5072-4407-80f0-88120c0351f7,network=Network(7b216831-24ac-41f0-ac1c-99aae9bc897b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2af53b80-50')#033[00m
Oct  2 08:38:32 np0005465988 podman[302656]: 2025-10-02 12:38:32.699299477 +0000 UTC m=+0.261133392 container cleanup f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:38:32 np0005465988 systemd[1]: libpod-conmon-f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797.scope: Deactivated successfully.
Oct  2 08:38:32 np0005465988 podman[302714]: 2025-10-02 12:38:32.920465018 +0000 UTC m=+0.193212078 container remove f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.926 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec8329b-e902-4233-bc63-5a00ff3c4c14]: (4, ('Thu Oct  2 12:38:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b (f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797)\nf43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797\nThu Oct  2 12:38:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b (f43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797)\nf43debc6bacedb17a05a67263f6d19ec404bcd8178b84137edab47a35a8bc797\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.928 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bd744639-2406-43ca-8a2e-6bcd2d1cd6c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.929 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b216831-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:32 np0005465988 kernel: tap7b216831-20: left promiscuous mode
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.936 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a4856f30-c2c1-4b7f-b058-ee28f77b5f0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:32.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:32 np0005465988 nova_compute[236126]: 2025-10-02 12:38:32.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.970 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[de4b3c28-57f6-4301-a63d-b93efe76fd95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.972 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[60098fae-ec6c-4220-b2af-cdb342a2be6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.986 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4f46ec3b-f849-41f4-8593-a5eb0e253453]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675087, 'reachable_time': 20153, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302730, 'error': None, 'target': 'ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.988 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b216831-24ac-41f0-ac1c-99aae9bc897b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:38:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:32.988 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[f787caa8-6e40-4149-9075-f359360835dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:32 np0005465988 systemd[1]: run-netns-ovnmeta\x2d7b216831\x2d24ac\x2d41f0\x2dac1c\x2d99aae9bc897b.mount: Deactivated successfully.
Oct  2 08:38:33 np0005465988 nova_compute[236126]: 2025-10-02 12:38:33.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:33 np0005465988 nova_compute[236126]: 2025-10-02 12:38:33.723 2 DEBUG nova.compute.manager [req-7705122b-c2ca-4fd9-924b-939142bf8126 req-03f071ce-0ac6-404b-b962-e58da4ece8ee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-vif-unplugged-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:33 np0005465988 nova_compute[236126]: 2025-10-02 12:38:33.724 2 DEBUG oslo_concurrency.lockutils [req-7705122b-c2ca-4fd9-924b-939142bf8126 req-03f071ce-0ac6-404b-b962-e58da4ece8ee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:33 np0005465988 nova_compute[236126]: 2025-10-02 12:38:33.724 2 DEBUG oslo_concurrency.lockutils [req-7705122b-c2ca-4fd9-924b-939142bf8126 req-03f071ce-0ac6-404b-b962-e58da4ece8ee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:33 np0005465988 nova_compute[236126]: 2025-10-02 12:38:33.725 2 DEBUG oslo_concurrency.lockutils [req-7705122b-c2ca-4fd9-924b-939142bf8126 req-03f071ce-0ac6-404b-b962-e58da4ece8ee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:33 np0005465988 nova_compute[236126]: 2025-10-02 12:38:33.725 2 DEBUG nova.compute.manager [req-7705122b-c2ca-4fd9-924b-939142bf8126 req-03f071ce-0ac6-404b-b962-e58da4ece8ee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] No waiting events found dispatching network-vif-unplugged-2af53b80-5072-4407-80f0-88120c0351f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:33 np0005465988 nova_compute[236126]: 2025-10-02 12:38:33.725 2 DEBUG nova.compute.manager [req-7705122b-c2ca-4fd9-924b-939142bf8126 req-03f071ce-0ac6-404b-b962-e58da4ece8ee d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-vif-unplugged-2af53b80-5072-4407-80f0-88120c0351f7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:38:34 np0005465988 nova_compute[236126]: 2025-10-02 12:38:34.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:34.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:34 np0005465988 nova_compute[236126]: 2025-10-02 12:38:34.627 2 INFO nova.virt.libvirt.driver [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Deleting instance files /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637_del#033[00m
Oct  2 08:38:34 np0005465988 nova_compute[236126]: 2025-10-02 12:38:34.629 2 INFO nova.virt.libvirt.driver [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Deletion of /var/lib/nova/instances/be1174bf-d7e1-4801-a2eb-67020632d637_del complete#033[00m
Oct  2 08:38:34 np0005465988 nova_compute[236126]: 2025-10-02 12:38:34.682 2 INFO nova.compute.manager [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Took 3.16 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:38:34 np0005465988 nova_compute[236126]: 2025-10-02 12:38:34.683 2 DEBUG oslo.service.loopingcall [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:38:34 np0005465988 nova_compute[236126]: 2025-10-02 12:38:34.683 2 DEBUG nova.compute.manager [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:38:34 np0005465988 nova_compute[236126]: 2025-10-02 12:38:34.683 2 DEBUG nova.network.neutron [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:38:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:34.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.773 2 DEBUG nova.network.neutron [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.803 2 INFO nova.compute.manager [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Took 1.12 seconds to deallocate network for instance.#033[00m
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.866 2 DEBUG oslo_concurrency.lockutils [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.867 2 DEBUG oslo_concurrency.lockutils [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.889 2 DEBUG nova.compute.manager [req-6a080b47-9b5a-4b8b-9995-dfc8380a42d8 req-f496dedb-7de6-41f7-be87-1d46a68e0420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.890 2 DEBUG oslo_concurrency.lockutils [req-6a080b47-9b5a-4b8b-9995-dfc8380a42d8 req-f496dedb-7de6-41f7-be87-1d46a68e0420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.891 2 DEBUG oslo_concurrency.lockutils [req-6a080b47-9b5a-4b8b-9995-dfc8380a42d8 req-f496dedb-7de6-41f7-be87-1d46a68e0420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.891 2 DEBUG oslo_concurrency.lockutils [req-6a080b47-9b5a-4b8b-9995-dfc8380a42d8 req-f496dedb-7de6-41f7-be87-1d46a68e0420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.891 2 DEBUG nova.compute.manager [req-6a080b47-9b5a-4b8b-9995-dfc8380a42d8 req-f496dedb-7de6-41f7-be87-1d46a68e0420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] No waiting events found dispatching network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.892 2 WARNING nova.compute.manager [req-6a080b47-9b5a-4b8b-9995-dfc8380a42d8 req-f496dedb-7de6-41f7-be87-1d46a68e0420 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received unexpected event network-vif-plugged-2af53b80-5072-4407-80f0-88120c0351f7 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.963 2 DEBUG nova.compute.manager [req-081d2bac-3997-41c2-851b-3e7ec651caae req-6f9b8a4b-d7be-4d66-8640-76cc383b2997 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Received event network-vif-deleted-2af53b80-5072-4407-80f0-88120c0351f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:35 np0005465988 nova_compute[236126]: 2025-10-02 12:38:35.965 2 DEBUG oslo_concurrency.processutils [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:36.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:38:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3882477945' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:38:36 np0005465988 nova_compute[236126]: 2025-10-02 12:38:36.482 2 DEBUG oslo_concurrency.processutils [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:36 np0005465988 nova_compute[236126]: 2025-10-02 12:38:36.488 2 DEBUG nova.compute.provider_tree [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:38:36 np0005465988 nova_compute[236126]: 2025-10-02 12:38:36.508 2 DEBUG nova.scheduler.client.report [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:38:36 np0005465988 nova_compute[236126]: 2025-10-02 12:38:36.536 2 DEBUG oslo_concurrency.lockutils [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:36 np0005465988 nova_compute[236126]: 2025-10-02 12:38:36.586 2 INFO nova.scheduler.client.report [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Deleted allocations for instance be1174bf-d7e1-4801-a2eb-67020632d637#033[00m
Oct  2 08:38:36 np0005465988 nova_compute[236126]: 2025-10-02 12:38:36.667 2 DEBUG oslo_concurrency.lockutils [None req-b22aec99-9b76-45ea-b443-d76880207305 bcd36ab668f449959719ba7058f25e72 1a05e525420b4aa8adcc9561158e73d1 - - default default] Lock "be1174bf-d7e1-4801-a2eb-67020632d637" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:36.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:37 np0005465988 nova_compute[236126]: 2025-10-02 12:38:37.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:37 np0005465988 nova_compute[236126]: 2025-10-02 12:38:37.847 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:37 np0005465988 nova_compute[236126]: 2025-10-02 12:38:37.847 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:37 np0005465988 nova_compute[236126]: 2025-10-02 12:38:37.914 2 DEBUG nova.compute.manager [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:38:37 np0005465988 nova_compute[236126]: 2025-10-02 12:38:37.984 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:37 np0005465988 nova_compute[236126]: 2025-10-02 12:38:37.984 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:37 np0005465988 nova_compute[236126]: 2025-10-02 12:38:37.992 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:38:37 np0005465988 nova_compute[236126]: 2025-10-02 12:38:37.992 2 INFO nova.compute.claims [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:38:38 np0005465988 nova_compute[236126]: 2025-10-02 12:38:38.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:38 np0005465988 nova_compute[236126]: 2025-10-02 12:38:38.175 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:38.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:38:38 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1546098081' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:38:38 np0005465988 nova_compute[236126]: 2025-10-02 12:38:38.635 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:38 np0005465988 nova_compute[236126]: 2025-10-02 12:38:38.642 2 DEBUG nova.compute.provider_tree [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:38:38 np0005465988 nova_compute[236126]: 2025-10-02 12:38:38.683 2 DEBUG nova.scheduler.client.report [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:38:38 np0005465988 nova_compute[236126]: 2025-10-02 12:38:38.811 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:38 np0005465988 nova_compute[236126]: 2025-10-02 12:38:38.813 2 DEBUG nova.compute.manager [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:38:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:38.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:38 np0005465988 nova_compute[236126]: 2025-10-02 12:38:38.974 2 DEBUG nova.compute.manager [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:38:38 np0005465988 nova_compute[236126]: 2025-10-02 12:38:38.975 2 DEBUG nova.network.neutron [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.114 2 INFO nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.153 2 DEBUG nova.compute.manager [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.254 2 DEBUG nova.compute.manager [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.256 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.256 2 INFO nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Creating image(s)#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.293 2 DEBUG nova.storage.rbd_utils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.324 2 DEBUG nova.storage.rbd_utils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.354 2 DEBUG nova.storage.rbd_utils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.358 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.403 2 DEBUG nova.policy [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c02d1dcc10ea4e57bbc6b7a3c100dc7b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6822f02d5ca04c659329a75d487054cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.450 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.451 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.452 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.452 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.480 2 DEBUG nova.storage.rbd_utils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.484 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 45079fff-1c54-42d6-921b-150592757d59_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:39 np0005465988 nova_compute[236126]: 2025-10-02 12:38:39.991 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 45079fff-1c54-42d6-921b-150592757d59_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:40 np0005465988 nova_compute[236126]: 2025-10-02 12:38:40.084 2 DEBUG nova.storage.rbd_utils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] resizing rbd image 45079fff-1c54-42d6-921b-150592757d59_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:38:40 np0005465988 nova_compute[236126]: 2025-10-02 12:38:40.214 2 DEBUG nova.objects.instance [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'migration_context' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:40 np0005465988 nova_compute[236126]: 2025-10-02 12:38:40.237 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:38:40 np0005465988 nova_compute[236126]: 2025-10-02 12:38:40.237 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Ensure instance console log exists: /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:38:40 np0005465988 nova_compute[236126]: 2025-10-02 12:38:40.238 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:40 np0005465988 nova_compute[236126]: 2025-10-02 12:38:40.239 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:40 np0005465988 nova_compute[236126]: 2025-10-02 12:38:40.239 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:40 np0005465988 nova_compute[236126]: 2025-10-02 12:38:40.323 2 DEBUG nova.network.neutron [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Successfully created port: fbca287a-897d-4532-bd1a-8bd100ed84e5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:38:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:40.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:40.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:41 np0005465988 nova_compute[236126]: 2025-10-02 12:38:41.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:41 np0005465988 nova_compute[236126]: 2025-10-02 12:38:41.681 2 DEBUG nova.network.neutron [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Successfully updated port: fbca287a-897d-4532-bd1a-8bd100ed84e5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:38:41 np0005465988 nova_compute[236126]: 2025-10-02 12:38:41.700 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:41 np0005465988 nova_compute[236126]: 2025-10-02 12:38:41.700 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquired lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:41 np0005465988 nova_compute[236126]: 2025-10-02 12:38:41.701 2 DEBUG nova.network.neutron [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:38:41 np0005465988 nova_compute[236126]: 2025-10-02 12:38:41.781 2 DEBUG nova.compute.manager [req-2222c20a-70c0-47a9-94f6-62d7c08bca58 req-c85a0fd2-638d-4dcc-aa48-c0c1e196e152 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-changed-fbca287a-897d-4532-bd1a-8bd100ed84e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:41 np0005465988 nova_compute[236126]: 2025-10-02 12:38:41.781 2 DEBUG nova.compute.manager [req-2222c20a-70c0-47a9-94f6-62d7c08bca58 req-c85a0fd2-638d-4dcc-aa48-c0c1e196e152 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Refreshing instance network info cache due to event network-changed-fbca287a-897d-4532-bd1a-8bd100ed84e5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:38:41 np0005465988 nova_compute[236126]: 2025-10-02 12:38:41.782 2 DEBUG oslo_concurrency.lockutils [req-2222c20a-70c0-47a9-94f6-62d7c08bca58 req-c85a0fd2-638d-4dcc-aa48-c0c1e196e152 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:41 np0005465988 nova_compute[236126]: 2025-10-02 12:38:41.895 2 DEBUG nova.network.neutron [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:38:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:42.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.888 2 DEBUG nova.network.neutron [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Updating instance_info_cache with network_info: [{"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.911 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Releasing lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.912 2 DEBUG nova.compute.manager [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Instance network_info: |[{"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.912 2 DEBUG oslo_concurrency.lockutils [req-2222c20a-70c0-47a9-94f6-62d7c08bca58 req-c85a0fd2-638d-4dcc-aa48-c0c1e196e152 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.912 2 DEBUG nova.network.neutron [req-2222c20a-70c0-47a9-94f6-62d7c08bca58 req-c85a0fd2-638d-4dcc-aa48-c0c1e196e152 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Refreshing network info cache for port fbca287a-897d-4532-bd1a-8bd100ed84e5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.915 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Start _get_guest_xml network_info=[{"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.922 2 WARNING nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.928 2 DEBUG nova.virt.libvirt.host [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.929 2 DEBUG nova.virt.libvirt.host [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.944 2 DEBUG nova.virt.libvirt.host [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.945 2 DEBUG nova.virt.libvirt.host [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.949 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.949 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.950 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.951 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.951 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.952 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.952 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.953 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.954 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.954 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.955 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.955 2 DEBUG nova.virt.hardware [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:38:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:42.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:42 np0005465988 nova_compute[236126]: 2025-10-02 12:38:42.961 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:38:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1066104501' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:38:43 np0005465988 nova_compute[236126]: 2025-10-02 12:38:43.417 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:43 np0005465988 nova_compute[236126]: 2025-10-02 12:38:43.444 2 DEBUG nova.storage.rbd_utils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:43 np0005465988 nova_compute[236126]: 2025-10-02 12:38:43.450 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:38:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1558871007' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:38:43 np0005465988 nova_compute[236126]: 2025-10-02 12:38:43.991 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.542s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:43 np0005465988 nova_compute[236126]: 2025-10-02 12:38:43.993 2 DEBUG nova.virt.libvirt.vif [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:38:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1744856244',display_name='tempest-tempest.common.compute-instance-1744856244',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1744856244',id=149,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB901wljkng8woQCRBVLpBhg2UNOo8jpxGS+GKRpCYtBXlVGSdH5T4M1H660T/Pgy6wslJmiip7oPjF07ONt3YODwmkR9Q/tCQnjaZPJ2wEKdN/e6CKDf4DgL6XjoOZqHA==',key_name='tempest-keypair-1625658847',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6822f02d5ca04c659329a75d487054cf',ramdisk_id='',reservation_id='r-6ppa0vgd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1680083910',owner_user_name='tempest-ServerActionsTestOtherA-1680083910-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:38:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c02d1dcc10ea4e57bbc6b7a3c100dc7b',uuid=45079fff-1c54-42d6-921b-150592757d59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:38:43 np0005465988 nova_compute[236126]: 2025-10-02 12:38:43.994 2 DEBUG nova.network.os_vif_util [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converting VIF {"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:43 np0005465988 nova_compute[236126]: 2025-10-02 12:38:43.995 2 DEBUG nova.network.os_vif_util [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:43 np0005465988 nova_compute[236126]: 2025-10-02 12:38:43.996 2 DEBUG nova.objects.instance [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'pci_devices' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.022 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  <uuid>45079fff-1c54-42d6-921b-150592757d59</uuid>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  <name>instance-00000095</name>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <nova:name>tempest-tempest.common.compute-instance-1744856244</nova:name>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:38:42</nova:creationTime>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <nova:user uuid="c02d1dcc10ea4e57bbc6b7a3c100dc7b">tempest-ServerActionsTestOtherA-1680083910-project-member</nova:user>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <nova:project uuid="6822f02d5ca04c659329a75d487054cf">tempest-ServerActionsTestOtherA-1680083910</nova:project>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <nova:port uuid="fbca287a-897d-4532-bd1a-8bd100ed84e5">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <entry name="serial">45079fff-1c54-42d6-921b-150592757d59</entry>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <entry name="uuid">45079fff-1c54-42d6-921b-150592757d59</entry>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/45079fff-1c54-42d6-921b-150592757d59_disk">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/45079fff-1c54-42d6-921b-150592757d59_disk.config">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:1f:e8:ba"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <target dev="tapfbca287a-89"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/console.log" append="off"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:38:44 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:38:44 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:38:44 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:38:44 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.024 2 DEBUG nova.compute.manager [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Preparing to wait for external event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.025 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.025 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.025 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.026 2 DEBUG nova.virt.libvirt.vif [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:38:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1744856244',display_name='tempest-tempest.common.compute-instance-1744856244',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1744856244',id=149,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB901wljkng8woQCRBVLpBhg2UNOo8jpxGS+GKRpCYtBXlVGSdH5T4M1H660T/Pgy6wslJmiip7oPjF07ONt3YODwmkR9Q/tCQnjaZPJ2wEKdN/e6CKDf4DgL6XjoOZqHA==',key_name='tempest-keypair-1625658847',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6822f02d5ca04c659329a75d487054cf',ramdisk_id='',reservation_id='r-6ppa0vgd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1680083910',owner_user_name='tempest-ServerActionsTestOtherA-1680083910-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:38:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c02d1dcc10ea4e57bbc6b7a3c100dc7b',uuid=45079fff-1c54-42d6-921b-150592757d59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.027 2 DEBUG nova.network.os_vif_util [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converting VIF {"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.028 2 DEBUG nova.network.os_vif_util [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.028 2 DEBUG os_vif [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.029 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.030 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.035 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbca287a-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.036 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfbca287a-89, col_values=(('external_ids', {'iface-id': 'fbca287a-897d-4532-bd1a-8bd100ed84e5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:e8:ba', 'vm-uuid': '45079fff-1c54-42d6-921b-150592757d59'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005465988 NetworkManager[45041]: <info>  [1759408724.0934] manager: (tapfbca287a-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/296)
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.100 2 INFO os_vif [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89')#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.171 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.172 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.172 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] No VIF found with MAC fa:16:3e:1f:e8:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.173 2 INFO nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Using config drive#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.211 2 DEBUG nova.storage.rbd_utils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:44 np0005465988 podman[303011]: 2025-10-02 12:38:44.235266525 +0000 UTC m=+0.084606574 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.296 2 DEBUG nova.network.neutron [req-2222c20a-70c0-47a9-94f6-62d7c08bca58 req-c85a0fd2-638d-4dcc-aa48-c0c1e196e152 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Updated VIF entry in instance network info cache for port fbca287a-897d-4532-bd1a-8bd100ed84e5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.297 2 DEBUG nova.network.neutron [req-2222c20a-70c0-47a9-94f6-62d7c08bca58 req-c85a0fd2-638d-4dcc-aa48-c0c1e196e152 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Updating instance_info_cache with network_info: [{"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.316 2 DEBUG oslo_concurrency.lockutils [req-2222c20a-70c0-47a9-94f6-62d7c08bca58 req-c85a0fd2-638d-4dcc-aa48-c0c1e196e152 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:44.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:44.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.993 2 INFO nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Creating config drive at /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config#033[00m
Oct  2 08:38:44 np0005465988 nova_compute[236126]: 2025-10-02 12:38:44.998 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo3ti_03j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:45 np0005465988 nova_compute[236126]: 2025-10-02 12:38:45.139 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo3ti_03j" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:45 np0005465988 nova_compute[236126]: 2025-10-02 12:38:45.165 2 DEBUG nova.storage.rbd_utils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:38:45 np0005465988 nova_compute[236126]: 2025-10-02 12:38:45.170 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config 45079fff-1c54-42d6-921b-150592757d59_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:45 np0005465988 nova_compute[236126]: 2025-10-02 12:38:45.923 2 DEBUG oslo_concurrency.processutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config 45079fff-1c54-42d6-921b-150592757d59_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.753s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:45 np0005465988 nova_compute[236126]: 2025-10-02 12:38:45.924 2 INFO nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Deleting local config drive /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config because it was imported into RBD.#033[00m
Oct  2 08:38:45 np0005465988 kernel: tapfbca287a-89: entered promiscuous mode
Oct  2 08:38:45 np0005465988 NetworkManager[45041]: <info>  [1759408725.9835] manager: (tapfbca287a-89): new Tun device (/org/freedesktop/NetworkManager/Devices/297)
Oct  2 08:38:45 np0005465988 nova_compute[236126]: 2025-10-02 12:38:45.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:45 np0005465988 nova_compute[236126]: 2025-10-02 12:38:45.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:45Z|00665|binding|INFO|Claiming lport fbca287a-897d-4532-bd1a-8bd100ed84e5 for this chassis.
Oct  2 08:38:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:45Z|00666|binding|INFO|fbca287a-897d-4532-bd1a-8bd100ed84e5: Claiming fa:16:3e:1f:e8:ba 10.100.0.9
Oct  2 08:38:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:45.995 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:e8:ba 10.100.0.9'], port_security=['fa:16:3e:1f:e8:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '45079fff-1c54-42d6-921b-150592757d59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00455285-97a7-4fa2-ba83-e8060936877e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6822f02d5ca04c659329a75d487054cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6969c0ae-a584-46fb-9098-5fddbc560ddc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f978d0a7-f86b-440f-a8b5-5432c3a4bc91, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=fbca287a-897d-4532-bd1a-8bd100ed84e5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:45.997 142124 INFO neutron.agent.ovn.metadata.agent [-] Port fbca287a-897d-4532-bd1a-8bd100ed84e5 in datapath 00455285-97a7-4fa2-ba83-e8060936877e bound to our chassis#033[00m
Oct  2 08:38:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:45.999 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 00455285-97a7-4fa2-ba83-e8060936877e#033[00m
Oct  2 08:38:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:46Z|00667|binding|INFO|Setting lport fbca287a-897d-4532-bd1a-8bd100ed84e5 ovn-installed in OVS
Oct  2 08:38:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:46Z|00668|binding|INFO|Setting lport fbca287a-897d-4532-bd1a-8bd100ed84e5 up in Southbound
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:46 np0005465988 systemd-udevd[303103]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.015 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[77a67f9e-6e35-4945-b94f-33c3b765cd7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.019 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap00455285-91 in ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.022 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap00455285-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.022 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0f5e7e4a-a6d4-4531-9c28-69282feb12e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.023 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[29a4322f-a384-4a14-bc73-053d19497e34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 systemd-machined[192594]: New machine qemu-69-instance-00000095.
Oct  2 08:38:46 np0005465988 NetworkManager[45041]: <info>  [1759408726.0318] device (tapfbca287a-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:38:46 np0005465988 NetworkManager[45041]: <info>  [1759408726.0327] device (tapfbca287a-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:38:46 np0005465988 systemd[1]: Started Virtual Machine qemu-69-instance-00000095.
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.040 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[c29dfa57-5106-4a10-8fe0-46f32d926abe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.071 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad5a964-5392-47b7-bd0c-755f252f7ac2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.099 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0386d836-4975-4b25-b30e-2615a056c5ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 NetworkManager[45041]: <info>  [1759408726.1052] manager: (tap00455285-90): new Veth device (/org/freedesktop/NetworkManager/Devices/298)
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.104 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d89f14e2-5ec1-42fd-8b00-e30294aa1239]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.144 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b402ddfb-b232-4471-90f2-9ca1e363626d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.147 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[fbaad372-6ac2-4c94-a4ea-4b9cd54697e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 NetworkManager[45041]: <info>  [1759408726.1760] device (tap00455285-90): carrier: link connected
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.181 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[bb51fe9a-f3a6-4b0d-8098-64ef0b3ba8fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.197 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b1b00e40-271d-48f7-b743-5e9689a94a10]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00455285-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:8a:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678650, 'reachable_time': 28231, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303136, 'error': None, 'target': 'ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.211 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[995e2d18-acc1-4ba2-8290-092fa08e8b8a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef6:8a3c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 678650, 'tstamp': 678650}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 303137, 'error': None, 'target': 'ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.227 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9bef95-5d63-4d8a-b4a8-dfc034097eba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00455285-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:8a:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678650, 'reachable_time': 28231, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 303138, 'error': None, 'target': 'ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.261 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[26697439-6bf3-4649-be05-0d00da5093e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.386 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c10f3209-e20d-45cd-a71c-f28404a1d16e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.389 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00455285-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.390 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.391 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00455285-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:46 np0005465988 NetworkManager[45041]: <info>  [1759408726.3939] manager: (tap00455285-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Oct  2 08:38:46 np0005465988 kernel: tap00455285-90: entered promiscuous mode
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.399 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap00455285-90, col_values=(('external_ids', {'iface-id': '293fb87a-10df-4698-a69e-3023bca5a6a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:46Z|00669|binding|INFO|Releasing lport 293fb87a-10df-4698-a69e-3023bca5a6a3 from this chassis (sb_readonly=0)
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.403 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00455285-97a7-4fa2-ba83-e8060936877e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00455285-97a7-4fa2-ba83-e8060936877e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.404 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6ccb80b5-b5c4-4f79-abd5-f665d206bb37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.405 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-00455285-97a7-4fa2-ba83-e8060936877e
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/00455285-97a7-4fa2-ba83-e8060936877e.pid.haproxy
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 00455285-97a7-4fa2-ba83-e8060936877e
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:38:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:38:46.407 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e', 'env', 'PROCESS_TAG=haproxy-00455285-97a7-4fa2-ba83-e8060936877e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/00455285-97a7-4fa2-ba83-e8060936877e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:46.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:46 np0005465988 podman[303212]: 2025-10-02 12:38:46.825273801 +0000 UTC m=+0.054581351 container create e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.835 2 DEBUG nova.compute.manager [req-192024b0-21cd-4c61-9723-8ac5ebf5e859 req-04ac1209-781c-43a3-8082-fbe232fcff55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.836 2 DEBUG oslo_concurrency.lockutils [req-192024b0-21cd-4c61-9723-8ac5ebf5e859 req-04ac1209-781c-43a3-8082-fbe232fcff55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.836 2 DEBUG oslo_concurrency.lockutils [req-192024b0-21cd-4c61-9723-8ac5ebf5e859 req-04ac1209-781c-43a3-8082-fbe232fcff55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.837 2 DEBUG oslo_concurrency.lockutils [req-192024b0-21cd-4c61-9723-8ac5ebf5e859 req-04ac1209-781c-43a3-8082-fbe232fcff55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:46 np0005465988 nova_compute[236126]: 2025-10-02 12:38:46.837 2 DEBUG nova.compute.manager [req-192024b0-21cd-4c61-9723-8ac5ebf5e859 req-04ac1209-781c-43a3-8082-fbe232fcff55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Processing event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:38:46 np0005465988 systemd[1]: Started libpod-conmon-e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844.scope.
Oct  2 08:38:46 np0005465988 podman[303212]: 2025-10-02 12:38:46.793742424 +0000 UTC m=+0.023049994 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:38:46 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:38:46 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6de903c98d5e64123f7861ad16816be41f2dbd058bd04b839b4935e36ce11880/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:38:46 np0005465988 podman[303212]: 2025-10-02 12:38:46.915068393 +0000 UTC m=+0.144375963 container init e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:38:46 np0005465988 podman[303212]: 2025-10-02 12:38:46.919784689 +0000 UTC m=+0.149092239 container start e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:38:46 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[303228]: [NOTICE]   (303232) : New worker (303234) forked
Oct  2 08:38:46 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[303228]: [NOTICE]   (303232) : Loading success.
Oct  2 08:38:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:38:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:46.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.196 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408727.1962903, 45079fff-1c54-42d6-921b-150592757d59 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.197 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] VM Started (Lifecycle Event)#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.199 2 DEBUG nova.compute.manager [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.205 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.208 2 INFO nova.virt.libvirt.driver [-] [instance: 45079fff-1c54-42d6-921b-150592757d59] Instance spawned successfully.#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.208 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.238 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.246 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.251 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.252 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.253 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.253 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.254 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.254 2 DEBUG nova.virt.libvirt.driver [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.297 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.298 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408727.197164, 45079fff-1c54-42d6-921b-150592757d59 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.298 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.337 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.341 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408727.2027874, 45079fff-1c54-42d6-921b-150592757d59 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.341 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.370 2 INFO nova.compute.manager [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Took 8.11 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.370 2 DEBUG nova.compute.manager [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.372 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.380 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.410 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.430 2 INFO nova.compute.manager [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Took 9.47 seconds to build instance.#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.444 2 DEBUG oslo_concurrency.lockutils [None req-c12a56b2-981f-4993-8ed6-be8e0c5258f4 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.570 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408712.5691483, be1174bf-d7e1-4801-a2eb-67020632d637 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.570 2 INFO nova.compute.manager [-] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:38:47 np0005465988 nova_compute[236126]: 2025-10-02 12:38:47.594 2 DEBUG nova.compute.manager [None req-416d0e55-a062-42ae-9df3-d94a8c081353 - - - - - -] [instance: be1174bf-d7e1-4801-a2eb-67020632d637] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:48.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:48.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:48 np0005465988 nova_compute[236126]: 2025-10-02 12:38:48.984 2 DEBUG nova.compute.manager [req-bd3991d0-12f5-4a3a-9173-9008a9886828 req-353d468a-52ef-43ab-924e-d553baee7583 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:48 np0005465988 nova_compute[236126]: 2025-10-02 12:38:48.985 2 DEBUG oslo_concurrency.lockutils [req-bd3991d0-12f5-4a3a-9173-9008a9886828 req-353d468a-52ef-43ab-924e-d553baee7583 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:48 np0005465988 nova_compute[236126]: 2025-10-02 12:38:48.985 2 DEBUG oslo_concurrency.lockutils [req-bd3991d0-12f5-4a3a-9173-9008a9886828 req-353d468a-52ef-43ab-924e-d553baee7583 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:48 np0005465988 nova_compute[236126]: 2025-10-02 12:38:48.986 2 DEBUG oslo_concurrency.lockutils [req-bd3991d0-12f5-4a3a-9173-9008a9886828 req-353d468a-52ef-43ab-924e-d553baee7583 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:48 np0005465988 nova_compute[236126]: 2025-10-02 12:38:48.986 2 DEBUG nova.compute.manager [req-bd3991d0-12f5-4a3a-9173-9008a9886828 req-353d468a-52ef-43ab-924e-d553baee7583 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] No waiting events found dispatching network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:48 np0005465988 nova_compute[236126]: 2025-10-02 12:38:48.987 2 WARNING nova.compute.manager [req-bd3991d0-12f5-4a3a-9173-9008a9886828 req-353d468a-52ef-43ab-924e-d553baee7583 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received unexpected event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:38:49 np0005465988 nova_compute[236126]: 2025-10-02 12:38:49.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:49 np0005465988 nova_compute[236126]: 2025-10-02 12:38:49.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:49 np0005465988 ovn_controller[132601]: 2025-10-02T12:38:49Z|00670|binding|INFO|Releasing lport 293fb87a-10df-4698-a69e-3023bca5a6a3 from this chassis (sb_readonly=0)
Oct  2 08:38:49 np0005465988 nova_compute[236126]: 2025-10-02 12:38:49.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:50.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:50.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:51 np0005465988 nova_compute[236126]: 2025-10-02 12:38:51.099 2 DEBUG nova.compute.manager [req-fd5d9223-df01-46f9-8090-c48360f2d905 req-3a3a464c-1fc5-4266-bd1a-f3b48463e205 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-changed-fbca287a-897d-4532-bd1a-8bd100ed84e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:51 np0005465988 nova_compute[236126]: 2025-10-02 12:38:51.099 2 DEBUG nova.compute.manager [req-fd5d9223-df01-46f9-8090-c48360f2d905 req-3a3a464c-1fc5-4266-bd1a-f3b48463e205 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Refreshing instance network info cache due to event network-changed-fbca287a-897d-4532-bd1a-8bd100ed84e5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:38:51 np0005465988 nova_compute[236126]: 2025-10-02 12:38:51.099 2 DEBUG oslo_concurrency.lockutils [req-fd5d9223-df01-46f9-8090-c48360f2d905 req-3a3a464c-1fc5-4266-bd1a-f3b48463e205 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:51 np0005465988 nova_compute[236126]: 2025-10-02 12:38:51.100 2 DEBUG oslo_concurrency.lockutils [req-fd5d9223-df01-46f9-8090-c48360f2d905 req-3a3a464c-1fc5-4266-bd1a-f3b48463e205 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:51 np0005465988 nova_compute[236126]: 2025-10-02 12:38:51.100 2 DEBUG nova.network.neutron [req-fd5d9223-df01-46f9-8090-c48360f2d905 req-3a3a464c-1fc5-4266-bd1a-f3b48463e205 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Refreshing network info cache for port fbca287a-897d-4532-bd1a-8bd100ed84e5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:38:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:38:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:52.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:38:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:52.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:52 np0005465988 nova_compute[236126]: 2025-10-02 12:38:52.981 2 DEBUG nova.network.neutron [req-fd5d9223-df01-46f9-8090-c48360f2d905 req-3a3a464c-1fc5-4266-bd1a-f3b48463e205 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Updated VIF entry in instance network info cache for port fbca287a-897d-4532-bd1a-8bd100ed84e5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:38:52 np0005465988 nova_compute[236126]: 2025-10-02 12:38:52.981 2 DEBUG nova.network.neutron [req-fd5d9223-df01-46f9-8090-c48360f2d905 req-3a3a464c-1fc5-4266-bd1a-f3b48463e205 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Updating instance_info_cache with network_info: [{"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:53 np0005465988 nova_compute[236126]: 2025-10-02 12:38:53.013 2 DEBUG oslo_concurrency.lockutils [req-fd5d9223-df01-46f9-8090-c48360f2d905 req-3a3a464c-1fc5-4266-bd1a-f3b48463e205 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:54 np0005465988 nova_compute[236126]: 2025-10-02 12:38:54.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:54 np0005465988 nova_compute[236126]: 2025-10-02 12:38:54.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:38:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:54.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:38:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:54.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:55 np0005465988 nova_compute[236126]: 2025-10-02 12:38:55.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:56.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:56.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:38:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:38:58.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:38:58 np0005465988 nova_compute[236126]: 2025-10-02 12:38:58.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:38:58 np0005465988 nova_compute[236126]: 2025-10-02 12:38:58.705 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:58 np0005465988 nova_compute[236126]: 2025-10-02 12:38:58.705 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:58 np0005465988 nova_compute[236126]: 2025-10-02 12:38:58.706 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:58 np0005465988 nova_compute[236126]: 2025-10-02 12:38:58.706 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:38:58 np0005465988 nova_compute[236126]: 2025-10-02 12:38:58.706 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:38:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:38:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:38:58.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:38:59 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2078892857' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.262 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.363 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000095 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.364 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000095 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.537 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.538 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4049MB free_disk=20.881362915039062GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.539 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.539 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.658 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 45079fff-1c54-42d6-921b-150592757d59 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.659 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.659 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:38:59 np0005465988 nova_compute[236126]: 2025-10-02 12:38:59.933 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:39:00 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2822882890' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:39:00 np0005465988 nova_compute[236126]: 2025-10-02 12:39:00.377 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:00 np0005465988 nova_compute[236126]: 2025-10-02 12:39:00.387 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:39:00 np0005465988 nova_compute[236126]: 2025-10-02 12:39:00.413 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:39:00 np0005465988 nova_compute[236126]: 2025-10-02 12:39:00.437 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:39:00 np0005465988 nova_compute[236126]: 2025-10-02 12:39:00.438 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:00.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:00 np0005465988 podman[303347]: 2025-10-02 12:39:00.537243074 +0000 UTC m=+0.062749946 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:39:00 np0005465988 podman[303348]: 2025-10-02 12:39:00.542519736 +0000 UTC m=+0.067288746 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.schema-version=1.0)
Oct  2 08:39:00 np0005465988 podman[303346]: 2025-10-02 12:39:00.573566159 +0000 UTC m=+0.093767148 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Oct  2 08:39:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:00.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:02.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:02.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:03 np0005465988 ceph-mds[84851]: mds.beacon.cephfs.compute-2.gpiyct missed beacon ack from the monitors
Oct  2 08:39:03 np0005465988 nova_compute[236126]: 2025-10-02 12:39:03.438 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:03 np0005465988 nova_compute[236126]: 2025-10-02 12:39:03.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:39:04 np0005465988 nova_compute[236126]: 2025-10-02 12:39:04.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:04 np0005465988 nova_compute[236126]: 2025-10-02 12:39:04.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:04.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:04.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:05 np0005465988 nova_compute[236126]: 2025-10-02 12:39:05.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:05 np0005465988 nova_compute[236126]: 2025-10-02 12:39:05.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:39:05 np0005465988 nova_compute[236126]: 2025-10-02 12:39:05.520 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:39:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).paxos(paxos updating c 4770..5379) lease_timeout -- calling new election
Oct  2 08:39:06 np0005465988 ceph-mon[76355]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Oct  2 08:39:06 np0005465988 ceph-mon[76355]: paxos.1).electionLogic(18) init, last seen epoch 18
Oct  2 08:39:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 08:39:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:39:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:06.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:39:06 np0005465988 nova_compute[236126]: 2025-10-02 12:39:06.520 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:06.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:07 np0005465988 nova_compute[236126]: 2025-10-02 12:39:07.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:07 np0005465988 nova_compute[236126]: 2025-10-02 12:39:07.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:07 np0005465988 nova_compute[236126]: 2025-10-02 12:39:07.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:07 np0005465988 nova_compute[236126]: 2025-10-02 12:39:07.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:39:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 08:39:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:08.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:08.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:09 np0005465988 nova_compute[236126]: 2025-10-02 12:39:09.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:09 np0005465988 nova_compute[236126]: 2025-10-02 12:39:09.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct  2 08:39:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:10.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e338 e338: 3 total, 3 up, 3 in
Oct  2 08:39:10 np0005465988 ceph-mon[76355]: mon.compute-1 calling monitor election
Oct  2 08:39:10 np0005465988 ceph-mon[76355]: mon.compute-2 calling monitor election
Oct  2 08:39:10 np0005465988 ceph-mon[76355]: mon.compute-0 calling monitor election
Oct  2 08:39:10 np0005465988 ceph-mon[76355]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Oct  2 08:39:10 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 08:39:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:39:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:10.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:39:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:39:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:12.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:13.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e338 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:39:14 np0005465988 nova_compute[236126]: 2025-10-02 12:39:14.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:14 np0005465988 nova_compute[236126]: 2025-10-02 12:39:14.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:14 np0005465988 nova_compute[236126]: 2025-10-02 12:39:14.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:14 np0005465988 nova_compute[236126]: 2025-10-02 12:39:14.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:39:14 np0005465988 nova_compute[236126]: 2025-10-02 12:39:14.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:39:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:39:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:14.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:39:14 np0005465988 podman[303600]: 2025-10-02 12:39:14.562275146 +0000 UTC m=+0.086703075 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:39:14 np0005465988 nova_compute[236126]: 2025-10-02 12:39:14.692 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:39:14 np0005465988 nova_compute[236126]: 2025-10-02 12:39:14.693 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:39:14 np0005465988 nova_compute[236126]: 2025-10-02 12:39:14.693 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:39:14 np0005465988 nova_compute[236126]: 2025-10-02 12:39:14.693 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:39:14Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:e8:ba 10.100.0.9
Oct  2 08:39:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:39:14Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:e8:ba 10.100.0.9
Oct  2 08:39:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:15.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:16.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e339 e339: 3 total, 3 up, 3 in
Oct  2 08:39:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:17.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:17 np0005465988 nova_compute[236126]: 2025-10-02 12:39:17.079 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Updating instance_info_cache with network_info: [{"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:39:17 np0005465988 nova_compute[236126]: 2025-10-02 12:39:17.191 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-45079fff-1c54-42d6-921b-150592757d59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:39:17 np0005465988 nova_compute[236126]: 2025-10-02 12:39:17.192 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:39:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:18.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:39:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:19.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:19 np0005465988 nova_compute[236126]: 2025-10-02 12:39:19.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:19 np0005465988 nova_compute[236126]: 2025-10-02 12:39:19.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:19 np0005465988 nova_compute[236126]: 2025-10-02 12:39:19.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:19 np0005465988 nova_compute[236126]: 2025-10-02 12:39:19.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:39:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:20.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:21.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:22.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:39:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:23.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:39:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:39:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:23.879 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:39:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:23.880 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:39:23 np0005465988 nova_compute[236126]: 2025-10-02 12:39:23.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:24 np0005465988 nova_compute[236126]: 2025-10-02 12:39:24.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:24 np0005465988 nova_compute[236126]: 2025-10-02 12:39:24.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:24.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:25.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:25.883 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:25 np0005465988 nova_compute[236126]: 2025-10-02 12:39:25.953 2 DEBUG oslo_concurrency.lockutils [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:25 np0005465988 nova_compute[236126]: 2025-10-02 12:39:25.954 2 DEBUG oslo_concurrency.lockutils [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:25 np0005465988 nova_compute[236126]: 2025-10-02 12:39:25.980 2 DEBUG nova.objects.instance [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'flavor' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.024 2 DEBUG oslo_concurrency.lockutils [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.304 2 DEBUG oslo_concurrency.lockutils [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.305 2 DEBUG oslo_concurrency.lockutils [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.306 2 INFO nova.compute.manager [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Attaching volume a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19 to /dev/vdb#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.486 2 DEBUG os_brick.utils [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.487 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:26.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.501 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.501 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[b4eec748-796c-4fde-9fad-420240ce5e7c]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.503 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.516 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.516 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[2cdf1dd5-a684-4038-8d1b-1a175adc3f0d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.518 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.525 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.526 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[a2fa9aca-dfc8-4ade-b785-34cd33f6fda3]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.527 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[5b65e310-010a-4bb1-8506-9ddc27bf64a6]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.528 2 DEBUG oslo_concurrency.processutils [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.565 2 DEBUG oslo_concurrency.processutils [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "nvme version" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.568 2 DEBUG os_brick.initiator.connectors.lightos [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.568 2 DEBUG os_brick.initiator.connectors.lightos [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.568 2 DEBUG os_brick.initiator.connectors.lightos [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.569 2 DEBUG os_brick.utils [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] <== get_connector_properties: return (82ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:39:26 np0005465988 nova_compute[236126]: 2025-10-02 12:39:26.569 2 DEBUG nova.virt.block_device [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Updating existing volume attachment record: b5db2386-7cee-4628-9b69-fa0ef8b30fec _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:39:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:27.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:27.370 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:27.371 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e340 e340: 3 total, 3 up, 3 in
Oct  2 08:39:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:27.371 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:27 np0005465988 nova_compute[236126]: 2025-10-02 12:39:27.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:28 np0005465988 nova_compute[236126]: 2025-10-02 12:39:28.148 2 DEBUG nova.objects.instance [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'flavor' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:28 np0005465988 nova_compute[236126]: 2025-10-02 12:39:28.182 2 DEBUG nova.virt.libvirt.driver [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Attempting to attach volume a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Oct  2 08:39:28 np0005465988 nova_compute[236126]: 2025-10-02 12:39:28.187 2 DEBUG nova.virt.libvirt.guest [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] attach device xml: <disk type="network" device="disk">
Oct  2 08:39:28 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:39:28 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19">
Oct  2 08:39:28 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:39:28 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:39:28 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:39:28 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:39:28 np0005465988 nova_compute[236126]:  <auth username="openstack">
Oct  2 08:39:28 np0005465988 nova_compute[236126]:    <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:39:28 np0005465988 nova_compute[236126]:  </auth>
Oct  2 08:39:28 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:39:28 np0005465988 nova_compute[236126]:  <serial>a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19</serial>
Oct  2 08:39:28 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:39:28 np0005465988 nova_compute[236126]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:39:28 np0005465988 nova_compute[236126]: 2025-10-02 12:39:28.341 2 DEBUG nova.virt.libvirt.driver [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:28 np0005465988 nova_compute[236126]: 2025-10-02 12:39:28.341 2 DEBUG nova.virt.libvirt.driver [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:28 np0005465988 nova_compute[236126]: 2025-10-02 12:39:28.342 2 DEBUG nova.virt.libvirt.driver [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:28 np0005465988 nova_compute[236126]: 2025-10-02 12:39:28.342 2 DEBUG nova.virt.libvirt.driver [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] No VIF found with MAC fa:16:3e:1f:e8:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:39:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:28.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:39:28 np0005465988 nova_compute[236126]: 2025-10-02 12:39:28.628 2 DEBUG oslo_concurrency.lockutils [None req-f8a1becd-e134-4f1f-9eff-60bf314b96f8 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:29.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:29 np0005465988 nova_compute[236126]: 2025-10-02 12:39:29.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:29 np0005465988 nova_compute[236126]: 2025-10-02 12:39:29.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:30.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:31.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:31 np0005465988 nova_compute[236126]: 2025-10-02 12:39:31.142 2 INFO nova.compute.manager [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Rebuilding instance#033[00m
Oct  2 08:39:31 np0005465988 nova_compute[236126]: 2025-10-02 12:39:31.431 2 DEBUG nova.objects.instance [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:31 np0005465988 podman[303730]: 2025-10-02 12:39:31.486300527 +0000 UTC m=+0.075644027 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:39:31 np0005465988 podman[303731]: 2025-10-02 12:39:31.486487982 +0000 UTC m=+0.073383582 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Oct  2 08:39:31 np0005465988 podman[303729]: 2025-10-02 12:39:31.500542666 +0000 UTC m=+0.100228453 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:39:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:39:31 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:39:32 np0005465988 nova_compute[236126]: 2025-10-02 12:39:32.405 2 DEBUG nova.compute.manager [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:32 np0005465988 nova_compute[236126]: 2025-10-02 12:39:32.468 2 DEBUG nova.objects.instance [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'pci_requests' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:32 np0005465988 nova_compute[236126]: 2025-10-02 12:39:32.485 2 DEBUG nova.objects.instance [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'pci_devices' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:32.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:32 np0005465988 nova_compute[236126]: 2025-10-02 12:39:32.527 2 DEBUG nova.objects.instance [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'resources' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:32 np0005465988 nova_compute[236126]: 2025-10-02 12:39:32.562 2 DEBUG nova.objects.instance [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'migration_context' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:32 np0005465988 nova_compute[236126]: 2025-10-02 12:39:32.587 2 DEBUG nova.objects.instance [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:39:32 np0005465988 nova_compute[236126]: 2025-10-02 12:39:32.590 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:39:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:33.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:33 np0005465988 nova_compute[236126]: 2025-10-02 12:39:33.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:39:34 np0005465988 nova_compute[236126]: 2025-10-02 12:39:34.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:34 np0005465988 nova_compute[236126]: 2025-10-02 12:39:34.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:34.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:39:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:35.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:39:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:36.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:37.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:37 np0005465988 kernel: tapfbca287a-89 (unregistering): left promiscuous mode
Oct  2 08:39:37 np0005465988 NetworkManager[45041]: <info>  [1759408777.1233] device (tapfbca287a-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:39:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:39:37Z|00671|binding|INFO|Releasing lport fbca287a-897d-4532-bd1a-8bd100ed84e5 from this chassis (sb_readonly=0)
Oct  2 08:39:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:39:37Z|00672|binding|INFO|Setting lport fbca287a-897d-4532-bd1a-8bd100ed84e5 down in Southbound
Oct  2 08:39:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:39:37Z|00673|binding|INFO|Removing iface tapfbca287a-89 ovn-installed in OVS
Oct  2 08:39:37 np0005465988 nova_compute[236126]: 2025-10-02 12:39:37.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:37 np0005465988 nova_compute[236126]: 2025-10-02 12:39:37.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:37 np0005465988 nova_compute[236126]: 2025-10-02 12:39:37.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:37 np0005465988 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000095.scope: Deactivated successfully.
Oct  2 08:39:37 np0005465988 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000095.scope: Consumed 15.813s CPU time.
Oct  2 08:39:37 np0005465988 systemd-machined[192594]: Machine qemu-69-instance-00000095 terminated.
Oct  2 08:39:37 np0005465988 nova_compute[236126]: 2025-10-02 12:39:37.619 2 INFO nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Instance shutdown successfully after 5 seconds.#033[00m
Oct  2 08:39:37 np0005465988 nova_compute[236126]: 2025-10-02 12:39:37.627 2 INFO nova.virt.libvirt.driver [-] [instance: 45079fff-1c54-42d6-921b-150592757d59] Instance destroyed successfully.#033[00m
Oct  2 08:39:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:37.732 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:e8:ba 10.100.0.9'], port_security=['fa:16:3e:1f:e8:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '45079fff-1c54-42d6-921b-150592757d59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00455285-97a7-4fa2-ba83-e8060936877e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6822f02d5ca04c659329a75d487054cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6969c0ae-a584-46fb-9098-5fddbc560ddc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.193'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f978d0a7-f86b-440f-a8b5-5432c3a4bc91, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=fbca287a-897d-4532-bd1a-8bd100ed84e5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:39:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:37.735 142124 INFO neutron.agent.ovn.metadata.agent [-] Port fbca287a-897d-4532-bd1a-8bd100ed84e5 in datapath 00455285-97a7-4fa2-ba83-e8060936877e unbound from our chassis#033[00m
Oct  2 08:39:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:37.739 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00455285-97a7-4fa2-ba83-e8060936877e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:39:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:37.741 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c1fa0339-f310-4248-8a78-fca7114b4a7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:37.742 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e namespace which is not needed anymore#033[00m
Oct  2 08:39:37 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[303228]: [NOTICE]   (303232) : haproxy version is 2.8.14-c23fe91
Oct  2 08:39:37 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[303228]: [NOTICE]   (303232) : path to executable is /usr/sbin/haproxy
Oct  2 08:39:37 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[303228]: [WARNING]  (303232) : Exiting Master process...
Oct  2 08:39:37 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[303228]: [ALERT]    (303232) : Current worker (303234) exited with code 143 (Terminated)
Oct  2 08:39:37 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[303228]: [WARNING]  (303232) : All workers exited. Exiting... (0)
Oct  2 08:39:37 np0005465988 nova_compute[236126]: 2025-10-02 12:39:37.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:37 np0005465988 systemd[1]: libpod-e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844.scope: Deactivated successfully.
Oct  2 08:39:37 np0005465988 podman[303855]: 2025-10-02 12:39:37.954753456 +0000 UTC m=+0.063674522 container died e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.014 2 INFO nova.compute.manager [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Detaching volume a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19#033[00m
Oct  2 08:39:38 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844-userdata-shm.mount: Deactivated successfully.
Oct  2 08:39:38 np0005465988 systemd[1]: var-lib-containers-storage-overlay-6de903c98d5e64123f7861ad16816be41f2dbd058bd04b839b4935e36ce11880-merged.mount: Deactivated successfully.
Oct  2 08:39:38 np0005465988 podman[303855]: 2025-10-02 12:39:38.075250162 +0000 UTC m=+0.184171208 container cleanup e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:39:38 np0005465988 systemd[1]: libpod-conmon-e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844.scope: Deactivated successfully.
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.168 2 INFO nova.virt.block_device [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Attempting to driver detach volume a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19 from mountpoint /dev/vdb#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.178 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Attempting to detach device vdb from instance 45079fff-1c54-42d6-921b-150592757d59 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.179 2 DEBUG nova.virt.libvirt.guest [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:39:38 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:39:38 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19">
Oct  2 08:39:38 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:39:38 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:39:38 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:39:38 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:39:38 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:39:38 np0005465988 nova_compute[236126]:  <serial>a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19</serial>
Oct  2 08:39:38 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Oct  2 08:39:38 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:39:38 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:39:38 np0005465988 podman[303885]: 2025-10-02 12:39:38.184454863 +0000 UTC m=+0.070537840 container remove e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:39:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:38.197 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[00a6cd2d-922b-4b78-93d5-330536115ea8]: (4, ('Thu Oct  2 12:39:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e (e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844)\ne3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844\nThu Oct  2 12:39:38 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e (e3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844)\ne3175009eb8d539997012ff6d891c3ed01ab2810d9cd7424f4f95d095f449844\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:38.200 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4f941907-5728-4004-8df6-e1091eb3bef4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:38.202 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00455285-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:38 np0005465988 kernel: tap00455285-90: left promiscuous mode
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.210 2 DEBUG nova.compute.manager [req-f2c69a65-642e-442d-8302-f00ae61f93ab req-43992312-704a-4519-86c9-dfbef8d77263 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-vif-unplugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.211 2 DEBUG oslo_concurrency.lockutils [req-f2c69a65-642e-442d-8302-f00ae61f93ab req-43992312-704a-4519-86c9-dfbef8d77263 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.211 2 DEBUG oslo_concurrency.lockutils [req-f2c69a65-642e-442d-8302-f00ae61f93ab req-43992312-704a-4519-86c9-dfbef8d77263 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.212 2 DEBUG oslo_concurrency.lockutils [req-f2c69a65-642e-442d-8302-f00ae61f93ab req-43992312-704a-4519-86c9-dfbef8d77263 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.212 2 DEBUG nova.compute.manager [req-f2c69a65-642e-442d-8302-f00ae61f93ab req-43992312-704a-4519-86c9-dfbef8d77263 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] No waiting events found dispatching network-vif-unplugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.212 2 WARNING nova.compute.manager [req-f2c69a65-642e-442d-8302-f00ae61f93ab req-43992312-704a-4519-86c9-dfbef8d77263 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received unexpected event network-vif-unplugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 for instance with vm_state active and task_state rebuilding.#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.221 2 INFO nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Successfully detached device vdb from instance 45079fff-1c54-42d6-921b-150592757d59 from the persistent domain config.#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:38.232 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a4752d8b-6c13-42a1-bbda-0c4873ff0cd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:38.271 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c917c04f-94e2-4ed5-992f-73200cb46f45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:38.273 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4e7bb136-bc33-4d95-873c-1338841cbb74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:38.288 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8222b40f-861c-43a6-8994-a273ebd2966e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678642, 'reachable_time': 41205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303905, 'error': None, 'target': 'ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:38 np0005465988 systemd[1]: run-netns-ovnmeta\x2d00455285\x2d97a7\x2d4fa2\x2dba83\x2de8060936877e.mount: Deactivated successfully.
Oct  2 08:39:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:38.292 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:39:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:38.292 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[ae42ba84-06dc-4ced-8d38-c43ab65656b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:38.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.551 2 INFO nova.virt.libvirt.driver [-] [instance: 45079fff-1c54-42d6-921b-150592757d59] Instance destroyed successfully.#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.553 2 DEBUG nova.virt.libvirt.vif [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:38:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1744856244',display_name='tempest-ServerActionsTestOtherA-server-969449302',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1744856244',id=149,image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB901wljkng8woQCRBVLpBhg2UNOo8jpxGS+GKRpCYtBXlVGSdH5T4M1H660T/Pgy6wslJmiip7oPjF07ONt3YODwmkR9Q/tCQnjaZPJ2wEKdN/e6CKDf4DgL6XjoOZqHA==',key_name='tempest-keypair-1625658847',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:38:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6822f02d5ca04c659329a75d487054cf',ramdisk_id='',reservation_id='r-6ppa0vgd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1680083910',owner_user_name='tempest-ServerActionsTestOtherA-1680083910-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:39:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c02d1dcc10ea4e57bbc6b7a3c100dc7b',uuid=45079fff-1c54-42d6-921b-150592757d59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.553 2 DEBUG nova.network.os_vif_util [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converting VIF {"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.554 2 DEBUG nova.network.os_vif_util [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.555 2 DEBUG os_vif [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.558 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbca287a-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:38 np0005465988 nova_compute[236126]: 2025-10-02 12:39:38.567 2 INFO os_vif [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89')#033[00m
Oct  2 08:39:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:39:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:39.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:39 np0005465988 nova_compute[236126]: 2025-10-02 12:39:39.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:40 np0005465988 nova_compute[236126]: 2025-10-02 12:39:40.304 2 DEBUG nova.compute.manager [req-825a9546-5c68-486e-a7c7-e71dc3471402 req-b6a1a261-95f2-4eff-bacc-c19e0e3afa9b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:40 np0005465988 nova_compute[236126]: 2025-10-02 12:39:40.304 2 DEBUG oslo_concurrency.lockutils [req-825a9546-5c68-486e-a7c7-e71dc3471402 req-b6a1a261-95f2-4eff-bacc-c19e0e3afa9b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:40 np0005465988 nova_compute[236126]: 2025-10-02 12:39:40.304 2 DEBUG oslo_concurrency.lockutils [req-825a9546-5c68-486e-a7c7-e71dc3471402 req-b6a1a261-95f2-4eff-bacc-c19e0e3afa9b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:40 np0005465988 nova_compute[236126]: 2025-10-02 12:39:40.305 2 DEBUG oslo_concurrency.lockutils [req-825a9546-5c68-486e-a7c7-e71dc3471402 req-b6a1a261-95f2-4eff-bacc-c19e0e3afa9b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:40 np0005465988 nova_compute[236126]: 2025-10-02 12:39:40.305 2 DEBUG nova.compute.manager [req-825a9546-5c68-486e-a7c7-e71dc3471402 req-b6a1a261-95f2-4eff-bacc-c19e0e3afa9b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] No waiting events found dispatching network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:39:40 np0005465988 nova_compute[236126]: 2025-10-02 12:39:40.305 2 WARNING nova.compute.manager [req-825a9546-5c68-486e-a7c7-e71dc3471402 req-b6a1a261-95f2-4eff-bacc-c19e0e3afa9b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received unexpected event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 for instance with vm_state active and task_state rebuilding.#033[00m
Oct  2 08:39:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:40.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:41 np0005465988 nova_compute[236126]: 2025-10-02 12:39:41.012 2 INFO nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Deleting instance files /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59_del#033[00m
Oct  2 08:39:41 np0005465988 nova_compute[236126]: 2025-10-02 12:39:41.013 2 INFO nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Deletion of /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59_del complete#033[00m
Oct  2 08:39:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:41.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:42.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:43.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:43 np0005465988 nova_compute[236126]: 2025-10-02 12:39:43.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:39:43 np0005465988 nova_compute[236126]: 2025-10-02 12:39:43.695 2 INFO nova.virt.block_device [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Booting with volume a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19 at /dev/vdb#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:44.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.685 2 DEBUG os_brick.utils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.687 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.699 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.699 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[26498e77-b20e-4647-a628-8a9570e4b186]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.700 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.715 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.715 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa6c562-5b2f-4ca9-917b-e323555eedb1]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.718 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.730 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.730 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[41031469-cd00-49b8-84a8-2e855dbfda3f]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.733 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[3b66643f-8ac2-47c0-8037-d3e8e91aaf0a]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.734 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.801 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "nvme version" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.805 2 DEBUG os_brick.initiator.connectors.lightos [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.805 2 DEBUG os_brick.initiator.connectors.lightos [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.806 2 DEBUG os_brick.initiator.connectors.lightos [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.806 2 DEBUG os_brick.utils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] <== get_connector_properties: return (120ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:39:44 np0005465988 nova_compute[236126]: 2025-10-02 12:39:44.807 2 DEBUG nova.virt.block_device [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Updating existing volume attachment record: 45eedd3d-5843-49db-8e74-7d74193b28dd _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:39:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:45.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:45 np0005465988 podman[303937]: 2025-10-02 12:39:45.536527199 +0000 UTC m=+0.071932690 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct  2 08:39:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:46.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:47.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.066 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.067 2 INFO nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Creating image(s)#033[00m
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.113 2 DEBUG nova.storage.rbd_utils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.152 2 DEBUG nova.storage.rbd_utils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.184 2 DEBUG nova.storage.rbd_utils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.190 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.271 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.272 2 DEBUG oslo_concurrency.lockutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.273 2 DEBUG oslo_concurrency.lockutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.274 2 DEBUG oslo_concurrency.lockutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "5133c8c7459ce4fa1cf043a638fc1b5c66ed8609" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.308 2 DEBUG nova.storage.rbd_utils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:39:47 np0005465988 nova_compute[236126]: 2025-10-02 12:39:47.313 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 45079fff-1c54-42d6-921b-150592757d59_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:48.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:48 np0005465988 nova_compute[236126]: 2025-10-02 12:39:48.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:39:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:49.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:49 np0005465988 nova_compute[236126]: 2025-10-02 12:39:49.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:50 np0005465988 nova_compute[236126]: 2025-10-02 12:39:50.424 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609 45079fff-1c54-42d6-921b-150592757d59_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.111s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:50.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:50 np0005465988 nova_compute[236126]: 2025-10-02 12:39:50.603 2 DEBUG nova.storage.rbd_utils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] resizing rbd image 45079fff-1c54-42d6-921b-150592757d59_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:39:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:51.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.945 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.946 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Ensure instance console log exists: /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.947 2 DEBUG oslo_concurrency.lockutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.947 2 DEBUG oslo_concurrency.lockutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.948 2 DEBUG oslo_concurrency.lockutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.952 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Start _get_guest_xml network_info=[{"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:47Z,direct_url=<?>,disk_format='qcow2',id=db05f54c-61f8-42d6-a1e2-da3219a77b12,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': None, 'mount_device': '/dev/vdb', 'attachment_id': '45eedd3d-5843-49db-8e74-7d74193b28dd', 'disk_bus': 'virtio', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '45079fff-1c54-42d6-921b-150592757d59', 'attached_at': '', 'detached_at': '', 'volume_id': 'a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19', 'serial': 'a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.957 2 WARNING nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.965 2 DEBUG nova.virt.libvirt.host [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.966 2 DEBUG nova.virt.libvirt.host [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.970 2 DEBUG nova.virt.libvirt.host [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.971 2 DEBUG nova.virt.libvirt.host [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.972 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.972 2 DEBUG nova.virt.hardware [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:47Z,direct_url=<?>,disk_format='qcow2',id=db05f54c-61f8-42d6-a1e2-da3219a77b12,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.973 2 DEBUG nova.virt.hardware [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.973 2 DEBUG nova.virt.hardware [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.973 2 DEBUG nova.virt.hardware [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.973 2 DEBUG nova.virt.hardware [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.974 2 DEBUG nova.virt.hardware [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.974 2 DEBUG nova.virt.hardware [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.974 2 DEBUG nova.virt.hardware [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.975 2 DEBUG nova.virt.hardware [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.975 2 DEBUG nova.virt.hardware [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.975 2 DEBUG nova.virt.hardware [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:39:51 np0005465988 nova_compute[236126]: 2025-10-02 12:39:51.976 2 DEBUG nova.objects.instance [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'vcpu_model' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:52 np0005465988 nova_compute[236126]: 2025-10-02 12:39:52.004 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:52 np0005465988 nova_compute[236126]: 2025-10-02 12:39:52.393 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408777.392167, 45079fff-1c54-42d6-921b-150592757d59 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:52 np0005465988 nova_compute[236126]: 2025-10-02 12:39:52.394 2 INFO nova.compute.manager [-] [instance: 45079fff-1c54-42d6-921b-150592757d59] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:39:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:39:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3309376384' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:39:52 np0005465988 nova_compute[236126]: 2025-10-02 12:39:52.469 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:52 np0005465988 nova_compute[236126]: 2025-10-02 12:39:52.500 2 DEBUG nova.storage.rbd_utils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:39:52 np0005465988 nova_compute[236126]: 2025-10-02 12:39:52.506 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:52.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:39:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/390193139' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:39:52 np0005465988 nova_compute[236126]: 2025-10-02 12:39:52.955 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:53.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.776 2 DEBUG nova.compute.manager [None req-fb8b1f72-89e3-4eae-ac35-099097a051b8 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.815 2 DEBUG nova.virt.libvirt.vif [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:38:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1744856244',display_name='tempest-ServerActionsTestOtherA-server-969449302',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1744856244',id=149,image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB901wljkng8woQCRBVLpBhg2UNOo8jpxGS+GKRpCYtBXlVGSdH5T4M1H660T/Pgy6wslJmiip7oPjF07ONt3YODwmkR9Q/tCQnjaZPJ2wEKdN/e6CKDf4DgL6XjoOZqHA==',key_name='tempest-keypair-1625658847',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:38:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6822f02d5ca04c659329a75d487054cf',ramdisk_id='',reservation_id='r-6ppa0vgd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1680083910',owner_user_name='tempest-ServerActionsTestOtherA-1680083910-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:39:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c02d1dcc10ea4e57bbc6b7a3c100dc7b',uuid=45079fff-1c54-42d6-921b-150592757d59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.815 2 DEBUG nova.network.os_vif_util [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converting VIF {"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.816 2 DEBUG nova.network.os_vif_util [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.818 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  <uuid>45079fff-1c54-42d6-921b-150592757d59</uuid>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  <name>instance-00000095</name>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerActionsTestOtherA-server-969449302</nova:name>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:39:51</nova:creationTime>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <nova:user uuid="c02d1dcc10ea4e57bbc6b7a3c100dc7b">tempest-ServerActionsTestOtherA-1680083910-project-member</nova:user>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <nova:project uuid="6822f02d5ca04c659329a75d487054cf">tempest-ServerActionsTestOtherA-1680083910</nova:project>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="db05f54c-61f8-42d6-a1e2-da3219a77b12"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <nova:port uuid="fbca287a-897d-4532-bd1a-8bd100ed84e5">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <entry name="serial">45079fff-1c54-42d6-921b-150592757d59</entry>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <entry name="uuid">45079fff-1c54-42d6-921b-150592757d59</entry>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/45079fff-1c54-42d6-921b-150592757d59_disk">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/45079fff-1c54-42d6-921b-150592757d59_disk.config">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <target dev="vdb" bus="virtio"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <serial>a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19</serial>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:1f:e8:ba"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <target dev="tapfbca287a-89"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/console.log" append="off"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:39:53 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:39:53 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:39:53 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:39:53 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.821 2 DEBUG nova.virt.libvirt.vif [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:38:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1744856244',display_name='tempest-ServerActionsTestOtherA-server-969449302',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1744856244',id=149,image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB901wljkng8woQCRBVLpBhg2UNOo8jpxGS+GKRpCYtBXlVGSdH5T4M1H660T/Pgy6wslJmiip7oPjF07ONt3YODwmkR9Q/tCQnjaZPJ2wEKdN/e6CKDf4DgL6XjoOZqHA==',key_name='tempest-keypair-1625658847',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:38:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6822f02d5ca04c659329a75d487054cf',ramdisk_id='',reservation_id='r-6ppa0vgd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1680083910',owner_user_name='tempest-ServerActionsTestOtherA-1680083910-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:39:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c02d1dcc10ea4e57bbc6b7a3c100dc7b',uuid=45079fff-1c54-42d6-921b-150592757d59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.821 2 DEBUG nova.network.os_vif_util [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converting VIF {"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.822 2 DEBUG nova.network.os_vif_util [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.822 2 DEBUG os_vif [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.823 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.824 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.826 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbca287a-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.827 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfbca287a-89, col_values=(('external_ids', {'iface-id': 'fbca287a-897d-4532-bd1a-8bd100ed84e5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:e8:ba', 'vm-uuid': '45079fff-1c54-42d6-921b-150592757d59'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:53 np0005465988 NetworkManager[45041]: <info>  [1759408793.8303] manager: (tapfbca287a-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:53 np0005465988 nova_compute[236126]: 2025-10-02 12:39:53.836 2 INFO os_vif [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89')#033[00m
Oct  2 08:39:54 np0005465988 nova_compute[236126]: 2025-10-02 12:39:54.085 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:54 np0005465988 nova_compute[236126]: 2025-10-02 12:39:54.086 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:54 np0005465988 nova_compute[236126]: 2025-10-02 12:39:54.086 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:54 np0005465988 nova_compute[236126]: 2025-10-02 12:39:54.087 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] No VIF found with MAC fa:16:3e:1f:e8:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:39:54 np0005465988 nova_compute[236126]: 2025-10-02 12:39:54.088 2 INFO nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Using config drive#033[00m
Oct  2 08:39:54 np0005465988 nova_compute[236126]: 2025-10-02 12:39:54.122 2 DEBUG nova.storage.rbd_utils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:39:54 np0005465988 nova_compute[236126]: 2025-10-02 12:39:54.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:54 np0005465988 nova_compute[236126]: 2025-10-02 12:39:54.323 2 DEBUG nova.objects.instance [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'ec2_ids' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:54 np0005465988 nova_compute[236126]: 2025-10-02 12:39:54.394 2 DEBUG nova.objects.instance [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'keypairs' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:54.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:55.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:55 np0005465988 nova_compute[236126]: 2025-10-02 12:39:55.398 2 INFO nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Creating config drive at /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config#033[00m
Oct  2 08:39:55 np0005465988 nova_compute[236126]: 2025-10-02 12:39:55.407 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9bto4jlm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:55 np0005465988 nova_compute[236126]: 2025-10-02 12:39:55.498 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:55 np0005465988 nova_compute[236126]: 2025-10-02 12:39:55.569 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9bto4jlm" returned: 0 in 0.163s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:55 np0005465988 nova_compute[236126]: 2025-10-02 12:39:55.602 2 DEBUG nova.storage.rbd_utils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] rbd image 45079fff-1c54-42d6-921b-150592757d59_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:39:55 np0005465988 nova_compute[236126]: 2025-10-02 12:39:55.607 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config 45079fff-1c54-42d6-921b-150592757d59_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:56 np0005465988 nova_compute[236126]: 2025-10-02 12:39:56.367 2 DEBUG oslo_concurrency.processutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config 45079fff-1c54-42d6-921b-150592757d59_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.760s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:56 np0005465988 nova_compute[236126]: 2025-10-02 12:39:56.368 2 INFO nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Deleting local config drive /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59/disk.config because it was imported into RBD.#033[00m
Oct  2 08:39:56 np0005465988 kernel: tapfbca287a-89: entered promiscuous mode
Oct  2 08:39:56 np0005465988 NetworkManager[45041]: <info>  [1759408796.4530] manager: (tapfbca287a-89): new Tun device (/org/freedesktop/NetworkManager/Devices/301)
Oct  2 08:39:56 np0005465988 nova_compute[236126]: 2025-10-02 12:39:56.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:39:56Z|00674|binding|INFO|Claiming lport fbca287a-897d-4532-bd1a-8bd100ed84e5 for this chassis.
Oct  2 08:39:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:39:56Z|00675|binding|INFO|fbca287a-897d-4532-bd1a-8bd100ed84e5: Claiming fa:16:3e:1f:e8:ba 10.100.0.9
Oct  2 08:39:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:39:56Z|00676|binding|INFO|Setting lport fbca287a-897d-4532-bd1a-8bd100ed84e5 ovn-installed in OVS
Oct  2 08:39:56 np0005465988 nova_compute[236126]: 2025-10-02 12:39:56.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:56 np0005465988 nova_compute[236126]: 2025-10-02 12:39:56.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:56 np0005465988 systemd-machined[192594]: New machine qemu-70-instance-00000095.
Oct  2 08:39:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:39:56Z|00677|binding|INFO|Setting lport fbca287a-897d-4532-bd1a-8bd100ed84e5 up in Southbound
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.499 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:e8:ba 10.100.0.9'], port_security=['fa:16:3e:1f:e8:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '45079fff-1c54-42d6-921b-150592757d59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00455285-97a7-4fa2-ba83-e8060936877e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6822f02d5ca04c659329a75d487054cf', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6969c0ae-a584-46fb-9098-5fddbc560ddc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.193'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f978d0a7-f86b-440f-a8b5-5432c3a4bc91, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=fbca287a-897d-4532-bd1a-8bd100ed84e5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.500 142124 INFO neutron.agent.ovn.metadata.agent [-] Port fbca287a-897d-4532-bd1a-8bd100ed84e5 in datapath 00455285-97a7-4fa2-ba83-e8060936877e bound to our chassis#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.501 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 00455285-97a7-4fa2-ba83-e8060936877e#033[00m
Oct  2 08:39:56 np0005465988 systemd[1]: Started Virtual Machine qemu-70-instance-00000095.
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.515 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e47009da-3552-4384-88c0-f65417d85be3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.516 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap00455285-91 in ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.518 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap00455285-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.518 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ea921de6-813a-4b8b-acbb-2cf3db6e977c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.518 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b6896ca1-fe38-4776-b7db-135bfde64753]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 systemd-udevd[304314]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.530 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[8ffb9e48-2e75-49f6-aa09-b48a95799cca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 NetworkManager[45041]: <info>  [1759408796.5384] device (tapfbca287a-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:39:56 np0005465988 NetworkManager[45041]: <info>  [1759408796.5392] device (tapfbca287a-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:39:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:56.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.553 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[92b9441b-f4d6-44bf-95c6-ebb808cd9f9a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.589 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f37d979d-47f2-4ebb-810c-40bc76334428]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 NetworkManager[45041]: <info>  [1759408796.5948] manager: (tap00455285-90): new Veth device (/org/freedesktop/NetworkManager/Devices/302)
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.594 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3f54732c-f4dd-4820-8b7b-9df00e23a239]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.636 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[81c1f197-8dbc-40b7-bd9f-e5a02b986f9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.639 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f882ca97-a0db-4c81-b5a7-ec4121e5840a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 NetworkManager[45041]: <info>  [1759408796.6659] device (tap00455285-90): carrier: link connected
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.673 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8d48a8da-baf1-4e7b-b8ae-745b46ac37f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.689 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[57389d78-49cc-49f2-8802-8ede30dec675]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00455285-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:8a:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685699, 'reachable_time': 20113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304346, 'error': None, 'target': 'ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.701 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b69462-5c7c-4cae-885f-dcfe38ea9f25]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef6:8a3c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 685699, 'tstamp': 685699}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304347, 'error': None, 'target': 'ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.715 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9c0d50a4-789e-4ef7-a601-e97d807adc26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00455285-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:8a:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685699, 'reachable_time': 20113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 304348, 'error': None, 'target': 'ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.748 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f39eda94-89eb-44b1-903a-f1ceba2f58fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.811 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a8dd31-6967-4a4a-b6e0-b54649ea6c5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.813 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00455285-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.814 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.815 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00455285-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:56 np0005465988 kernel: tap00455285-90: entered promiscuous mode
Oct  2 08:39:56 np0005465988 NetworkManager[45041]: <info>  [1759408796.8202] manager: (tap00455285-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Oct  2 08:39:56 np0005465988 nova_compute[236126]: 2025-10-02 12:39:56.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.824 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap00455285-90, col_values=(('external_ids', {'iface-id': '293fb87a-10df-4698-a69e-3023bca5a6a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:39:56Z|00678|binding|INFO|Releasing lport 293fb87a-10df-4698-a69e-3023bca5a6a3 from this chassis (sb_readonly=0)
Oct  2 08:39:56 np0005465988 nova_compute[236126]: 2025-10-02 12:39:56.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.830 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00455285-97a7-4fa2-ba83-e8060936877e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00455285-97a7-4fa2-ba83-e8060936877e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.831 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[553fdcdd-bb2c-4e3c-b2ba-7ed36bd8a79a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.832 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-00455285-97a7-4fa2-ba83-e8060936877e
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/00455285-97a7-4fa2-ba83-e8060936877e.pid.haproxy
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 00455285-97a7-4fa2-ba83-e8060936877e
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:39:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:39:56.834 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e', 'env', 'PROCESS_TAG=haproxy-00455285-97a7-4fa2-ba83-e8060936877e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/00455285-97a7-4fa2-ba83-e8060936877e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:39:56 np0005465988 nova_compute[236126]: 2025-10-02 12:39:56.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:39:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:57.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:39:57 np0005465988 podman[304435]: 2025-10-02 12:39:57.233123296 +0000 UTC m=+0.057283839 container create a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:39:57 np0005465988 systemd[1]: Started libpod-conmon-a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6.scope.
Oct  2 08:39:57 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:39:57 np0005465988 podman[304435]: 2025-10-02 12:39:57.203525895 +0000 UTC m=+0.027686458 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:39:57 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cacac35373d9eba5b80043ec3c5527569fddd13895469815c997b04eea6cd242/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:39:57 np0005465988 podman[304435]: 2025-10-02 12:39:57.325009589 +0000 UTC m=+0.149170162 container init a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:39:57 np0005465988 podman[304435]: 2025-10-02 12:39:57.334322507 +0000 UTC m=+0.158483050 container start a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:39:57 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[304450]: [NOTICE]   (304459) : New worker (304461) forked
Oct  2 08:39:57 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[304450]: [NOTICE]   (304459) : Loading success.
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.811 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408797.81086, 45079fff-1c54-42d6-921b-150592757d59 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.811 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.815 2 DEBUG nova.compute.manager [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.815 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.819 2 INFO nova.virt.libvirt.driver [-] [instance: 45079fff-1c54-42d6-921b-150592757d59] Instance spawned successfully.#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.819 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.865 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.870 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.885 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.886 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.886 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.887 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.887 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.888 2 DEBUG nova.virt.libvirt.driver [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.938 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.939 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408797.8122213, 45079fff-1c54-42d6-921b-150592757d59 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:57 np0005465988 nova_compute[236126]: 2025-10-02 12:39:57.939 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] VM Started (Lifecycle Event)#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.004 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.010 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.031 2 DEBUG nova.compute.manager [req-94a7c137-6ed4-442a-9951-685a1082c675 req-90161657-bc19-4bfd-9f00-f84658b911ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.032 2 DEBUG oslo_concurrency.lockutils [req-94a7c137-6ed4-442a-9951-685a1082c675 req-90161657-bc19-4bfd-9f00-f84658b911ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.032 2 DEBUG oslo_concurrency.lockutils [req-94a7c137-6ed4-442a-9951-685a1082c675 req-90161657-bc19-4bfd-9f00-f84658b911ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.032 2 DEBUG oslo_concurrency.lockutils [req-94a7c137-6ed4-442a-9951-685a1082c675 req-90161657-bc19-4bfd-9f00-f84658b911ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.033 2 DEBUG nova.compute.manager [req-94a7c137-6ed4-442a-9951-685a1082c675 req-90161657-bc19-4bfd-9f00-f84658b911ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] No waiting events found dispatching network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.033 2 WARNING nova.compute.manager [req-94a7c137-6ed4-442a-9951-685a1082c675 req-90161657-bc19-4bfd-9f00-f84658b911ab d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received unexpected event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.061 2 DEBUG nova.compute.manager [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.085 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.207 2 DEBUG oslo_concurrency.lockutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.209 2 DEBUG oslo_concurrency.lockutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.209 2 DEBUG nova.objects.instance [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:39:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:39:58.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.574 2 DEBUG oslo_concurrency.lockutils [None req-31acfc5b-4e27-460a-85c3-cc01fb481af1 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:39:58 np0005465988 nova_compute[236126]: 2025-10-02 12:39:58.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:39:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:39:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:39:59.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:39:59 np0005465988 nova_compute[236126]: 2025-10-02 12:39:59.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:59 np0005465988 nova_compute[236126]: 2025-10-02 12:39:59.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:59 np0005465988 nova_compute[236126]: 2025-10-02 12:39:59.857 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:59 np0005465988 nova_compute[236126]: 2025-10-02 12:39:59.857 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:59 np0005465988 nova_compute[236126]: 2025-10-02 12:39:59.858 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:59 np0005465988 nova_compute[236126]: 2025-10-02 12:39:59.858 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:39:59 np0005465988 nova_compute[236126]: 2025-10-02 12:39:59.859 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:40:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:40:00 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1802728507' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:40:00 np0005465988 nova_compute[236126]: 2025-10-02 12:40:00.340 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:40:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:00.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:00 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 08:40:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:01.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:02 np0005465988 podman[304498]: 2025-10-02 12:40:02.535590419 +0000 UTC m=+0.064868517 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:40:02 np0005465988 podman[304497]: 2025-10-02 12:40:02.545907996 +0000 UTC m=+0.072306581 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:40:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:02.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:02 np0005465988 podman[304496]: 2025-10-02 12:40:02.591305171 +0000 UTC m=+0.126648453 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  2 08:40:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:03.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:03 np0005465988 nova_compute[236126]: 2025-10-02 12:40:03.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:04 np0005465988 nova_compute[236126]: 2025-10-02 12:40:04.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:40:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:04.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:40:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:05.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:06.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:07.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:08.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:08 np0005465988 nova_compute[236126]: 2025-10-02 12:40:08.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:40:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:09.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:40:09 np0005465988 nova_compute[236126]: 2025-10-02 12:40:09.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:09 np0005465988 nova_compute[236126]: 2025-10-02 12:40:09.894 2 DEBUG nova.compute.manager [req-513f9a01-d8da-477e-9081-12b5011b0027 req-388a10ba-df3b-4722-9134-73a7b8a8cd1a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:09 np0005465988 nova_compute[236126]: 2025-10-02 12:40:09.895 2 DEBUG oslo_concurrency.lockutils [req-513f9a01-d8da-477e-9081-12b5011b0027 req-388a10ba-df3b-4722-9134-73a7b8a8cd1a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:09 np0005465988 nova_compute[236126]: 2025-10-02 12:40:09.896 2 DEBUG oslo_concurrency.lockutils [req-513f9a01-d8da-477e-9081-12b5011b0027 req-388a10ba-df3b-4722-9134-73a7b8a8cd1a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:09 np0005465988 nova_compute[236126]: 2025-10-02 12:40:09.896 2 DEBUG oslo_concurrency.lockutils [req-513f9a01-d8da-477e-9081-12b5011b0027 req-388a10ba-df3b-4722-9134-73a7b8a8cd1a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:09 np0005465988 nova_compute[236126]: 2025-10-02 12:40:09.896 2 DEBUG nova.compute.manager [req-513f9a01-d8da-477e-9081-12b5011b0027 req-388a10ba-df3b-4722-9134-73a7b8a8cd1a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] No waiting events found dispatching network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:40:09 np0005465988 nova_compute[236126]: 2025-10-02 12:40:09.897 2 WARNING nova.compute.manager [req-513f9a01-d8da-477e-9081-12b5011b0027 req-388a10ba-df3b-4722-9134-73a7b8a8cd1a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received unexpected event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.129 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000095 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.130 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000095 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.131 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000095 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.187 2 DEBUG oslo_concurrency.lockutils [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.188 2 DEBUG oslo_concurrency.lockutils [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.208 2 INFO nova.compute.manager [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Detaching volume a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.332 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.334 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3975MB free_disk=20.876338958740234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.335 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.335 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:10.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:10.686 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:40:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:10.688 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.707 2 INFO nova.virt.block_device [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Attempting to driver detach volume a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19 from mountpoint /dev/vdb#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.719 2 DEBUG nova.virt.libvirt.driver [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Attempting to detach device vdb from instance 45079fff-1c54-42d6-921b-150592757d59 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.720 2 DEBUG nova.virt.libvirt.guest [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19">
Oct  2 08:40:10 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  <serial>a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19</serial>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:40:10 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.733 2 INFO nova.virt.libvirt.driver [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Successfully detached device vdb from instance 45079fff-1c54-42d6-921b-150592757d59 from the persistent domain config.#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.734 2 DEBUG nova.virt.libvirt.driver [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 45079fff-1c54-42d6-921b-150592757d59 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.734 2 DEBUG nova.virt.libvirt.guest [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19">
Oct  2 08:40:10 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  <serial>a1bae65f-c0b4-46c3-b2fd-8ed16aa61b19</serial>
Oct  2 08:40:10 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct  2 08:40:10 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:40:10 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.754 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 45079fff-1c54-42d6-921b-150592757d59 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.754 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.755 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.783 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.803 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.804 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.806 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Received event <DeviceRemovedEvent: 1759408810.8066878, 45079fff-1c54-42d6-921b-150592757d59 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.807 2 DEBUG nova.virt.libvirt.driver [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 45079fff-1c54-42d6-921b-150592757d59 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.809 2 INFO nova.virt.libvirt.driver [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Successfully detached device vdb from instance 45079fff-1c54-42d6-921b-150592757d59 from the live domain config.#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.860 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.904 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:40:10 np0005465988 nova_compute[236126]: 2025-10-02 12:40:10.979 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:40:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:11.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.196 2 DEBUG nova.objects.instance [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'flavor' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.284 2 DEBUG oslo_concurrency.lockutils [None req-8210cb25-79a4-4234-8fee-122051d65984 c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:40:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3604971004' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.443 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.453 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.483 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.486 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.487 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:11 np0005465988 ovn_controller[132601]: 2025-10-02T12:40:11Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:e8:ba 10.100.0.9
Oct  2 08:40:11 np0005465988 ovn_controller[132601]: 2025-10-02T12:40:11Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:e8:ba 10.100.0.9
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.793 2 DEBUG oslo_concurrency.lockutils [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.794 2 DEBUG oslo_concurrency.lockutils [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.794 2 DEBUG oslo_concurrency.lockutils [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.795 2 DEBUG oslo_concurrency.lockutils [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.796 2 DEBUG oslo_concurrency.lockutils [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.798 2 INFO nova.compute.manager [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Terminating instance#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.800 2 DEBUG nova.compute.manager [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:40:11 np0005465988 kernel: tapfbca287a-89 (unregistering): left promiscuous mode
Oct  2 08:40:11 np0005465988 NetworkManager[45041]: <info>  [1759408811.8941] device (tapfbca287a-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:11 np0005465988 ovn_controller[132601]: 2025-10-02T12:40:11Z|00679|binding|INFO|Releasing lport fbca287a-897d-4532-bd1a-8bd100ed84e5 from this chassis (sb_readonly=0)
Oct  2 08:40:11 np0005465988 ovn_controller[132601]: 2025-10-02T12:40:11Z|00680|binding|INFO|Setting lport fbca287a-897d-4532-bd1a-8bd100ed84e5 down in Southbound
Oct  2 08:40:11 np0005465988 ovn_controller[132601]: 2025-10-02T12:40:11Z|00681|binding|INFO|Removing iface tapfbca287a-89 ovn-installed in OVS
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:11.933 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:e8:ba 10.100.0.9'], port_security=['fa:16:3e:1f:e8:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '45079fff-1c54-42d6-921b-150592757d59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00455285-97a7-4fa2-ba83-e8060936877e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6822f02d5ca04c659329a75d487054cf', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6969c0ae-a584-46fb-9098-5fddbc560ddc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.193', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f978d0a7-f86b-440f-a8b5-5432c3a4bc91, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=fbca287a-897d-4532-bd1a-8bd100ed84e5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:40:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:11.936 142124 INFO neutron.agent.ovn.metadata.agent [-] Port fbca287a-897d-4532-bd1a-8bd100ed84e5 in datapath 00455285-97a7-4fa2-ba83-e8060936877e unbound from our chassis#033[00m
Oct  2 08:40:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:11.939 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00455285-97a7-4fa2-ba83-e8060936877e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:40:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:11.941 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[694ab7f2-6374-4b02-ad44-ad5ca3ba9fe1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:11.943 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e namespace which is not needed anymore#033[00m
Oct  2 08:40:11 np0005465988 nova_compute[236126]: 2025-10-02 12:40:11.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:11 np0005465988 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000095.scope: Deactivated successfully.
Oct  2 08:40:11 np0005465988 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000095.scope: Consumed 13.442s CPU time.
Oct  2 08:40:11 np0005465988 systemd-machined[192594]: Machine qemu-70-instance-00000095 terminated.
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.040 2 INFO nova.virt.libvirt.driver [-] [instance: 45079fff-1c54-42d6-921b-150592757d59] Instance destroyed successfully.#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.041 2 DEBUG nova.objects.instance [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lazy-loading 'resources' on Instance uuid 45079fff-1c54-42d6-921b-150592757d59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.084 2 DEBUG nova.virt.libvirt.vif [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:38:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1744856244',display_name='tempest-ServerActionsTestOtherA-server-969449302',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1744856244',id=149,image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB901wljkng8woQCRBVLpBhg2UNOo8jpxGS+GKRpCYtBXlVGSdH5T4M1H660T/Pgy6wslJmiip7oPjF07ONt3YODwmkR9Q/tCQnjaZPJ2wEKdN/e6CKDf4DgL6XjoOZqHA==',key_name='tempest-keypair-1625658847',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:39:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6822f02d5ca04c659329a75d487054cf',ramdisk_id='',reservation_id='r-6ppa0vgd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='db05f54c-61f8-42d6-a1e2-da3219a77b12',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1680083910',owner_user_name='tempest-ServerActionsTestOtherA-1680083910-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:39:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c02d1dcc10ea4e57bbc6b7a3c100dc7b',uuid=45079fff-1c54-42d6-921b-150592757d59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.085 2 DEBUG nova.network.os_vif_util [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converting VIF {"id": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "address": "fa:16:3e:1f:e8:ba", "network": {"id": "00455285-97a7-4fa2-ba83-e8060936877e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1293599148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6822f02d5ca04c659329a75d487054cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbca287a-89", "ovs_interfaceid": "fbca287a-897d-4532-bd1a-8bd100ed84e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.085 2 DEBUG nova.network.os_vif_util [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.086 2 DEBUG os_vif [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.087 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbca287a-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.093 2 INFO os_vif [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:e8:ba,bridge_name='br-int',has_traffic_filtering=True,id=fbca287a-897d-4532-bd1a-8bd100ed84e5,network=Network(00455285-97a7-4fa2-ba83-e8060936877e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbca287a-89')#033[00m
Oct  2 08:40:12 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[304450]: [NOTICE]   (304459) : haproxy version is 2.8.14-c23fe91
Oct  2 08:40:12 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[304450]: [NOTICE]   (304459) : path to executable is /usr/sbin/haproxy
Oct  2 08:40:12 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[304450]: [WARNING]  (304459) : Exiting Master process...
Oct  2 08:40:12 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[304450]: [ALERT]    (304459) : Current worker (304461) exited with code 143 (Terminated)
Oct  2 08:40:12 np0005465988 neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e[304450]: [WARNING]  (304459) : All workers exited. Exiting... (0)
Oct  2 08:40:12 np0005465988 systemd[1]: libpod-a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6.scope: Deactivated successfully.
Oct  2 08:40:12 np0005465988 podman[304667]: 2025-10-02 12:40:12.131828353 +0000 UTC m=+0.081398282 container died a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:40:12 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6-userdata-shm.mount: Deactivated successfully.
Oct  2 08:40:12 np0005465988 systemd[1]: var-lib-containers-storage-overlay-cacac35373d9eba5b80043ec3c5527569fddd13895469815c997b04eea6cd242-merged.mount: Deactivated successfully.
Oct  2 08:40:12 np0005465988 podman[304667]: 2025-10-02 12:40:12.301955096 +0000 UTC m=+0.251525055 container cleanup a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:40:12 np0005465988 systemd[1]: libpod-conmon-a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6.scope: Deactivated successfully.
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #109. Immutable memtables: 0.
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.422901) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 67] Flushing memtable with next log file: 109
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408812422963, "job": 67, "event": "flush_started", "num_memtables": 1, "num_entries": 1888, "num_deletes": 253, "total_data_size": 4253382, "memory_usage": 4322648, "flush_reason": "Manual Compaction"}
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 67] Level-0 flush table #110: started
Oct  2 08:40:12 np0005465988 podman[304719]: 2025-10-02 12:40:12.452581559 +0000 UTC m=+0.116422370 container remove a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  2 08:40:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:12.529 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[531526d2-034c-4c1c-b706-d21e5e201836]: (4, ('Thu Oct  2 12:40:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e (a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6)\na347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6\nThu Oct  2 12:40:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e (a347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6)\na347fadcb47544518efc7a337523f6b546c738adfe67d5a8f02a079fa10459f6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:12.532 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[198f72e4-5855-4288-8e8e-c2f61d37f450]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:12.534 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00455285-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:12 np0005465988 kernel: tap00455285-90: left promiscuous mode
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408812566495, "cf_name": "default", "job": 67, "event": "table_file_creation", "file_number": 110, "file_size": 1734257, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 55407, "largest_seqno": 57290, "table_properties": {"data_size": 1728274, "index_size": 2993, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 16574, "raw_average_key_size": 21, "raw_value_size": 1714755, "raw_average_value_size": 2235, "num_data_blocks": 131, "num_entries": 767, "num_filter_entries": 767, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759408650, "oldest_key_time": 1759408650, "file_creation_time": 1759408812, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 110, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 67] Flush lasted 143690 microseconds, and 8058 cpu microseconds.
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:40:12 np0005465988 nova_compute[236126]: 2025-10-02 12:40:12.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:12.570 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[348b4620-76ae-4913-8e04-423d610d04f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:12.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.566591) [db/flush_job.cc:967] [default] [JOB 67] Level-0 flush table #110: 1734257 bytes OK
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.566617) [db/memtable_list.cc:519] [default] Level-0 commit table #110 started
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.603010) [db/memtable_list.cc:722] [default] Level-0 commit table #110: memtable #1 done
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.603060) EVENT_LOG_v1 {"time_micros": 1759408812603048, "job": 67, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.603088) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 67] Try to delete WAL files size 4244833, prev total WAL file size 4244833, number of live WAL files 2.
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000106.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.604880) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031373539' seq:72057594037927935, type:22 .. '6D6772737461740032303130' seq:0, type:0; will stop at (end)
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 68] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 67 Base level 0, inputs: [110(1693KB)], [108(11MB)]
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408812604963, "job": 68, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [110], "files_L6": [108], "score": -1, "input_data_size": 14198782, "oldest_snapshot_seqno": -1}
Oct  2 08:40:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:12.606 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[957cde5b-be69-4aa7-8c29-191f3a554b15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:12.608 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b1fc5acb-dbd7-457a-800a-8fc2ee94a08d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:12.631 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba223a0-39a1-46f5-9a50-e4eddbc6e08f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685691, 'reachable_time': 31677, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304734, 'error': None, 'target': 'ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:12.634 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-00455285-97a7-4fa2-ba83-e8060936877e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:40:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:12.634 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[95271722-550e-4334-85ae-9021bf0b8fc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:12 np0005465988 systemd[1]: run-netns-ovnmeta\x2d00455285\x2d97a7\x2d4fa2\x2dba83\x2de8060936877e.mount: Deactivated successfully.
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 68] Generated table #111: 8241 keys, 11383962 bytes, temperature: kUnknown
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408812724245, "cf_name": "default", "job": 68, "event": "table_file_creation", "file_number": 111, "file_size": 11383962, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11330235, "index_size": 32047, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20613, "raw_key_size": 212740, "raw_average_key_size": 25, "raw_value_size": 11184906, "raw_average_value_size": 1357, "num_data_blocks": 1257, "num_entries": 8241, "num_filter_entries": 8241, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759408812, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 111, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.724754) [db/compaction/compaction_job.cc:1663] [default] [JOB 68] Compacted 1@0 + 1@6 files to L6 => 11383962 bytes
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.729920) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 118.8 rd, 95.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 11.9 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(14.8) write-amplify(6.6) OK, records in: 8700, records dropped: 459 output_compression: NoCompression
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.729952) EVENT_LOG_v1 {"time_micros": 1759408812729938, "job": 68, "event": "compaction_finished", "compaction_time_micros": 119480, "compaction_time_cpu_micros": 35104, "output_level": 6, "num_output_files": 1, "total_output_size": 11383962, "num_input_records": 8700, "num_output_records": 8241, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000110.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408812730735, "job": 68, "event": "table_file_deletion", "file_number": 110}
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000108.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408812734934, "job": 68, "event": "table_file_deletion", "file_number": 108}
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.604733) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.735037) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.735047) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.735051) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.735055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:40:12 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:40:12.735059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:40:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:13.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:13 np0005465988 nova_compute[236126]: 2025-10-02 12:40:13.638 2 INFO nova.virt.libvirt.driver [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Deleting instance files /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59_del#033[00m
Oct  2 08:40:13 np0005465988 nova_compute[236126]: 2025-10-02 12:40:13.639 2 INFO nova.virt.libvirt.driver [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Deletion of /var/lib/nova/instances/45079fff-1c54-42d6-921b-150592757d59_del complete#033[00m
Oct  2 08:40:13 np0005465988 nova_compute[236126]: 2025-10-02 12:40:13.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:13 np0005465988 nova_compute[236126]: 2025-10-02 12:40:13.930 2 INFO nova.compute.manager [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Took 2.13 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:40:13 np0005465988 nova_compute[236126]: 2025-10-02 12:40:13.931 2 DEBUG oslo.service.loopingcall [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:40:13 np0005465988 nova_compute[236126]: 2025-10-02 12:40:13.932 2 DEBUG nova.compute.manager [-] [instance: 45079fff-1c54-42d6-921b-150592757d59] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:40:13 np0005465988 nova_compute[236126]: 2025-10-02 12:40:13.932 2 DEBUG nova.network.neutron [-] [instance: 45079fff-1c54-42d6-921b-150592757d59] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.369 2 DEBUG nova.compute.manager [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-vif-unplugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.370 2 DEBUG oslo_concurrency.lockutils [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.370 2 DEBUG oslo_concurrency.lockutils [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.370 2 DEBUG oslo_concurrency.lockutils [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.371 2 DEBUG nova.compute.manager [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] No waiting events found dispatching network-vif-unplugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.371 2 DEBUG nova.compute.manager [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-vif-unplugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.371 2 DEBUG nova.compute.manager [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.372 2 DEBUG oslo_concurrency.lockutils [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45079fff-1c54-42d6-921b-150592757d59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.372 2 DEBUG oslo_concurrency.lockutils [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.372 2 DEBUG oslo_concurrency.lockutils [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.372 2 DEBUG nova.compute.manager [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] No waiting events found dispatching network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.373 2 WARNING nova.compute.manager [req-1725085d-5072-485b-acb4-be16d1cea51a req-d3bcfd67-ad19-4305-9e5f-f42fe5dec611 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received unexpected event network-vif-plugged-fbca287a-897d-4532-bd1a-8bd100ed84e5 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.488 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.489 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.489 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.490 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.543 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.544 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.544 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.544 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.545 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.545 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.546 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:14 np0005465988 nova_compute[236126]: 2025-10-02 12:40:14.546 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:40:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:14.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:15.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:16 np0005465988 nova_compute[236126]: 2025-10-02 12:40:16.136 2 DEBUG nova.network.neutron [-] [instance: 45079fff-1c54-42d6-921b-150592757d59] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:40:16 np0005465988 nova_compute[236126]: 2025-10-02 12:40:16.386 2 INFO nova.compute.manager [-] [instance: 45079fff-1c54-42d6-921b-150592757d59] Took 2.45 seconds to deallocate network for instance.#033[00m
Oct  2 08:40:16 np0005465988 nova_compute[236126]: 2025-10-02 12:40:16.522 2 DEBUG oslo_concurrency.lockutils [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:16 np0005465988 nova_compute[236126]: 2025-10-02 12:40:16.523 2 DEBUG oslo_concurrency.lockutils [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:16 np0005465988 podman[304738]: 2025-10-02 12:40:16.552571245 +0000 UTC m=+0.080371433 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:40:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:16.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:16 np0005465988 nova_compute[236126]: 2025-10-02 12:40:16.605 2 DEBUG oslo_concurrency.processutils [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:40:16 np0005465988 nova_compute[236126]: 2025-10-02 12:40:16.774 2 DEBUG nova.compute.manager [req-ce7e631e-8179-449a-a8e8-524e7bb67724 req-5ac2196d-77f1-42f7-8e20-ed3c1e87e12b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45079fff-1c54-42d6-921b-150592757d59] Received event network-vif-deleted-fbca287a-897d-4532-bd1a-8bd100ed84e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:17 np0005465988 nova_compute[236126]: 2025-10-02 12:40:17.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:17.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:40:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3669041500' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:40:17 np0005465988 nova_compute[236126]: 2025-10-02 12:40:17.172 2 DEBUG oslo_concurrency.processutils [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.566s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:40:17 np0005465988 nova_compute[236126]: 2025-10-02 12:40:17.181 2 DEBUG nova.compute.provider_tree [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:40:17 np0005465988 nova_compute[236126]: 2025-10-02 12:40:17.247 2 DEBUG nova.scheduler.client.report [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:40:17 np0005465988 nova_compute[236126]: 2025-10-02 12:40:17.419 2 DEBUG oslo_concurrency.lockutils [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.896s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:17 np0005465988 nova_compute[236126]: 2025-10-02 12:40:17.497 2 INFO nova.scheduler.client.report [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Deleted allocations for instance 45079fff-1c54-42d6-921b-150592757d59#033[00m
Oct  2 08:40:17 np0005465988 nova_compute[236126]: 2025-10-02 12:40:17.796 2 DEBUG oslo_concurrency.lockutils [None req-a456f631-299b-4025-9919-bd99ab4263bc c02d1dcc10ea4e57bbc6b7a3c100dc7b 6822f02d5ca04c659329a75d487054cf - - default default] Lock "45079fff-1c54-42d6-921b-150592757d59" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 e341: 3 total, 3 up, 3 in
Oct  2 08:40:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:18.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:19.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:19 np0005465988 nova_compute[236126]: 2025-10-02 12:40:19.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:20.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:20.691 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:40:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:21.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:22 np0005465988 nova_compute[236126]: 2025-10-02 12:40:22.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:22 np0005465988 nova_compute[236126]: 2025-10-02 12:40:22.527 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:22.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:23.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:24 np0005465988 nova_compute[236126]: 2025-10-02 12:40:24.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:24.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:25.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:26.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:27 np0005465988 nova_compute[236126]: 2025-10-02 12:40:27.040 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408812.0382314, 45079fff-1c54-42d6-921b-150592757d59 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:40:27 np0005465988 nova_compute[236126]: 2025-10-02 12:40:27.041 2 INFO nova.compute.manager [-] [instance: 45079fff-1c54-42d6-921b-150592757d59] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:40:27 np0005465988 nova_compute[236126]: 2025-10-02 12:40:27.066 2 DEBUG nova.compute.manager [None req-42e011b0-efde-4893-a950-6883a5b90fae - - - - - -] [instance: 45079fff-1c54-42d6-921b-150592757d59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:40:27 np0005465988 nova_compute[236126]: 2025-10-02 12:40:27.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:27.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:27 np0005465988 ceph-mds[84851]: mds.beacon.cephfs.compute-2.gpiyct missed beacon ack from the monitors
Oct  2 08:40:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:27.378 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:27.379 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:40:27.379 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:28.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:29.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:29 np0005465988 nova_compute[236126]: 2025-10-02 12:40:29.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:30.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:31.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:32 np0005465988 nova_compute[236126]: 2025-10-02 12:40:32.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:32.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:33.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:33 np0005465988 podman[304971]: 2025-10-02 12:40:33.557497177 +0000 UTC m=+0.081055595 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid)
Oct  2 08:40:33 np0005465988 podman[304972]: 2025-10-02 12:40:33.575111557 +0000 UTC m=+0.090001168 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:40:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:33 np0005465988 podman[304970]: 2025-10-02 12:40:33.653997309 +0000 UTC m=+0.176407744 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:40:34 np0005465988 nova_compute[236126]: 2025-10-02 12:40:34.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:34.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:35.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:40:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:36.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:40:37 np0005465988 nova_compute[236126]: 2025-10-02 12:40:37.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:37.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:40:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:38.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:40:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:39.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:40:39 np0005465988 nova_compute[236126]: 2025-10-02 12:40:39.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:40:40 np0005465988 nova_compute[236126]: 2025-10-02 12:40:40.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:40.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:40 np0005465988 nova_compute[236126]: 2025-10-02 12:40:40.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:41.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:41 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:40:41 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:40:42 np0005465988 nova_compute[236126]: 2025-10-02 12:40:42.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:42.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:40:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:43.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:40:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:40:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:40:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:40:44 np0005465988 nova_compute[236126]: 2025-10-02 12:40:44.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:44.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:40:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:45.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:40:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:40:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:46.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:40:47 np0005465988 nova_compute[236126]: 2025-10-02 12:40:47.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:47.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:47 np0005465988 podman[305040]: 2025-10-02 12:40:47.547283797 +0000 UTC m=+0.077437392 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:40:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:48.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:49.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:49 np0005465988 nova_compute[236126]: 2025-10-02 12:40:49.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:50.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:51.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:52 np0005465988 nova_compute[236126]: 2025-10-02 12:40:52.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:40:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:52.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:40:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:53.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:54 np0005465988 nova_compute[236126]: 2025-10-02 12:40:54.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:40:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:54.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:40:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:40:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2023610297' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:40:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:40:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2023610297' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:40:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:55.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:56 np0005465988 nova_compute[236126]: 2025-10-02 12:40:56.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:40:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:56.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:40:57 np0005465988 nova_compute[236126]: 2025-10-02 12:40:57.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:40:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:57.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:40:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:40:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:40:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:40:58.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:40:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:40:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:40:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:40:59.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:40:59 np0005465988 nova_compute[236126]: 2025-10-02 12:40:59.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:59 np0005465988 nova_compute[236126]: 2025-10-02 12:40:59.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:59 np0005465988 nova_compute[236126]: 2025-10-02 12:40:59.503 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:59 np0005465988 nova_compute[236126]: 2025-10-02 12:40:59.503 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:59 np0005465988 nova_compute[236126]: 2025-10-02 12:40:59.503 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:59 np0005465988 nova_compute[236126]: 2025-10-02 12:40:59.503 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:40:59 np0005465988 nova_compute[236126]: 2025-10-02 12:40:59.504 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:40:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:40:59 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/708667367' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:40:59 np0005465988 nova_compute[236126]: 2025-10-02 12:40:59.959 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.166 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.167 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4221MB free_disk=20.92177963256836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.167 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.168 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.241 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.242 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.281 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:00.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:41:00 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4004202773' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.744 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.751 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.774 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.827 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:41:00 np0005465988 nova_compute[236126]: 2025-10-02 12:41:00.827 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:01.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:02 np0005465988 nova_compute[236126]: 2025-10-02 12:41:02.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:02.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:41:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:03.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:41:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:04 np0005465988 nova_compute[236126]: 2025-10-02 12:41:04.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:04 np0005465988 podman[305164]: 2025-10-02 12:41:04.540722639 +0000 UTC m=+0.070410712 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:41:04 np0005465988 podman[305163]: 2025-10-02 12:41:04.551907737 +0000 UTC m=+0.093582170 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:41:04 np0005465988 podman[305165]: 2025-10-02 12:41:04.555849009 +0000 UTC m=+0.083282068 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:41:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:04.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:04 np0005465988 nova_compute[236126]: 2025-10-02 12:41:04.827 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:05.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:05 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:41:05 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:41:06 np0005465988 nova_compute[236126]: 2025-10-02 12:41:06.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:06.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:07 np0005465988 nova_compute[236126]: 2025-10-02 12:41:07.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:41:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:07.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:41:08 np0005465988 nova_compute[236126]: 2025-10-02 12:41:08.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:41:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:08.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:41:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:09.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:09 np0005465988 nova_compute[236126]: 2025-10-02 12:41:09.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:09 np0005465988 nova_compute[236126]: 2025-10-02 12:41:09.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:10 np0005465988 nova_compute[236126]: 2025-10-02 12:41:10.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:10.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:11.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:41:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3051522132' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:41:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:41:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3051522132' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:41:11 np0005465988 nova_compute[236126]: 2025-10-02 12:41:11.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:11 np0005465988 nova_compute[236126]: 2025-10-02 12:41:11.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:41:11 np0005465988 nova_compute[236126]: 2025-10-02 12:41:11.606 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:11 np0005465988 nova_compute[236126]: 2025-10-02 12:41:11.607 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:11 np0005465988 nova_compute[236126]: 2025-10-02 12:41:11.635 2 DEBUG nova.compute.manager [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:41:11 np0005465988 nova_compute[236126]: 2025-10-02 12:41:11.735 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:11 np0005465988 nova_compute[236126]: 2025-10-02 12:41:11.736 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:11 np0005465988 nova_compute[236126]: 2025-10-02 12:41:11.746 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:41:11 np0005465988 nova_compute[236126]: 2025-10-02 12:41:11.747 2 INFO nova.compute.claims [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:41:11 np0005465988 nova_compute[236126]: 2025-10-02 12:41:11.884 2 DEBUG oslo_concurrency.processutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:41:12 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2115603472' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.361 2 DEBUG oslo_concurrency.processutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.368 2 DEBUG nova.compute.provider_tree [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.387 2 DEBUG nova.scheduler.client.report [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.421 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.422 2 DEBUG nova.compute.manager [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.528 2 DEBUG nova.compute.manager [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.529 2 DEBUG nova.network.neutron [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.597 2 INFO nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.637 2 DEBUG nova.compute.manager [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:41:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:41:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:12.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.696 2 INFO nova.virt.block_device [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Booting with volume 6d39578f-a8e9-40a7-8016-2f523024dbcc at /dev/vda#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.929 2 DEBUG os_brick.utils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.930 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.939 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.940 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[cf83e3ae-5deb-4bf2-8922-50b9bc8e834b]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.941 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.947 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.947 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[9f166c9e-a6ad-44c1-a05e-e5271d943820]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.948 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.955 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.956 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[ef48a445-3d85-4e7a-97e6-e8de7c404860]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.957 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[66cff5c2-dc7d-44ad-b726-ce23031d822c]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.957 2 DEBUG oslo_concurrency.processutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.992 2 DEBUG oslo_concurrency.processutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "nvme version" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.995 2 DEBUG os_brick.initiator.connectors.lightos [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.995 2 DEBUG os_brick.initiator.connectors.lightos [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.995 2 DEBUG os_brick.initiator.connectors.lightos [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.996 2 DEBUG os_brick.utils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] <== get_connector_properties: return (66ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:41:12 np0005465988 nova_compute[236126]: 2025-10-02 12:41:12.996 2 DEBUG nova.virt.block_device [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating existing volume attachment record: ae623a24-470d-48aa-b464-ea5427a6c254 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:41:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:41:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:13.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:41:13 np0005465988 nova_compute[236126]: 2025-10-02 12:41:13.354 2 DEBUG nova.policy [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b82c89ad6c4a49e78943f7a92d0a6560', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a41d99312f014c65adddea4f70536a15', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:41:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #112. Immutable memtables: 0.
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.024198) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 69] Flushing memtable with next log file: 112
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408874024236, "job": 69, "event": "flush_started", "num_memtables": 1, "num_entries": 852, "num_deletes": 251, "total_data_size": 1575641, "memory_usage": 1605568, "flush_reason": "Manual Compaction"}
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 69] Level-0 flush table #113: started
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408874051444, "cf_name": "default", "job": 69, "event": "table_file_creation", "file_number": 113, "file_size": 1027815, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 57295, "largest_seqno": 58142, "table_properties": {"data_size": 1023837, "index_size": 1694, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9658, "raw_average_key_size": 20, "raw_value_size": 1015525, "raw_average_value_size": 2111, "num_data_blocks": 74, "num_entries": 481, "num_filter_entries": 481, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759408812, "oldest_key_time": 1759408812, "file_creation_time": 1759408874, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 113, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 69] Flush lasted 27288 microseconds, and 4069 cpu microseconds.
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.051486) [db/flush_job.cc:967] [default] [JOB 69] Level-0 flush table #113: 1027815 bytes OK
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.051504) [db/memtable_list.cc:519] [default] Level-0 commit table #113 started
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.056051) [db/memtable_list.cc:722] [default] Level-0 commit table #113: memtable #1 done
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.056069) EVENT_LOG_v1 {"time_micros": 1759408874056064, "job": 69, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.056088) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 69] Try to delete WAL files size 1571214, prev total WAL file size 1586657, number of live WAL files 2.
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000109.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.056741) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034373639' seq:72057594037927935, type:22 .. '7061786F730035303231' seq:0, type:0; will stop at (end)
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 70] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 69 Base level 0, inputs: [113(1003KB)], [111(10MB)]
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408874056766, "job": 70, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [113], "files_L6": [111], "score": -1, "input_data_size": 12411777, "oldest_snapshot_seqno": -1}
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 70] Generated table #114: 8198 keys, 10536339 bytes, temperature: kUnknown
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408874194838, "cf_name": "default", "job": 70, "event": "table_file_creation", "file_number": 114, "file_size": 10536339, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10483665, "index_size": 31058, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20549, "raw_key_size": 212647, "raw_average_key_size": 25, "raw_value_size": 10339828, "raw_average_value_size": 1261, "num_data_blocks": 1210, "num_entries": 8198, "num_filter_entries": 8198, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759408874, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 114, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.195233) [db/compaction/compaction_job.cc:1663] [default] [JOB 70] Compacted 1@0 + 1@6 files to L6 => 10536339 bytes
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.204662) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 89.8 rd, 76.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 10.9 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(22.3) write-amplify(10.3) OK, records in: 8722, records dropped: 524 output_compression: NoCompression
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.204712) EVENT_LOG_v1 {"time_micros": 1759408874204688, "job": 70, "event": "compaction_finished", "compaction_time_micros": 138179, "compaction_time_cpu_micros": 24555, "output_level": 6, "num_output_files": 1, "total_output_size": 10536339, "num_input_records": 8722, "num_output_records": 8198, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000113.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408874205262, "job": 70, "event": "table_file_deletion", "file_number": 113}
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000111.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759408874209720, "job": 70, "event": "table_file_deletion", "file_number": 111}
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.056664) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.209825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.209832) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.209834) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.209836) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:41:14 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:41:14.209838) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:41:14 np0005465988 nova_compute[236126]: 2025-10-02 12:41:14.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:14.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:15.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:15 np0005465988 nova_compute[236126]: 2025-10-02 12:41:15.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:15 np0005465988 nova_compute[236126]: 2025-10-02 12:41:15.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:41:15 np0005465988 nova_compute[236126]: 2025-10-02 12:41:15.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:41:15 np0005465988 nova_compute[236126]: 2025-10-02 12:41:15.594 2 DEBUG nova.compute.manager [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:41:15 np0005465988 nova_compute[236126]: 2025-10-02 12:41:15.596 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:41:15 np0005465988 nova_compute[236126]: 2025-10-02 12:41:15.597 2 INFO nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Creating image(s)#033[00m
Oct  2 08:41:15 np0005465988 nova_compute[236126]: 2025-10-02 12:41:15.598 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:41:15 np0005465988 nova_compute[236126]: 2025-10-02 12:41:15.598 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Ensure instance console log exists: /var/lib/nova/instances/5a11dca9-ede5-4fdd-af8e-7936ff4f9980/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:41:15 np0005465988 nova_compute[236126]: 2025-10-02 12:41:15.599 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:15 np0005465988 nova_compute[236126]: 2025-10-02 12:41:15.599 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:15 np0005465988 nova_compute[236126]: 2025-10-02 12:41:15.600 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:16 np0005465988 nova_compute[236126]: 2025-10-02 12:41:16.663 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:41:16 np0005465988 nova_compute[236126]: 2025-10-02 12:41:16.664 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:41:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:16.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:16 np0005465988 nova_compute[236126]: 2025-10-02 12:41:16.754 2 DEBUG nova.network.neutron [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Successfully created port: ffc7f957-3806-432f-a6e7-5ea3c764735a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:41:17 np0005465988 nova_compute[236126]: 2025-10-02 12:41:17.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:17.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:18 np0005465988 podman[305363]: 2025-10-02 12:41:18.505255353 +0000 UTC m=+0.046121532 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 08:41:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e341 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:18.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:19 np0005465988 nova_compute[236126]: 2025-10-02 12:41:19.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:41:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:19.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:41:19 np0005465988 nova_compute[236126]: 2025-10-02 12:41:19.885 2 DEBUG nova.network.neutron [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Successfully updated port: ffc7f957-3806-432f-a6e7-5ea3c764735a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:41:19 np0005465988 nova_compute[236126]: 2025-10-02 12:41:19.921 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:19 np0005465988 nova_compute[236126]: 2025-10-02 12:41:19.922 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquired lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:19 np0005465988 nova_compute[236126]: 2025-10-02 12:41:19.922 2 DEBUG nova.network.neutron [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:41:20 np0005465988 nova_compute[236126]: 2025-10-02 12:41:20.289 2 DEBUG nova.compute.manager [req-596efe9a-d666-487c-8525-1b03060ff4cc req-3de7e64b-2635-482c-895b-86ae80feaf53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:20 np0005465988 nova_compute[236126]: 2025-10-02 12:41:20.289 2 DEBUG nova.compute.manager [req-596efe9a-d666-487c-8525-1b03060ff4cc req-3de7e64b-2635-482c-895b-86ae80feaf53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing instance network info cache due to event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:41:20 np0005465988 nova_compute[236126]: 2025-10-02 12:41:20.290 2 DEBUG oslo_concurrency.lockutils [req-596efe9a-d666-487c-8525-1b03060ff4cc req-3de7e64b-2635-482c-895b-86ae80feaf53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:20 np0005465988 nova_compute[236126]: 2025-10-02 12:41:20.622 2 DEBUG nova.network.neutron [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:41:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:41:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:20.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:41:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:21.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:22 np0005465988 nova_compute[236126]: 2025-10-02 12:41:22.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:22.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.219 2 DEBUG nova.network.neutron [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating instance_info_cache with network_info: [{"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:23.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.259 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Releasing lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.260 2 DEBUG nova.compute.manager [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Instance network_info: |[{"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.261 2 DEBUG oslo_concurrency.lockutils [req-596efe9a-d666-487c-8525-1b03060ff4cc req-3de7e64b-2635-482c-895b-86ae80feaf53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.262 2 DEBUG nova.network.neutron [req-596efe9a-d666-487c-8525-1b03060ff4cc req-3de7e64b-2635-482c-895b-86ae80feaf53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.268 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Start _get_guest_xml network_info=[{"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': 'ae623a24-470d-48aa-b464-ea5427a6c254', 'disk_bus': 'virtio', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-6d39578f-a8e9-40a7-8016-2f523024dbcc', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '6d39578f-a8e9-40a7-8016-2f523024dbcc', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '5a11dca9-ede5-4fdd-af8e-7936ff4f9980', 'attached_at': '', 'detached_at': '', 'volume_id': '6d39578f-a8e9-40a7-8016-2f523024dbcc', 'serial': '6d39578f-a8e9-40a7-8016-2f523024dbcc'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.274 2 WARNING nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.280 2 DEBUG nova.virt.libvirt.host [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.282 2 DEBUG nova.virt.libvirt.host [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.286 2 DEBUG nova.virt.libvirt.host [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.287 2 DEBUG nova.virt.libvirt.host [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.288 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.288 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.288 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.288 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.289 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.289 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.289 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.289 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.289 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.289 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.289 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.290 2 DEBUG nova.virt.hardware [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.319 2 DEBUG nova.storage.rbd_utils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] rbd image 5a11dca9-ede5-4fdd-af8e-7936ff4f9980_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.323 2 DEBUG oslo_concurrency.processutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e342 e342: 3 total, 3 up, 3 in
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.549 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.550 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.599 2 DEBUG nova.compute.manager [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:41:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.745 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.745 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:41:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2110376952' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.758 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.758 2 INFO nova.compute.claims [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.768 2 DEBUG oslo_concurrency.processutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.828 2 DEBUG nova.virt.libvirt.vif [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1297354926',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1297354926',id=152,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKhJxrDtnxwBQUfhXEoiE7UJdnEItyt2MVgFBXsCoh01cS2FKjJZa0tSLP7/9uktcmwDXaXDiKLD638dMdEY8dQy2aXxdKxSuJAyk4atAc8PHb6iv+FO/634dBFNFVRVg==',key_name='tempest-TestInstancesWithCinderVolumes-1888663332',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a41d99312f014c65adddea4f70536a15',ramdisk_id='',reservation_id='r-yursnfjy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-99684106',owner_user_name='tempest-TestInstancesWithCinderVolumes-99684106-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:41:12Z,user_data=None,user_id='b82c89ad6c4a49e78943f7a92d0a6560',uuid=5a11dca9-ede5-4fdd-af8e-7936ff4f9980,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.829 2 DEBUG nova.network.os_vif_util [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converting VIF {"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.829 2 DEBUG nova.network.os_vif_util [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:52:ab,bridge_name='br-int',has_traffic_filtering=True,id=ffc7f957-3806-432f-a6e7-5ea3c764735a,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc7f957-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.831 2 DEBUG nova.objects.instance [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.859 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  <uuid>5a11dca9-ede5-4fdd-af8e-7936ff4f9980</uuid>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  <name>instance-00000098</name>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestInstancesWithCinderVolumes-server-1297354926</nova:name>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:41:23</nova:creationTime>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <nova:user uuid="b82c89ad6c4a49e78943f7a92d0a6560">tempest-TestInstancesWithCinderVolumes-99684106-project-member</nova:user>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <nova:project uuid="a41d99312f014c65adddea4f70536a15">tempest-TestInstancesWithCinderVolumes-99684106</nova:project>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <nova:port uuid="ffc7f957-3806-432f-a6e7-5ea3c764735a">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <entry name="serial">5a11dca9-ede5-4fdd-af8e-7936ff4f9980</entry>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <entry name="uuid">5a11dca9-ede5-4fdd-af8e-7936ff4f9980</entry>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/5a11dca9-ede5-4fdd-af8e-7936ff4f9980_disk.config">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-6d39578f-a8e9-40a7-8016-2f523024dbcc">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <serial>6d39578f-a8e9-40a7-8016-2f523024dbcc</serial>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:5f:52:ab"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <target dev="tapffc7f957-38"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/5a11dca9-ede5-4fdd-af8e-7936ff4f9980/console.log" append="off"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:41:23 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:41:23 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:41:23 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:41:23 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.862 2 DEBUG nova.compute.manager [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Preparing to wait for external event network-vif-plugged-ffc7f957-3806-432f-a6e7-5ea3c764735a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.862 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.863 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.863 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.864 2 DEBUG nova.virt.libvirt.vif [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1297354926',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1297354926',id=152,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKhJxrDtnxwBQUfhXEoiE7UJdnEItyt2MVgFBXsCoh01cS2FKjJZa0tSLP7/9uktcmwDXaXDiKLD638dMdEY8dQy2aXxdKxSuJAyk4atAc8PHb6iv+FO/634dBFNFVRVg==',key_name='tempest-TestInstancesWithCinderVolumes-1888663332',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a41d99312f014c65adddea4f70536a15',ramdisk_id='',reservation_id='r-yursnfjy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-99684106',owner_user_name='tempest-TestInstancesWithCinderVolumes-99684106-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:41:12Z,user_data=None,user_id='b82c89ad6c4a49e78943f7a92d0a6560',uuid=5a11dca9-ede5-4fdd-af8e-7936ff4f9980,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.865 2 DEBUG nova.network.os_vif_util [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converting VIF {"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.866 2 DEBUG nova.network.os_vif_util [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:52:ab,bridge_name='br-int',has_traffic_filtering=True,id=ffc7f957-3806-432f-a6e7-5ea3c764735a,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc7f957-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.866 2 DEBUG os_vif [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:52:ab,bridge_name='br-int',has_traffic_filtering=True,id=ffc7f957-3806-432f-a6e7-5ea3c764735a,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc7f957-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.872 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.873 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.883 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapffc7f957-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.885 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapffc7f957-38, col_values=(('external_ids', {'iface-id': 'ffc7f957-3806-432f-a6e7-5ea3c764735a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:52:ab', 'vm-uuid': '5a11dca9-ede5-4fdd-af8e-7936ff4f9980'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:23 np0005465988 NetworkManager[45041]: <info>  [1759408883.8888] manager: (tapffc7f957-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:23 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.899 2 INFO os_vif [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:52:ab,bridge_name='br-int',has_traffic_filtering=True,id=ffc7f957-3806-432f-a6e7-5ea3c764735a,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc7f957-38')#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:23.999 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.000 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.002 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No VIF found with MAC fa:16:3e:5f:52:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.003 2 INFO nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Using config drive#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.036 2 DEBUG nova.storage.rbd_utils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] rbd image 5a11dca9-ede5-4fdd-af8e-7936ff4f9980_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.045 2 DEBUG oslo_concurrency.processutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:41:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/945226095' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.564 2 DEBUG oslo_concurrency.processutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.572 2 DEBUG nova.compute.provider_tree [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.610 2 DEBUG nova.scheduler.client.report [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.641 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.642 2 DEBUG nova.compute.manager [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:41:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:24.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.712 2 DEBUG nova.compute.manager [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.712 2 DEBUG nova.network.neutron [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.745 2 INFO nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Creating config drive at /var/lib/nova/instances/5a11dca9-ede5-4fdd-af8e-7936ff4f9980/disk.config#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.756 2 DEBUG oslo_concurrency.processutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5a11dca9-ede5-4fdd-af8e-7936ff4f9980/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9b372txi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.805 2 INFO nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.843 2 DEBUG nova.compute.manager [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.920 2 DEBUG oslo_concurrency.processutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5a11dca9-ede5-4fdd-af8e-7936ff4f9980/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9b372txi" returned: 0 in 0.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.966 2 DEBUG nova.storage.rbd_utils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] rbd image 5a11dca9-ede5-4fdd-af8e-7936ff4f9980_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:41:24 np0005465988 nova_compute[236126]: 2025-10-02 12:41:24.975 2 DEBUG oslo_concurrency.processutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5a11dca9-ede5-4fdd-af8e-7936ff4f9980/disk.config 5a11dca9-ede5-4fdd-af8e-7936ff4f9980_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.033 2 INFO nova.virt.block_device [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Booting with volume 9ccd0211-39da-4f0a-ae5b-a7561864b3d4 at /dev/vda#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.206 2 DEBUG oslo_concurrency.processutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5a11dca9-ede5-4fdd-af8e-7936ff4f9980/disk.config 5a11dca9-ede5-4fdd-af8e-7936ff4f9980_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.232s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.207 2 INFO nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Deleting local config drive /var/lib/nova/instances/5a11dca9-ede5-4fdd-af8e-7936ff4f9980/disk.config because it was imported into RBD.#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.213 2 DEBUG nova.policy [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b82c89ad6c4a49e78943f7a92d0a6560', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a41d99312f014c65adddea4f70536a15', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:41:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:25.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.225 2 DEBUG os_brick.utils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.226 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.239 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.240 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[a517c20d-e50f-4de1-a4d8-906bb95b7f8b]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.241 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.251 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.252 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[b1058c1d-8231-4621-b7cd-c153fd92c79c]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.253 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.264 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.265 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[2a796295-c8f4-45a3-b96e-3903712b9c21]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.266 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[4a4b90fa-54da-488d-897d-4b68bb0d071b]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.266 2 DEBUG oslo_concurrency.processutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:25 np0005465988 kernel: tapffc7f957-38: entered promiscuous mode
Oct  2 08:41:25 np0005465988 NetworkManager[45041]: <info>  [1759408885.2750] manager: (tapffc7f957-38): new Tun device (/org/freedesktop/NetworkManager/Devices/305)
Oct  2 08:41:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:25Z|00682|binding|INFO|Claiming lport ffc7f957-3806-432f-a6e7-5ea3c764735a for this chassis.
Oct  2 08:41:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:25Z|00683|binding|INFO|ffc7f957-3806-432f-a6e7-5ea3c764735a: Claiming fa:16:3e:5f:52:ab 10.100.0.5
Oct  2 08:41:25 np0005465988 systemd-udevd[305528]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.311 2 DEBUG oslo_concurrency.processutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "nvme version" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:25 np0005465988 NetworkManager[45041]: <info>  [1759408885.3183] device (tapffc7f957-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:41:25 np0005465988 NetworkManager[45041]: <info>  [1759408885.3191] device (tapffc7f957-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:41:25 np0005465988 systemd-machined[192594]: New machine qemu-71-instance-00000098.
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.321 2 DEBUG os_brick.initiator.connectors.lightos [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.322 2 DEBUG os_brick.initiator.connectors.lightos [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.322 2 DEBUG os_brick.initiator.connectors.lightos [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.322 2 DEBUG os_brick.utils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] <== get_connector_properties: return (96ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.323 2 DEBUG nova.virt.block_device [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating existing volume attachment record: 91d86631-64e7-4787-887f-760c514b960b _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:25 np0005465988 systemd[1]: Started Virtual Machine qemu-71-instance-00000098.
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:25Z|00684|binding|INFO|Setting lport ffc7f957-3806-432f-a6e7-5ea3c764735a ovn-installed in OVS
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:25Z|00685|binding|INFO|Setting lport ffc7f957-3806-432f-a6e7-5ea3c764735a up in Southbound
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.392 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:52:ab 10.100.0.5'], port_security=['fa:16:3e:5f:52:ab 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5a11dca9-ede5-4fdd-af8e-7936ff4f9980', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a41d99312f014c65adddea4f70536a15', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f3db9ba-e6e8-41b4-b916-387b4ad385f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4eaf2b53-ef61-475e-8161-94a8e63ff149, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=ffc7f957-3806-432f-a6e7-5ea3c764735a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.393 142124 INFO neutron.agent.ovn.metadata.agent [-] Port ffc7f957-3806-432f-a6e7-5ea3c764735a in datapath e7b8a8de-b6cd-4283-854b-a2bd919c371d bound to our chassis#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.395 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e7b8a8de-b6cd-4283-854b-a2bd919c371d#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.407 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9a8b5ff5-bd2d-4f61-b74e-97d2fe52ba24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.408 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape7b8a8de-b1 in ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.410 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape7b8a8de-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.410 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[db4cef14-461f-4bad-bcc1-da70cc9964c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.410 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[afd2be9e-e989-4ad0-aa5b-e0a72254814c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.433 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad2a33f-7f1a-4fb9-bae6-774e18d2d97d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.464 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[843cdaf5-b1b6-40a4-8ce9-e281265c45c6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.501 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1b72559a-69c8-45de-9257-a3e65efe3605]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 NetworkManager[45041]: <info>  [1759408885.5098] manager: (tape7b8a8de-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/306)
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.508 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0e4a36b4-d647-464c-9c56-f1361d2a8fbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.565 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ccece9cb-8125-44c3-a9a7-da00a55dfa56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.568 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.568 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[61752223-b94f-49eb-ad93-175a10318029]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 NetworkManager[45041]: <info>  [1759408885.6009] device (tape7b8a8de-b0): carrier: link connected
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.609 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f5050df9-ac86-4580-8221-b20f08e6295b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.633 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e19f7880-b38e-4be6-a4ed-c57315fb9ce4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape7b8a8de-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:18:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694593, 'reachable_time': 44964, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305582, 'error': None, 'target': 'ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.656 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[be196616-ffc9-47fa-b349-17f9e10ed75f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7b:1819'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 694593, 'tstamp': 694593}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 305598, 'error': None, 'target': 'ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.683 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[158305ff-e246-45c9-b261-9bf2cce54038]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape7b8a8de-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:18:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694593, 'reachable_time': 44964, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 305602, 'error': None, 'target': 'ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.725 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c375ca05-2b7e-4245-8696-6a9a389240f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.788 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0309f0bb-108c-48b5-bad6-19a6c82ea9f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.789 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7b8a8de-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.790 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.790 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7b8a8de-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:25 np0005465988 NetworkManager[45041]: <info>  [1759408885.7928] manager: (tape7b8a8de-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Oct  2 08:41:25 np0005465988 kernel: tape7b8a8de-b0: entered promiscuous mode
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.794 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape7b8a8de-b0, col_values=(('external_ids', {'iface-id': '79bf28ab-e58e-4276-adf8-279ba85b1b49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:25 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:25Z|00686|binding|INFO|Releasing lport 79bf28ab-e58e-4276-adf8-279ba85b1b49 from this chassis (sb_readonly=0)
Oct  2 08:41:25 np0005465988 nova_compute[236126]: 2025-10-02 12:41:25.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.808 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e7b8a8de-b6cd-4283-854b-a2bd919c371d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e7b8a8de-b6cd-4283-854b-a2bd919c371d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.809 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[12810a4a-548c-4e6b-b8a2-98787b6a25d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.810 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-e7b8a8de-b6cd-4283-854b-a2bd919c371d
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/e7b8a8de-b6cd-4283-854b-a2bd919c371d.pid.haproxy
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID e7b8a8de-b6cd-4283-854b-a2bd919c371d
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:41:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:25.810 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'env', 'PROCESS_TAG=haproxy-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e7b8a8de-b6cd-4283-854b-a2bd919c371d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:41:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:41:26 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/115220749' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:41:26 np0005465988 podman[305640]: 2025-10-02 12:41:26.242657735 +0000 UTC m=+0.073861090 container create 14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:41:26 np0005465988 nova_compute[236126]: 2025-10-02 12:41:26.269 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408886.2688413, 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:26 np0005465988 nova_compute[236126]: 2025-10-02 12:41:26.269 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] VM Started (Lifecycle Event)#033[00m
Oct  2 08:41:26 np0005465988 systemd[1]: Started libpod-conmon-14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42.scope.
Oct  2 08:41:26 np0005465988 nova_compute[236126]: 2025-10-02 12:41:26.286 2 DEBUG nova.network.neutron [req-596efe9a-d666-487c-8525-1b03060ff4cc req-3de7e64b-2635-482c-895b-86ae80feaf53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updated VIF entry in instance network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:41:26 np0005465988 nova_compute[236126]: 2025-10-02 12:41:26.286 2 DEBUG nova.network.neutron [req-596efe9a-d666-487c-8525-1b03060ff4cc req-3de7e64b-2635-482c-895b-86ae80feaf53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating instance_info_cache with network_info: [{"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:26 np0005465988 podman[305640]: 2025-10-02 12:41:26.213178457 +0000 UTC m=+0.044381922 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:41:26 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:41:26 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb5001e0e485dd34d2bc40f97d2448ee3f37bb25495f0a4744ddc83e66e58ccd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:41:26 np0005465988 podman[305640]: 2025-10-02 12:41:26.323584745 +0000 UTC m=+0.154788120 container init 14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:41:26 np0005465988 podman[305640]: 2025-10-02 12:41:26.328389972 +0000 UTC m=+0.159593327 container start 14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:41:26 np0005465988 neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d[305655]: [NOTICE]   (305659) : New worker (305661) forked
Oct  2 08:41:26 np0005465988 neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d[305655]: [NOTICE]   (305659) : Loading success.
Oct  2 08:41:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:26.390 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:41:26 np0005465988 nova_compute[236126]: 2025-10-02 12:41:26.458 2 DEBUG oslo_concurrency.lockutils [req-596efe9a-d666-487c-8525-1b03060ff4cc req-3de7e64b-2635-482c-895b-86ae80feaf53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:26 np0005465988 nova_compute[236126]: 2025-10-02 12:41:26.475 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:26 np0005465988 nova_compute[236126]: 2025-10-02 12:41:26.481 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408886.268966, 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:26 np0005465988 nova_compute[236126]: 2025-10-02 12:41:26.481 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:41:26 np0005465988 nova_compute[236126]: 2025-10-02 12:41:26.515 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:26 np0005465988 nova_compute[236126]: 2025-10-02 12:41:26.520 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:41:26 np0005465988 nova_compute[236126]: 2025-10-02 12:41:26.563 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:41:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:26.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:27.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:27.379 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:27.380 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:27.381 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:28.391 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:28.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.842 2 DEBUG nova.compute.manager [req-968e5f18-af42-4906-aa2f-a6ec4840c933 req-9bdbf373-4362-48d8-90db-d272884120d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-vif-plugged-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.843 2 DEBUG oslo_concurrency.lockutils [req-968e5f18-af42-4906-aa2f-a6ec4840c933 req-9bdbf373-4362-48d8-90db-d272884120d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.843 2 DEBUG oslo_concurrency.lockutils [req-968e5f18-af42-4906-aa2f-a6ec4840c933 req-9bdbf373-4362-48d8-90db-d272884120d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.843 2 DEBUG oslo_concurrency.lockutils [req-968e5f18-af42-4906-aa2f-a6ec4840c933 req-9bdbf373-4362-48d8-90db-d272884120d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.844 2 DEBUG nova.compute.manager [req-968e5f18-af42-4906-aa2f-a6ec4840c933 req-9bdbf373-4362-48d8-90db-d272884120d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Processing event network-vif-plugged-ffc7f957-3806-432f-a6e7-5ea3c764735a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.844 2 DEBUG nova.compute.manager [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.849 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408888.849029, 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.850 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.853 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.858 2 INFO nova.virt.libvirt.driver [-] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Instance spawned successfully.#033[00m
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.858 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:41:28 np0005465988 nova_compute[236126]: 2025-10-02 12:41:28.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:29 np0005465988 nova_compute[236126]: 2025-10-02 12:41:29.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:29.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:29 np0005465988 nova_compute[236126]: 2025-10-02 12:41:29.424 2 DEBUG nova.compute.manager [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:41:29 np0005465988 nova_compute[236126]: 2025-10-02 12:41:29.427 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:41:29 np0005465988 nova_compute[236126]: 2025-10-02 12:41:29.427 2 INFO nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Creating image(s)#033[00m
Oct  2 08:41:29 np0005465988 nova_compute[236126]: 2025-10-02 12:41:29.428 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:41:29 np0005465988 nova_compute[236126]: 2025-10-02 12:41:29.429 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Ensure instance console log exists: /var/lib/nova/instances/1ce0c3bd-552b-4bc2-95e9-ccac7b24593c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:41:29 np0005465988 nova_compute[236126]: 2025-10-02 12:41:29.429 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:29 np0005465988 nova_compute[236126]: 2025-10-02 12:41:29.430 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:29 np0005465988 nova_compute[236126]: 2025-10-02 12:41:29.431 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.111 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.117 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.117 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.118 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.119 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.119 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.120 2 DEBUG nova.virt.libvirt.driver [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.125 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.143 2 DEBUG nova.network.neutron [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Successfully created port: 6797a28e-4489-4337-b4be-f09d77787856 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.328 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.401 2 INFO nova.compute.manager [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Took 14.81 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.402 2 DEBUG nova.compute.manager [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.495 2 INFO nova.compute.manager [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Took 18.79 seconds to build instance.#033[00m
Oct  2 08:41:30 np0005465988 nova_compute[236126]: 2025-10-02 12:41:30.538 2 DEBUG oslo_concurrency.lockutils [None req-366018ff-f5b0-4d07-b4f3-8524a27fefc6 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.931s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:30.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.061 2 DEBUG nova.compute.manager [req-f76aaed1-24aa-4bf6-b6e7-2d4f42c90ef6 req-f3cf8e62-c7e6-41d4-8673-1d303208a32a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-vif-plugged-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.062 2 DEBUG oslo_concurrency.lockutils [req-f76aaed1-24aa-4bf6-b6e7-2d4f42c90ef6 req-f3cf8e62-c7e6-41d4-8673-1d303208a32a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.062 2 DEBUG oslo_concurrency.lockutils [req-f76aaed1-24aa-4bf6-b6e7-2d4f42c90ef6 req-f3cf8e62-c7e6-41d4-8673-1d303208a32a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.063 2 DEBUG oslo_concurrency.lockutils [req-f76aaed1-24aa-4bf6-b6e7-2d4f42c90ef6 req-f3cf8e62-c7e6-41d4-8673-1d303208a32a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.063 2 DEBUG nova.compute.manager [req-f76aaed1-24aa-4bf6-b6e7-2d4f42c90ef6 req-f3cf8e62-c7e6-41d4-8673-1d303208a32a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] No waiting events found dispatching network-vif-plugged-ffc7f957-3806-432f-a6e7-5ea3c764735a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.063 2 WARNING nova.compute.manager [req-f76aaed1-24aa-4bf6-b6e7-2d4f42c90ef6 req-f3cf8e62-c7e6-41d4-8673-1d303208a32a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received unexpected event network-vif-plugged-ffc7f957-3806-432f-a6e7-5ea3c764735a for instance with vm_state active and task_state None.#033[00m
Oct  2 08:41:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:41:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:31.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.530 2 DEBUG nova.network.neutron [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Successfully updated port: 6797a28e-4489-4337-b4be-f09d77787856 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.549 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.550 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquired lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.550 2 DEBUG nova.network.neutron [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.660 2 DEBUG nova.compute.manager [req-0415f5f9-7717-42f5-8675-863bd7318dc3 req-9df6e516-c90b-492d-96cb-c3ecc26e242a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-changed-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.661 2 DEBUG nova.compute.manager [req-0415f5f9-7717-42f5-8675-863bd7318dc3 req-9df6e516-c90b-492d-96cb-c3ecc26e242a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing instance network info cache due to event network-changed-6797a28e-4489-4337-b4be-f09d77787856. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.661 2 DEBUG oslo_concurrency.lockutils [req-0415f5f9-7717-42f5-8675-863bd7318dc3 req-9df6e516-c90b-492d-96cb-c3ecc26e242a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:31 np0005465988 nova_compute[236126]: 2025-10-02 12:41:31.790 2 DEBUG nova.network.neutron [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:41:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:32.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:33.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:33 np0005465988 nova_compute[236126]: 2025-10-02 12:41:33.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:33 np0005465988 nova_compute[236126]: 2025-10-02 12:41:33.927 2 DEBUG oslo_concurrency.lockutils [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:33 np0005465988 nova_compute[236126]: 2025-10-02 12:41:33.928 2 DEBUG oslo_concurrency.lockutils [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:33 np0005465988 nova_compute[236126]: 2025-10-02 12:41:33.993 2 DEBUG nova.objects.instance [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.052 2 DEBUG oslo_concurrency.lockutils [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.059 2 DEBUG nova.network.neutron [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating instance_info_cache with network_info: [{"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.121 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Releasing lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.123 2 DEBUG nova.compute.manager [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Instance network_info: |[{"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.126 2 DEBUG oslo_concurrency.lockutils [req-0415f5f9-7717-42f5-8675-863bd7318dc3 req-9df6e516-c90b-492d-96cb-c3ecc26e242a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.126 2 DEBUG nova.network.neutron [req-0415f5f9-7717-42f5-8675-863bd7318dc3 req-9df6e516-c90b-492d-96cb-c3ecc26e242a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing network info cache for port 6797a28e-4489-4337-b4be-f09d77787856 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.131 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Start _get_guest_xml network_info=[{"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '91d86631-64e7-4787-887f-760c514b960b', 'disk_bus': 'virtio', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-9ccd0211-39da-4f0a-ae5b-a7561864b3d4', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '9ccd0211-39da-4f0a-ae5b-a7561864b3d4', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1ce0c3bd-552b-4bc2-95e9-ccac7b24593c', 'attached_at': '', 'detached_at': '', 'volume_id': '9ccd0211-39da-4f0a-ae5b-a7561864b3d4', 'serial': '9ccd0211-39da-4f0a-ae5b-a7561864b3d4'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.136 2 WARNING nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.144 2 DEBUG nova.virt.libvirt.host [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.145 2 DEBUG nova.virt.libvirt.host [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.149 2 DEBUG nova.virt.libvirt.host [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.150 2 DEBUG nova.virt.libvirt.host [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.151 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.151 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.152 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.152 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.152 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.153 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.153 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.153 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.154 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.154 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.154 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.154 2 DEBUG nova.virt.hardware [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.183 2 DEBUG nova.storage.rbd_utils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] rbd image 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.189 2 DEBUG oslo_concurrency.processutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.434 2 DEBUG oslo_concurrency.lockutils [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.435 2 DEBUG oslo_concurrency.lockutils [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.436 2 INFO nova.compute.manager [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Attaching volume bc8a781d-7240-4c85-8671-db184dc7c32b to /dev/vdb#033[00m
Oct  2 08:41:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:41:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/423294808' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.676 2 DEBUG oslo_concurrency.processutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:41:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:34.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.720 2 DEBUG nova.virt.libvirt.vif [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:41:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1280805717',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1280805717',id=155,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKhJxrDtnxwBQUfhXEoiE7UJdnEItyt2MVgFBXsCoh01cS2FKjJZa0tSLP7/9uktcmwDXaXDiKLD638dMdEY8dQy2aXxdKxSuJAyk4atAc8PHb6iv+FO/634dBFNFVRVg==',key_name='tempest-TestInstancesWithCinderVolumes-1888663332',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a41d99312f014c65adddea4f70536a15',ramdisk_id='',reservation_id='r-c03q7o85',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-99684106',owner_user_name='tempest-TestInstancesWithCinderVolumes-99684106-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:41:24Z,user_data=None,user_id='b82c89ad6c4a49e78943f7a92d0a6560',uuid=1ce0c3bd-552b-4bc2-95e9-ccac7b24593c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.721 2 DEBUG nova.network.os_vif_util [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converting VIF {"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.722 2 DEBUG nova.network.os_vif_util [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:5d:14,bridge_name='br-int',has_traffic_filtering=True,id=6797a28e-4489-4337-b4be-f09d77787856,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6797a28e-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.724 2 DEBUG nova.objects.instance [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.739 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  <uuid>1ce0c3bd-552b-4bc2-95e9-ccac7b24593c</uuid>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  <name>instance-0000009b</name>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestInstancesWithCinderVolumes-server-1280805717</nova:name>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:41:34</nova:creationTime>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <nova:user uuid="b82c89ad6c4a49e78943f7a92d0a6560">tempest-TestInstancesWithCinderVolumes-99684106-project-member</nova:user>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <nova:project uuid="a41d99312f014c65adddea4f70536a15">tempest-TestInstancesWithCinderVolumes-99684106</nova:project>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <nova:port uuid="6797a28e-4489-4337-b4be-f09d77787856">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <entry name="serial">1ce0c3bd-552b-4bc2-95e9-ccac7b24593c</entry>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <entry name="uuid">1ce0c3bd-552b-4bc2-95e9-ccac7b24593c</entry>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/1ce0c3bd-552b-4bc2-95e9-ccac7b24593c_disk.config">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-9ccd0211-39da-4f0a-ae5b-a7561864b3d4">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <serial>9ccd0211-39da-4f0a-ae5b-a7561864b3d4</serial>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:34:5d:14"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <target dev="tap6797a28e-44"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/1ce0c3bd-552b-4bc2-95e9-ccac7b24593c/console.log" append="off"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:41:34 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:41:34 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:41:34 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:41:34 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.745 2 DEBUG nova.compute.manager [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Preparing to wait for external event network-vif-plugged-6797a28e-4489-4337-b4be-f09d77787856 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.745 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.745 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.746 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.746 2 DEBUG nova.virt.libvirt.vif [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:41:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1280805717',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1280805717',id=155,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKhJxrDtnxwBQUfhXEoiE7UJdnEItyt2MVgFBXsCoh01cS2FKjJZa0tSLP7/9uktcmwDXaXDiKLD638dMdEY8dQy2aXxdKxSuJAyk4atAc8PHb6iv+FO/634dBFNFVRVg==',key_name='tempest-TestInstancesWithCinderVolumes-1888663332',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a41d99312f014c65adddea4f70536a15',ramdisk_id='',reservation_id='r-c03q7o85',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-99684106',owner_user_name='tempest-TestInstancesWithCinderVolumes-99684106-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:41:24Z,user_data=None,user_id='b82c89ad6c4a49e78943f7a92d0a6560',uuid=1ce0c3bd-552b-4bc2-95e9-ccac7b24593c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.747 2 DEBUG nova.network.os_vif_util [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converting VIF {"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.748 2 DEBUG nova.network.os_vif_util [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:5d:14,bridge_name='br-int',has_traffic_filtering=True,id=6797a28e-4489-4337-b4be-f09d77787856,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6797a28e-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.748 2 DEBUG os_vif [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:5d:14,bridge_name='br-int',has_traffic_filtering=True,id=6797a28e-4489-4337-b4be-f09d77787856,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6797a28e-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.749 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.749 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.899 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6797a28e-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.900 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6797a28e-44, col_values=(('external_ids', {'iface-id': '6797a28e-4489-4337-b4be-f09d77787856', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:5d:14', 'vm-uuid': '1ce0c3bd-552b-4bc2-95e9-ccac7b24593c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:34 np0005465988 NetworkManager[45041]: <info>  [1759408894.9026] manager: (tap6797a28e-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.904 2 DEBUG os_brick.utils [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.906 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.910 2 INFO os_vif [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:5d:14,bridge_name='br-int',has_traffic_filtering=True,id=6797a28e-4489-4337-b4be-f09d77787856,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6797a28e-44')#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.934 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.935 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[2f4ee270-ac38-44e3-ab96-4931b34fd9c9]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.939 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.952 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.952 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd7d232-104c-44ae-8108-39fb4f3e11f6]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.957 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.971 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.971 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[1061798f-6e3b-4271-a10e-dce59baf30c0]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.973 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[9a1dc9fe-24e1-4b21-ad70-0d92a7ab01c6]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:34 np0005465988 nova_compute[236126]: 2025-10-02 12:41:34.973 2 DEBUG oslo_concurrency.processutils [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:35 np0005465988 nova_compute[236126]: 2025-10-02 12:41:35.021 2 DEBUG oslo_concurrency.processutils [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "nvme version" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:35 np0005465988 podman[305771]: 2025-10-02 12:41:35.022531165 +0000 UTC m=+0.059586684 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:41:35 np0005465988 nova_compute[236126]: 2025-10-02 12:41:35.023 2 DEBUG os_brick.initiator.connectors.lightos [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:41:35 np0005465988 nova_compute[236126]: 2025-10-02 12:41:35.023 2 DEBUG os_brick.initiator.connectors.lightos [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:41:35 np0005465988 nova_compute[236126]: 2025-10-02 12:41:35.024 2 DEBUG os_brick.initiator.connectors.lightos [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:41:35 np0005465988 nova_compute[236126]: 2025-10-02 12:41:35.028 2 DEBUG os_brick.utils [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] <== get_connector_properties: return (118ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:41:35 np0005465988 nova_compute[236126]: 2025-10-02 12:41:35.031 2 DEBUG nova.virt.block_device [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating existing volume attachment record: 1be79524-2048-4e8d-9bdd-a2a2c35a9a17 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:41:35 np0005465988 nova_compute[236126]: 2025-10-02 12:41:35.048 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:35 np0005465988 nova_compute[236126]: 2025-10-02 12:41:35.048 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:35 np0005465988 nova_compute[236126]: 2025-10-02 12:41:35.049 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No VIF found with MAC fa:16:3e:34:5d:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:41:35 np0005465988 nova_compute[236126]: 2025-10-02 12:41:35.049 2 INFO nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Using config drive#033[00m
Oct  2 08:41:35 np0005465988 podman[305779]: 2025-10-02 12:41:35.055862312 +0000 UTC m=+0.089328779 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:41:35 np0005465988 nova_compute[236126]: 2025-10-02 12:41:35.081 2 DEBUG nova.storage.rbd_utils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] rbd image 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:41:35 np0005465988 podman[305769]: 2025-10-02 12:41:35.087248494 +0000 UTC m=+0.133305479 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:41:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:35.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.108 2 DEBUG nova.objects.instance [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.142 2 DEBUG nova.virt.libvirt.driver [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Attempting to attach volume bc8a781d-7240-4c85-8671-db184dc7c32b with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.145 2 INFO nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Creating config drive at /var/lib/nova/instances/1ce0c3bd-552b-4bc2-95e9-ccac7b24593c/disk.config#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.151 2 DEBUG oslo_concurrency.processutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1ce0c3bd-552b-4bc2-95e9-ccac7b24593c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnrgbqxwq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.201 2 DEBUG nova.virt.libvirt.guest [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] attach device xml: <disk type="network" device="disk">
Oct  2 08:41:36 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:41:36 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-bc8a781d-7240-4c85-8671-db184dc7c32b">
Oct  2 08:41:36 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:41:36 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:41:36 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:41:36 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:41:36 np0005465988 nova_compute[236126]:  <auth username="openstack">
Oct  2 08:41:36 np0005465988 nova_compute[236126]:    <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:41:36 np0005465988 nova_compute[236126]:  </auth>
Oct  2 08:41:36 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:41:36 np0005465988 nova_compute[236126]:  <serial>bc8a781d-7240-4c85-8671-db184dc7c32b</serial>
Oct  2 08:41:36 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:41:36 np0005465988 nova_compute[236126]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.317 2 DEBUG oslo_concurrency.processutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1ce0c3bd-552b-4bc2-95e9-ccac7b24593c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnrgbqxwq" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.348 2 DEBUG nova.storage.rbd_utils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] rbd image 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.355 2 DEBUG oslo_concurrency.processutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1ce0c3bd-552b-4bc2-95e9-ccac7b24593c/disk.config 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.430 2 DEBUG nova.virt.libvirt.driver [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.431 2 DEBUG nova.virt.libvirt.driver [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.432 2 DEBUG nova.virt.libvirt.driver [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.432 2 DEBUG nova.virt.libvirt.driver [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No VIF found with MAC fa:16:3e:5f:52:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.532 2 DEBUG nova.network.neutron [req-0415f5f9-7717-42f5-8675-863bd7318dc3 req-9df6e516-c90b-492d-96cb-c3ecc26e242a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updated VIF entry in instance network info cache for port 6797a28e-4489-4337-b4be-f09d77787856. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.534 2 DEBUG nova.network.neutron [req-0415f5f9-7717-42f5-8675-863bd7318dc3 req-9df6e516-c90b-492d-96cb-c3ecc26e242a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating instance_info_cache with network_info: [{"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.580 2 DEBUG oslo_concurrency.lockutils [req-0415f5f9-7717-42f5-8675-863bd7318dc3 req-9df6e516-c90b-492d-96cb-c3ecc26e242a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.595 2 DEBUG oslo_concurrency.processutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1ce0c3bd-552b-4bc2-95e9-ccac7b24593c/disk.config 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.240s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.596 2 INFO nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Deleting local config drive /var/lib/nova/instances/1ce0c3bd-552b-4bc2-95e9-ccac7b24593c/disk.config because it was imported into RBD.#033[00m
Oct  2 08:41:36 np0005465988 kernel: tap6797a28e-44: entered promiscuous mode
Oct  2 08:41:36 np0005465988 NetworkManager[45041]: <info>  [1759408896.6524] manager: (tap6797a28e-44): new Tun device (/org/freedesktop/NetworkManager/Devices/309)
Oct  2 08:41:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:36Z|00687|binding|INFO|Claiming lport 6797a28e-4489-4337-b4be-f09d77787856 for this chassis.
Oct  2 08:41:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:36Z|00688|binding|INFO|6797a28e-4489-4337-b4be-f09d77787856: Claiming fa:16:3e:34:5d:14 10.100.0.4
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.688 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:5d:14 10.100.0.4'], port_security=['fa:16:3e:34:5d:14 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '1ce0c3bd-552b-4bc2-95e9-ccac7b24593c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a41d99312f014c65adddea4f70536a15', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f3db9ba-e6e8-41b4-b916-387b4ad385f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4eaf2b53-ef61-475e-8161-94a8e63ff149, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=6797a28e-4489-4337-b4be-f09d77787856) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.689 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 6797a28e-4489-4337-b4be-f09d77787856 in datapath e7b8a8de-b6cd-4283-854b-a2bd919c371d bound to our chassis#033[00m
Oct  2 08:41:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:36Z|00689|binding|INFO|Setting lport 6797a28e-4489-4337-b4be-f09d77787856 ovn-installed in OVS
Oct  2 08:41:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:36Z|00690|binding|INFO|Setting lport 6797a28e-4489-4337-b4be-f09d77787856 up in Southbound
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.692 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e7b8a8de-b6cd-4283-854b-a2bd919c371d#033[00m
Oct  2 08:41:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:36.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:36 np0005465988 systemd-udevd[305928]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:41:36 np0005465988 systemd-machined[192594]: New machine qemu-72-instance-0000009b.
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.714 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9ad83648-fc04-4a2d-88dd-d884590e2c6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:36 np0005465988 systemd[1]: Started Virtual Machine qemu-72-instance-0000009b.
Oct  2 08:41:36 np0005465988 NetworkManager[45041]: <info>  [1759408896.7328] device (tap6797a28e-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:41:36 np0005465988 NetworkManager[45041]: <info>  [1759408896.7347] device (tap6797a28e-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.753 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[66b4a16b-1c98-45fa-b4cc-9226528252b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.756 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2ead1111-d526-4925-8c19-10ac5a73e544]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.791 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce1605c-3246-4722-b050-bf2ae5c6be20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.814 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2a13d96f-dedb-4043-81e8-465e16ac9b72]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape7b8a8de-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:18:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 832, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 832, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694593, 'reachable_time': 44964, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305940, 'error': None, 'target': 'ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.832 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d0000077-e33c-4521-8f36-91d215b66369]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape7b8a8de-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 694608, 'tstamp': 694608}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 305942, 'error': None, 'target': 'ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape7b8a8de-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 694611, 'tstamp': 694611}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 305942, 'error': None, 'target': 'ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.834 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7b8a8de-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:36 np0005465988 nova_compute[236126]: 2025-10-02 12:41:36.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.839 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7b8a8de-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.839 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.840 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape7b8a8de-b0, col_values=(('external_ids', {'iface-id': '79bf28ab-e58e-4276-adf8-279ba85b1b49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:41:36.840 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:41:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:37.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:37 np0005465988 nova_compute[236126]: 2025-10-02 12:41:37.623 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408897.6225646, 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:37 np0005465988 nova_compute[236126]: 2025-10-02 12:41:37.624 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:41:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:38.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:39 np0005465988 nova_compute[236126]: 2025-10-02 12:41:39.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:41:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:39.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:41:39 np0005465988 nova_compute[236126]: 2025-10-02 12:41:39.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:40 np0005465988 nova_compute[236126]: 2025-10-02 12:41:40.448 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:40 np0005465988 nova_compute[236126]: 2025-10-02 12:41:40.453 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408897.6227221, 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:40 np0005465988 nova_compute[236126]: 2025-10-02 12:41:40.454 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:41:40 np0005465988 nova_compute[236126]: 2025-10-02 12:41:40.492 2 DEBUG oslo_concurrency.lockutils [None req-8c6161a3-e5af-449e-8de4-6d8e2a7869ba b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 6.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:40 np0005465988 nova_compute[236126]: 2025-10-02 12:41:40.497 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:40 np0005465988 nova_compute[236126]: 2025-10-02 12:41:40.501 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:41:40 np0005465988 nova_compute[236126]: 2025-10-02 12:41:40.565 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:41:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:40.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:41.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.084 2 DEBUG oslo_concurrency.lockutils [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.084 2 DEBUG oslo_concurrency.lockutils [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.113 2 DEBUG nova.objects.instance [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.199 2 DEBUG oslo_concurrency.lockutils [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.517 2 DEBUG oslo_concurrency.lockutils [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.518 2 DEBUG oslo_concurrency.lockutils [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.518 2 INFO nova.compute.manager [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Attaching volume 1fa11495-af8b-4452-bf4e-b9eb1f185956 to /dev/vdc#033[00m
Oct  2 08:41:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:42.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.788 2 DEBUG os_brick.utils [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.791 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.812 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.813 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[041203e6-918a-41f3-a740-d657f44bfc78]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.815 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.830 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.830 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[2814042d-bab7-4530-b2d3-e2f50b5065ac]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.832 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.843 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.843 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[21c2660d-be3d-4a81-bfd0-395ca4a74c99]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.845 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad13a2c-b3f5-4b12-9c8e-6c1379783c79]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.846 2 DEBUG oslo_concurrency.processutils [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.906 2 DEBUG oslo_concurrency.processutils [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "nvme version" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.910 2 DEBUG os_brick.initiator.connectors.lightos [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.911 2 DEBUG os_brick.initiator.connectors.lightos [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.912 2 DEBUG os_brick.initiator.connectors.lightos [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.912 2 DEBUG os_brick.utils [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] <== get_connector_properties: return (123ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:41:42 np0005465988 nova_compute[236126]: 2025-10-02 12:41:42.913 2 DEBUG nova.virt.block_device [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating existing volume attachment record: 76fd7d8b-2698-4170-afbb-be0640734ebf _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:41:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:43.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:43 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:43Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:52:ab 10.100.0.5
Oct  2 08:41:43 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:43Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:52:ab 10.100.0.5
Oct  2 08:41:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:44 np0005465988 nova_compute[236126]: 2025-10-02 12:41:44.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:44 np0005465988 nova_compute[236126]: 2025-10-02 12:41:44.359 2 DEBUG nova.objects.instance [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:44 np0005465988 nova_compute[236126]: 2025-10-02 12:41:44.397 2 DEBUG nova.virt.libvirt.driver [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Attempting to attach volume 1fa11495-af8b-4452-bf4e-b9eb1f185956 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Oct  2 08:41:44 np0005465988 nova_compute[236126]: 2025-10-02 12:41:44.401 2 DEBUG nova.virt.libvirt.guest [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] attach device xml: <disk type="network" device="disk">
Oct  2 08:41:44 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:41:44 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-1fa11495-af8b-4452-bf4e-b9eb1f185956">
Oct  2 08:41:44 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:41:44 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:41:44 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:41:44 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:41:44 np0005465988 nova_compute[236126]:  <auth username="openstack">
Oct  2 08:41:44 np0005465988 nova_compute[236126]:    <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:41:44 np0005465988 nova_compute[236126]:  </auth>
Oct  2 08:41:44 np0005465988 nova_compute[236126]:  <target dev="vdc" bus="virtio"/>
Oct  2 08:41:44 np0005465988 nova_compute[236126]:  <serial>1fa11495-af8b-4452-bf4e-b9eb1f185956</serial>
Oct  2 08:41:44 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:41:44 np0005465988 nova_compute[236126]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:41:44 np0005465988 nova_compute[236126]: 2025-10-02 12:41:44.580 2 DEBUG nova.virt.libvirt.driver [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:44 np0005465988 nova_compute[236126]: 2025-10-02 12:41:44.580 2 DEBUG nova.virt.libvirt.driver [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:44 np0005465988 nova_compute[236126]: 2025-10-02 12:41:44.581 2 DEBUG nova.virt.libvirt.driver [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:44 np0005465988 nova_compute[236126]: 2025-10-02 12:41:44.581 2 DEBUG nova.virt.libvirt.driver [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:44 np0005465988 nova_compute[236126]: 2025-10-02 12:41:44.582 2 DEBUG nova.virt.libvirt.driver [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No VIF found with MAC fa:16:3e:5f:52:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:41:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:41:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:44.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:41:44 np0005465988 nova_compute[236126]: 2025-10-02 12:41:44.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:45.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.510 2 DEBUG oslo_concurrency.lockutils [None req-32441eb5-10c8-4388-8b68-e6cb8d7a575c b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.762 2 DEBUG nova.compute.manager [req-a515bd84-32fe-492a-bf42-f1837d32c34d req-226ae675-14c4-47cc-aa1d-b5bbef2db117 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-vif-plugged-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.762 2 DEBUG oslo_concurrency.lockutils [req-a515bd84-32fe-492a-bf42-f1837d32c34d req-226ae675-14c4-47cc-aa1d-b5bbef2db117 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.763 2 DEBUG oslo_concurrency.lockutils [req-a515bd84-32fe-492a-bf42-f1837d32c34d req-226ae675-14c4-47cc-aa1d-b5bbef2db117 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.763 2 DEBUG oslo_concurrency.lockutils [req-a515bd84-32fe-492a-bf42-f1837d32c34d req-226ae675-14c4-47cc-aa1d-b5bbef2db117 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.763 2 DEBUG nova.compute.manager [req-a515bd84-32fe-492a-bf42-f1837d32c34d req-226ae675-14c4-47cc-aa1d-b5bbef2db117 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Processing event network-vif-plugged-6797a28e-4489-4337-b4be-f09d77787856 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.764 2 DEBUG nova.compute.manager [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.772 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.773 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408905.772427, 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.773 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.778 2 INFO nova.virt.libvirt.driver [-] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Instance spawned successfully.#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.778 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.809 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.812 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.825 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.826 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.826 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.826 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.827 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.827 2 DEBUG nova.virt.libvirt.driver [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.854 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.941 2 INFO nova.compute.manager [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Took 16.52 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:41:45 np0005465988 nova_compute[236126]: 2025-10-02 12:41:45.942 2 DEBUG nova.compute.manager [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:46 np0005465988 nova_compute[236126]: 2025-10-02 12:41:46.042 2 INFO nova.compute.manager [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Took 22.38 seconds to build instance.#033[00m
Oct  2 08:41:46 np0005465988 nova_compute[236126]: 2025-10-02 12:41:46.104 2 DEBUG oslo_concurrency.lockutils [None req-5039ce9a-c6c4-49a5-98c8-63f1f1431055 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:41:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:46.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:41:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:47.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:48 np0005465988 nova_compute[236126]: 2025-10-02 12:41:48.233 2 DEBUG nova.compute.manager [req-dc1c5fce-5e58-4ada-97f9-2a261924d991 req-3c2ffcd7-7557-4b8e-b965-e4be007d4169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-vif-plugged-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:48 np0005465988 nova_compute[236126]: 2025-10-02 12:41:48.234 2 DEBUG oslo_concurrency.lockutils [req-dc1c5fce-5e58-4ada-97f9-2a261924d991 req-3c2ffcd7-7557-4b8e-b965-e4be007d4169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:48 np0005465988 nova_compute[236126]: 2025-10-02 12:41:48.234 2 DEBUG oslo_concurrency.lockutils [req-dc1c5fce-5e58-4ada-97f9-2a261924d991 req-3c2ffcd7-7557-4b8e-b965-e4be007d4169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:48 np0005465988 nova_compute[236126]: 2025-10-02 12:41:48.235 2 DEBUG oslo_concurrency.lockutils [req-dc1c5fce-5e58-4ada-97f9-2a261924d991 req-3c2ffcd7-7557-4b8e-b965-e4be007d4169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:48 np0005465988 nova_compute[236126]: 2025-10-02 12:41:48.236 2 DEBUG nova.compute.manager [req-dc1c5fce-5e58-4ada-97f9-2a261924d991 req-3c2ffcd7-7557-4b8e-b965-e4be007d4169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] No waiting events found dispatching network-vif-plugged-6797a28e-4489-4337-b4be-f09d77787856 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:41:48 np0005465988 nova_compute[236126]: 2025-10-02 12:41:48.236 2 WARNING nova.compute.manager [req-dc1c5fce-5e58-4ada-97f9-2a261924d991 req-3c2ffcd7-7557-4b8e-b965-e4be007d4169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received unexpected event network-vif-plugged-6797a28e-4489-4337-b4be-f09d77787856 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:41:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:41:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:48.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:41:49 np0005465988 nova_compute[236126]: 2025-10-02 12:41:49.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:49.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:49 np0005465988 NetworkManager[45041]: <info>  [1759408909.3662] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Oct  2 08:41:49 np0005465988 NetworkManager[45041]: <info>  [1759408909.3678] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Oct  2 08:41:49 np0005465988 nova_compute[236126]: 2025-10-02 12:41:49.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:49 np0005465988 podman[306020]: 2025-10-02 12:41:49.58134979 +0000 UTC m=+0.100996982 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:41:49 np0005465988 nova_compute[236126]: 2025-10-02 12:41:49.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:49 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:49Z|00691|binding|INFO|Releasing lport 79bf28ab-e58e-4276-adf8-279ba85b1b49 from this chassis (sb_readonly=0)
Oct  2 08:41:49 np0005465988 nova_compute[236126]: 2025-10-02 12:41:49.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:49 np0005465988 nova_compute[236126]: 2025-10-02 12:41:49.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:50.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:41:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:51.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:41:51 np0005465988 nova_compute[236126]: 2025-10-02 12:41:51.586 2 DEBUG nova.compute.manager [req-da25b644-7a22-4253-95f9-4d1a83a47488 req-077c3b15-c2ba-4c90-b1f3-6bf6d026f282 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:51 np0005465988 nova_compute[236126]: 2025-10-02 12:41:51.588 2 DEBUG nova.compute.manager [req-da25b644-7a22-4253-95f9-4d1a83a47488 req-077c3b15-c2ba-4c90-b1f3-6bf6d026f282 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing instance network info cache due to event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:41:51 np0005465988 nova_compute[236126]: 2025-10-02 12:41:51.588 2 DEBUG oslo_concurrency.lockutils [req-da25b644-7a22-4253-95f9-4d1a83a47488 req-077c3b15-c2ba-4c90-b1f3-6bf6d026f282 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:51 np0005465988 nova_compute[236126]: 2025-10-02 12:41:51.589 2 DEBUG oslo_concurrency.lockutils [req-da25b644-7a22-4253-95f9-4d1a83a47488 req-077c3b15-c2ba-4c90-b1f3-6bf6d026f282 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:51 np0005465988 nova_compute[236126]: 2025-10-02 12:41:51.589 2 DEBUG nova.network.neutron [req-da25b644-7a22-4253-95f9-4d1a83a47488 req-077c3b15-c2ba-4c90-b1f3-6bf6d026f282 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:41:51 np0005465988 nova_compute[236126]: 2025-10-02 12:41:51.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:52 np0005465988 nova_compute[236126]: 2025-10-02 12:41:52.637 2 DEBUG nova.compute.manager [req-08172b6d-b0b7-435d-b885-364f511c9a32 req-682f04c3-322f-40b4-9d08-cb8249d3b141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:52 np0005465988 nova_compute[236126]: 2025-10-02 12:41:52.639 2 DEBUG nova.compute.manager [req-08172b6d-b0b7-435d-b885-364f511c9a32 req-682f04c3-322f-40b4-9d08-cb8249d3b141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing instance network info cache due to event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:41:52 np0005465988 nova_compute[236126]: 2025-10-02 12:41:52.641 2 DEBUG oslo_concurrency.lockutils [req-08172b6d-b0b7-435d-b885-364f511c9a32 req-682f04c3-322f-40b4-9d08-cb8249d3b141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:41:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:52.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:41:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:53.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:53 np0005465988 nova_compute[236126]: 2025-10-02 12:41:53.584 2 DEBUG nova.network.neutron [req-da25b644-7a22-4253-95f9-4d1a83a47488 req-077c3b15-c2ba-4c90-b1f3-6bf6d026f282 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updated VIF entry in instance network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:41:53 np0005465988 nova_compute[236126]: 2025-10-02 12:41:53.586 2 DEBUG nova.network.neutron [req-da25b644-7a22-4253-95f9-4d1a83a47488 req-077c3b15-c2ba-4c90-b1f3-6bf6d026f282 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating instance_info_cache with network_info: [{"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:53 np0005465988 nova_compute[236126]: 2025-10-02 12:41:53.633 2 DEBUG oslo_concurrency.lockutils [req-da25b644-7a22-4253-95f9-4d1a83a47488 req-077c3b15-c2ba-4c90-b1f3-6bf6d026f282 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:53 np0005465988 nova_compute[236126]: 2025-10-02 12:41:53.635 2 DEBUG oslo_concurrency.lockutils [req-08172b6d-b0b7-435d-b885-364f511c9a32 req-682f04c3-322f-40b4-9d08-cb8249d3b141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:53 np0005465988 nova_compute[236126]: 2025-10-02 12:41:53.635 2 DEBUG nova.network.neutron [req-08172b6d-b0b7-435d-b885-364f511c9a32 req-682f04c3-322f-40b4-9d08-cb8249d3b141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:41:54 np0005465988 nova_compute[236126]: 2025-10-02 12:41:54.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:54.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:54 np0005465988 nova_compute[236126]: 2025-10-02 12:41:54.847 2 DEBUG nova.compute.manager [req-f8fb7a10-0a52-4c63-8c2f-3ef2518515b6 req-6320756e-8bd0-447f-a2e1-c3da13800ea3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:54 np0005465988 nova_compute[236126]: 2025-10-02 12:41:54.848 2 DEBUG nova.compute.manager [req-f8fb7a10-0a52-4c63-8c2f-3ef2518515b6 req-6320756e-8bd0-447f-a2e1-c3da13800ea3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing instance network info cache due to event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:41:54 np0005465988 nova_compute[236126]: 2025-10-02 12:41:54.849 2 DEBUG oslo_concurrency.lockutils [req-f8fb7a10-0a52-4c63-8c2f-3ef2518515b6 req-6320756e-8bd0-447f-a2e1-c3da13800ea3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:54 np0005465988 nova_compute[236126]: 2025-10-02 12:41:54.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:41:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1073050676' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:41:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:41:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1073050676' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:41:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:55.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:56.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:57.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:57 np0005465988 nova_compute[236126]: 2025-10-02 12:41:57.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:57 np0005465988 nova_compute[236126]: 2025-10-02 12:41:57.683 2 DEBUG nova.network.neutron [req-08172b6d-b0b7-435d-b885-364f511c9a32 req-682f04c3-322f-40b4-9d08-cb8249d3b141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updated VIF entry in instance network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:41:57 np0005465988 nova_compute[236126]: 2025-10-02 12:41:57.683 2 DEBUG nova.network.neutron [req-08172b6d-b0b7-435d-b885-364f511c9a32 req-682f04c3-322f-40b4-9d08-cb8249d3b141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating instance_info_cache with network_info: [{"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:57 np0005465988 nova_compute[236126]: 2025-10-02 12:41:57.707 2 DEBUG oslo_concurrency.lockutils [req-08172b6d-b0b7-435d-b885-364f511c9a32 req-682f04c3-322f-40b4-9d08-cb8249d3b141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:57 np0005465988 nova_compute[236126]: 2025-10-02 12:41:57.707 2 DEBUG oslo_concurrency.lockutils [req-f8fb7a10-0a52-4c63-8c2f-3ef2518515b6 req-6320756e-8bd0-447f-a2e1-c3da13800ea3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:57 np0005465988 nova_compute[236126]: 2025-10-02 12:41:57.707 2 DEBUG nova.network.neutron [req-f8fb7a10-0a52-4c63-8c2f-3ef2518515b6 req-6320756e-8bd0-447f-a2e1-c3da13800ea3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:41:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:41:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:41:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:41:58.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:41:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:58Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:34:5d:14 10.100.0.4
Oct  2 08:41:58 np0005465988 ovn_controller[132601]: 2025-10-02T12:41:58Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:34:5d:14 10.100.0.4
Oct  2 08:41:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e343 e343: 3 total, 3 up, 3 in
Oct  2 08:41:59 np0005465988 nova_compute[236126]: 2025-10-02 12:41:59.188 2 DEBUG nova.compute.manager [req-d297f99c-204d-493c-8b73-8198cf6e3001 req-5a6efeff-deba-44ff-9289-557e562d0a11 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:59 np0005465988 nova_compute[236126]: 2025-10-02 12:41:59.188 2 DEBUG nova.compute.manager [req-d297f99c-204d-493c-8b73-8198cf6e3001 req-5a6efeff-deba-44ff-9289-557e562d0a11 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing instance network info cache due to event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:41:59 np0005465988 nova_compute[236126]: 2025-10-02 12:41:59.188 2 DEBUG oslo_concurrency.lockutils [req-d297f99c-204d-493c-8b73-8198cf6e3001 req-5a6efeff-deba-44ff-9289-557e562d0a11 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:41:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:41:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:41:59.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:41:59 np0005465988 nova_compute[236126]: 2025-10-02 12:41:59.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:59 np0005465988 nova_compute[236126]: 2025-10-02 12:41:59.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:59 np0005465988 nova_compute[236126]: 2025-10-02 12:41:59.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:00 np0005465988 nova_compute[236126]: 2025-10-02 12:42:00.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:00 np0005465988 nova_compute[236126]: 2025-10-02 12:42:00.510 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:00 np0005465988 nova_compute[236126]: 2025-10-02 12:42:00.511 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:00 np0005465988 nova_compute[236126]: 2025-10-02 12:42:00.511 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:00 np0005465988 nova_compute[236126]: 2025-10-02 12:42:00.512 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:42:00 np0005465988 nova_compute[236126]: 2025-10-02 12:42:00.512 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:00 np0005465988 nova_compute[236126]: 2025-10-02 12:42:00.586 2 DEBUG nova.network.neutron [req-f8fb7a10-0a52-4c63-8c2f-3ef2518515b6 req-6320756e-8bd0-447f-a2e1-c3da13800ea3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updated VIF entry in instance network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:42:00 np0005465988 nova_compute[236126]: 2025-10-02 12:42:00.588 2 DEBUG nova.network.neutron [req-f8fb7a10-0a52-4c63-8c2f-3ef2518515b6 req-6320756e-8bd0-447f-a2e1-c3da13800ea3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating instance_info_cache with network_info: [{"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:00 np0005465988 nova_compute[236126]: 2025-10-02 12:42:00.616 2 DEBUG oslo_concurrency.lockutils [req-f8fb7a10-0a52-4c63-8c2f-3ef2518515b6 req-6320756e-8bd0-447f-a2e1-c3da13800ea3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:00 np0005465988 nova_compute[236126]: 2025-10-02 12:42:00.617 2 DEBUG oslo_concurrency.lockutils [req-d297f99c-204d-493c-8b73-8198cf6e3001 req-5a6efeff-deba-44ff-9289-557e562d0a11 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:00 np0005465988 nova_compute[236126]: 2025-10-02 12:42:00.617 2 DEBUG nova.network.neutron [req-d297f99c-204d-493c-8b73-8198cf6e3001 req-5a6efeff-deba-44ff-9289-557e562d0a11 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:42:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:00.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:42:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1853213375' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.024 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 08:42:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:01.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.289 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000098 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.290 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000098 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.290 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000098 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.290 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000098 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.295 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.296 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.512 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.513 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3774MB free_disk=20.875686645507812GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.513 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.513 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.681 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.681 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.681 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.682 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:42:01 np0005465988 nova_compute[236126]: 2025-10-02 12:42:01.761 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:42:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2808035723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:42:02 np0005465988 nova_compute[236126]: 2025-10-02 12:42:02.335 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.575s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:02 np0005465988 nova_compute[236126]: 2025-10-02 12:42:02.342 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:42:02 np0005465988 nova_compute[236126]: 2025-10-02 12:42:02.369 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:42:02 np0005465988 nova_compute[236126]: 2025-10-02 12:42:02.408 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:42:02 np0005465988 nova_compute[236126]: 2025-10-02 12:42:02.409 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:02.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:02 np0005465988 nova_compute[236126]: 2025-10-02 12:42:02.909 2 DEBUG nova.network.neutron [req-d297f99c-204d-493c-8b73-8198cf6e3001 req-5a6efeff-deba-44ff-9289-557e562d0a11 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updated VIF entry in instance network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:42:02 np0005465988 nova_compute[236126]: 2025-10-02 12:42:02.909 2 DEBUG nova.network.neutron [req-d297f99c-204d-493c-8b73-8198cf6e3001 req-5a6efeff-deba-44ff-9289-557e562d0a11 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating instance_info_cache with network_info: [{"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:02 np0005465988 nova_compute[236126]: 2025-10-02 12:42:02.952 2 DEBUG oslo_concurrency.lockutils [req-d297f99c-204d-493c-8b73-8198cf6e3001 req-5a6efeff-deba-44ff-9289-557e562d0a11 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:03.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:03 np0005465988 nova_compute[236126]: 2025-10-02 12:42:03.560 2 DEBUG oslo_concurrency.lockutils [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:03 np0005465988 nova_compute[236126]: 2025-10-02 12:42:03.560 2 DEBUG oslo_concurrency.lockutils [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:03 np0005465988 nova_compute[236126]: 2025-10-02 12:42:03.592 2 INFO nova.compute.manager [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Detaching volume bc8a781d-7240-4c85-8671-db184dc7c32b#033[00m
Oct  2 08:42:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e343 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.010 2 INFO nova.virt.block_device [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Attempting to driver detach volume bc8a781d-7240-4c85-8671-db184dc7c32b from mountpoint /dev/vdb#033[00m
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.023 2 DEBUG nova.virt.libvirt.driver [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Attempting to detach device vdb from instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.024 2 DEBUG nova.virt.libvirt.guest [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-bc8a781d-7240-4c85-8671-db184dc7c32b">
Oct  2 08:42:04 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  <serial>bc8a781d-7240-4c85-8671-db184dc7c32b</serial>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:42:04 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.047 2 INFO nova.virt.libvirt.driver [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully detached device vdb from instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 from the persistent domain config.#033[00m
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.048 2 DEBUG nova.virt.libvirt.driver [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.049 2 DEBUG nova.virt.libvirt.guest [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-bc8a781d-7240-4c85-8671-db184dc7c32b">
Oct  2 08:42:04 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  <serial>bc8a781d-7240-4c85-8671-db184dc7c32b</serial>
Oct  2 08:42:04 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Oct  2 08:42:04 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:42:04 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.346 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Received event <DeviceRemovedEvent: 1759408924.3459778, 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.350 2 DEBUG nova.virt.libvirt.driver [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.354 2 INFO nova.virt.libvirt.driver [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully detached device vdb from instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 from the live domain config.#033[00m
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.709 2 DEBUG nova.objects.instance [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:04.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.770 2 DEBUG oslo_concurrency.lockutils [None req-63f594af-9703-4fce-ba5a-54b3edb42a9a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:04 np0005465988 nova_compute[236126]: 2025-10-02 12:42:04.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:05 np0005465988 podman[306197]: 2025-10-02 12:42:05.200507819 +0000 UTC m=+0.086601422 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd)
Oct  2 08:42:05 np0005465988 podman[306196]: 2025-10-02 12:42:05.223820142 +0000 UTC m=+0.111798629 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  2 08:42:05 np0005465988 podman[306198]: 2025-10-02 12:42:05.255967435 +0000 UTC m=+0.129125281 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:42:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:05.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:05 np0005465988 nova_compute[236126]: 2025-10-02 12:42:05.409 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:06 np0005465988 nova_compute[236126]: 2025-10-02 12:42:06.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:42:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:42:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:42:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:42:06 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1730528600' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:42:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:06.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:07.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.274 2 DEBUG oslo_concurrency.lockutils [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.275 2 DEBUG oslo_concurrency.lockutils [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.307 2 INFO nova.compute.manager [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Detaching volume 1fa11495-af8b-4452-bf4e-b9eb1f185956#033[00m
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.467 2 INFO nova.virt.block_device [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Attempting to driver detach volume 1fa11495-af8b-4452-bf4e-b9eb1f185956 from mountpoint /dev/vdc#033[00m
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.485 2 DEBUG nova.virt.libvirt.driver [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Attempting to detach device vdc from instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.486 2 DEBUG nova.virt.libvirt.guest [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-1fa11495-af8b-4452-bf4e-b9eb1f185956">
Oct  2 08:42:08 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  <target dev="vdc" bus="virtio"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  <serial>1fa11495-af8b-4452-bf4e-b9eb1f185956</serial>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:42:08 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.498 2 INFO nova.virt.libvirt.driver [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully detached device vdc from instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 from the persistent domain config.#033[00m
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.499 2 DEBUG nova.virt.libvirt.driver [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.500 2 DEBUG nova.virt.libvirt.guest [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-1fa11495-af8b-4452-bf4e-b9eb1f185956">
Oct  2 08:42:08 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  <target dev="vdc" bus="virtio"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  <serial>1fa11495-af8b-4452-bf4e-b9eb1f185956</serial>
Oct  2 08:42:08 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Oct  2 08:42:08 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:42:08 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:42:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e343 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.640 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Received event <DeviceRemovedEvent: 1759408928.6404464, 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.643 2 DEBUG nova.virt.libvirt.driver [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.647 2 INFO nova.virt.libvirt.driver [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully detached device vdc from instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 from the live domain config.#033[00m
Oct  2 08:42:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:08.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:08 np0005465988 nova_compute[236126]: 2025-10-02 12:42:08.934 2 DEBUG nova.objects.instance [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:09 np0005465988 nova_compute[236126]: 2025-10-02 12:42:09.002 2 DEBUG oslo_concurrency.lockutils [None req-d6e3e53d-ac63-4d5d-9288-628ff7b60231 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.726s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:09.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:09 np0005465988 nova_compute[236126]: 2025-10-02 12:42:09.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:09 np0005465988 nova_compute[236126]: 2025-10-02 12:42:09.404 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "c70e8f51-9397-40dd-9bbe-210e60b75364" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:09 np0005465988 nova_compute[236126]: 2025-10-02 12:42:09.405 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:09 np0005465988 nova_compute[236126]: 2025-10-02 12:42:09.426 2 DEBUG nova.compute.manager [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:42:09 np0005465988 nova_compute[236126]: 2025-10-02 12:42:09.565 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:09 np0005465988 nova_compute[236126]: 2025-10-02 12:42:09.565 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:09 np0005465988 nova_compute[236126]: 2025-10-02 12:42:09.571 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:42:09 np0005465988 nova_compute[236126]: 2025-10-02 12:42:09.571 2 INFO nova.compute.claims [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:42:09 np0005465988 nova_compute[236126]: 2025-10-02 12:42:09.752 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.207 2 DEBUG nova.compute.manager [req-faf3ff73-813a-48a3-998e-09203e41bc53 req-aeead96a-409f-400d-b429-b595ef802d09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.207 2 DEBUG nova.compute.manager [req-faf3ff73-813a-48a3-998e-09203e41bc53 req-aeead96a-409f-400d-b429-b595ef802d09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing instance network info cache due to event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.208 2 DEBUG oslo_concurrency.lockutils [req-faf3ff73-813a-48a3-998e-09203e41bc53 req-aeead96a-409f-400d-b429-b595ef802d09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.209 2 DEBUG oslo_concurrency.lockutils [req-faf3ff73-813a-48a3-998e-09203e41bc53 req-aeead96a-409f-400d-b429-b595ef802d09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.209 2 DEBUG nova.network.neutron [req-faf3ff73-813a-48a3-998e-09203e41bc53 req-aeead96a-409f-400d-b429-b595ef802d09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:42:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:42:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3584363981' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.304 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.552s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.311 2 DEBUG nova.compute.provider_tree [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.331 2 DEBUG nova.scheduler.client.report [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.366 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.800s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.367 2 DEBUG nova.compute.manager [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.446 2 DEBUG nova.compute.manager [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.447 2 DEBUG nova.network.neutron [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.470 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.471 2 INFO nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:42:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/429066828' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:42:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:42:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/429066828' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.497 2 DEBUG nova.compute.manager [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.609 2 DEBUG nova.compute.manager [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.611 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.613 2 INFO nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Creating image(s)#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.660 2 DEBUG nova.storage.rbd_utils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] rbd image c70e8f51-9397-40dd-9bbe-210e60b75364_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.695 2 DEBUG nova.storage.rbd_utils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] rbd image c70e8f51-9397-40dd-9bbe-210e60b75364_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.730 2 DEBUG nova.storage.rbd_utils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] rbd image c70e8f51-9397-40dd-9bbe-210e60b75364_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.734 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:10.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.780 2 DEBUG nova.policy [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56c6abe1bb704c8aa499677aeb9017f5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4b8f9114c7ab4b6e9fc9650d4bd08af9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.834 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.835 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.836 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.836 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.870 2 DEBUG nova.storage.rbd_utils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] rbd image c70e8f51-9397-40dd-9bbe-210e60b75364_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:42:10 np0005465988 nova_compute[236126]: 2025-10-02 12:42:10.876 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c70e8f51-9397-40dd-9bbe-210e60b75364_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:11 np0005465988 nova_compute[236126]: 2025-10-02 12:42:11.203 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c70e8f51-9397-40dd-9bbe-210e60b75364_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.327s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:11.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:11 np0005465988 nova_compute[236126]: 2025-10-02 12:42:11.317 2 DEBUG nova.storage.rbd_utils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] resizing rbd image c70e8f51-9397-40dd-9bbe-210e60b75364_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:42:11 np0005465988 nova_compute[236126]: 2025-10-02 12:42:11.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:11 np0005465988 nova_compute[236126]: 2025-10-02 12:42:11.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:42:11 np0005465988 nova_compute[236126]: 2025-10-02 12:42:11.484 2 DEBUG nova.objects.instance [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lazy-loading 'migration_context' on Instance uuid c70e8f51-9397-40dd-9bbe-210e60b75364 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:11 np0005465988 nova_compute[236126]: 2025-10-02 12:42:11.502 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:42:11 np0005465988 nova_compute[236126]: 2025-10-02 12:42:11.503 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Ensure instance console log exists: /var/lib/nova/instances/c70e8f51-9397-40dd-9bbe-210e60b75364/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:42:11 np0005465988 nova_compute[236126]: 2025-10-02 12:42:11.504 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:11 np0005465988 nova_compute[236126]: 2025-10-02 12:42:11.504 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:11 np0005465988 nova_compute[236126]: 2025-10-02 12:42:11.505 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:11 np0005465988 nova_compute[236126]: 2025-10-02 12:42:11.699 2 DEBUG nova.network.neutron [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Successfully created port: 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:42:12 np0005465988 nova_compute[236126]: 2025-10-02 12:42:12.326 2 DEBUG nova.network.neutron [req-faf3ff73-813a-48a3-998e-09203e41bc53 req-aeead96a-409f-400d-b429-b595ef802d09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updated VIF entry in instance network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:42:12 np0005465988 nova_compute[236126]: 2025-10-02 12:42:12.327 2 DEBUG nova.network.neutron [req-faf3ff73-813a-48a3-998e-09203e41bc53 req-aeead96a-409f-400d-b429-b595ef802d09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating instance_info_cache with network_info: [{"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:12 np0005465988 nova_compute[236126]: 2025-10-02 12:42:12.352 2 DEBUG oslo_concurrency.lockutils [req-faf3ff73-813a-48a3-998e-09203e41bc53 req-aeead96a-409f-400d-b429-b595ef802d09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:12.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:13.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:42:13 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:42:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e343 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:14 np0005465988 nova_compute[236126]: 2025-10-02 12:42:14.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:14.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:15 np0005465988 nova_compute[236126]: 2025-10-02 12:42:15.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:15 np0005465988 nova_compute[236126]: 2025-10-02 12:42:15.217 2 DEBUG nova.network.neutron [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Successfully updated port: 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:42:15 np0005465988 nova_compute[236126]: 2025-10-02 12:42:15.244 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:42:15 np0005465988 nova_compute[236126]: 2025-10-02 12:42:15.245 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquired lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:15 np0005465988 nova_compute[236126]: 2025-10-02 12:42:15.245 2 DEBUG nova.network.neutron [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:42:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:15.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:15 np0005465988 nova_compute[236126]: 2025-10-02 12:42:15.336 2 DEBUG nova.compute.manager [req-29e37ac5-62e4-4169-9cc2-93f8af8c9c48 req-9f707443-8809-4bf2-a0e8-a0170352a491 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Received event network-changed-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:15 np0005465988 nova_compute[236126]: 2025-10-02 12:42:15.337 2 DEBUG nova.compute.manager [req-29e37ac5-62e4-4169-9cc2-93f8af8c9c48 req-9f707443-8809-4bf2-a0e8-a0170352a491 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Refreshing instance network info cache due to event network-changed-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:42:15 np0005465988 nova_compute[236126]: 2025-10-02 12:42:15.337 2 DEBUG oslo_concurrency.lockutils [req-29e37ac5-62e4-4169-9cc2-93f8af8c9c48 req-9f707443-8809-4bf2-a0e8-a0170352a491 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:42:15 np0005465988 nova_compute[236126]: 2025-10-02 12:42:15.428 2 DEBUG nova.network.neutron [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:42:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:16.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.100 2 DEBUG nova.network.neutron [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Updating instance_info_cache with network_info: [{"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:17.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.398 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Releasing lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.399 2 DEBUG nova.compute.manager [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Instance network_info: |[{"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.401 2 DEBUG oslo_concurrency.lockutils [req-29e37ac5-62e4-4169-9cc2-93f8af8c9c48 req-9f707443-8809-4bf2-a0e8-a0170352a491 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.401 2 DEBUG nova.network.neutron [req-29e37ac5-62e4-4169-9cc2-93f8af8c9c48 req-9f707443-8809-4bf2-a0e8-a0170352a491 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Refreshing network info cache for port 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.407 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Start _get_guest_xml network_info=[{"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.413 2 WARNING nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.418 2 DEBUG nova.virt.libvirt.host [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.419 2 DEBUG nova.virt.libvirt.host [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.424 2 DEBUG nova.virt.libvirt.host [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.424 2 DEBUG nova.virt.libvirt.host [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.426 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.428 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.428 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.429 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.429 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.430 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.430 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.430 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.431 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.431 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.432 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.432 2 DEBUG nova.virt.hardware [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.438 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.491 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.491 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.492 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.596 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.795 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.796 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.796 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.796 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:42:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2451229646' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.925 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.959 2 DEBUG nova.storage.rbd_utils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] rbd image c70e8f51-9397-40dd-9bbe-210e60b75364_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:42:17 np0005465988 nova_compute[236126]: 2025-10-02 12:42:17.964 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:42:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2539095240' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.425 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.428 2 DEBUG nova.virt.libvirt.vif [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:42:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-561749725',display_name='tempest-TestShelveInstance-server-561749725',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-561749725',id=157,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEhf9w8pkEKWa7V1ngOe9fjFIi8JcNaUtJyznubChlj27hHukuq0Ytpxs3mHaFViqIafdIVxRwuOXby9NJMGuDWmrvU49YApKESuv4kV9WfKPY1JgB2zj33RiXhpo9OCqg==',key_name='tempest-TestShelveInstance-1513273409',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b8f9114c7ab4b6e9fc9650d4bd08af9',ramdisk_id='',reservation_id='r-iq5s6cd5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1219039163',owner_user_name='tempest-TestShelveInstance-1219039163-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:42:10Z,user_data=None,user_id='56c6abe1bb704c8aa499677aeb9017f5',uuid=c70e8f51-9397-40dd-9bbe-210e60b75364,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.428 2 DEBUG nova.network.os_vif_util [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converting VIF {"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.430 2 DEBUG nova.network.os_vif_util [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:bd:5e,bridge_name='br-int',has_traffic_filtering=True,id=9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fd83de3-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.432 2 DEBUG nova.objects.instance [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lazy-loading 'pci_devices' on Instance uuid c70e8f51-9397-40dd-9bbe-210e60b75364 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.455 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  <uuid>c70e8f51-9397-40dd-9bbe-210e60b75364</uuid>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  <name>instance-0000009d</name>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestShelveInstance-server-561749725</nova:name>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:42:17</nova:creationTime>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <nova:user uuid="56c6abe1bb704c8aa499677aeb9017f5">tempest-TestShelveInstance-1219039163-project-member</nova:user>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <nova:project uuid="4b8f9114c7ab4b6e9fc9650d4bd08af9">tempest-TestShelveInstance-1219039163</nova:project>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <nova:port uuid="9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <entry name="serial">c70e8f51-9397-40dd-9bbe-210e60b75364</entry>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <entry name="uuid">c70e8f51-9397-40dd-9bbe-210e60b75364</entry>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c70e8f51-9397-40dd-9bbe-210e60b75364_disk">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c70e8f51-9397-40dd-9bbe-210e60b75364_disk.config">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:e0:bd:5e"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <target dev="tap9fd83de3-58"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/c70e8f51-9397-40dd-9bbe-210e60b75364/console.log" append="off"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:42:18 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:42:18 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:42:18 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:42:18 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.455 2 DEBUG nova.compute.manager [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Preparing to wait for external event network-vif-plugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.456 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.456 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.457 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.458 2 DEBUG nova.virt.libvirt.vif [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:42:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-561749725',display_name='tempest-TestShelveInstance-server-561749725',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-561749725',id=157,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEhf9w8pkEKWa7V1ngOe9fjFIi8JcNaUtJyznubChlj27hHukuq0Ytpxs3mHaFViqIafdIVxRwuOXby9NJMGuDWmrvU49YApKESuv4kV9WfKPY1JgB2zj33RiXhpo9OCqg==',key_name='tempest-TestShelveInstance-1513273409',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b8f9114c7ab4b6e9fc9650d4bd08af9',ramdisk_id='',reservation_id='r-iq5s6cd5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1219039163',owner_user_name='tempest-TestShelveInstance-1219039163-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:42:10Z,user_data=None,user_id='56c6abe1bb704c8aa499677aeb9017f5',uuid=c70e8f51-9397-40dd-9bbe-210e60b75364,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.458 2 DEBUG nova.network.os_vif_util [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converting VIF {"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.459 2 DEBUG nova.network.os_vif_util [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:bd:5e,bridge_name='br-int',has_traffic_filtering=True,id=9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fd83de3-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.460 2 DEBUG os_vif [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:bd:5e,bridge_name='br-int',has_traffic_filtering=True,id=9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fd83de3-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.462 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.462 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.468 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9fd83de3-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.468 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9fd83de3-58, col_values=(('external_ids', {'iface-id': '9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:bd:5e', 'vm-uuid': 'c70e8f51-9397-40dd-9bbe-210e60b75364'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:18 np0005465988 NetworkManager[45041]: <info>  [1759408938.4714] manager: (tap9fd83de3-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.477 2 INFO os_vif [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:bd:5e,bridge_name='br-int',has_traffic_filtering=True,id=9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fd83de3-58')#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.602 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.603 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.603 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] No VIF found with MAC fa:16:3e:e0:bd:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.603 2 INFO nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Using config drive#033[00m
Oct  2 08:42:18 np0005465988 nova_compute[236126]: 2025-10-02 12:42:18.630 2 DEBUG nova.storage.rbd_utils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] rbd image c70e8f51-9397-40dd-9bbe-210e60b75364_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:42:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e343 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:18.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e344 e344: 3 total, 3 up, 3 in
Oct  2 08:42:19 np0005465988 nova_compute[236126]: 2025-10-02 12:42:19.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:19.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:42:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1546775005' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.199 2 INFO nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Creating config drive at /var/lib/nova/instances/c70e8f51-9397-40dd-9bbe-210e60b75364/disk.config#033[00m
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.211 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c70e8f51-9397-40dd-9bbe-210e60b75364/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfnarvz5o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.333 2 DEBUG nova.network.neutron [req-29e37ac5-62e4-4169-9cc2-93f8af8c9c48 req-9f707443-8809-4bf2-a0e8-a0170352a491 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Updated VIF entry in instance network info cache for port 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.333 2 DEBUG nova.network.neutron [req-29e37ac5-62e4-4169-9cc2-93f8af8c9c48 req-9f707443-8809-4bf2-a0e8-a0170352a491 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Updating instance_info_cache with network_info: [{"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.364 2 DEBUG oslo_concurrency.lockutils [req-29e37ac5-62e4-4169-9cc2-93f8af8c9c48 req-9f707443-8809-4bf2-a0e8-a0170352a491 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.365 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c70e8f51-9397-40dd-9bbe-210e60b75364/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfnarvz5o" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.409 2 DEBUG nova.storage.rbd_utils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] rbd image c70e8f51-9397-40dd-9bbe-210e60b75364_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.414 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c70e8f51-9397-40dd-9bbe-210e60b75364/disk.config c70e8f51-9397-40dd-9bbe-210e60b75364_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:20 np0005465988 podman[306745]: 2025-10-02 12:42:20.532944331 +0000 UTC m=+0.060706797 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.633 2 DEBUG oslo_concurrency.processutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c70e8f51-9397-40dd-9bbe-210e60b75364/disk.config c70e8f51-9397-40dd-9bbe-210e60b75364_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.219s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.635 2 INFO nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Deleting local config drive /var/lib/nova/instances/c70e8f51-9397-40dd-9bbe-210e60b75364/disk.config because it was imported into RBD.#033[00m
Oct  2 08:42:20 np0005465988 kernel: tap9fd83de3-58: entered promiscuous mode
Oct  2 08:42:20 np0005465988 NetworkManager[45041]: <info>  [1759408940.7062] manager: (tap9fd83de3-58): new Tun device (/org/freedesktop/NetworkManager/Devices/313)
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:20Z|00692|binding|INFO|Claiming lport 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf for this chassis.
Oct  2 08:42:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:20Z|00693|binding|INFO|9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf: Claiming fa:16:3e:e0:bd:5e 10.100.0.7
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.714 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:bd:5e 10.100.0.7'], port_security=['fa:16:3e:e0:bd:5e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c70e8f51-9397-40dd-9bbe-210e60b75364', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8f9114c7ab4b6e9fc9650d4bd08af9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'aaa1af3a-3e07-4a02-982d-cee91699f079', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04a89c39-8141-4654-8368-c858180215b3, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.716 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf in datapath 6ea0a90a-9528-4fe1-8b35-dfde9b35e85f bound to our chassis#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.719 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6ea0a90a-9528-4fe1-8b35-dfde9b35e85f#033[00m
Oct  2 08:42:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:20Z|00694|binding|INFO|Setting lport 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf ovn-installed in OVS
Oct  2 08:42:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:20Z|00695|binding|INFO|Setting lport 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf up in Southbound
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.742 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[488efa6c-5a47-4890-8855-13c16b23c277]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.743 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6ea0a90a-91 in ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.746 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6ea0a90a-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.746 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ad30b2c8-075c-4d8d-9a00-778ea1665fb1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.747 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[295efb8f-1955-4f27-b89e-4ff282fbc750]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 nova_compute[236126]: 2025-10-02 12:42:20.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:20.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.768 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[536be70c-9ed1-477a-94cc-74c28e5a6cb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 systemd-machined[192594]: New machine qemu-73-instance-0000009d.
Oct  2 08:42:20 np0005465988 systemd-udevd[306794]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:42:20 np0005465988 NetworkManager[45041]: <info>  [1759408940.7915] device (tap9fd83de3-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:42:20 np0005465988 systemd[1]: Started Virtual Machine qemu-73-instance-0000009d.
Oct  2 08:42:20 np0005465988 NetworkManager[45041]: <info>  [1759408940.7929] device (tap9fd83de3-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.792 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[883fc48b-9bf3-4c5d-b537-f0fd8c091088]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.823 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ad75af-082f-4ed6-a321-3998168f288d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.829 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[36131e38-672e-43f9-a73b-e3a8b6d6fef1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 NetworkManager[45041]: <info>  [1759408940.8309] manager: (tap6ea0a90a-90): new Veth device (/org/freedesktop/NetworkManager/Devices/314)
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.870 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6af861-3741-43c1-99f8-4696a983a7e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.875 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a97853-6667-4602-a225-0f9e9f25a1c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 NetworkManager[45041]: <info>  [1759408940.8966] device (tap6ea0a90a-90): carrier: link connected
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.907 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e9817e-94aa-409d-8298-46bcbab11038]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.924 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[63fcc231-ae7b-4349-a9ec-1dc63073bf61]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ea0a90a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:92:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700122, 'reachable_time': 34216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306825, 'error': None, 'target': 'ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.939 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[86ecb66d-8c27-416c-8f92-18528395859a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe67:9244'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 700122, 'tstamp': 700122}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 306826, 'error': None, 'target': 'ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.956 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[48353eeb-72f5-4c19-84ae-7f6cd73f9c07]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ea0a90a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:92:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700122, 'reachable_time': 34216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 306827, 'error': None, 'target': 'ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:20.994 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[485110dc-2da4-4eac-bd97-300f67712418]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:21.066 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[46631dce-1a76-4a1a-b320-ae260af0bc31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:21.068 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ea0a90a-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:21.069 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:21.070 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ea0a90a-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:21 np0005465988 nova_compute[236126]: 2025-10-02 12:42:21.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:21 np0005465988 NetworkManager[45041]: <info>  [1759408941.0728] manager: (tap6ea0a90a-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Oct  2 08:42:21 np0005465988 kernel: tap6ea0a90a-90: entered promiscuous mode
Oct  2 08:42:21 np0005465988 nova_compute[236126]: 2025-10-02 12:42:21.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:21.076 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6ea0a90a-90, col_values=(('external_ids', {'iface-id': '3850aa59-d3b6-4277-b937-ad9f4b8f7b4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:21 np0005465988 nova_compute[236126]: 2025-10-02 12:42:21.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:21 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:21Z|00696|binding|INFO|Releasing lport 3850aa59-d3b6-4277-b937-ad9f4b8f7b4c from this chassis (sb_readonly=0)
Oct  2 08:42:21 np0005465988 nova_compute[236126]: 2025-10-02 12:42:21.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:21.106 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6ea0a90a-9528-4fe1-8b35-dfde9b35e85f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6ea0a90a-9528-4fe1-8b35-dfde9b35e85f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:21.107 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[45d44287-45df-4b6a-a5da-b6c66a821c26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:21.108 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/6ea0a90a-9528-4fe1-8b35-dfde9b35e85f.pid.haproxy
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 6ea0a90a-9528-4fe1-8b35-dfde9b35e85f
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:42:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:21.109 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'env', 'PROCESS_TAG=haproxy-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6ea0a90a-9528-4fe1-8b35-dfde9b35e85f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:42:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:42:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:21.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:42:21 np0005465988 podman[306860]: 2025-10-02 12:42:21.524934304 +0000 UTC m=+0.074661243 container create c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:42:21 np0005465988 podman[306860]: 2025-10-02 12:42:21.476206519 +0000 UTC m=+0.025933468 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:42:21 np0005465988 systemd[1]: Started libpod-conmon-c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540.scope.
Oct  2 08:42:21 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:42:21 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a107f5f6f9bd34c760d2ab1265480dc12d24ff4f1cab5e6abf038f1f661162b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:42:21 np0005465988 podman[306860]: 2025-10-02 12:42:21.600986726 +0000 UTC m=+0.150713685 container init c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:42:21 np0005465988 nova_compute[236126]: 2025-10-02 12:42:21.609 2 DEBUG nova.compute.manager [req-acd3ca72-5f99-44ef-af76-4983e4170424 req-b6f18306-e474-44ca-b1a5-3639acd31989 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Received event network-vif-plugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:21 np0005465988 nova_compute[236126]: 2025-10-02 12:42:21.610 2 DEBUG oslo_concurrency.lockutils [req-acd3ca72-5f99-44ef-af76-4983e4170424 req-b6f18306-e474-44ca-b1a5-3639acd31989 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:21 np0005465988 nova_compute[236126]: 2025-10-02 12:42:21.610 2 DEBUG oslo_concurrency.lockutils [req-acd3ca72-5f99-44ef-af76-4983e4170424 req-b6f18306-e474-44ca-b1a5-3639acd31989 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:21 np0005465988 nova_compute[236126]: 2025-10-02 12:42:21.610 2 DEBUG oslo_concurrency.lockutils [req-acd3ca72-5f99-44ef-af76-4983e4170424 req-b6f18306-e474-44ca-b1a5-3639acd31989 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:21 np0005465988 nova_compute[236126]: 2025-10-02 12:42:21.610 2 DEBUG nova.compute.manager [req-acd3ca72-5f99-44ef-af76-4983e4170424 req-b6f18306-e474-44ca-b1a5-3639acd31989 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Processing event network-vif-plugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:42:21 np0005465988 podman[306860]: 2025-10-02 12:42:21.612333338 +0000 UTC m=+0.162060267 container start c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:42:21 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[306893]: [NOTICE]   (306912) : New worker (306917) forked
Oct  2 08:42:21 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[306893]: [NOTICE]   (306912) : Loading success.
Oct  2 08:42:22 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:22Z|00697|memory|INFO|peak resident set size grew 50% in last 3522.6 seconds, from 16000 kB to 24068 kB
Oct  2 08:42:22 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:22Z|00698|memory|INFO|idl-cells-OVN_Southbound:11602 idl-cells-Open_vSwitch:1041 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:384 lflow-cache-entries-cache-matches:297 lflow-cache-size-KB:1536 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:701 ofctrl_installed_flow_usage-KB:513 ofctrl_sb_flow_ref_usage-KB:263
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.528 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating instance_info_cache with network_info: [{"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.558 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.559 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.617 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408942.6163197, c70e8f51-9397-40dd-9bbe-210e60b75364 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.617 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] VM Started (Lifecycle Event)#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.621 2 DEBUG nova.compute.manager [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.626 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.631 2 INFO nova.virt.libvirt.driver [-] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Instance spawned successfully.#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.631 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.652 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.658 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.664 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.665 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.665 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.665 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.666 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.666 2 DEBUG nova.virt.libvirt.driver [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.694 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.695 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408942.6164489, c70e8f51-9397-40dd-9bbe-210e60b75364 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.695 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.717 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.720 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759408942.626377, c70e8f51-9397-40dd-9bbe-210e60b75364 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.720 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.726 2 INFO nova.compute.manager [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Took 12.12 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.726 2 DEBUG nova.compute.manager [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.758 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:22.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.763 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.803 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.831 2 INFO nova.compute.manager [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Took 13.33 seconds to build instance.#033[00m
Oct  2 08:42:22 np0005465988 nova_compute[236126]: 2025-10-02 12:42:22.862 2 DEBUG oslo_concurrency.lockutils [None req-53741aec-df49-4752-bca0-4dd8faf978cf 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:23.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:23 np0005465988 nova_compute[236126]: 2025-10-02 12:42:23.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:23 np0005465988 nova_compute[236126]: 2025-10-02 12:42:23.695 2 DEBUG nova.compute.manager [req-511e701c-69ae-4363-ba4b-bd443d2adff3 req-cdea602c-4458-4607-9923-58c16da6af0e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Received event network-vif-plugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:23 np0005465988 nova_compute[236126]: 2025-10-02 12:42:23.695 2 DEBUG oslo_concurrency.lockutils [req-511e701c-69ae-4363-ba4b-bd443d2adff3 req-cdea602c-4458-4607-9923-58c16da6af0e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:23 np0005465988 nova_compute[236126]: 2025-10-02 12:42:23.696 2 DEBUG oslo_concurrency.lockutils [req-511e701c-69ae-4363-ba4b-bd443d2adff3 req-cdea602c-4458-4607-9923-58c16da6af0e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:23 np0005465988 nova_compute[236126]: 2025-10-02 12:42:23.696 2 DEBUG oslo_concurrency.lockutils [req-511e701c-69ae-4363-ba4b-bd443d2adff3 req-cdea602c-4458-4607-9923-58c16da6af0e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:23 np0005465988 nova_compute[236126]: 2025-10-02 12:42:23.696 2 DEBUG nova.compute.manager [req-511e701c-69ae-4363-ba4b-bd443d2adff3 req-cdea602c-4458-4607-9923-58c16da6af0e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] No waiting events found dispatching network-vif-plugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:42:23 np0005465988 nova_compute[236126]: 2025-10-02 12:42:23.696 2 WARNING nova.compute.manager [req-511e701c-69ae-4363-ba4b-bd443d2adff3 req-cdea602c-4458-4607-9923-58c16da6af0e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Received unexpected event network-vif-plugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf for instance with vm_state active and task_state None.#033[00m
Oct  2 08:42:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:42:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3383345508' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:42:24 np0005465988 nova_compute[236126]: 2025-10-02 12:42:24.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:24.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:25.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:26.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:27.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:27.381 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:27.383 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:27.384 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:28 np0005465988 nova_compute[236126]: 2025-10-02 12:42:28.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e345 e345: 3 total, 3 up, 3 in
Oct  2 08:42:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:42:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:28.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:42:29 np0005465988 nova_compute[236126]: 2025-10-02 12:42:29.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:29.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:29 np0005465988 nova_compute[236126]: 2025-10-02 12:42:29.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:29.598 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:42:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:29.599 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:42:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:29.600 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:29 np0005465988 nova_compute[236126]: 2025-10-02 12:42:29.834 2 DEBUG nova.compute.manager [req-6c0d9d59-1207-466d-85a5-3888b5bc26a3 req-59a5e26a-b0ab-4f95-9b15-d42d56bf6853 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Received event network-changed-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:29 np0005465988 nova_compute[236126]: 2025-10-02 12:42:29.834 2 DEBUG nova.compute.manager [req-6c0d9d59-1207-466d-85a5-3888b5bc26a3 req-59a5e26a-b0ab-4f95-9b15-d42d56bf6853 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Refreshing instance network info cache due to event network-changed-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:42:29 np0005465988 nova_compute[236126]: 2025-10-02 12:42:29.835 2 DEBUG oslo_concurrency.lockutils [req-6c0d9d59-1207-466d-85a5-3888b5bc26a3 req-59a5e26a-b0ab-4f95-9b15-d42d56bf6853 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:42:29 np0005465988 nova_compute[236126]: 2025-10-02 12:42:29.835 2 DEBUG oslo_concurrency.lockutils [req-6c0d9d59-1207-466d-85a5-3888b5bc26a3 req-59a5e26a-b0ab-4f95-9b15-d42d56bf6853 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:29 np0005465988 nova_compute[236126]: 2025-10-02 12:42:29.835 2 DEBUG nova.network.neutron [req-6c0d9d59-1207-466d-85a5-3888b5bc26a3 req-59a5e26a-b0ab-4f95-9b15-d42d56bf6853 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Refreshing network info cache for port 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:42:30 np0005465988 nova_compute[236126]: 2025-10-02 12:42:30.537 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:30.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:31.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:31 np0005465988 nova_compute[236126]: 2025-10-02 12:42:31.668 2 DEBUG nova.network.neutron [req-6c0d9d59-1207-466d-85a5-3888b5bc26a3 req-59a5e26a-b0ab-4f95-9b15-d42d56bf6853 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Updated VIF entry in instance network info cache for port 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:42:31 np0005465988 nova_compute[236126]: 2025-10-02 12:42:31.669 2 DEBUG nova.network.neutron [req-6c0d9d59-1207-466d-85a5-3888b5bc26a3 req-59a5e26a-b0ab-4f95-9b15-d42d56bf6853 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Updating instance_info_cache with network_info: [{"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:31 np0005465988 nova_compute[236126]: 2025-10-02 12:42:31.690 2 DEBUG oslo_concurrency.lockutils [req-6c0d9d59-1207-466d-85a5-3888b5bc26a3 req-59a5e26a-b0ab-4f95-9b15-d42d56bf6853 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:32.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:33.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:33 np0005465988 nova_compute[236126]: 2025-10-02 12:42:33.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e345 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:34 np0005465988 nova_compute[236126]: 2025-10-02 12:42:34.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:34.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:35.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:35 np0005465988 podman[306991]: 2025-10-02 12:42:35.573885876 +0000 UTC m=+0.092322425 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:42:35 np0005465988 podman[306990]: 2025-10-02 12:42:35.582215303 +0000 UTC m=+0.111532371 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 08:42:35 np0005465988 podman[306989]: 2025-10-02 12:42:35.593226776 +0000 UTC m=+0.124637354 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 08:42:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:36.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:37.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:38 np0005465988 nova_compute[236126]: 2025-10-02 12:42:38.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:38Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e0:bd:5e 10.100.0.7
Oct  2 08:42:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:38Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:bd:5e 10.100.0.7
Oct  2 08:42:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e345 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:38.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:39 np0005465988 nova_compute[236126]: 2025-10-02 12:42:39.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:39.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:42:40 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1825861267' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:42:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:42:40 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1825861267' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:42:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:40.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:41.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:42.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:43.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:43 np0005465988 nova_compute[236126]: 2025-10-02 12:42:43.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e345 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:44 np0005465988 nova_compute[236126]: 2025-10-02 12:42:44.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:44.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:45.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:46 np0005465988 nova_compute[236126]: 2025-10-02 12:42:46.376 2 DEBUG oslo_concurrency.lockutils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "c70e8f51-9397-40dd-9bbe-210e60b75364" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:46 np0005465988 nova_compute[236126]: 2025-10-02 12:42:46.377 2 DEBUG oslo_concurrency.lockutils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:46 np0005465988 nova_compute[236126]: 2025-10-02 12:42:46.377 2 INFO nova.compute.manager [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Shelving#033[00m
Oct  2 08:42:46 np0005465988 nova_compute[236126]: 2025-10-02 12:42:46.400 2 DEBUG nova.virt.libvirt.driver [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:42:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:46.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:47.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:48 np0005465988 nova_compute[236126]: 2025-10-02 12:42:48.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e345 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:48 np0005465988 kernel: tap9fd83de3-58 (unregistering): left promiscuous mode
Oct  2 08:42:48 np0005465988 NetworkManager[45041]: <info>  [1759408968.7283] device (tap9fd83de3-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:42:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:48Z|00699|binding|INFO|Releasing lport 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf from this chassis (sb_readonly=0)
Oct  2 08:42:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:48Z|00700|binding|INFO|Setting lport 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf down in Southbound
Oct  2 08:42:48 np0005465988 nova_compute[236126]: 2025-10-02 12:42:48.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:42:48Z|00701|binding|INFO|Removing iface tap9fd83de3-58 ovn-installed in OVS
Oct  2 08:42:48 np0005465988 nova_compute[236126]: 2025-10-02 12:42:48.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:48 np0005465988 nova_compute[236126]: 2025-10-02 12:42:48.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:42:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:48.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:42:48 np0005465988 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009d.scope: Deactivated successfully.
Oct  2 08:42:48 np0005465988 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009d.scope: Consumed 15.471s CPU time.
Oct  2 08:42:48 np0005465988 systemd-machined[192594]: Machine qemu-73-instance-0000009d terminated.
Oct  2 08:42:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:48.852 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:bd:5e 10.100.0.7'], port_security=['fa:16:3e:e0:bd:5e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c70e8f51-9397-40dd-9bbe-210e60b75364', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8f9114c7ab4b6e9fc9650d4bd08af9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aaa1af3a-3e07-4a02-982d-cee91699f079', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04a89c39-8141-4654-8368-c858180215b3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:42:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:48.855 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf in datapath 6ea0a90a-9528-4fe1-8b35-dfde9b35e85f unbound from our chassis#033[00m
Oct  2 08:42:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:48.860 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:42:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:48.861 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[780a8d39-ebf8-43ca-b44a-9edfe0d95400]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:48.862 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f namespace which is not needed anymore#033[00m
Oct  2 08:42:48 np0005465988 nova_compute[236126]: 2025-10-02 12:42:48.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:48 np0005465988 nova_compute[236126]: 2025-10-02 12:42:48.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:49 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[306893]: [NOTICE]   (306912) : haproxy version is 2.8.14-c23fe91
Oct  2 08:42:49 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[306893]: [NOTICE]   (306912) : path to executable is /usr/sbin/haproxy
Oct  2 08:42:49 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[306893]: [WARNING]  (306912) : Exiting Master process...
Oct  2 08:42:49 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[306893]: [ALERT]    (306912) : Current worker (306917) exited with code 143 (Terminated)
Oct  2 08:42:49 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[306893]: [WARNING]  (306912) : All workers exited. Exiting... (0)
Oct  2 08:42:49 np0005465988 systemd[1]: libpod-c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540.scope: Deactivated successfully.
Oct  2 08:42:49 np0005465988 podman[307080]: 2025-10-02 12:42:49.095721218 +0000 UTC m=+0.126163816 container died c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:42:49 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540-userdata-shm.mount: Deactivated successfully.
Oct  2 08:42:49 np0005465988 systemd[1]: var-lib-containers-storage-overlay-7a107f5f6f9bd34c760d2ab1265480dc12d24ff4f1cab5e6abf038f1f661162b-merged.mount: Deactivated successfully.
Oct  2 08:42:49 np0005465988 podman[307080]: 2025-10-02 12:42:49.154294083 +0000 UTC m=+0.184736681 container cleanup c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:42:49 np0005465988 systemd[1]: libpod-conmon-c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540.scope: Deactivated successfully.
Oct  2 08:42:49 np0005465988 podman[307122]: 2025-10-02 12:42:49.316421191 +0000 UTC m=+0.136684866 container remove c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:49.341 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f4f6a251-b638-4c71-aa39-61d8e7dd1c91]: (4, ('Thu Oct  2 12:42:48 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f (c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540)\nc92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540\nThu Oct  2 12:42:49 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f (c92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540)\nc92eb12ac27addb211d7b40420e04eaca3ec9b8db5fa3f562e667eb2030c4540\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:49.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:49.343 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7b63e67d-30ba-45fd-a5e6-7d3e5cfdf5fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:49.344 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ea0a90a-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:49 np0005465988 kernel: tap6ea0a90a-90: left promiscuous mode
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.350 2 DEBUG oslo_concurrency.lockutils [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.351 2 DEBUG oslo_concurrency.lockutils [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:49.374 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[71e0c2c8-7900-4053-a248-ab346047a10a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:49.403 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b051ce95-7ac7-4cba-acfa-761d8d8b4516]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:49.404 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae9b709-325f-4eff-8389-16b390674594]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.422 2 INFO nova.virt.libvirt.driver [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:42:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:49.423 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c44b95f8-43fe-49ff-a673-9dbc5a8a4282]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700114, 'reachable_time': 17128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307141, 'error': None, 'target': 'ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:49 np0005465988 systemd[1]: run-netns-ovnmeta\x2d6ea0a90a\x2d9528\x2d4fe1\x2d8b35\x2ddfde9b35e85f.mount: Deactivated successfully.
Oct  2 08:42:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:49.427 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:42:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:42:49.428 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[13b2559f-d0a5-4de7-ab02-a8d2a37b1e29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.431 2 INFO nova.virt.libvirt.driver [-] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Instance destroyed successfully.#033[00m
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.431 2 DEBUG nova.objects.instance [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lazy-loading 'numa_topology' on Instance uuid c70e8f51-9397-40dd-9bbe-210e60b75364 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.523 2 DEBUG nova.objects.instance [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.773 2 INFO nova.virt.libvirt.driver [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Beginning cold snapshot process#033[00m
Oct  2 08:42:49 np0005465988 nova_compute[236126]: 2025-10-02 12:42:49.953 2 DEBUG oslo_concurrency.lockutils [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:50 np0005465988 nova_compute[236126]: 2025-10-02 12:42:50.588 2 DEBUG nova.virt.libvirt.imagebackend [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:42:50 np0005465988 nova_compute[236126]: 2025-10-02 12:42:50.731 2 DEBUG nova.compute.manager [req-a6395a38-5e35-45a4-88b5-2e1cd1fef4e3 req-fa153ff6-aaa0-4e7a-9983-1c32ba43f1a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Received event network-vif-unplugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:50 np0005465988 nova_compute[236126]: 2025-10-02 12:42:50.731 2 DEBUG oslo_concurrency.lockutils [req-a6395a38-5e35-45a4-88b5-2e1cd1fef4e3 req-fa153ff6-aaa0-4e7a-9983-1c32ba43f1a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:50 np0005465988 nova_compute[236126]: 2025-10-02 12:42:50.732 2 DEBUG oslo_concurrency.lockutils [req-a6395a38-5e35-45a4-88b5-2e1cd1fef4e3 req-fa153ff6-aaa0-4e7a-9983-1c32ba43f1a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:50 np0005465988 nova_compute[236126]: 2025-10-02 12:42:50.732 2 DEBUG oslo_concurrency.lockutils [req-a6395a38-5e35-45a4-88b5-2e1cd1fef4e3 req-fa153ff6-aaa0-4e7a-9983-1c32ba43f1a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:50 np0005465988 nova_compute[236126]: 2025-10-02 12:42:50.732 2 DEBUG nova.compute.manager [req-a6395a38-5e35-45a4-88b5-2e1cd1fef4e3 req-fa153ff6-aaa0-4e7a-9983-1c32ba43f1a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] No waiting events found dispatching network-vif-unplugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:42:50 np0005465988 nova_compute[236126]: 2025-10-02 12:42:50.732 2 WARNING nova.compute.manager [req-a6395a38-5e35-45a4-88b5-2e1cd1fef4e3 req-fa153ff6-aaa0-4e7a-9983-1c32ba43f1a5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Received unexpected event network-vif-unplugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:42:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:42:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:50.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:42:50 np0005465988 nova_compute[236126]: 2025-10-02 12:42:50.841 2 DEBUG nova.storage.rbd_utils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] creating snapshot(da447840b28447d68a111e265367804e) on rbd image(c70e8f51-9397-40dd-9bbe-210e60b75364_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.300 2 DEBUG oslo_concurrency.lockutils [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.301 2 DEBUG oslo_concurrency.lockutils [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.301 2 INFO nova.compute.manager [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Attaching volume 419667a7-d107-42dc-8a35-e16461ced816 to /dev/vdb#033[00m
Oct  2 08:42:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:51.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.468 2 DEBUG os_brick.utils [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.470 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.490 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.490 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[7c162dc2-4492-4004-a778-7c54badefa4a]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.492 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.503 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.503 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[0230433e-aef3-4a4f-b824-51e019ee90db]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.506 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e346 e346: 3 total, 3 up, 3 in
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.520 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.520 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[557820ec-b5b2-4d4c-a66a-695ed5c5c5b0]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.523 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[fc118b03-ea9e-42c0-96ea-ddc8de597306]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.524 2 DEBUG oslo_concurrency.processutils [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:51 np0005465988 podman[307195]: 2025-10-02 12:42:51.566126079 +0000 UTC m=+0.092164900 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.575 2 DEBUG oslo_concurrency.processutils [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "nvme version" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.577 2 DEBUG os_brick.initiator.connectors.lightos [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.577 2 DEBUG os_brick.initiator.connectors.lightos [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.578 2 DEBUG os_brick.initiator.connectors.lightos [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.578 2 DEBUG os_brick.utils [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] <== get_connector_properties: return (108ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:42:51 np0005465988 nova_compute[236126]: 2025-10-02 12:42:51.578 2 DEBUG nova.virt.block_device [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating existing volume attachment record: a404e328-e693-4e3b-82e3-065059927c8f _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:42:52 np0005465988 nova_compute[236126]: 2025-10-02 12:42:52.237 2 DEBUG nova.storage.rbd_utils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] cloning vms/c70e8f51-9397-40dd-9bbe-210e60b75364_disk@da447840b28447d68a111e265367804e to images/b6b477f5-f929-4c8c-a864-0bf66fee68a2 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:42:52 np0005465988 nova_compute[236126]: 2025-10-02 12:42:52.458 2 DEBUG nova.storage.rbd_utils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] flattening images/b6b477f5-f929-4c8c-a864-0bf66fee68a2 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:42:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:42:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:52.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:42:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:42:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:53.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:42:53 np0005465988 nova_compute[236126]: 2025-10-02 12:42:53.514 2 DEBUG nova.compute.manager [req-42dfdfeb-1c9e-4683-89ad-878696b687ae req-45136f3d-c3f8-49d7-a85a-b225c3bd2e64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Received event network-vif-plugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:53 np0005465988 nova_compute[236126]: 2025-10-02 12:42:53.515 2 DEBUG oslo_concurrency.lockutils [req-42dfdfeb-1c9e-4683-89ad-878696b687ae req-45136f3d-c3f8-49d7-a85a-b225c3bd2e64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:53 np0005465988 nova_compute[236126]: 2025-10-02 12:42:53.516 2 DEBUG oslo_concurrency.lockutils [req-42dfdfeb-1c9e-4683-89ad-878696b687ae req-45136f3d-c3f8-49d7-a85a-b225c3bd2e64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:53 np0005465988 nova_compute[236126]: 2025-10-02 12:42:53.516 2 DEBUG oslo_concurrency.lockutils [req-42dfdfeb-1c9e-4683-89ad-878696b687ae req-45136f3d-c3f8-49d7-a85a-b225c3bd2e64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:53 np0005465988 nova_compute[236126]: 2025-10-02 12:42:53.517 2 DEBUG nova.compute.manager [req-42dfdfeb-1c9e-4683-89ad-878696b687ae req-45136f3d-c3f8-49d7-a85a-b225c3bd2e64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] No waiting events found dispatching network-vif-plugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:42:53 np0005465988 nova_compute[236126]: 2025-10-02 12:42:53.517 2 WARNING nova.compute.manager [req-42dfdfeb-1c9e-4683-89ad-878696b687ae req-45136f3d-c3f8-49d7-a85a-b225c3bd2e64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Received unexpected event network-vif-plugged-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:42:53 np0005465988 nova_compute[236126]: 2025-10-02 12:42:53.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:53 np0005465988 nova_compute[236126]: 2025-10-02 12:42:53.520 2 DEBUG nova.objects.instance [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:54 np0005465988 nova_compute[236126]: 2025-10-02 12:42:54.246 2 DEBUG nova.virt.libvirt.driver [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Attempting to attach volume 419667a7-d107-42dc-8a35-e16461ced816 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Oct  2 08:42:54 np0005465988 nova_compute[236126]: 2025-10-02 12:42:54.254 2 DEBUG nova.virt.libvirt.guest [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] attach device xml: <disk type="network" device="disk">
Oct  2 08:42:54 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:42:54 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-419667a7-d107-42dc-8a35-e16461ced816">
Oct  2 08:42:54 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:42:54 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:42:54 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:42:54 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:42:54 np0005465988 nova_compute[236126]:  <auth username="openstack">
Oct  2 08:42:54 np0005465988 nova_compute[236126]:    <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:42:54 np0005465988 nova_compute[236126]:  </auth>
Oct  2 08:42:54 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:42:54 np0005465988 nova_compute[236126]:  <serial>419667a7-d107-42dc-8a35-e16461ced816</serial>
Oct  2 08:42:54 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:42:54 np0005465988 nova_compute[236126]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:42:54 np0005465988 nova_compute[236126]: 2025-10-02 12:42:54.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:54 np0005465988 nova_compute[236126]: 2025-10-02 12:42:54.366 2 DEBUG nova.storage.rbd_utils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] removing snapshot(da447840b28447d68a111e265367804e) on rbd image(c70e8f51-9397-40dd-9bbe-210e60b75364_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:42:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:54.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:55 np0005465988 nova_compute[236126]: 2025-10-02 12:42:55.122 2 DEBUG nova.virt.libvirt.driver [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:42:55 np0005465988 nova_compute[236126]: 2025-10-02 12:42:55.123 2 DEBUG nova.virt.libvirt.driver [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:42:55 np0005465988 nova_compute[236126]: 2025-10-02 12:42:55.123 2 DEBUG nova.virt.libvirt.driver [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:42:55 np0005465988 nova_compute[236126]: 2025-10-02 12:42:55.124 2 DEBUG nova.virt.libvirt.driver [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No VIF found with MAC fa:16:3e:34:5d:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:42:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:55.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:55 np0005465988 nova_compute[236126]: 2025-10-02 12:42:55.919 2 DEBUG oslo_concurrency.lockutils [None req-89628d15-8382-40e1-9632-0faf76a0024a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 4.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e347 e347: 3 total, 3 up, 3 in
Oct  2 08:42:55 np0005465988 nova_compute[236126]: 2025-10-02 12:42:55.992 2 DEBUG nova.storage.rbd_utils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] creating snapshot(snap) on rbd image(b6b477f5-f929-4c8c-a864-0bf66fee68a2) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:42:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:56.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e348 e348: 3 total, 3 up, 3 in
Oct  2 08:42:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:42:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:57.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:42:57 np0005465988 nova_compute[236126]: 2025-10-02 12:42:57.365 2 DEBUG oslo_concurrency.lockutils [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:57 np0005465988 nova_compute[236126]: 2025-10-02 12:42:57.365 2 DEBUG oslo_concurrency.lockutils [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:57 np0005465988 nova_compute[236126]: 2025-10-02 12:42:57.429 2 DEBUG nova.objects.instance [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:57 np0005465988 nova_compute[236126]: 2025-10-02 12:42:57.550 2 DEBUG oslo_concurrency.lockutils [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.093 2 DEBUG oslo_concurrency.lockutils [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.094 2 DEBUG oslo_concurrency.lockutils [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.094 2 INFO nova.compute.manager [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Attaching volume 438ad3ce-5006-4a29-8013-2d6621c8349e to /dev/vdc#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.215 2 DEBUG os_brick.utils [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.217 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.236 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.236 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[aa1b28bd-832c-4911-b448-29401ac9771e]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.240 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.252 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.252 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[c3cfaf0a-16b1-4855-a61d-9b8f7be0cafe]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.255 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.266 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.266 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[1a45ae3b-124b-4378-b94c-9baddceaee4b]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.268 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab7f229-58d8-4a71-a154-646dab978aa5]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.269 2 DEBUG oslo_concurrency.processutils [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.323 2 DEBUG oslo_concurrency.processutils [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "nvme version" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.327 2 DEBUG os_brick.initiator.connectors.lightos [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.327 2 DEBUG os_brick.initiator.connectors.lightos [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.328 2 DEBUG os_brick.initiator.connectors.lightos [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.328 2 DEBUG os_brick.utils [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] <== get_connector_properties: return (111ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.329 2 DEBUG nova.virt.block_device [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating existing volume attachment record: 763fd409-e162-4b58-89f1-00ddc284204d _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e348 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:42:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:42:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:42:58.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.826 2 INFO nova.virt.libvirt.driver [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Snapshot image upload complete#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.828 2 DEBUG nova.compute.manager [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:58 np0005465988 nova_compute[236126]: 2025-10-02 12:42:58.994 2 INFO nova.compute.manager [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Shelve offloading#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.004 2 INFO nova.virt.libvirt.driver [-] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Instance destroyed successfully.#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.004 2 DEBUG nova.compute.manager [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.007 2 DEBUG oslo_concurrency.lockutils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.008 2 DEBUG oslo_concurrency.lockutils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquired lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.008 2 DEBUG nova.network.neutron [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.147 2 DEBUG nova.objects.instance [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.170 2 DEBUG nova.virt.libvirt.driver [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Attempting to attach volume 438ad3ce-5006-4a29-8013-2d6621c8349e with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.173 2 DEBUG nova.virt.libvirt.guest [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] attach device xml: <disk type="network" device="disk">
Oct  2 08:42:59 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:42:59 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-438ad3ce-5006-4a29-8013-2d6621c8349e">
Oct  2 08:42:59 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:42:59 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:42:59 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:42:59 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:42:59 np0005465988 nova_compute[236126]:  <auth username="openstack">
Oct  2 08:42:59 np0005465988 nova_compute[236126]:    <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:42:59 np0005465988 nova_compute[236126]:  </auth>
Oct  2 08:42:59 np0005465988 nova_compute[236126]:  <target dev="vdc" bus="virtio"/>
Oct  2 08:42:59 np0005465988 nova_compute[236126]:  <serial>438ad3ce-5006-4a29-8013-2d6621c8349e</serial>
Oct  2 08:42:59 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:42:59 np0005465988 nova_compute[236126]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.280 2 DEBUG nova.virt.libvirt.driver [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.280 2 DEBUG nova.virt.libvirt.driver [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.280 2 DEBUG nova.virt.libvirt.driver [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.281 2 DEBUG nova.virt.libvirt.driver [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.281 2 DEBUG nova.virt.libvirt.driver [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] No VIF found with MAC fa:16:3e:34:5d:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:42:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:42:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:42:59.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:42:59 np0005465988 nova_compute[236126]: 2025-10-02 12:42:59.468 2 DEBUG oslo_concurrency.lockutils [None req-4932b137-e92a-4512-ba0d-7a722384a81d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:00 np0005465988 nova_compute[236126]: 2025-10-02 12:43:00.354 2 DEBUG nova.network.neutron [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Updating instance_info_cache with network_info: [{"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:00 np0005465988 nova_compute[236126]: 2025-10-02 12:43:00.391 2 DEBUG oslo_concurrency.lockutils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Releasing lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:00 np0005465988 nova_compute[236126]: 2025-10-02 12:43:00.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:00 np0005465988 nova_compute[236126]: 2025-10-02 12:43:00.504 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:00 np0005465988 nova_compute[236126]: 2025-10-02 12:43:00.505 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:00 np0005465988 nova_compute[236126]: 2025-10-02 12:43:00.505 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:00 np0005465988 nova_compute[236126]: 2025-10-02 12:43:00.506 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:43:00 np0005465988 nova_compute[236126]: 2025-10-02 12:43:00.506 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:00.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:43:00 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2107379637' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.002 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.086 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000098 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.086 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000098 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.092 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.092 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.093 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.093 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.096 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000009d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.096 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-0000009d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.247 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.248 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3749MB free_disk=20.896564483642578GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.249 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.249 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:01.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.385 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.386 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.386 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance c70e8f51-9397-40dd-9bbe-210e60b75364 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.386 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.387 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.466 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.721 2 INFO nova.virt.libvirt.driver [-] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Instance destroyed successfully.#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.722 2 DEBUG nova.objects.instance [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lazy-loading 'resources' on Instance uuid c70e8f51-9397-40dd-9bbe-210e60b75364 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.759 2 DEBUG nova.virt.libvirt.vif [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:42:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-561749725',display_name='tempest-TestShelveInstance-server-561749725',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-561749725',id=157,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEhf9w8pkEKWa7V1ngOe9fjFIi8JcNaUtJyznubChlj27hHukuq0Ytpxs3mHaFViqIafdIVxRwuOXby9NJMGuDWmrvU49YApKESuv4kV9WfKPY1JgB2zj33RiXhpo9OCqg==',key_name='tempest-TestShelveInstance-1513273409',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:42:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='4b8f9114c7ab4b6e9fc9650d4bd08af9',ramdisk_id='',reservation_id='r-iq5s6cd5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1219039163',owner_user_name='tempest-TestShelveInstance-1219039163-project-member',shelved_at='2025-10-02T12:42:58.827915',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='b6b477f5-f929-4c8c-a864-0bf66fee68a2'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:42:50Z,user_data=None,user_id='56c6abe1bb704c8aa499677aeb9017f5',uuid=c70e8f51-9397-40dd-9bbe-210e60b75364,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.760 2 DEBUG nova.network.os_vif_util [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converting VIF {"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fd83de3-58", "ovs_interfaceid": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.762 2 DEBUG nova.network.os_vif_util [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:bd:5e,bridge_name='br-int',has_traffic_filtering=True,id=9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fd83de3-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.762 2 DEBUG os_vif [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:bd:5e,bridge_name='br-int',has_traffic_filtering=True,id=9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fd83de3-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.767 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9fd83de3-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.776 2 INFO os_vif [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:bd:5e,bridge_name='br-int',has_traffic_filtering=True,id=9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fd83de3-58')#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.892 2 DEBUG nova.compute.manager [req-8557e7dc-b09f-4555-b4bb-f3642d66b04d req-12a49e07-5c34-4164-9b92-ea39bca87182 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Received event network-changed-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.893 2 DEBUG nova.compute.manager [req-8557e7dc-b09f-4555-b4bb-f3642d66b04d req-12a49e07-5c34-4164-9b92-ea39bca87182 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Refreshing instance network info cache due to event network-changed-9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.893 2 DEBUG oslo_concurrency.lockutils [req-8557e7dc-b09f-4555-b4bb-f3642d66b04d req-12a49e07-5c34-4164-9b92-ea39bca87182 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.894 2 DEBUG oslo_concurrency.lockutils [req-8557e7dc-b09f-4555-b4bb-f3642d66b04d req-12a49e07-5c34-4164-9b92-ea39bca87182 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.894 2 DEBUG nova.network.neutron [req-8557e7dc-b09f-4555-b4bb-f3642d66b04d req-12a49e07-5c34-4164-9b92-ea39bca87182 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Refreshing network info cache for port 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:43:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/21696698' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.916 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:01 np0005465988 nova_compute[236126]: 2025-10-02 12:43:01.923 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:43:02 np0005465988 nova_compute[236126]: 2025-10-02 12:43:02.264 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:43:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:43:02Z|00702|binding|INFO|Releasing lport 79bf28ab-e58e-4276-adf8-279ba85b1b49 from this chassis (sb_readonly=0)
Oct  2 08:43:02 np0005465988 nova_compute[236126]: 2025-10-02 12:43:02.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:02 np0005465988 nova_compute[236126]: 2025-10-02 12:43:02.399 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:43:02 np0005465988 nova_compute[236126]: 2025-10-02 12:43:02.399 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:43:02Z|00703|binding|INFO|Releasing lport 79bf28ab-e58e-4276-adf8-279ba85b1b49 from this chassis (sb_readonly=0)
Oct  2 08:43:02 np0005465988 nova_compute[236126]: 2025-10-02 12:43:02.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:43:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:02.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:43:03 np0005465988 nova_compute[236126]: 2025-10-02 12:43:03.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:03 np0005465988 NetworkManager[45041]: <info>  [1759408983.0185] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Oct  2 08:43:03 np0005465988 NetworkManager[45041]: <info>  [1759408983.0203] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Oct  2 08:43:03 np0005465988 nova_compute[236126]: 2025-10-02 12:43:03.123 2 INFO nova.virt.libvirt.driver [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Deleting instance files /var/lib/nova/instances/c70e8f51-9397-40dd-9bbe-210e60b75364_del#033[00m
Oct  2 08:43:03 np0005465988 nova_compute[236126]: 2025-10-02 12:43:03.125 2 INFO nova.virt.libvirt.driver [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Deletion of /var/lib/nova/instances/c70e8f51-9397-40dd-9bbe-210e60b75364_del complete#033[00m
Oct  2 08:43:03 np0005465988 nova_compute[236126]: 2025-10-02 12:43:03.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:03 np0005465988 nova_compute[236126]: 2025-10-02 12:43:03.297 2 INFO nova.scheduler.client.report [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Deleted allocations for instance c70e8f51-9397-40dd-9bbe-210e60b75364#033[00m
Oct  2 08:43:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:43:03Z|00704|binding|INFO|Releasing lport 79bf28ab-e58e-4276-adf8-279ba85b1b49 from this chassis (sb_readonly=0)
Oct  2 08:43:03 np0005465988 nova_compute[236126]: 2025-10-02 12:43:03.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:03.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:03 np0005465988 nova_compute[236126]: 2025-10-02 12:43:03.386 2 DEBUG oslo_concurrency.lockutils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:03 np0005465988 nova_compute[236126]: 2025-10-02 12:43:03.386 2 DEBUG oslo_concurrency.lockutils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e349 e349: 3 total, 3 up, 3 in
Oct  2 08:43:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:03 np0005465988 nova_compute[236126]: 2025-10-02 12:43:03.984 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408968.9834187, c70e8f51-9397-40dd-9bbe-210e60b75364 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:03 np0005465988 nova_compute[236126]: 2025-10-02 12:43:03.985 2 INFO nova.compute.manager [-] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:43:04 np0005465988 nova_compute[236126]: 2025-10-02 12:43:04.137 2 DEBUG nova.compute.manager [None req-1d9a36cc-c9d6-4102-8071-66e1fc78f6bc - - - - - -] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:04 np0005465988 nova_compute[236126]: 2025-10-02 12:43:04.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:04 np0005465988 nova_compute[236126]: 2025-10-02 12:43:04.550 2 DEBUG oslo_concurrency.processutils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:04.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.065 2 DEBUG nova.compute.manager [req-04a12be4-91a7-4062-842b-194315c04d57 req-2400601d-077d-4c1a-8376-c997b59615c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-changed-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.066 2 DEBUG nova.compute.manager [req-04a12be4-91a7-4062-842b-194315c04d57 req-2400601d-077d-4c1a-8376-c997b59615c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing instance network info cache due to event network-changed-6797a28e-4489-4337-b4be-f09d77787856. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.066 2 DEBUG oslo_concurrency.lockutils [req-04a12be4-91a7-4062-842b-194315c04d57 req-2400601d-077d-4c1a-8376-c997b59615c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.067 2 DEBUG oslo_concurrency.lockutils [req-04a12be4-91a7-4062-842b-194315c04d57 req-2400601d-077d-4c1a-8376-c997b59615c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.067 2 DEBUG nova.network.neutron [req-04a12be4-91a7-4062-842b-194315c04d57 req-2400601d-077d-4c1a-8376-c997b59615c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing network info cache for port 6797a28e-4489-4337-b4be-f09d77787856 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:05.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:43:05 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2004316980' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.491 2 DEBUG oslo_concurrency.processutils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.942s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.498 2 DEBUG nova.compute.provider_tree [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.527 2 DEBUG nova.scheduler.client.report [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.556 2 DEBUG oslo_concurrency.lockutils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.618 2 DEBUG oslo_concurrency.lockutils [None req-e6e608d4-3c21-478b-b3cc-a4c57ebf3d11 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "c70e8f51-9397-40dd-9bbe-210e60b75364" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 19.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.829 2 DEBUG nova.network.neutron [req-8557e7dc-b09f-4555-b4bb-f3642d66b04d req-12a49e07-5c34-4164-9b92-ea39bca87182 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Updated VIF entry in instance network info cache for port 9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.830 2 DEBUG nova.network.neutron [req-8557e7dc-b09f-4555-b4bb-f3642d66b04d req-12a49e07-5c34-4164-9b92-ea39bca87182 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c70e8f51-9397-40dd-9bbe-210e60b75364] Updating instance_info_cache with network_info: [{"id": "9fd83de3-587e-4631-a4e3-ee5ab5b1cbbf", "address": "fa:16:3e:e0:bd:5e", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": null, "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap9fd83de3-58", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:05 np0005465988 nova_compute[236126]: 2025-10-02 12:43:05.856 2 DEBUG oslo_concurrency.lockutils [req-8557e7dc-b09f-4555-b4bb-f3642d66b04d req-12a49e07-5c34-4164-9b92-ea39bca87182 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c70e8f51-9397-40dd-9bbe-210e60b75364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:06 np0005465988 nova_compute[236126]: 2025-10-02 12:43:06.447 2 DEBUG nova.compute.manager [req-2ce7bf19-59ee-4622-bda3-e538c2be8281 req-f1c94396-20fa-47e9-a3df-49eb07653f18 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-changed-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:06 np0005465988 nova_compute[236126]: 2025-10-02 12:43:06.448 2 DEBUG nova.compute.manager [req-2ce7bf19-59ee-4622-bda3-e538c2be8281 req-f1c94396-20fa-47e9-a3df-49eb07653f18 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing instance network info cache due to event network-changed-6797a28e-4489-4337-b4be-f09d77787856. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:06 np0005465988 nova_compute[236126]: 2025-10-02 12:43:06.448 2 DEBUG oslo_concurrency.lockutils [req-2ce7bf19-59ee-4622-bda3-e538c2be8281 req-f1c94396-20fa-47e9-a3df-49eb07653f18 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:06 np0005465988 podman[307504]: 2025-10-02 12:43:06.53793885 +0000 UTC m=+0.070384872 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=iscsid)
Oct  2 08:43:06 np0005465988 podman[307505]: 2025-10-02 12:43:06.53793301 +0000 UTC m=+0.069052344 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:43:06 np0005465988 podman[307503]: 2025-10-02 12:43:06.555819338 +0000 UTC m=+0.094022263 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:43:06 np0005465988 nova_compute[236126]: 2025-10-02 12:43:06.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:06.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:06 np0005465988 nova_compute[236126]: 2025-10-02 12:43:06.952 2 DEBUG nova.network.neutron [req-04a12be4-91a7-4062-842b-194315c04d57 req-2400601d-077d-4c1a-8376-c997b59615c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updated VIF entry in instance network info cache for port 6797a28e-4489-4337-b4be-f09d77787856. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:06 np0005465988 nova_compute[236126]: 2025-10-02 12:43:06.953 2 DEBUG nova.network.neutron [req-04a12be4-91a7-4062-842b-194315c04d57 req-2400601d-077d-4c1a-8376-c997b59615c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating instance_info_cache with network_info: [{"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:06 np0005465988 nova_compute[236126]: 2025-10-02 12:43:06.989 2 DEBUG oslo_concurrency.lockutils [req-04a12be4-91a7-4062-842b-194315c04d57 req-2400601d-077d-4c1a-8376-c997b59615c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:06 np0005465988 nova_compute[236126]: 2025-10-02 12:43:06.990 2 DEBUG oslo_concurrency.lockutils [req-2ce7bf19-59ee-4622-bda3-e538c2be8281 req-f1c94396-20fa-47e9-a3df-49eb07653f18 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:06 np0005465988 nova_compute[236126]: 2025-10-02 12:43:06.990 2 DEBUG nova.network.neutron [req-2ce7bf19-59ee-4622-bda3-e538c2be8281 req-f1c94396-20fa-47e9-a3df-49eb07653f18 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing network info cache for port 6797a28e-4489-4337-b4be-f09d77787856 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:07.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:07 np0005465988 nova_compute[236126]: 2025-10-02 12:43:07.399 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:08 np0005465988 nova_compute[236126]: 2025-10-02 12:43:08.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:08 np0005465988 nova_compute[236126]: 2025-10-02 12:43:08.501 2 DEBUG nova.network.neutron [req-2ce7bf19-59ee-4622-bda3-e538c2be8281 req-f1c94396-20fa-47e9-a3df-49eb07653f18 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updated VIF entry in instance network info cache for port 6797a28e-4489-4337-b4be-f09d77787856. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:08 np0005465988 nova_compute[236126]: 2025-10-02 12:43:08.502 2 DEBUG nova.network.neutron [req-2ce7bf19-59ee-4622-bda3-e538c2be8281 req-f1c94396-20fa-47e9-a3df-49eb07653f18 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating instance_info_cache with network_info: [{"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:08 np0005465988 nova_compute[236126]: 2025-10-02 12:43:08.591 2 DEBUG oslo_concurrency.lockutils [req-2ce7bf19-59ee-4622-bda3-e538c2be8281 req-f1c94396-20fa-47e9-a3df-49eb07653f18 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:08 np0005465988 nova_compute[236126]: 2025-10-02 12:43:08.788 2 DEBUG nova.compute.manager [req-d0d8c853-5897-44db-8923-8cc788066c03 req-dbcc4824-97af-46e7-a37d-b68f1d2b860b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-changed-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:08 np0005465988 nova_compute[236126]: 2025-10-02 12:43:08.789 2 DEBUG nova.compute.manager [req-d0d8c853-5897-44db-8923-8cc788066c03 req-dbcc4824-97af-46e7-a37d-b68f1d2b860b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing instance network info cache due to event network-changed-6797a28e-4489-4337-b4be-f09d77787856. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:08 np0005465988 nova_compute[236126]: 2025-10-02 12:43:08.789 2 DEBUG oslo_concurrency.lockutils [req-d0d8c853-5897-44db-8923-8cc788066c03 req-dbcc4824-97af-46e7-a37d-b68f1d2b860b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:08 np0005465988 nova_compute[236126]: 2025-10-02 12:43:08.790 2 DEBUG oslo_concurrency.lockutils [req-d0d8c853-5897-44db-8923-8cc788066c03 req-dbcc4824-97af-46e7-a37d-b68f1d2b860b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:08 np0005465988 nova_compute[236126]: 2025-10-02 12:43:08.790 2 DEBUG nova.network.neutron [req-d0d8c853-5897-44db-8923-8cc788066c03 req-dbcc4824-97af-46e7-a37d-b68f1d2b860b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing network info cache for port 6797a28e-4489-4337-b4be-f09d77787856 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:08.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:09.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:09 np0005465988 nova_compute[236126]: 2025-10-02 12:43:09.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:10.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:10 np0005465988 nova_compute[236126]: 2025-10-02 12:43:10.889 2 DEBUG nova.network.neutron [req-d0d8c853-5897-44db-8923-8cc788066c03 req-dbcc4824-97af-46e7-a37d-b68f1d2b860b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updated VIF entry in instance network info cache for port 6797a28e-4489-4337-b4be-f09d77787856. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:10 np0005465988 nova_compute[236126]: 2025-10-02 12:43:10.890 2 DEBUG nova.network.neutron [req-d0d8c853-5897-44db-8923-8cc788066c03 req-dbcc4824-97af-46e7-a37d-b68f1d2b860b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating instance_info_cache with network_info: [{"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:10 np0005465988 nova_compute[236126]: 2025-10-02 12:43:10.944 2 DEBUG oslo_concurrency.lockutils [req-d0d8c853-5897-44db-8923-8cc788066c03 req-dbcc4824-97af-46e7-a37d-b68f1d2b860b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:11.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:11 np0005465988 nova_compute[236126]: 2025-10-02 12:43:11.409 2 DEBUG nova.compute.manager [req-5224784f-1040-4188-836f-fa363126c35e req-d65f3cf4-5241-4247-b9ec-e25f756d8cd3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-changed-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:11 np0005465988 nova_compute[236126]: 2025-10-02 12:43:11.409 2 DEBUG nova.compute.manager [req-5224784f-1040-4188-836f-fa363126c35e req-d65f3cf4-5241-4247-b9ec-e25f756d8cd3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing instance network info cache due to event network-changed-6797a28e-4489-4337-b4be-f09d77787856. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:11 np0005465988 nova_compute[236126]: 2025-10-02 12:43:11.410 2 DEBUG oslo_concurrency.lockutils [req-5224784f-1040-4188-836f-fa363126c35e req-d65f3cf4-5241-4247-b9ec-e25f756d8cd3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:11 np0005465988 nova_compute[236126]: 2025-10-02 12:43:11.410 2 DEBUG oslo_concurrency.lockutils [req-5224784f-1040-4188-836f-fa363126c35e req-d65f3cf4-5241-4247-b9ec-e25f756d8cd3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:11 np0005465988 nova_compute[236126]: 2025-10-02 12:43:11.410 2 DEBUG nova.network.neutron [req-5224784f-1040-4188-836f-fa363126c35e req-d65f3cf4-5241-4247-b9ec-e25f756d8cd3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing network info cache for port 6797a28e-4489-4337-b4be-f09d77787856 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:11 np0005465988 nova_compute[236126]: 2025-10-02 12:43:11.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:11 np0005465988 nova_compute[236126]: 2025-10-02 12:43:11.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:11 np0005465988 nova_compute[236126]: 2025-10-02 12:43:11.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:43:11 np0005465988 nova_compute[236126]: 2025-10-02 12:43:11.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:12 np0005465988 nova_compute[236126]: 2025-10-02 12:43:12.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:12 np0005465988 nova_compute[236126]: 2025-10-02 12:43:12.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:12 np0005465988 nova_compute[236126]: 2025-10-02 12:43:12.722 2 DEBUG nova.network.neutron [req-5224784f-1040-4188-836f-fa363126c35e req-d65f3cf4-5241-4247-b9ec-e25f756d8cd3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updated VIF entry in instance network info cache for port 6797a28e-4489-4337-b4be-f09d77787856. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:12 np0005465988 nova_compute[236126]: 2025-10-02 12:43:12.723 2 DEBUG nova.network.neutron [req-5224784f-1040-4188-836f-fa363126c35e req-d65f3cf4-5241-4247-b9ec-e25f756d8cd3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating instance_info_cache with network_info: [{"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:12 np0005465988 nova_compute[236126]: 2025-10-02 12:43:12.743 2 DEBUG oslo_concurrency.lockutils [req-5224784f-1040-4188-836f-fa363126c35e req-d65f3cf4-5241-4247-b9ec-e25f756d8cd3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:12.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:13.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:14 np0005465988 nova_compute[236126]: 2025-10-02 12:43:14.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:14 np0005465988 nova_compute[236126]: 2025-10-02 12:43:14.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:14.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:15.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:43:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:43:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:43:16 np0005465988 nova_compute[236126]: 2025-10-02 12:43:16.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:16.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:17.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:18 np0005465988 nova_compute[236126]: 2025-10-02 12:43:18.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:18 np0005465988 nova_compute[236126]: 2025-10-02 12:43:18.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:43:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:18.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:18 np0005465988 nova_compute[236126]: 2025-10-02 12:43:18.965 2 DEBUG oslo_concurrency.lockutils [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:18 np0005465988 nova_compute[236126]: 2025-10-02 12:43:18.967 2 DEBUG oslo_concurrency.lockutils [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:18 np0005465988 nova_compute[236126]: 2025-10-02 12:43:18.970 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:18 np0005465988 nova_compute[236126]: 2025-10-02 12:43:18.971 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:18 np0005465988 nova_compute[236126]: 2025-10-02 12:43:18.971 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:43:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:19 np0005465988 nova_compute[236126]: 2025-10-02 12:43:19.010 2 INFO nova.compute.manager [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Detaching volume 419667a7-d107-42dc-8a35-e16461ced816#033[00m
Oct  2 08:43:19 np0005465988 nova_compute[236126]: 2025-10-02 12:43:19.239 2 INFO nova.virt.block_device [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Attempting to driver detach volume 419667a7-d107-42dc-8a35-e16461ced816 from mountpoint /dev/vdb#033[00m
Oct  2 08:43:19 np0005465988 nova_compute[236126]: 2025-10-02 12:43:19.255 2 DEBUG nova.virt.libvirt.driver [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Attempting to detach device vdb from instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:43:19 np0005465988 nova_compute[236126]: 2025-10-02 12:43:19.256 2 DEBUG nova.virt.libvirt.guest [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-419667a7-d107-42dc-8a35-e16461ced816">
Oct  2 08:43:19 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  <serial>419667a7-d107-42dc-8a35-e16461ced816</serial>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:43:19 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:43:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:19.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:19 np0005465988 nova_compute[236126]: 2025-10-02 12:43:19.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:19 np0005465988 nova_compute[236126]: 2025-10-02 12:43:19.446 2 INFO nova.virt.libvirt.driver [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully detached device vdb from instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c from the persistent domain config.#033[00m
Oct  2 08:43:19 np0005465988 nova_compute[236126]: 2025-10-02 12:43:19.447 2 DEBUG nova.virt.libvirt.driver [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:43:19 np0005465988 nova_compute[236126]: 2025-10-02 12:43:19.447 2 DEBUG nova.virt.libvirt.guest [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-419667a7-d107-42dc-8a35-e16461ced816">
Oct  2 08:43:19 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  <serial>419667a7-d107-42dc-8a35-e16461ced816</serial>
Oct  2 08:43:19 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Oct  2 08:43:19 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:43:19 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:43:20 np0005465988 nova_compute[236126]: 2025-10-02 12:43:20.472 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Received event <DeviceRemovedEvent: 1759409000.4720123, 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:43:20 np0005465988 nova_compute[236126]: 2025-10-02 12:43:20.475 2 DEBUG nova.virt.libvirt.driver [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:43:20 np0005465988 nova_compute[236126]: 2025-10-02 12:43:20.478 2 INFO nova.virt.libvirt.driver [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully detached device vdb from instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c from the live domain config.#033[00m
Oct  2 08:43:20 np0005465988 nova_compute[236126]: 2025-10-02 12:43:20.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:20 np0005465988 nova_compute[236126]: 2025-10-02 12:43:20.767 2 DEBUG nova.objects.instance [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:20.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:21 np0005465988 nova_compute[236126]: 2025-10-02 12:43:21.026 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating instance_info_cache with network_info: [{"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:21 np0005465988 nova_compute[236126]: 2025-10-02 12:43:21.088 2 DEBUG oslo_concurrency.lockutils [None req-31c13c87-02a3-4b03-b6cc-dbed5bed99cf b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 2.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:21 np0005465988 nova_compute[236126]: 2025-10-02 12:43:21.261 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:21 np0005465988 nova_compute[236126]: 2025-10-02 12:43:21.262 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:43:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:21.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:21 np0005465988 nova_compute[236126]: 2025-10-02 12:43:21.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:22 np0005465988 podman[307757]: 2025-10-02 12:43:22.563797049 +0000 UTC m=+0.083152965 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:43:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:22.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:43:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:23.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #115. Immutable memtables: 0.
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.573302) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 71] Flushing memtable with next log file: 115
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409003573453, "job": 71, "event": "flush_started", "num_memtables": 1, "num_entries": 1686, "num_deletes": 257, "total_data_size": 3584597, "memory_usage": 3627888, "flush_reason": "Manual Compaction"}
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 71] Level-0 flush table #116: started
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409003595317, "cf_name": "default", "job": 71, "event": "table_file_creation", "file_number": 116, "file_size": 2362261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 58147, "largest_seqno": 59828, "table_properties": {"data_size": 2355162, "index_size": 4042, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15631, "raw_average_key_size": 20, "raw_value_size": 2340659, "raw_average_value_size": 3043, "num_data_blocks": 177, "num_entries": 769, "num_filter_entries": 769, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759408874, "oldest_key_time": 1759408874, "file_creation_time": 1759409003, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 116, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 71] Flush lasted 22125 microseconds, and 6514 cpu microseconds.
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.595440) [db/flush_job.cc:967] [default] [JOB 71] Level-0 flush table #116: 2362261 bytes OK
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.595476) [db/memtable_list.cc:519] [default] Level-0 commit table #116 started
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.606938) [db/memtable_list.cc:722] [default] Level-0 commit table #116: memtable #1 done
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.606977) EVENT_LOG_v1 {"time_micros": 1759409003606969, "job": 71, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.606996) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 71] Try to delete WAL files size 3576834, prev total WAL file size 3576834, number of live WAL files 2.
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000112.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.608036) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032303036' seq:72057594037927935, type:22 .. '6C6F676D0032323537' seq:0, type:0; will stop at (end)
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 72] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 71 Base level 0, inputs: [116(2306KB)], [114(10MB)]
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409003608154, "job": 72, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [116], "files_L6": [114], "score": -1, "input_data_size": 12898600, "oldest_snapshot_seqno": -1}
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 72] Generated table #117: 8436 keys, 12766253 bytes, temperature: kUnknown
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409003702429, "cf_name": "default", "job": 72, "event": "table_file_creation", "file_number": 117, "file_size": 12766253, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12709504, "index_size": 34557, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21125, "raw_key_size": 218837, "raw_average_key_size": 25, "raw_value_size": 12559149, "raw_average_value_size": 1488, "num_data_blocks": 1355, "num_entries": 8436, "num_filter_entries": 8436, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759409003, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 117, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.702662) [db/compaction/compaction_job.cc:1663] [default] [JOB 72] Compacted 1@0 + 1@6 files to L6 => 12766253 bytes
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.705633) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.8 rd, 135.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 10.0 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(10.9) write-amplify(5.4) OK, records in: 8967, records dropped: 531 output_compression: NoCompression
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.705653) EVENT_LOG_v1 {"time_micros": 1759409003705644, "job": 72, "event": "compaction_finished", "compaction_time_micros": 94317, "compaction_time_cpu_micros": 32534, "output_level": 6, "num_output_files": 1, "total_output_size": 12766253, "num_input_records": 8967, "num_output_records": 8436, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000116.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409003706297, "job": 72, "event": "table_file_deletion", "file_number": 116}
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000114.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409003708675, "job": 72, "event": "table_file_deletion", "file_number": 114}
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.607860) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.708775) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.708784) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.708788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.708791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:43:23.708794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:43:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:24 np0005465988 nova_compute[236126]: 2025-10-02 12:43:24.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:43:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:24.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:43:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:43:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:25.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:43:25 np0005465988 nova_compute[236126]: 2025-10-02 12:43:25.499 2 DEBUG oslo_concurrency.lockutils [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:25 np0005465988 nova_compute[236126]: 2025-10-02 12:43:25.500 2 DEBUG oslo_concurrency.lockutils [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:25 np0005465988 nova_compute[236126]: 2025-10-02 12:43:25.549 2 INFO nova.compute.manager [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Detaching volume 438ad3ce-5006-4a29-8013-2d6621c8349e#033[00m
Oct  2 08:43:25 np0005465988 nova_compute[236126]: 2025-10-02 12:43:25.671 2 INFO nova.virt.block_device [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Attempting to driver detach volume 438ad3ce-5006-4a29-8013-2d6621c8349e from mountpoint /dev/vdc#033[00m
Oct  2 08:43:25 np0005465988 nova_compute[236126]: 2025-10-02 12:43:25.685 2 DEBUG nova.virt.libvirt.driver [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Attempting to detach device vdc from instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:43:25 np0005465988 nova_compute[236126]: 2025-10-02 12:43:25.686 2 DEBUG nova.virt.libvirt.guest [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-438ad3ce-5006-4a29-8013-2d6621c8349e">
Oct  2 08:43:25 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  <target dev="vdc" bus="virtio"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  <serial>438ad3ce-5006-4a29-8013-2d6621c8349e</serial>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:43:25 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:43:25 np0005465988 nova_compute[236126]: 2025-10-02 12:43:25.695 2 INFO nova.virt.libvirt.driver [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully detached device vdc from instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c from the persistent domain config.#033[00m
Oct  2 08:43:25 np0005465988 nova_compute[236126]: 2025-10-02 12:43:25.696 2 DEBUG nova.virt.libvirt.driver [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:43:25 np0005465988 nova_compute[236126]: 2025-10-02 12:43:25.697 2 DEBUG nova.virt.libvirt.guest [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-438ad3ce-5006-4a29-8013-2d6621c8349e">
Oct  2 08:43:25 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  <target dev="vdc" bus="virtio"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  <serial>438ad3ce-5006-4a29-8013-2d6621c8349e</serial>
Oct  2 08:43:25 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Oct  2 08:43:25 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:43:25 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:43:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e350 e350: 3 total, 3 up, 3 in
Oct  2 08:43:26 np0005465988 nova_compute[236126]: 2025-10-02 12:43:26.169 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Received event <DeviceRemovedEvent: 1759409006.1686094, 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:43:26 np0005465988 nova_compute[236126]: 2025-10-02 12:43:26.172 2 DEBUG nova.virt.libvirt.driver [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:43:26 np0005465988 nova_compute[236126]: 2025-10-02 12:43:26.175 2 INFO nova.virt.libvirt.driver [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully detached device vdc from instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c from the live domain config.#033[00m
Oct  2 08:43:26 np0005465988 nova_compute[236126]: 2025-10-02 12:43:26.432 2 DEBUG nova.objects.instance [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'flavor' on Instance uuid 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:26 np0005465988 nova_compute[236126]: 2025-10-02 12:43:26.651 2 DEBUG oslo_concurrency.lockutils [None req-2bdde2e9-3f8e-4179-b58c-2327a249c93a b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:26 np0005465988 nova_compute[236126]: 2025-10-02 12:43:26.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:43:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:26.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:43:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:43:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:43:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:27.382 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:27.383 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:27.383 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:27.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:28 np0005465988 nova_compute[236126]: 2025-10-02 12:43:28.548 2 DEBUG nova.compute.manager [req-c9fc2f71-abb3-47c8-ae36-29f45618b288 req-42e24488-d93a-4f0e-a8bb-c0628f27cfd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-changed-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:28 np0005465988 nova_compute[236126]: 2025-10-02 12:43:28.548 2 DEBUG nova.compute.manager [req-c9fc2f71-abb3-47c8-ae36-29f45618b288 req-42e24488-d93a-4f0e-a8bb-c0628f27cfd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing instance network info cache due to event network-changed-6797a28e-4489-4337-b4be-f09d77787856. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:28 np0005465988 nova_compute[236126]: 2025-10-02 12:43:28.549 2 DEBUG oslo_concurrency.lockutils [req-c9fc2f71-abb3-47c8-ae36-29f45618b288 req-42e24488-d93a-4f0e-a8bb-c0628f27cfd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:28 np0005465988 nova_compute[236126]: 2025-10-02 12:43:28.549 2 DEBUG oslo_concurrency.lockutils [req-c9fc2f71-abb3-47c8-ae36-29f45618b288 req-42e24488-d93a-4f0e-a8bb-c0628f27cfd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:28 np0005465988 nova_compute[236126]: 2025-10-02 12:43:28.549 2 DEBUG nova.network.neutron [req-c9fc2f71-abb3-47c8-ae36-29f45618b288 req-42e24488-d93a-4f0e-a8bb-c0628f27cfd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing network info cache for port 6797a28e-4489-4337-b4be-f09d77787856 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e351 e351: 3 total, 3 up, 3 in
Oct  2 08:43:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:43:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:28.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:43:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e351 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:43:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.0 total, 600.0 interval#012Cumulative writes: 11K writes, 59K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.12 GB, 0.03 MB/s#012Cumulative WAL: 11K writes, 11K syncs, 1.00 writes per sync, written: 0.12 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1583 writes, 7870 keys, 1583 commit groups, 1.0 writes per commit group, ingest: 15.87 MB, 0.03 MB/s#012Interval WAL: 1583 writes, 1583 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     60.3      1.20              0.25        36    0.033       0      0       0.0       0.0#012  L6      1/0   12.17 MB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   4.6    124.7    105.7      3.18              1.26        35    0.091    224K    19K       0.0       0.0#012 Sum      1/0   12.17 MB   0.0      0.4     0.1      0.3       0.4      0.1       0.0   5.6     90.5     93.3      4.38              1.51        71    0.062    224K    19K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.0     63.3     65.8      1.20              0.25        12    0.100     51K   3093       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   0.0    124.7    105.7      3.18              1.26        35    0.091    224K    19K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     60.4      1.20              0.25        35    0.034       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4200.0 total, 600.0 interval#012Flush(GB): cumulative 0.071, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.40 GB write, 0.10 MB/s write, 0.39 GB read, 0.09 MB/s read, 4.4 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.07 GB read, 0.13 MB/s read, 1.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 304.00 MB usage: 43.51 MB table_size: 0 occupancy: 18446744073709551615 collections: 8 last_copies: 0 last_secs: 0.000494 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2529,41.82 MB,13.7574%) FilterBlock(71,634.17 KB,0.20372%) IndexBlock(71,1.06 MB,0.350014%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 08:43:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:29.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:29 np0005465988 nova_compute[236126]: 2025-10-02 12:43:29.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:30 np0005465988 nova_compute[236126]: 2025-10-02 12:43:30.226 2 DEBUG nova.network.neutron [req-c9fc2f71-abb3-47c8-ae36-29f45618b288 req-42e24488-d93a-4f0e-a8bb-c0628f27cfd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updated VIF entry in instance network info cache for port 6797a28e-4489-4337-b4be-f09d77787856. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:30 np0005465988 nova_compute[236126]: 2025-10-02 12:43:30.227 2 DEBUG nova.network.neutron [req-c9fc2f71-abb3-47c8-ae36-29f45618b288 req-42e24488-d93a-4f0e-a8bb-c0628f27cfd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating instance_info_cache with network_info: [{"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:30 np0005465988 nova_compute[236126]: 2025-10-02 12:43:30.467 2 DEBUG oslo_concurrency.lockutils [req-c9fc2f71-abb3-47c8-ae36-29f45618b288 req-42e24488-d93a-4f0e-a8bb-c0628f27cfd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:30.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:31.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:31 np0005465988 nova_compute[236126]: 2025-10-02 12:43:31.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:43:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:32.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:43:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:33.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e352 e352: 3 total, 3 up, 3 in
Oct  2 08:43:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e352 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:34 np0005465988 nova_compute[236126]: 2025-10-02 12:43:34.055 2 DEBUG nova.compute.manager [req-c339703b-7dbd-48cb-852d-dbad5d8c03ec req-b1b4dbaa-161f-42d0-8add-cfa393aa867a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-changed-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:34 np0005465988 nova_compute[236126]: 2025-10-02 12:43:34.055 2 DEBUG nova.compute.manager [req-c339703b-7dbd-48cb-852d-dbad5d8c03ec req-b1b4dbaa-161f-42d0-8add-cfa393aa867a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing instance network info cache due to event network-changed-6797a28e-4489-4337-b4be-f09d77787856. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:34 np0005465988 nova_compute[236126]: 2025-10-02 12:43:34.056 2 DEBUG oslo_concurrency.lockutils [req-c339703b-7dbd-48cb-852d-dbad5d8c03ec req-b1b4dbaa-161f-42d0-8add-cfa393aa867a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:34 np0005465988 nova_compute[236126]: 2025-10-02 12:43:34.056 2 DEBUG oslo_concurrency.lockutils [req-c339703b-7dbd-48cb-852d-dbad5d8c03ec req-b1b4dbaa-161f-42d0-8add-cfa393aa867a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:34 np0005465988 nova_compute[236126]: 2025-10-02 12:43:34.056 2 DEBUG nova.network.neutron [req-c339703b-7dbd-48cb-852d-dbad5d8c03ec req-b1b4dbaa-161f-42d0-8add-cfa393aa867a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Refreshing network info cache for port 6797a28e-4489-4337-b4be-f09d77787856 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:34 np0005465988 nova_compute[236126]: 2025-10-02 12:43:34.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:43:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:34.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:43:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:43:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:35.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:43:35 np0005465988 nova_compute[236126]: 2025-10-02 12:43:35.662 2 DEBUG nova.network.neutron [req-c339703b-7dbd-48cb-852d-dbad5d8c03ec req-b1b4dbaa-161f-42d0-8add-cfa393aa867a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updated VIF entry in instance network info cache for port 6797a28e-4489-4337-b4be-f09d77787856. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:35 np0005465988 nova_compute[236126]: 2025-10-02 12:43:35.663 2 DEBUG nova.network.neutron [req-c339703b-7dbd-48cb-852d-dbad5d8c03ec req-b1b4dbaa-161f-42d0-8add-cfa393aa867a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating instance_info_cache with network_info: [{"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:35 np0005465988 nova_compute[236126]: 2025-10-02 12:43:35.772 2 DEBUG oslo_concurrency.lockutils [req-c339703b-7dbd-48cb-852d-dbad5d8c03ec req-b1b4dbaa-161f-42d0-8add-cfa393aa867a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:36 np0005465988 nova_compute[236126]: 2025-10-02 12:43:36.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:36.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e353 e353: 3 total, 3 up, 3 in
Oct  2 08:43:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:37.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:37 np0005465988 podman[307888]: 2025-10-02 12:43:37.570129699 +0000 UTC m=+0.078230854 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:43:37 np0005465988 podman[307887]: 2025-10-02 12:43:37.571047015 +0000 UTC m=+0.090460652 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:43:37 np0005465988 podman[307886]: 2025-10-02 12:43:37.605030351 +0000 UTC m=+0.129427529 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:43:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e354 e354: 3 total, 3 up, 3 in
Oct  2 08:43:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:38.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:38.890 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:43:38 np0005465988 nova_compute[236126]: 2025-10-02 12:43:38.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:38.892 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:43:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:39.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:39 np0005465988 nova_compute[236126]: 2025-10-02 12:43:39.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:43:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:40.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:43:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:43:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:41.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:43:41 np0005465988 nova_compute[236126]: 2025-10-02 12:43:41.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:42 np0005465988 nova_compute[236126]: 2025-10-02 12:43:42.643 2 DEBUG nova.compute.manager [req-6004b449-e57c-43a8-be2b-bc5a5289526a req-02a62ea2-8653-4169-90e6-656477419446 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:42 np0005465988 nova_compute[236126]: 2025-10-02 12:43:42.644 2 DEBUG nova.compute.manager [req-6004b449-e57c-43a8-be2b-bc5a5289526a req-02a62ea2-8653-4169-90e6-656477419446 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing instance network info cache due to event network-changed-ffc7f957-3806-432f-a6e7-5ea3c764735a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:42 np0005465988 nova_compute[236126]: 2025-10-02 12:43:42.644 2 DEBUG oslo_concurrency.lockutils [req-6004b449-e57c-43a8-be2b-bc5a5289526a req-02a62ea2-8653-4169-90e6-656477419446 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:42 np0005465988 nova_compute[236126]: 2025-10-02 12:43:42.645 2 DEBUG oslo_concurrency.lockutils [req-6004b449-e57c-43a8-be2b-bc5a5289526a req-02a62ea2-8653-4169-90e6-656477419446 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:42 np0005465988 nova_compute[236126]: 2025-10-02 12:43:42.645 2 DEBUG nova.network.neutron [req-6004b449-e57c-43a8-be2b-bc5a5289526a req-02a62ea2-8653-4169-90e6-656477419446 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Refreshing network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:43:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:42.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:43:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:43.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:44 np0005465988 nova_compute[236126]: 2025-10-02 12:43:44.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:44.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:45.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:43:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:46.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:43:46 np0005465988 nova_compute[236126]: 2025-10-02 12:43:46.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:46.894 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.143 2 DEBUG nova.network.neutron [req-6004b449-e57c-43a8-be2b-bc5a5289526a req-02a62ea2-8653-4169-90e6-656477419446 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updated VIF entry in instance network info cache for port ffc7f957-3806-432f-a6e7-5ea3c764735a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.143 2 DEBUG nova.network.neutron [req-6004b449-e57c-43a8-be2b-bc5a5289526a req-02a62ea2-8653-4169-90e6-656477419446 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating instance_info_cache with network_info: [{"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.251 2 DEBUG oslo_concurrency.lockutils [req-6004b449-e57c-43a8-be2b-bc5a5289526a req-02a62ea2-8653-4169-90e6-656477419446 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-5a11dca9-ede5-4fdd-af8e-7936ff4f9980" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:47.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.646 2 DEBUG oslo_concurrency.lockutils [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.646 2 DEBUG oslo_concurrency.lockutils [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.647 2 DEBUG oslo_concurrency.lockutils [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.648 2 DEBUG oslo_concurrency.lockutils [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.649 2 DEBUG oslo_concurrency.lockutils [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.651 2 INFO nova.compute.manager [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Terminating instance#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.653 2 DEBUG nova.compute.manager [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:43:47 np0005465988 kernel: tap6797a28e-44 (unregistering): left promiscuous mode
Oct  2 08:43:47 np0005465988 NetworkManager[45041]: <info>  [1759409027.7466] device (tap6797a28e-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:43:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:43:47Z|00705|binding|INFO|Releasing lport 6797a28e-4489-4337-b4be-f09d77787856 from this chassis (sb_readonly=0)
Oct  2 08:43:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:43:47Z|00706|binding|INFO|Setting lport 6797a28e-4489-4337-b4be-f09d77787856 down in Southbound
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:43:47Z|00707|binding|INFO|Removing iface tap6797a28e-44 ovn-installed in OVS
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.807 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:5d:14 10.100.0.4'], port_security=['fa:16:3e:34:5d:14 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '1ce0c3bd-552b-4bc2-95e9-ccac7b24593c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a41d99312f014c65adddea4f70536a15', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5f3db9ba-e6e8-41b4-b916-387b4ad385f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4eaf2b53-ef61-475e-8161-94a8e63ff149, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=6797a28e-4489-4337-b4be-f09d77787856) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.809 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 6797a28e-4489-4337-b4be-f09d77787856 in datapath e7b8a8de-b6cd-4283-854b-a2bd919c371d unbound from our chassis#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.812 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e7b8a8de-b6cd-4283-854b-a2bd919c371d#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.834 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[95643c07-625b-4a58-a717-7d65a5d27887]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:47 np0005465988 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Oct  2 08:43:47 np0005465988 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009b.scope: Consumed 20.110s CPU time.
Oct  2 08:43:47 np0005465988 systemd-machined[192594]: Machine qemu-72-instance-0000009b terminated.
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.869 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5af0b61b-733e-420e-973e-cc50bd6b5e32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.874 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5b006f5d-4b68-49dc-a419-ce9c25c52180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.891 2 INFO nova.virt.libvirt.driver [-] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Instance destroyed successfully.#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.892 2 DEBUG nova.objects.instance [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'resources' on Instance uuid 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.915 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7bae1252-103a-42e6-8ce8-122b027d298e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.941 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[753410eb-e4c2-4dd0-8d1e-737a28c7b111]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape7b8a8de-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:18:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694593, 'reachable_time': 16514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307976, 'error': None, 'target': 'ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.945 2 DEBUG nova.virt.libvirt.vif [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:41:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1280805717',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1280805717',id=155,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKhJxrDtnxwBQUfhXEoiE7UJdnEItyt2MVgFBXsCoh01cS2FKjJZa0tSLP7/9uktcmwDXaXDiKLD638dMdEY8dQy2aXxdKxSuJAyk4atAc8PHb6iv+FO/634dBFNFVRVg==',key_name='tempest-TestInstancesWithCinderVolumes-1888663332',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:41:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a41d99312f014c65adddea4f70536a15',ramdisk_id='',reservation_id='r-c03q7o85',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestInstancesWithCinderVolumes-99684106',owner_user_name='tempest-TestInstancesWithCinderVolumes-99684106-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:41:46Z,user_data=None,user_id='b82c89ad6c4a49e78943f7a92d0a6560',uuid=1ce0c3bd-552b-4bc2-95e9-ccac7b24593c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.946 2 DEBUG nova.network.os_vif_util [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converting VIF {"id": "6797a28e-4489-4337-b4be-f09d77787856", "address": "fa:16:3e:34:5d:14", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6797a28e-44", "ovs_interfaceid": "6797a28e-4489-4337-b4be-f09d77787856", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.947 2 DEBUG nova.network.os_vif_util [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:5d:14,bridge_name='br-int',has_traffic_filtering=True,id=6797a28e-4489-4337-b4be-f09d77787856,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6797a28e-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.948 2 DEBUG os_vif [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:5d:14,bridge_name='br-int',has_traffic_filtering=True,id=6797a28e-4489-4337-b4be-f09d77787856,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6797a28e-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.951 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6797a28e-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.968 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a836226c-1e5d-4dd8-871d-79bba98a4f5c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape7b8a8de-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 694608, 'tstamp': 694608}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 307977, 'error': None, 'target': 'ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape7b8a8de-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 694611, 'tstamp': 694611}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 307977, 'error': None, 'target': 'ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.970 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7b8a8de-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:43:47 np0005465988 nova_compute[236126]: 2025-10-02 12:43:47.988 2 INFO os_vif [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:5d:14,bridge_name='br-int',has_traffic_filtering=True,id=6797a28e-4489-4337-b4be-f09d77787856,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6797a28e-44')#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.987 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7b8a8de-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.987 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.988 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape7b8a8de-b0, col_values=(('external_ids', {'iface-id': '79bf28ab-e58e-4276-adf8-279ba85b1b49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:43:47.989 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:43:48 np0005465988 nova_compute[236126]: 2025-10-02 12:43:48.544 2 INFO nova.virt.libvirt.driver [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Deleting instance files /var/lib/nova/instances/1ce0c3bd-552b-4bc2-95e9-ccac7b24593c_del#033[00m
Oct  2 08:43:48 np0005465988 nova_compute[236126]: 2025-10-02 12:43:48.545 2 INFO nova.virt.libvirt.driver [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Deletion of /var/lib/nova/instances/1ce0c3bd-552b-4bc2-95e9-ccac7b24593c_del complete#033[00m
Oct  2 08:43:48 np0005465988 nova_compute[236126]: 2025-10-02 12:43:48.825 2 INFO nova.compute.manager [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Took 1.17 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:43:48 np0005465988 nova_compute[236126]: 2025-10-02 12:43:48.826 2 DEBUG oslo.service.loopingcall [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:43:48 np0005465988 nova_compute[236126]: 2025-10-02 12:43:48.827 2 DEBUG nova.compute.manager [-] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:43:48 np0005465988 nova_compute[236126]: 2025-10-02 12:43:48.827 2 DEBUG nova.network.neutron [-] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:43:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:48.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:49.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.751 2 DEBUG nova.compute.manager [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-vif-unplugged-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.752 2 DEBUG oslo_concurrency.lockutils [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.752 2 DEBUG oslo_concurrency.lockutils [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.753 2 DEBUG oslo_concurrency.lockutils [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.753 2 DEBUG nova.compute.manager [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] No waiting events found dispatching network-vif-unplugged-6797a28e-4489-4337-b4be-f09d77787856 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.753 2 DEBUG nova.compute.manager [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-vif-unplugged-6797a28e-4489-4337-b4be-f09d77787856 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.753 2 DEBUG nova.compute.manager [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-vif-plugged-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.753 2 DEBUG oslo_concurrency.lockutils [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.754 2 DEBUG oslo_concurrency.lockutils [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.754 2 DEBUG oslo_concurrency.lockutils [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.754 2 DEBUG nova.compute.manager [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] No waiting events found dispatching network-vif-plugged-6797a28e-4489-4337-b4be-f09d77787856 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.754 2 WARNING nova.compute.manager [req-0c76c6c2-5e21-4be8-bf2d-b88a78430f02 req-9b581f29-0006-40a8-be69-87cabbc057c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received unexpected event network-vif-plugged-6797a28e-4489-4337-b4be-f09d77787856 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.758 2 DEBUG nova.network.neutron [-] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:49 np0005465988 nova_compute[236126]: 2025-10-02 12:43:49.909 2 INFO nova.compute.manager [-] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Took 1.08 seconds to deallocate network for instance.#033[00m
Oct  2 08:43:50 np0005465988 nova_compute[236126]: 2025-10-02 12:43:50.265 2 INFO nova.compute.manager [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Took 0.35 seconds to detach 1 volumes for instance.#033[00m
Oct  2 08:43:50 np0005465988 nova_compute[236126]: 2025-10-02 12:43:50.586 2 DEBUG oslo_concurrency.lockutils [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:50 np0005465988 nova_compute[236126]: 2025-10-02 12:43:50.586 2 DEBUG oslo_concurrency.lockutils [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:50 np0005465988 nova_compute[236126]: 2025-10-02 12:43:50.663 2 DEBUG oslo_concurrency.processutils [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:50.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:43:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1716887063' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:43:51 np0005465988 nova_compute[236126]: 2025-10-02 12:43:51.118 2 DEBUG oslo_concurrency.processutils [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:51 np0005465988 nova_compute[236126]: 2025-10-02 12:43:51.124 2 DEBUG nova.compute.provider_tree [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:43:51 np0005465988 nova_compute[236126]: 2025-10-02 12:43:51.336 2 DEBUG nova.scheduler.client.report [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:43:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:51.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:51 np0005465988 nova_compute[236126]: 2025-10-02 12:43:51.582 2 DEBUG oslo_concurrency.lockutils [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.996s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:51 np0005465988 nova_compute[236126]: 2025-10-02 12:43:51.741 2 INFO nova.scheduler.client.report [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Deleted allocations for instance 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c#033[00m
Oct  2 08:43:52 np0005465988 nova_compute[236126]: 2025-10-02 12:43:52.059 2 DEBUG oslo_concurrency.lockutils [None req-1e010df6-d2fb-4800-9c44-86670f52944d b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "1ce0c3bd-552b-4bc2-95e9-ccac7b24593c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:52 np0005465988 nova_compute[236126]: 2025-10-02 12:43:52.234 2 DEBUG nova.compute.manager [req-1b90779c-47c5-4de1-9555-d882453754e4 req-5f2c4521-b6e6-4812-9e26-2dfcc8de4580 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Received event network-vif-deleted-6797a28e-4489-4337-b4be-f09d77787856 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:52.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:52 np0005465988 nova_compute[236126]: 2025-10-02 12:43:52.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:53.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:53 np0005465988 podman[308072]: 2025-10-02 12:43:53.526888312 +0000 UTC m=+0.066227193 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:43:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:54 np0005465988 nova_compute[236126]: 2025-10-02 12:43:54.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:43:54Z|00708|binding|INFO|Releasing lport 79bf28ab-e58e-4276-adf8-279ba85b1b49 from this chassis (sb_readonly=0)
Oct  2 08:43:54 np0005465988 nova_compute[236126]: 2025-10-02 12:43:54.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:43:54Z|00709|binding|INFO|Releasing lport 79bf28ab-e58e-4276-adf8-279ba85b1b49 from this chassis (sb_readonly=0)
Oct  2 08:43:54 np0005465988 nova_compute[236126]: 2025-10-02 12:43:54.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:43:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:54.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:43:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:43:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:55.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:43:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:43:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:56.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:43:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:43:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:57.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:43:58 np0005465988 nova_compute[236126]: 2025-10-02 12:43:58.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:58 np0005465988 nova_compute[236126]: 2025-10-02 12:43:58.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:43:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:43:58.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:43:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:43:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:43:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:43:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:43:59.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:43:59 np0005465988 nova_compute[236126]: 2025-10-02 12:43:59.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:00.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:01.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:01 np0005465988 nova_compute[236126]: 2025-10-02 12:44:01.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:01 np0005465988 nova_compute[236126]: 2025-10-02 12:44:01.500 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:01 np0005465988 nova_compute[236126]: 2025-10-02 12:44:01.501 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:01 np0005465988 nova_compute[236126]: 2025-10-02 12:44:01.501 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:01 np0005465988 nova_compute[236126]: 2025-10-02 12:44:01.501 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:44:01 np0005465988 nova_compute[236126]: 2025-10-02 12:44:01.502 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:44:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2310385740' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.024 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.092 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000098 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.092 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-00000098 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.277 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.279 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3981MB free_disk=20.96723175048828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.279 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.280 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.523 2 DEBUG oslo_concurrency.lockutils [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.524 2 DEBUG oslo_concurrency.lockutils [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.524 2 DEBUG oslo_concurrency.lockutils [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.525 2 DEBUG oslo_concurrency.lockutils [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.525 2 DEBUG oslo_concurrency.lockutils [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.527 2 INFO nova.compute.manager [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Terminating instance#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.528 2 DEBUG nova.compute.manager [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.621 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.621 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.622 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.708 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:02 np0005465988 kernel: tapffc7f957-38 (unregistering): left promiscuous mode
Oct  2 08:44:02 np0005465988 NetworkManager[45041]: <info>  [1759409042.7311] device (tapffc7f957-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:44:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:02Z|00710|binding|INFO|Releasing lport ffc7f957-3806-432f-a6e7-5ea3c764735a from this chassis (sb_readonly=0)
Oct  2 08:44:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:02Z|00711|binding|INFO|Setting lport ffc7f957-3806-432f-a6e7-5ea3c764735a down in Southbound
Oct  2 08:44:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:02Z|00712|binding|INFO|Removing iface tapffc7f957-38 ovn-installed in OVS
Oct  2 08:44:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:02.751 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:52:ab 10.100.0.5'], port_security=['fa:16:3e:5f:52:ab 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5a11dca9-ede5-4fdd-af8e-7936ff4f9980', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a41d99312f014c65adddea4f70536a15', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5f3db9ba-e6e8-41b4-b916-387b4ad385f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4eaf2b53-ef61-475e-8161-94a8e63ff149, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=ffc7f957-3806-432f-a6e7-5ea3c764735a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:44:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:02.752 142124 INFO neutron.agent.ovn.metadata.agent [-] Port ffc7f957-3806-432f-a6e7-5ea3c764735a in datapath e7b8a8de-b6cd-4283-854b-a2bd919c371d unbound from our chassis#033[00m
Oct  2 08:44:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:02.759 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e7b8a8de-b6cd-4283-854b-a2bd919c371d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:02.763 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2c4ebe5a-92ed-406f-8281-6877585d2d0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:02.765 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d namespace which is not needed anymore#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:02 np0005465988 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000098.scope: Deactivated successfully.
Oct  2 08:44:02 np0005465988 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000098.scope: Consumed 22.497s CPU time.
Oct  2 08:44:02 np0005465988 systemd-machined[192594]: Machine qemu-71-instance-00000098 terminated.
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.888 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409027.8868785, 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.889 2 INFO nova.compute.manager [-] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:44:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:02.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.926 2 DEBUG nova.compute.manager [None req-ea1d5ed8-79fa-44d5-92f9-d56c36d3652d - - - - - -] [instance: 1ce0c3bd-552b-4bc2-95e9-ccac7b24593c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:02 np0005465988 neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d[305655]: [NOTICE]   (305659) : haproxy version is 2.8.14-c23fe91
Oct  2 08:44:02 np0005465988 neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d[305655]: [NOTICE]   (305659) : path to executable is /usr/sbin/haproxy
Oct  2 08:44:02 np0005465988 neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d[305655]: [WARNING]  (305659) : Exiting Master process...
Oct  2 08:44:02 np0005465988 neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d[305655]: [ALERT]    (305659) : Current worker (305661) exited with code 143 (Terminated)
Oct  2 08:44:02 np0005465988 neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d[305655]: [WARNING]  (305659) : All workers exited. Exiting... (0)
Oct  2 08:44:02 np0005465988 systemd[1]: libpod-14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42.scope: Deactivated successfully.
Oct  2 08:44:02 np0005465988 podman[308145]: 2025-10-02 12:44:02.948974935 +0000 UTC m=+0.075331482 container died 14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.973 2 INFO nova.virt.libvirt.driver [-] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Instance destroyed successfully.#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.974 2 DEBUG nova.objects.instance [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lazy-loading 'resources' on Instance uuid 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.988 2 DEBUG nova.virt.libvirt.vif [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1297354926',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1297354926',id=152,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKhJxrDtnxwBQUfhXEoiE7UJdnEItyt2MVgFBXsCoh01cS2FKjJZa0tSLP7/9uktcmwDXaXDiKLD638dMdEY8dQy2aXxdKxSuJAyk4atAc8PHb6iv+FO/634dBFNFVRVg==',key_name='tempest-TestInstancesWithCinderVolumes-1888663332',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:41:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a41d99312f014c65adddea4f70536a15',ramdisk_id='',reservation_id='r-yursnfjy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestInstancesWithCinderVolumes-99684106',owner_user_name='tempest-TestInstancesWithCinderVolumes-99684106-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:41:30Z,user_data=None,user_id='b82c89ad6c4a49e78943f7a92d0a6560',uuid=5a11dca9-ede5-4fdd-af8e-7936ff4f9980,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.989 2 DEBUG nova.network.os_vif_util [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converting VIF {"id": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "address": "fa:16:3e:5f:52:ab", "network": {"id": "e7b8a8de-b6cd-4283-854b-a2bd919c371d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1851369337-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a41d99312f014c65adddea4f70536a15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffc7f957-38", "ovs_interfaceid": "ffc7f957-3806-432f-a6e7-5ea3c764735a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.989 2 DEBUG nova.network.os_vif_util [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:52:ab,bridge_name='br-int',has_traffic_filtering=True,id=ffc7f957-3806-432f-a6e7-5ea3c764735a,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc7f957-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.990 2 DEBUG os_vif [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:52:ab,bridge_name='br-int',has_traffic_filtering=True,id=ffc7f957-3806-432f-a6e7-5ea3c764735a,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc7f957-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.991 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapffc7f957-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:44:02 np0005465988 nova_compute[236126]: 2025-10-02 12:44:02.997 2 INFO os_vif [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:52:ab,bridge_name='br-int',has_traffic_filtering=True,id=ffc7f957-3806-432f-a6e7-5ea3c764735a,network=Network(e7b8a8de-b6cd-4283-854b-a2bd919c371d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffc7f957-38')#033[00m
Oct  2 08:44:03 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42-userdata-shm.mount: Deactivated successfully.
Oct  2 08:44:03 np0005465988 systemd[1]: var-lib-containers-storage-overlay-bb5001e0e485dd34d2bc40f97d2448ee3f37bb25495f0a4744ddc83e66e58ccd-merged.mount: Deactivated successfully.
Oct  2 08:44:03 np0005465988 podman[308145]: 2025-10-02 12:44:03.031234083 +0000 UTC m=+0.157590620 container cleanup 14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:44:03 np0005465988 systemd[1]: libpod-conmon-14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42.scope: Deactivated successfully.
Oct  2 08:44:03 np0005465988 podman[308220]: 2025-10-02 12:44:03.109580139 +0000 UTC m=+0.050160906 container remove 14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 08:44:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:03.116 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8633c6eb-9784-4571-8bd9-820603a41a33]: (4, ('Thu Oct  2 12:44:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d (14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42)\n14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42\nThu Oct  2 12:44:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d (14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42)\n14df4f6226daf267010745f7d308552c4d89813658fe6e141342e15626dfbc42\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:03.120 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4b61b061-d5ff-49b5-8694-5890b73030d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:03.122 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7b8a8de-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.122 2 DEBUG nova.compute.manager [req-4842a09a-b498-4223-b9e1-22b3a08bce75 req-8b2d9ac7-bbba-44b5-901e-e676442e2080 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-vif-unplugged-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.122 2 DEBUG oslo_concurrency.lockutils [req-4842a09a-b498-4223-b9e1-22b3a08bce75 req-8b2d9ac7-bbba-44b5-901e-e676442e2080 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.122 2 DEBUG oslo_concurrency.lockutils [req-4842a09a-b498-4223-b9e1-22b3a08bce75 req-8b2d9ac7-bbba-44b5-901e-e676442e2080 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.123 2 DEBUG oslo_concurrency.lockutils [req-4842a09a-b498-4223-b9e1-22b3a08bce75 req-8b2d9ac7-bbba-44b5-901e-e676442e2080 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.123 2 DEBUG nova.compute.manager [req-4842a09a-b498-4223-b9e1-22b3a08bce75 req-8b2d9ac7-bbba-44b5-901e-e676442e2080 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] No waiting events found dispatching network-vif-unplugged-ffc7f957-3806-432f-a6e7-5ea3c764735a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.123 2 DEBUG nova.compute.manager [req-4842a09a-b498-4223-b9e1-22b3a08bce75 req-8b2d9ac7-bbba-44b5-901e-e676442e2080 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-vif-unplugged-ffc7f957-3806-432f-a6e7-5ea3c764735a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:03 np0005465988 kernel: tape7b8a8de-b0: left promiscuous mode
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:03.129 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[17c89042-8554-47e1-a768-93c659b0c57a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:03.156 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3c84188a-c005-4e4a-9820-47e45cf5072c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:03.164 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d419f7de-683c-42f9-acca-41712a2267e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:03.181 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[97b79570-c5cc-451b-b4af-6190846b6ad6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694582, 'reachable_time': 24576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308238, 'error': None, 'target': 'ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:03 np0005465988 systemd[1]: run-netns-ovnmeta\x2de7b8a8de\x2db6cd\x2d4283\x2d854b\x2da2bd919c371d.mount: Deactivated successfully.
Oct  2 08:44:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:03.185 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e7b8a8de-b6cd-4283-854b-a2bd919c371d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:44:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:03.186 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[fce694c1-a11b-47a5-b51f-67599e25799f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:44:03 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/109427468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.214 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.220 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.239 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.258 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:44:03 np0005465988 nova_compute[236126]: 2025-10-02 12:44:03.258 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:44:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:03.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:44:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:04 np0005465988 nova_compute[236126]: 2025-10-02 12:44:04.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:04.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.248 2 DEBUG nova.compute.manager [req-5ab9b493-9d73-4cd2-a9ef-7f3fe6b3e93a req-970f656e-e844-4c58-a64e-e387324a0557 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-vif-plugged-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.249 2 DEBUG oslo_concurrency.lockutils [req-5ab9b493-9d73-4cd2-a9ef-7f3fe6b3e93a req-970f656e-e844-4c58-a64e-e387324a0557 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.249 2 DEBUG oslo_concurrency.lockutils [req-5ab9b493-9d73-4cd2-a9ef-7f3fe6b3e93a req-970f656e-e844-4c58-a64e-e387324a0557 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.249 2 DEBUG oslo_concurrency.lockutils [req-5ab9b493-9d73-4cd2-a9ef-7f3fe6b3e93a req-970f656e-e844-4c58-a64e-e387324a0557 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.249 2 DEBUG nova.compute.manager [req-5ab9b493-9d73-4cd2-a9ef-7f3fe6b3e93a req-970f656e-e844-4c58-a64e-e387324a0557 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] No waiting events found dispatching network-vif-plugged-ffc7f957-3806-432f-a6e7-5ea3c764735a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.250 2 WARNING nova.compute.manager [req-5ab9b493-9d73-4cd2-a9ef-7f3fe6b3e93a req-970f656e-e844-4c58-a64e-e387324a0557 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received unexpected event network-vif-plugged-ffc7f957-3806-432f-a6e7-5ea3c764735a for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:44:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:05.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.730 2 INFO nova.virt.libvirt.driver [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Deleting instance files /var/lib/nova/instances/5a11dca9-ede5-4fdd-af8e-7936ff4f9980_del#033[00m
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.731 2 INFO nova.virt.libvirt.driver [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Deletion of /var/lib/nova/instances/5a11dca9-ede5-4fdd-af8e-7936ff4f9980_del complete#033[00m
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.816 2 INFO nova.compute.manager [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Took 3.29 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.817 2 DEBUG oslo.service.loopingcall [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.817 2 DEBUG nova.compute.manager [-] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:44:05 np0005465988 nova_compute[236126]: 2025-10-02 12:44:05.818 2 DEBUG nova.network.neutron [-] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:44:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:44:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4201.0 total, 600.0 interval#012Cumulative writes: 58K writes, 233K keys, 58K commit groups, 1.0 writes per commit group, ingest: 0.23 GB, 0.06 MB/s#012Cumulative WAL: 58K writes, 21K syncs, 2.71 writes per sync, written: 0.23 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 9608 writes, 38K keys, 9608 commit groups, 1.0 writes per commit group, ingest: 38.55 MB, 0.06 MB/s#012Interval WAL: 9608 writes, 3674 syncs, 2.62 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 08:44:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:06.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:44:07 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3975239224' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:44:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:07.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:07 np0005465988 nova_compute[236126]: 2025-10-02 12:44:07.491 2 DEBUG nova.network.neutron [-] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:44:07 np0005465988 nova_compute[236126]: 2025-10-02 12:44:07.525 2 INFO nova.compute.manager [-] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Took 1.71 seconds to deallocate network for instance.#033[00m
Oct  2 08:44:07 np0005465988 nova_compute[236126]: 2025-10-02 12:44:07.573 2 DEBUG nova.compute.manager [req-5dce351a-35f1-4606-98c9-6d388a4e757e req-5d4e5480-41a7-4d7d-a920-9fc15da4d815 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Received event network-vif-deleted-ffc7f957-3806-432f-a6e7-5ea3c764735a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:07 np0005465988 nova_compute[236126]: 2025-10-02 12:44:07.741 2 INFO nova.compute.manager [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Took 0.22 seconds to detach 1 volumes for instance.#033[00m
Oct  2 08:44:07 np0005465988 nova_compute[236126]: 2025-10-02 12:44:07.829 2 DEBUG oslo_concurrency.lockutils [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:07 np0005465988 nova_compute[236126]: 2025-10-02 12:44:07.830 2 DEBUG oslo_concurrency.lockutils [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:07 np0005465988 nova_compute[236126]: 2025-10-02 12:44:07.920 2 DEBUG oslo_concurrency.processutils [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:07 np0005465988 nova_compute[236126]: 2025-10-02 12:44:07.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:44:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4175023848' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:44:08 np0005465988 nova_compute[236126]: 2025-10-02 12:44:08.396 2 DEBUG oslo_concurrency.processutils [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:08 np0005465988 nova_compute[236126]: 2025-10-02 12:44:08.402 2 DEBUG nova.compute.provider_tree [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:44:08 np0005465988 nova_compute[236126]: 2025-10-02 12:44:08.419 2 DEBUG nova.scheduler.client.report [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:44:08 np0005465988 nova_compute[236126]: 2025-10-02 12:44:08.444 2 DEBUG oslo_concurrency.lockutils [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:08 np0005465988 nova_compute[236126]: 2025-10-02 12:44:08.475 2 INFO nova.scheduler.client.report [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Deleted allocations for instance 5a11dca9-ede5-4fdd-af8e-7936ff4f9980#033[00m
Oct  2 08:44:08 np0005465988 podman[308268]: 2025-10-02 12:44:08.533786371 +0000 UTC m=+0.065883324 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:44:08 np0005465988 podman[308267]: 2025-10-02 12:44:08.549298681 +0000 UTC m=+0.073339825 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:44:08 np0005465988 podman[308266]: 2025-10-02 12:44:08.554462348 +0000 UTC m=+0.093554250 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:44:08 np0005465988 nova_compute[236126]: 2025-10-02 12:44:08.571 2 DEBUG oslo_concurrency.lockutils [None req-6457158b-a163-423d-a209-fa356ac26111 b82c89ad6c4a49e78943f7a92d0a6560 a41d99312f014c65adddea4f70536a15 - - default default] Lock "5a11dca9-ede5-4fdd-af8e-7936ff4f9980" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.047s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:08.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:09 np0005465988 nova_compute[236126]: 2025-10-02 12:44:09.259 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:09 np0005465988 nova_compute[236126]: 2025-10-02 12:44:09.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:44:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:09.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:44:09 np0005465988 nova_compute[236126]: 2025-10-02 12:44:09.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:10.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:11 np0005465988 nova_compute[236126]: 2025-10-02 12:44:11.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:11 np0005465988 nova_compute[236126]: 2025-10-02 12:44:11.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:44:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:11.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:12 np0005465988 nova_compute[236126]: 2025-10-02 12:44:12.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:12.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:13 np0005465988 nova_compute[236126]: 2025-10-02 12:44:12.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:13.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:14 np0005465988 nova_compute[236126]: 2025-10-02 12:44:14.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:14 np0005465988 nova_compute[236126]: 2025-10-02 12:44:14.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:14 np0005465988 nova_compute[236126]: 2025-10-02 12:44:14.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:14.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:15 np0005465988 nova_compute[236126]: 2025-10-02 12:44:15.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:15 np0005465988 nova_compute[236126]: 2025-10-02 12:44:15.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:44:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:44:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:15.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:44:15 np0005465988 nova_compute[236126]: 2025-10-02 12:44:15.818 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:44:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:16.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:17.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:17.768 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:44:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:17.769 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:44:17 np0005465988 nova_compute[236126]: 2025-10-02 12:44:17.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:17 np0005465988 nova_compute[236126]: 2025-10-02 12:44:17.968 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409042.9670196, 5a11dca9-ede5-4fdd-af8e-7936ff4f9980 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:17 np0005465988 nova_compute[236126]: 2025-10-02 12:44:17.969 2 INFO nova.compute.manager [-] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:44:18 np0005465988 nova_compute[236126]: 2025-10-02 12:44:18.000 2 DEBUG nova.compute.manager [None req-2c715f96-f7dc-4436-ab1f-4c62203df51f - - - - - -] [instance: 5a11dca9-ede5-4fdd-af8e-7936ff4f9980] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:18 np0005465988 nova_compute[236126]: 2025-10-02 12:44:18.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:44:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:18.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:44:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.260 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Acquiring lock "45e8ae09-6891-40ac-8d06-222dd16bea27" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.261 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.306 2 DEBUG nova.compute.manager [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.465 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.465 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.473 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.474 2 INFO nova.compute.claims [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:44:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:19.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.642 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e355 e355: 3 total, 3 up, 3 in
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.818 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.818 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.819 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.988 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:44:19 np0005465988 nova_compute[236126]: 2025-10-02 12:44:19.988 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:44:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:44:20 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3502647660' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:44:20 np0005465988 nova_compute[236126]: 2025-10-02 12:44:20.135 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:20 np0005465988 nova_compute[236126]: 2025-10-02 12:44:20.141 2 DEBUG nova.compute.provider_tree [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:44:20 np0005465988 nova_compute[236126]: 2025-10-02 12:44:20.247 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "38b13275-2908-42f3-bb70-73c050f375ea" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:20 np0005465988 nova_compute[236126]: 2025-10-02 12:44:20.248 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:20 np0005465988 nova_compute[236126]: 2025-10-02 12:44:20.256 2 DEBUG nova.scheduler.client.report [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:44:20 np0005465988 nova_compute[236126]: 2025-10-02 12:44:20.593 2 DEBUG nova.compute.manager [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:44:20 np0005465988 nova_compute[236126]: 2025-10-02 12:44:20.909 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:20.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:21 np0005465988 nova_compute[236126]: 2025-10-02 12:44:21.443 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Acquiring lock "92c8f6c7-12e7-47dc-9c01-28da07c93073" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:21 np0005465988 nova_compute[236126]: 2025-10-02 12:44:21.444 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "92c8f6c7-12e7-47dc-9c01-28da07c93073" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:21.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:21 np0005465988 nova_compute[236126]: 2025-10-02 12:44:21.502 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "92c8f6c7-12e7-47dc-9c01-28da07c93073" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:21 np0005465988 nova_compute[236126]: 2025-10-02 12:44:21.503 2 DEBUG nova.compute.manager [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:44:21 np0005465988 nova_compute[236126]: 2025-10-02 12:44:21.734 2 DEBUG nova.compute.manager [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:44:21 np0005465988 nova_compute[236126]: 2025-10-02 12:44:21.734 2 DEBUG nova.network.neutron [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:44:22 np0005465988 nova_compute[236126]: 2025-10-02 12:44:22.173 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:22 np0005465988 nova_compute[236126]: 2025-10-02 12:44:22.173 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:22 np0005465988 nova_compute[236126]: 2025-10-02 12:44:22.180 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:44:22 np0005465988 nova_compute[236126]: 2025-10-02 12:44:22.180 2 INFO nova.compute.claims [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:44:22 np0005465988 nova_compute[236126]: 2025-10-02 12:44:22.184 2 INFO nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:44:22 np0005465988 nova_compute[236126]: 2025-10-02 12:44:22.515 2 DEBUG nova.compute.manager [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:44:22 np0005465988 nova_compute[236126]: 2025-10-02 12:44:22.686 2 DEBUG oslo_concurrency.processutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:22.773 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:22.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.041 2 DEBUG nova.policy [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57d85bd790b540cd81dc4d2ab9e6fb13', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1f50509a65834d86866517e09320e48b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.169 2 DEBUG nova.compute.manager [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.170 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.170 2 INFO nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Creating image(s)#033[00m
Oct  2 08:44:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:44:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2992762048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.198 2 DEBUG nova.storage.rbd_utils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] rbd image 45e8ae09-6891-40ac-8d06-222dd16bea27_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.234 2 DEBUG nova.storage.rbd_utils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] rbd image 45e8ae09-6891-40ac-8d06-222dd16bea27_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.262 2 DEBUG nova.storage.rbd_utils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] rbd image 45e8ae09-6891-40ac-8d06-222dd16bea27_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.265 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.296 2 DEBUG oslo_concurrency.processutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.610s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.303 2 DEBUG nova.compute.provider_tree [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.333 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.333 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.334 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.335 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.368 2 DEBUG nova.storage.rbd_utils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] rbd image 45e8ae09-6891-40ac-8d06-222dd16bea27_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.372 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 45e8ae09-6891-40ac-8d06-222dd16bea27_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:23.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.530 2 DEBUG nova.scheduler.client.report [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.731 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.732 2 DEBUG nova.compute.manager [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.839 2 DEBUG nova.compute.manager [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.840 2 DEBUG nova.network.neutron [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:44:23 np0005465988 nova_compute[236126]: 2025-10-02 12:44:23.909 2 INFO nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:44:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e355 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.024 2 DEBUG nova.compute.manager [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.142 2 INFO nova.virt.block_device [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Booting with volume 857b1b6f-42d5-4289-a74d-1acb4fd6b032 at /dev/vda#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.222 2 DEBUG nova.policy [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56c6abe1bb704c8aa499677aeb9017f5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4b8f9114c7ab4b6e9fc9650d4bd08af9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.301 2 DEBUG os_brick.utils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.302 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.317 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.318 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[fef4d424-cad2-402f-9916-b00f1eb693c5]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.320 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.330 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.330 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[0be3eb4c-0a72-4bd2-9d00-fe402ff37d50]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.332 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.342 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.343 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[3c4e6afd-0e05-498e-9d9a-d2744ac6951a]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.345 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[37466696-2b43-4ccd-80c2-773d5f63faf2]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.345 2 DEBUG oslo_concurrency.processutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.396 2 DEBUG oslo_concurrency.processutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "nvme version" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.402 2 DEBUG os_brick.initiator.connectors.lightos [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.403 2 DEBUG os_brick.initiator.connectors.lightos [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.403 2 DEBUG os_brick.initiator.connectors.lightos [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.403 2 DEBUG os_brick.utils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] <== get_connector_properties: return (102ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.404 2 DEBUG nova.virt.block_device [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Updating existing volume attachment record: 87ef146e-2318-4c72-a2df-e8c485467f1c _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:44:24 np0005465988 podman[308533]: 2025-10-02 12:44:24.534854085 +0000 UTC m=+0.069572388 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:24 np0005465988 nova_compute[236126]: 2025-10-02 12:44:24.766 2 DEBUG nova.network.neutron [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Successfully created port: 179b7da1-efa4-4050-801e-30bd6a2faf74 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:44:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:24.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:25.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:25 np0005465988 nova_compute[236126]: 2025-10-02 12:44:25.611 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 45e8ae09-6891-40ac-8d06-222dd16bea27_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.239s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:25 np0005465988 nova_compute[236126]: 2025-10-02 12:44:25.685 2 DEBUG nova.storage.rbd_utils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] resizing rbd image 45e8ae09-6891-40ac-8d06-222dd16bea27_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.536 2 DEBUG nova.objects.instance [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lazy-loading 'migration_context' on Instance uuid 45e8ae09-6891-40ac-8d06-222dd16bea27 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.711 2 DEBUG nova.compute.manager [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.713 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.714 2 INFO nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Creating image(s)#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.714 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.715 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Ensure instance console log exists: /var/lib/nova/instances/38b13275-2908-42f3-bb70-73c050f375ea/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.715 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.716 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.716 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.754 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.755 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Ensure instance console log exists: /var/lib/nova/instances/45e8ae09-6891-40ac-8d06-222dd16bea27/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.756 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.756 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:26 np0005465988 nova_compute[236126]: 2025-10-02 12:44:26.756 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:26.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:27.383 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:27.384 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:27.384 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:27.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:28 np0005465988 nova_compute[236126]: 2025-10-02 12:44:28.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:28 np0005465988 nova_compute[236126]: 2025-10-02 12:44:28.273 2 DEBUG nova.network.neutron [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Successfully created port: 36888ba0-b822-4067-a556-6a12a1136d08 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:44:28 np0005465988 podman[308800]: 2025-10-02 12:44:28.39766082 +0000 UTC m=+0.166062351 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 08:44:28 np0005465988 podman[308800]: 2025-10-02 12:44:28.508005596 +0000 UTC m=+0.276407097 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef)
Oct  2 08:44:28 np0005465988 nova_compute[236126]: 2025-10-02 12:44:28.640 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:28 np0005465988 nova_compute[236126]: 2025-10-02 12:44:28.781 2 DEBUG nova.network.neutron [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Successfully updated port: 179b7da1-efa4-4050-801e-30bd6a2faf74 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:44:28 np0005465988 nova_compute[236126]: 2025-10-02 12:44:28.884 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Acquiring lock "refresh_cache-45e8ae09-6891-40ac-8d06-222dd16bea27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:44:28 np0005465988 nova_compute[236126]: 2025-10-02 12:44:28.884 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Acquired lock "refresh_cache-45e8ae09-6891-40ac-8d06-222dd16bea27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:44:28 np0005465988 nova_compute[236126]: 2025-10-02 12:44:28.884 2 DEBUG nova.network.neutron [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:44:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 e356: 3 total, 3 up, 3 in
Oct  2 08:44:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:44:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:28.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:44:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:29 np0005465988 nova_compute[236126]: 2025-10-02 12:44:29.013 2 DEBUG nova.compute.manager [req-a62af1ef-b48a-4243-a439-a12cf0714c42 req-f95df0a6-6bd5-4a4e-96d9-7723bbdfd023 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Received event network-changed-179b7da1-efa4-4050-801e-30bd6a2faf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:29 np0005465988 nova_compute[236126]: 2025-10-02 12:44:29.014 2 DEBUG nova.compute.manager [req-a62af1ef-b48a-4243-a439-a12cf0714c42 req-f95df0a6-6bd5-4a4e-96d9-7723bbdfd023 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Refreshing instance network info cache due to event network-changed-179b7da1-efa4-4050-801e-30bd6a2faf74. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:44:29 np0005465988 nova_compute[236126]: 2025-10-02 12:44:29.014 2 DEBUG oslo_concurrency.lockutils [req-a62af1ef-b48a-4243-a439-a12cf0714c42 req-f95df0a6-6bd5-4a4e-96d9-7723bbdfd023 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-45e8ae09-6891-40ac-8d06-222dd16bea27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:44:29 np0005465988 nova_compute[236126]: 2025-10-02 12:44:29.276 2 DEBUG nova.network.neutron [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:44:29 np0005465988 podman[308937]: 2025-10-02 12:44:29.35926419 +0000 UTC m=+0.134393631 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:44:29 np0005465988 podman[308937]: 2025-10-02 12:44:29.393278036 +0000 UTC m=+0.168407447 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:44:29 np0005465988 nova_compute[236126]: 2025-10-02 12:44:29.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:29.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:29 np0005465988 nova_compute[236126]: 2025-10-02 12:44:29.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:29 np0005465988 podman[308998]: 2025-10-02 12:44:29.611446947 +0000 UTC m=+0.047295035 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, version=2.2.4, io.buildah.version=1.28.2, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, release=1793, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, vcs-type=git, description=keepalived for Ceph, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20)
Oct  2 08:44:29 np0005465988 podman[308998]: 2025-10-02 12:44:29.624785036 +0000 UTC m=+0.060633104 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.openshift.expose-services=, description=keepalived for Ceph, release=1793, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, build-date=2023-02-22T09:23:20, vcs-type=git, vendor=Red Hat, Inc.)
Oct  2 08:44:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:44:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:44:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:44:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:44:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:44:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:30.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:30 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #118. Immutable memtables: 0.
Oct  2 08:44:30 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:30.987119) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:44:30 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 73] Flushing memtable with next log file: 118
Oct  2 08:44:30 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409070987148, "job": 73, "event": "flush_started", "num_memtables": 1, "num_entries": 1055, "num_deletes": 253, "total_data_size": 2052129, "memory_usage": 2085840, "flush_reason": "Manual Compaction"}
Oct  2 08:44:30 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 73] Level-0 flush table #119: started
Oct  2 08:44:30 np0005465988 nova_compute[236126]: 2025-10-02 12:44:30.986 2 DEBUG nova.network.neutron [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Updating instance_info_cache with network_info: [{"id": "179b7da1-efa4-4050-801e-30bd6a2faf74", "address": "fa:16:3e:ee:6e:06", "network": {"id": "d9f8364c-62cb-4b10-886b-40ba4143ca98", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-805477975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f50509a65834d86866517e09320e48b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap179b7da1-ef", "ovs_interfaceid": "179b7da1-efa4-4050-801e-30bd6a2faf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409071001418, "cf_name": "default", "job": 73, "event": "table_file_creation", "file_number": 119, "file_size": 1352364, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 59833, "largest_seqno": 60883, "table_properties": {"data_size": 1347517, "index_size": 2371, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11219, "raw_average_key_size": 20, "raw_value_size": 1337629, "raw_average_value_size": 2440, "num_data_blocks": 102, "num_entries": 548, "num_filter_entries": 548, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759409004, "oldest_key_time": 1759409004, "file_creation_time": 1759409070, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 119, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 73] Flush lasted 14377 microseconds, and 4245 cpu microseconds.
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.001484) [db/flush_job.cc:967] [default] [JOB 73] Level-0 flush table #119: 1352364 bytes OK
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.001517) [db/memtable_list.cc:519] [default] Level-0 commit table #119 started
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.002697) [db/memtable_list.cc:722] [default] Level-0 commit table #119: memtable #1 done
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.002712) EVENT_LOG_v1 {"time_micros": 1759409071002707, "job": 73, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.002738) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 73] Try to delete WAL files size 2046867, prev total WAL file size 2046867, number of live WAL files 2.
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000115.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.003548) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035303230' seq:72057594037927935, type:22 .. '7061786F730035323732' seq:0, type:0; will stop at (end)
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 74] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 73 Base level 0, inputs: [119(1320KB)], [117(12MB)]
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409071003618, "job": 74, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [119], "files_L6": [117], "score": -1, "input_data_size": 14118617, "oldest_snapshot_seqno": -1}
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 74] Generated table #120: 8461 keys, 12095540 bytes, temperature: kUnknown
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409071093094, "cf_name": "default", "job": 74, "event": "table_file_creation", "file_number": 120, "file_size": 12095540, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12039165, "index_size": 34119, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21189, "raw_key_size": 220261, "raw_average_key_size": 26, "raw_value_size": 11888777, "raw_average_value_size": 1405, "num_data_blocks": 1331, "num_entries": 8461, "num_filter_entries": 8461, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759409071, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 120, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.093487) [db/compaction/compaction_job.cc:1663] [default] [JOB 74] Compacted 1@0 + 1@6 files to L6 => 12095540 bytes
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.095214) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 157.6 rd, 135.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 12.2 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(19.4) write-amplify(8.9) OK, records in: 8984, records dropped: 523 output_compression: NoCompression
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.095231) EVENT_LOG_v1 {"time_micros": 1759409071095222, "job": 74, "event": "compaction_finished", "compaction_time_micros": 89589, "compaction_time_cpu_micros": 37083, "output_level": 6, "num_output_files": 1, "total_output_size": 12095540, "num_input_records": 8984, "num_output_records": 8461, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000119.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409071095597, "job": 74, "event": "table_file_deletion", "file_number": 119}
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000117.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409071097872, "job": 74, "event": "table_file_deletion", "file_number": 117}
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.003414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.098065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.098074) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.098078) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.098081) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:44:31 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:44:31.098084) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.510 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Releasing lock "refresh_cache-45e8ae09-6891-40ac-8d06-222dd16bea27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.511 2 DEBUG nova.compute.manager [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Instance network_info: |[{"id": "179b7da1-efa4-4050-801e-30bd6a2faf74", "address": "fa:16:3e:ee:6e:06", "network": {"id": "d9f8364c-62cb-4b10-886b-40ba4143ca98", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-805477975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f50509a65834d86866517e09320e48b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap179b7da1-ef", "ovs_interfaceid": "179b7da1-efa4-4050-801e-30bd6a2faf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.511 2 DEBUG oslo_concurrency.lockutils [req-a62af1ef-b48a-4243-a439-a12cf0714c42 req-f95df0a6-6bd5-4a4e-96d9-7723bbdfd023 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-45e8ae09-6891-40ac-8d06-222dd16bea27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.512 2 DEBUG nova.network.neutron [req-a62af1ef-b48a-4243-a439-a12cf0714c42 req-f95df0a6-6bd5-4a4e-96d9-7723bbdfd023 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Refreshing network info cache for port 179b7da1-efa4-4050-801e-30bd6a2faf74 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.515 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Start _get_guest_xml network_info=[{"id": "179b7da1-efa4-4050-801e-30bd6a2faf74", "address": "fa:16:3e:ee:6e:06", "network": {"id": "d9f8364c-62cb-4b10-886b-40ba4143ca98", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-805477975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f50509a65834d86866517e09320e48b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap179b7da1-ef", "ovs_interfaceid": "179b7da1-efa4-4050-801e-30bd6a2faf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.521 2 WARNING nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.529 2 DEBUG nova.virt.libvirt.host [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.530 2 DEBUG nova.virt.libvirt.host [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:44:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:31.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.534 2 DEBUG nova.virt.libvirt.host [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.535 2 DEBUG nova.virt.libvirt.host [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.536 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.537 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.537 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.537 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.538 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.538 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.538 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.538 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.539 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.539 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.539 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.539 2 DEBUG nova.virt.hardware [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.542 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:31 np0005465988 nova_compute[236126]: 2025-10-02 12:44:31.742 2 DEBUG nova.network.neutron [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Successfully updated port: 36888ba0-b822-4067-a556-6a12a1136d08 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:44:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:44:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/665459189' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.052 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.053 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquired lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.053 2 DEBUG nova.network.neutron [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.064 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.094 2 DEBUG nova.storage.rbd_utils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] rbd image 45e8ae09-6891-40ac-8d06-222dd16bea27_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.098 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.146 2 DEBUG nova.compute.manager [req-9f0bf6c7-8a0c-4d07-9433-de8056c3ac5b req-cee8ad1e-709d-4a98-802c-70bbbf96fd76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Received event network-changed-36888ba0-b822-4067-a556-6a12a1136d08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.147 2 DEBUG nova.compute.manager [req-9f0bf6c7-8a0c-4d07-9433-de8056c3ac5b req-cee8ad1e-709d-4a98-802c-70bbbf96fd76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Refreshing instance network info cache due to event network-changed-36888ba0-b822-4067-a556-6a12a1136d08. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.147 2 DEBUG oslo_concurrency.lockutils [req-9f0bf6c7-8a0c-4d07-9433-de8056c3ac5b req-cee8ad1e-709d-4a98-802c-70bbbf96fd76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.531 2 DEBUG nova.network.neutron [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:44:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:44:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2679859747' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.586 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.588 2 DEBUG nova.virt.libvirt.vif [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:44:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1475816137',display_name='tempest-ServerGroupTestJSON-server-1475816137',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1475816137',id=160,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1f50509a65834d86866517e09320e48b',ramdisk_id='',reservation_id='r-gnaugn5n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-1031786497',owner_user_name='tempest-ServerGroupTestJSON-1031786497-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:44:22Z,user_data=None,user_id='57d85bd790b540cd81dc4d2ab9e6fb13',uuid=45e8ae09-6891-40ac-8d06-222dd16bea27,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "179b7da1-efa4-4050-801e-30bd6a2faf74", "address": "fa:16:3e:ee:6e:06", "network": {"id": "d9f8364c-62cb-4b10-886b-40ba4143ca98", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-805477975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f50509a65834d86866517e09320e48b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap179b7da1-ef", "ovs_interfaceid": "179b7da1-efa4-4050-801e-30bd6a2faf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.588 2 DEBUG nova.network.os_vif_util [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Converting VIF {"id": "179b7da1-efa4-4050-801e-30bd6a2faf74", "address": "fa:16:3e:ee:6e:06", "network": {"id": "d9f8364c-62cb-4b10-886b-40ba4143ca98", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-805477975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f50509a65834d86866517e09320e48b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap179b7da1-ef", "ovs_interfaceid": "179b7da1-efa4-4050-801e-30bd6a2faf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.589 2 DEBUG nova.network.os_vif_util [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:6e:06,bridge_name='br-int',has_traffic_filtering=True,id=179b7da1-efa4-4050-801e-30bd6a2faf74,network=Network(d9f8364c-62cb-4b10-886b-40ba4143ca98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap179b7da1-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.591 2 DEBUG nova.objects.instance [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lazy-loading 'pci_devices' on Instance uuid 45e8ae09-6891-40ac-8d06-222dd16bea27 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.620 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  <uuid>45e8ae09-6891-40ac-8d06-222dd16bea27</uuid>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  <name>instance-000000a0</name>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerGroupTestJSON-server-1475816137</nova:name>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:44:31</nova:creationTime>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <nova:user uuid="57d85bd790b540cd81dc4d2ab9e6fb13">tempest-ServerGroupTestJSON-1031786497-project-member</nova:user>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <nova:project uuid="1f50509a65834d86866517e09320e48b">tempest-ServerGroupTestJSON-1031786497</nova:project>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <nova:port uuid="179b7da1-efa4-4050-801e-30bd6a2faf74">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <entry name="serial">45e8ae09-6891-40ac-8d06-222dd16bea27</entry>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <entry name="uuid">45e8ae09-6891-40ac-8d06-222dd16bea27</entry>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/45e8ae09-6891-40ac-8d06-222dd16bea27_disk">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/45e8ae09-6891-40ac-8d06-222dd16bea27_disk.config">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:ee:6e:06"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <target dev="tap179b7da1-ef"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/45e8ae09-6891-40ac-8d06-222dd16bea27/console.log" append="off"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:44:32 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:44:32 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:44:32 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:44:32 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.621 2 DEBUG nova.compute.manager [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Preparing to wait for external event network-vif-plugged-179b7da1-efa4-4050-801e-30bd6a2faf74 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.622 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Acquiring lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.622 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.622 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.623 2 DEBUG nova.virt.libvirt.vif [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:44:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1475816137',display_name='tempest-ServerGroupTestJSON-server-1475816137',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1475816137',id=160,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1f50509a65834d86866517e09320e48b',ramdisk_id='',reservation_id='r-gnaugn5n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-1031786497',owner_user_name='tempest-ServerGroupTestJSON-1031786497-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:44:22Z,user_data=None,user_id='57d85bd790b540cd81dc4d2ab9e6fb13',uuid=45e8ae09-6891-40ac-8d06-222dd16bea27,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "179b7da1-efa4-4050-801e-30bd6a2faf74", "address": "fa:16:3e:ee:6e:06", "network": {"id": "d9f8364c-62cb-4b10-886b-40ba4143ca98", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-805477975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f50509a65834d86866517e09320e48b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap179b7da1-ef", "ovs_interfaceid": "179b7da1-efa4-4050-801e-30bd6a2faf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.623 2 DEBUG nova.network.os_vif_util [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Converting VIF {"id": "179b7da1-efa4-4050-801e-30bd6a2faf74", "address": "fa:16:3e:ee:6e:06", "network": {"id": "d9f8364c-62cb-4b10-886b-40ba4143ca98", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-805477975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f50509a65834d86866517e09320e48b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap179b7da1-ef", "ovs_interfaceid": "179b7da1-efa4-4050-801e-30bd6a2faf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.624 2 DEBUG nova.network.os_vif_util [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:6e:06,bridge_name='br-int',has_traffic_filtering=True,id=179b7da1-efa4-4050-801e-30bd6a2faf74,network=Network(d9f8364c-62cb-4b10-886b-40ba4143ca98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap179b7da1-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.624 2 DEBUG os_vif [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:6e:06,bridge_name='br-int',has_traffic_filtering=True,id=179b7da1-efa4-4050-801e-30bd6a2faf74,network=Network(d9f8364c-62cb-4b10-886b-40ba4143ca98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap179b7da1-ef') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.626 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.626 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.630 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap179b7da1-ef, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.630 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap179b7da1-ef, col_values=(('external_ids', {'iface-id': '179b7da1-efa4-4050-801e-30bd6a2faf74', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:6e:06', 'vm-uuid': '45e8ae09-6891-40ac-8d06-222dd16bea27'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:32 np0005465988 NetworkManager[45041]: <info>  [1759409072.6326] manager: (tap179b7da1-ef): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/318)
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.640 2 INFO os_vif [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:6e:06,bridge_name='br-int',has_traffic_filtering=True,id=179b7da1-efa4-4050-801e-30bd6a2faf74,network=Network(d9f8364c-62cb-4b10-886b-40ba4143ca98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap179b7da1-ef')#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.758 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:32 np0005465988 nova_compute[236126]: 2025-10-02 12:44:32.759 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:44:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:32.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:33 np0005465988 nova_compute[236126]: 2025-10-02 12:44:33.067 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:44:33 np0005465988 nova_compute[236126]: 2025-10-02 12:44:33.067 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:44:33 np0005465988 nova_compute[236126]: 2025-10-02 12:44:33.067 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] No VIF found with MAC fa:16:3e:ee:6e:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:44:33 np0005465988 nova_compute[236126]: 2025-10-02 12:44:33.068 2 INFO nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Using config drive#033[00m
Oct  2 08:44:33 np0005465988 nova_compute[236126]: 2025-10-02 12:44:33.095 2 DEBUG nova.storage.rbd_utils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] rbd image 45e8ae09-6891-40ac-8d06-222dd16bea27_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:44:33 np0005465988 nova_compute[236126]: 2025-10-02 12:44:33.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:33.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.563 2 INFO nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Creating config drive at /var/lib/nova/instances/45e8ae09-6891-40ac-8d06-222dd16bea27/disk.config#033[00m
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.568 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/45e8ae09-6891-40ac-8d06-222dd16bea27/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg9n7llio execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.713 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/45e8ae09-6891-40ac-8d06-222dd16bea27/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg9n7llio" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.741 2 DEBUG nova.storage.rbd_utils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] rbd image 45e8ae09-6891-40ac-8d06-222dd16bea27_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.745 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/45e8ae09-6891-40ac-8d06-222dd16bea27/disk.config 45e8ae09-6891-40ac-8d06-222dd16bea27_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:34.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.950 2 DEBUG nova.network.neutron [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Updating instance_info_cache with network_info: [{"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.982 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Releasing lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.983 2 DEBUG nova.compute.manager [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Instance network_info: |[{"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.985 2 DEBUG oslo_concurrency.lockutils [req-9f0bf6c7-8a0c-4d07-9433-de8056c3ac5b req-cee8ad1e-709d-4a98-802c-70bbbf96fd76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.986 2 DEBUG nova.network.neutron [req-9f0bf6c7-8a0c-4d07-9433-de8056c3ac5b req-cee8ad1e-709d-4a98-802c-70bbbf96fd76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Refreshing network info cache for port 36888ba0-b822-4067-a556-6a12a1136d08 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:44:34 np0005465988 nova_compute[236126]: 2025-10-02 12:44:34.994 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Start _get_guest_xml network_info=[{"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '87ef146e-2318-4c72-a2df-e8c485467f1c', 'disk_bus': 'virtio', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-857b1b6f-42d5-4289-a74d-1acb4fd6b032', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '857b1b6f-42d5-4289-a74d-1acb4fd6b032', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '38b13275-2908-42f3-bb70-73c050f375ea', 'attached_at': '', 'detached_at': '', 'volume_id': '857b1b6f-42d5-4289-a74d-1acb4fd6b032', 'serial': '857b1b6f-42d5-4289-a74d-1acb4fd6b032'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.006 2 WARNING nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.021 2 DEBUG nova.virt.libvirt.host [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.022 2 DEBUG nova.virt.libvirt.host [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.028 2 DEBUG nova.virt.libvirt.host [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.029 2 DEBUG nova.virt.libvirt.host [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.031 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.032 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.033 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.033 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.034 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.034 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.034 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.035 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.035 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.036 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.036 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.037 2 DEBUG nova.virt.hardware [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.078 2 DEBUG nova.storage.rbd_utils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] rbd image 38b13275-2908-42f3-bb70-73c050f375ea_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.087 2 DEBUG oslo_concurrency.processutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.291 2 DEBUG nova.network.neutron [req-a62af1ef-b48a-4243-a439-a12cf0714c42 req-f95df0a6-6bd5-4a4e-96d9-7723bbdfd023 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Updated VIF entry in instance network info cache for port 179b7da1-efa4-4050-801e-30bd6a2faf74. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.294 2 DEBUG nova.network.neutron [req-a62af1ef-b48a-4243-a439-a12cf0714c42 req-f95df0a6-6bd5-4a4e-96d9-7723bbdfd023 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Updating instance_info_cache with network_info: [{"id": "179b7da1-efa4-4050-801e-30bd6a2faf74", "address": "fa:16:3e:ee:6e:06", "network": {"id": "d9f8364c-62cb-4b10-886b-40ba4143ca98", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-805477975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f50509a65834d86866517e09320e48b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap179b7da1-ef", "ovs_interfaceid": "179b7da1-efa4-4050-801e-30bd6a2faf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.314 2 DEBUG oslo_concurrency.processutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/45e8ae09-6891-40ac-8d06-222dd16bea27/disk.config 45e8ae09-6891-40ac-8d06-222dd16bea27_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.569s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.315 2 INFO nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Deleting local config drive /var/lib/nova/instances/45e8ae09-6891-40ac-8d06-222dd16bea27/disk.config because it was imported into RBD.#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.340 2 DEBUG oslo_concurrency.lockutils [req-a62af1ef-b48a-4243-a439-a12cf0714c42 req-f95df0a6-6bd5-4a4e-96d9-7723bbdfd023 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-45e8ae09-6891-40ac-8d06-222dd16bea27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:44:35 np0005465988 NetworkManager[45041]: <info>  [1759409075.3755] manager: (tap179b7da1-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/319)
Oct  2 08:44:35 np0005465988 kernel: tap179b7da1-ef: entered promiscuous mode
Oct  2 08:44:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:35Z|00713|binding|INFO|Claiming lport 179b7da1-efa4-4050-801e-30bd6a2faf74 for this chassis.
Oct  2 08:44:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:35Z|00714|binding|INFO|179b7da1-efa4-4050-801e-30bd6a2faf74: Claiming fa:16:3e:ee:6e:06 10.100.0.7
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.395 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:6e:06 10.100.0.7'], port_security=['fa:16:3e:ee:6e:06 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '45e8ae09-6891-40ac-8d06-222dd16bea27', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f8364c-62cb-4b10-886b-40ba4143ca98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f50509a65834d86866517e09320e48b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ecc2b733-872f-4fa9-87d7-0e3f095bc65e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5069afdb-4291-477c-88a1-921849406f34, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=179b7da1-efa4-4050-801e-30bd6a2faf74) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.398 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 179b7da1-efa4-4050-801e-30bd6a2faf74 in datapath d9f8364c-62cb-4b10-886b-40ba4143ca98 bound to our chassis#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.402 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9f8364c-62cb-4b10-886b-40ba4143ca98#033[00m
Oct  2 08:44:35 np0005465988 systemd-machined[192594]: New machine qemu-74-instance-000000a0.
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.420 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c4f7d8b1-6c25-49e1-a17e-aec6a9984f13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.421 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd9f8364c-61 in ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.423 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd9f8364c-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.424 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[41a857f3-8168-4313-8c24-015022e4cb0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.425 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ff3c4165-77a7-49b9-961c-dbc3255544d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 systemd[1]: Started Virtual Machine qemu-74-instance-000000a0.
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.448 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[df3fb6ce-2556-4c32-a077-6b795bc7841d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 systemd-udevd[309410]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:35Z|00715|binding|INFO|Setting lport 179b7da1-efa4-4050-801e-30bd6a2faf74 ovn-installed in OVS
Oct  2 08:44:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:35Z|00716|binding|INFO|Setting lport 179b7da1-efa4-4050-801e-30bd6a2faf74 up in Southbound
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:35 np0005465988 NetworkManager[45041]: <info>  [1759409075.4714] device (tap179b7da1-ef): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:44:35 np0005465988 NetworkManager[45041]: <info>  [1759409075.4728] device (tap179b7da1-ef): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.485 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ceb1c0a3-bd6f-4d58-b6c8-ae051e432991]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.533 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[991f5524-8862-4ee3-b6b4-77957a479104]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 NetworkManager[45041]: <info>  [1759409075.5404] manager: (tapd9f8364c-60): new Veth device (/org/freedesktop/NetworkManager/Devices/320)
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.539 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2d2cb2dc-0c9e-4cb0-9252-984262ef4d84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 systemd-udevd[309413]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:44:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:44:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:35.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.576 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1bdfa04e-d316-4186-8da3-3d3e3a27eed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.580 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8295d528-964a-4670-a703-ed5a097d10f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 NetworkManager[45041]: <info>  [1759409075.5981] device (tapd9f8364c-60): carrier: link connected
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.604 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[a4bcb990-2a40-4d2f-a683-cdb26d3ab1cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.622 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9608e466-740c-46f8-9ac7-583f8e235201]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f8364c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:a8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 713592, 'reachable_time': 40423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309441, 'error': None, 'target': 'ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:44:35 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1677197474' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.640 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6c1f5cd8-c8ba-4da8-9793-6fc82e920512]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feff:a859'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 713592, 'tstamp': 713592}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 309442, 'error': None, 'target': 'ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.657 2 DEBUG oslo_concurrency.processutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.570s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.667 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[86082945-220b-4a55-8a91-e38709d32be2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f8364c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:a8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 713592, 'reachable_time': 40423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 309452, 'error': None, 'target': 'ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.700 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[10b71a79-2bcd-47cc-b489-62c3e8c896ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.719 2 DEBUG nova.virt.libvirt.vif [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:44:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-59400876',display_name='tempest-TestShelveInstance-server-59400876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-59400876',id=161,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOey7OItBT8ky28T+jgqzhjbvGoB+pWZYmixBVLc/rLrtlb2/muXhDo2zj9MYH0P2A6ukY8/c6TiMhKqcmGhKPZ0/ha7STFDDz62rpDlcbzBiZArK4kjT3veuuC9b5czRQ==',key_name='tempest-TestShelveInstance-1912831029',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b8f9114c7ab4b6e9fc9650d4bd08af9',ramdisk_id='',reservation_id='r-iu4ddc0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1219039163',owner_user_name='tempest-TestShelveInstance-1219039163-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:44:24Z,user_data=None,user_id='56c6abe1bb704c8aa499677aeb9017f5',uuid=38b13275-2908-42f3-bb70-73c050f375ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.720 2 DEBUG nova.network.os_vif_util [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converting VIF {"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.721 2 DEBUG nova.network.os_vif_util [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:89:9a,bridge_name='br-int',has_traffic_filtering=True,id=36888ba0-b822-4067-a556-6a12a1136d08,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36888ba0-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.722 2 DEBUG nova.objects.instance [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 38b13275-2908-42f3-bb70-73c050f375ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.742 2 DEBUG nova.compute.manager [req-f97f099f-8826-4078-ad69-660df8cb6329 req-1b855a34-05eb-43aa-8b4e-15731fd60a3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Received event network-vif-plugged-179b7da1-efa4-4050-801e-30bd6a2faf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.743 2 DEBUG oslo_concurrency.lockutils [req-f97f099f-8826-4078-ad69-660df8cb6329 req-1b855a34-05eb-43aa-8b4e-15731fd60a3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.743 2 DEBUG oslo_concurrency.lockutils [req-f97f099f-8826-4078-ad69-660df8cb6329 req-1b855a34-05eb-43aa-8b4e-15731fd60a3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.743 2 DEBUG oslo_concurrency.lockutils [req-f97f099f-8826-4078-ad69-660df8cb6329 req-1b855a34-05eb-43aa-8b4e-15731fd60a3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.744 2 DEBUG nova.compute.manager [req-f97f099f-8826-4078-ad69-660df8cb6329 req-1b855a34-05eb-43aa-8b4e-15731fd60a3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Processing event network-vif-plugged-179b7da1-efa4-4050-801e-30bd6a2faf74 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.747 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  <uuid>38b13275-2908-42f3-bb70-73c050f375ea</uuid>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  <name>instance-000000a1</name>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestShelveInstance-server-59400876</nova:name>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:44:35</nova:creationTime>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <nova:user uuid="56c6abe1bb704c8aa499677aeb9017f5">tempest-TestShelveInstance-1219039163-project-member</nova:user>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <nova:project uuid="4b8f9114c7ab4b6e9fc9650d4bd08af9">tempest-TestShelveInstance-1219039163</nova:project>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <nova:port uuid="36888ba0-b822-4067-a556-6a12a1136d08">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <entry name="serial">38b13275-2908-42f3-bb70-73c050f375ea</entry>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <entry name="uuid">38b13275-2908-42f3-bb70-73c050f375ea</entry>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/38b13275-2908-42f3-bb70-73c050f375ea_disk.config">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-857b1b6f-42d5-4289-a74d-1acb4fd6b032">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <serial>857b1b6f-42d5-4289-a74d-1acb4fd6b032</serial>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:8d:89:9a"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <target dev="tap36888ba0-b8"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/38b13275-2908-42f3-bb70-73c050f375ea/console.log" append="off"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:44:35 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:44:35 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:44:35 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:44:35 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.749 2 DEBUG nova.compute.manager [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Preparing to wait for external event network-vif-plugged-36888ba0-b822-4067-a556-6a12a1136d08 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.749 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "38b13275-2908-42f3-bb70-73c050f375ea-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.749 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.750 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.751 2 DEBUG nova.virt.libvirt.vif [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:44:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-59400876',display_name='tempest-TestShelveInstance-server-59400876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-59400876',id=161,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOey7OItBT8ky28T+jgqzhjbvGoB+pWZYmixBVLc/rLrtlb2/muXhDo2zj9MYH0P2A6ukY8/c6TiMhKqcmGhKPZ0/ha7STFDDz62rpDlcbzBiZArK4kjT3veuuC9b5czRQ==',key_name='tempest-TestShelveInstance-1912831029',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b8f9114c7ab4b6e9fc9650d4bd08af9',ramdisk_id='',reservation_id='r-iu4ddc0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1219039163',owner_user_name='tempest-TestShelveInstance-1219039163-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:44:24Z,user_data=None,user_id='56c6abe1bb704c8aa499677aeb9017f5',uuid=38b13275-2908-42f3-bb70-73c050f375ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.751 2 DEBUG nova.network.os_vif_util [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converting VIF {"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.752 2 DEBUG nova.network.os_vif_util [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:89:9a,bridge_name='br-int',has_traffic_filtering=True,id=36888ba0-b822-4067-a556-6a12a1136d08,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36888ba0-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.753 2 DEBUG os_vif [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:89:9a,bridge_name='br-int',has_traffic_filtering=True,id=36888ba0-b822-4067-a556-6a12a1136d08,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36888ba0-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.754 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.754 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.758 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap36888ba0-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.759 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap36888ba0-b8, col_values=(('external_ids', {'iface-id': '36888ba0-b822-4067-a556-6a12a1136d08', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8d:89:9a', 'vm-uuid': '38b13275-2908-42f3-bb70-73c050f375ea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:35 np0005465988 NetworkManager[45041]: <info>  [1759409075.7614] manager: (tap36888ba0-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/321)
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.772 2 INFO os_vif [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:89:9a,bridge_name='br-int',has_traffic_filtering=True,id=36888ba0-b822-4067-a556-6a12a1136d08,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36888ba0-b8')#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.775 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0d076d73-586f-4e49-8da9-8286ffddee3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.776 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f8364c-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.776 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.777 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9f8364c-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:35 np0005465988 NetworkManager[45041]: <info>  [1759409075.7798] manager: (tapd9f8364c-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Oct  2 08:44:35 np0005465988 kernel: tapd9f8364c-60: entered promiscuous mode
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.784 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9f8364c-60, col_values=(('external_ids', {'iface-id': 'f849a39b-48e4-449e-822d-58c01fd62a3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:35Z|00717|binding|INFO|Releasing lport f849a39b-48e4-449e-822d-58c01fd62a3b from this chassis (sb_readonly=0)
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.803 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f8364c-62cb-4b10-886b-40ba4143ca98.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f8364c-62cb-4b10-886b-40ba4143ca98.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.804 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1aed5290-742b-4ce4-b1e4-822a6bf26ed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.806 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-d9f8364c-62cb-4b10-886b-40ba4143ca98
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/d9f8364c-62cb-4b10-886b-40ba4143ca98.pid.haproxy
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID d9f8364c-62cb-4b10-886b-40ba4143ca98
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:44:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:35.806 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98', 'env', 'PROCESS_TAG=haproxy-d9f8364c-62cb-4b10-886b-40ba4143ca98', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d9f8364c-62cb-4b10-886b-40ba4143ca98.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.857 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.858 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.858 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] No VIF found with MAC fa:16:3e:8d:89:9a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.858 2 INFO nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Using config drive#033[00m
Oct  2 08:44:35 np0005465988 nova_compute[236126]: 2025-10-02 12:44:35.886 2 DEBUG nova.storage.rbd_utils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] rbd image 38b13275-2908-42f3-bb70-73c050f375ea_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:44:36 np0005465988 podman[309541]: 2025-10-02 12:44:36.22254779 +0000 UTC m=+0.066430299 container create 30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:44:36 np0005465988 systemd[1]: Started libpod-conmon-30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac.scope.
Oct  2 08:44:36 np0005465988 podman[309541]: 2025-10-02 12:44:36.186590669 +0000 UTC m=+0.030473238 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.279 2 DEBUG nova.compute.manager [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.281 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409076.279219, 45e8ae09-6891-40ac-8d06-222dd16bea27 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.281 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] VM Started (Lifecycle Event)#033[00m
Oct  2 08:44:36 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.285 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:44:36 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8614b7b183f9691ac7113761752912b3f93b18f6cd4ab5e5807801e8fe460bb7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.292 2 INFO nova.virt.libvirt.driver [-] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Instance spawned successfully.#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.294 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:44:36 np0005465988 podman[309541]: 2025-10-02 12:44:36.312129206 +0000 UTC m=+0.156011765 container init 30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:44:36 np0005465988 podman[309541]: 2025-10-02 12:44:36.320302139 +0000 UTC m=+0.164184648 container start 30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:44:36 np0005465988 neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98[309556]: [NOTICE]   (309560) : New worker (309562) forked
Oct  2 08:44:36 np0005465988 neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98[309556]: [NOTICE]   (309560) : Loading success.
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.387 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.395 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.395 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.396 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.396 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.397 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.397 2 DEBUG nova.virt.libvirt.driver [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.400 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.448 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.448 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409076.2805297, 45e8ae09-6891-40ac-8d06-222dd16bea27 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.449 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.544 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.548 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409076.2846148, 45e8ae09-6891-40ac-8d06-222dd16bea27 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.548 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.579 2 INFO nova.compute.manager [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Took 13.41 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.580 2 DEBUG nova.compute.manager [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.605 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.609 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.719 2 INFO nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Creating config drive at /var/lib/nova/instances/38b13275-2908-42f3-bb70-73c050f375ea/disk.config#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.723 2 DEBUG oslo_concurrency.processutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/38b13275-2908-42f3-bb70-73c050f375ea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4hnqlvjf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.782 2 DEBUG nova.network.neutron [req-9f0bf6c7-8a0c-4d07-9433-de8056c3ac5b req-cee8ad1e-709d-4a98-802c-70bbbf96fd76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Updated VIF entry in instance network info cache for port 36888ba0-b822-4067-a556-6a12a1136d08. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.783 2 DEBUG nova.network.neutron [req-9f0bf6c7-8a0c-4d07-9433-de8056c3ac5b req-cee8ad1e-709d-4a98-802c-70bbbf96fd76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Updating instance_info_cache with network_info: [{"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.803 2 INFO nova.compute.manager [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Took 17.36 seconds to build instance.#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.806 2 DEBUG oslo_concurrency.lockutils [req-9f0bf6c7-8a0c-4d07-9433-de8056c3ac5b req-cee8ad1e-709d-4a98-802c-70bbbf96fd76 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.895 2 DEBUG oslo_concurrency.processutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/38b13275-2908-42f3-bb70-73c050f375ea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4hnqlvjf" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.938 2 DEBUG nova.storage.rbd_utils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] rbd image 38b13275-2908-42f3-bb70-73c050f375ea_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:44:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:36.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.946 2 DEBUG oslo_concurrency.processutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/38b13275-2908-42f3-bb70-73c050f375ea/disk.config 38b13275-2908-42f3-bb70-73c050f375ea_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:36 np0005465988 nova_compute[236126]: 2025-10-02 12:44:36.994 2 DEBUG oslo_concurrency.lockutils [None req-323eccaa-d882-4d40-9763-e203f69620d2 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.505 2 DEBUG oslo_concurrency.processutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/38b13275-2908-42f3-bb70-73c050f375ea/disk.config 38b13275-2908-42f3-bb70-73c050f375ea_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.559s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.506 2 INFO nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Deleting local config drive /var/lib/nova/instances/38b13275-2908-42f3-bb70-73c050f375ea/disk.config because it was imported into RBD.#033[00m
Oct  2 08:44:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:37.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:37 np0005465988 kernel: tap36888ba0-b8: entered promiscuous mode
Oct  2 08:44:37 np0005465988 NetworkManager[45041]: <info>  [1759409077.5765] manager: (tap36888ba0-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/323)
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:37 np0005465988 systemd-udevd[309431]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:37Z|00718|binding|INFO|Claiming lport 36888ba0-b822-4067-a556-6a12a1136d08 for this chassis.
Oct  2 08:44:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:37Z|00719|binding|INFO|36888ba0-b822-4067-a556-6a12a1136d08: Claiming fa:16:3e:8d:89:9a 10.100.0.12
Oct  2 08:44:37 np0005465988 NetworkManager[45041]: <info>  [1759409077.5975] device (tap36888ba0-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:44:37 np0005465988 NetworkManager[45041]: <info>  [1759409077.5992] device (tap36888ba0-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.619 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:89:9a 10.100.0.12'], port_security=['fa:16:3e:8d:89:9a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '38b13275-2908-42f3-bb70-73c050f375ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8f9114c7ab4b6e9fc9650d4bd08af9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cfad1590-a7c0-4c27-a7db-b88ec54c64dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04a89c39-8141-4654-8368-c858180215b3, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=36888ba0-b822-4067-a556-6a12a1136d08) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.621 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 36888ba0-b822-4067-a556-6a12a1136d08 in datapath 6ea0a90a-9528-4fe1-8b35-dfde9b35e85f bound to our chassis#033[00m
Oct  2 08:44:37 np0005465988 systemd-machined[192594]: New machine qemu-75-instance-000000a1.
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.629 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6ea0a90a-9528-4fe1-8b35-dfde9b35e85f#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.644 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6fcf8254-3ae5-4a2f-a553-c320d7729621]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.645 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6ea0a90a-91 in ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:44:37 np0005465988 systemd[1]: Started Virtual Machine qemu-75-instance-000000a1.
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:37Z|00720|binding|INFO|Setting lport 36888ba0-b822-4067-a556-6a12a1136d08 ovn-installed in OVS
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.650 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6ea0a90a-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.650 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[671b4e1e-a894-4a0e-bf70-09afdd2ea815]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.656 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c4973dff-ff80-421e-97a2-a7c73f77f2ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:44:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:37Z|00721|binding|INFO|Setting lport 36888ba0-b822-4067-a556-6a12a1136d08 up in Southbound
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.670 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[e2260daa-e6bb-43fa-9183-81a63611cf3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.692 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[85645cf3-6350-4168-9df4-236949e69734]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.724 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[6e7e7782-09cc-4cac-9dfc-5718a20de3c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 NetworkManager[45041]: <info>  [1759409077.7320] manager: (tap6ea0a90a-90): new Veth device (/org/freedesktop/NetworkManager/Devices/324)
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.731 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b3cf28-cc6f-4775-a0ad-72e2c719c5ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.764 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5845e0b4-3641-4fc6-84fb-37e1c9010cac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.768 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[14f372df-0851-45d3-9f81-f9e9e518617c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 NetworkManager[45041]: <info>  [1759409077.7952] device (tap6ea0a90a-90): carrier: link connected
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.799 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[70c6f180-bb55-4d53-ba8a-98371cf2a16e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.816 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[eed7c1a8-b5a6-4593-a0c6-a232f0cbe157]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ea0a90a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:92:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 212], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 713812, 'reachable_time': 26225, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309651, 'error': None, 'target': 'ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.835 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2c948d15-4770-4f47-8a12-25cca4b0a617]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe67:9244'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 713812, 'tstamp': 713812}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 309652, 'error': None, 'target': 'ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.853 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f36b7f67-dd6b-4040-961c-526d95b98c75]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ea0a90a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:92:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 212], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 713812, 'reachable_time': 26225, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 309661, 'error': None, 'target': 'ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.884 2 DEBUG nova.compute.manager [req-41e213cd-ddb0-4229-99a1-093247e23bbb req-d1fc5263-d774-4722-874c-5a79c992b203 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Received event network-vif-plugged-179b7da1-efa4-4050-801e-30bd6a2faf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.884 2 DEBUG oslo_concurrency.lockutils [req-41e213cd-ddb0-4229-99a1-093247e23bbb req-d1fc5263-d774-4722-874c-5a79c992b203 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.884 2 DEBUG oslo_concurrency.lockutils [req-41e213cd-ddb0-4229-99a1-093247e23bbb req-d1fc5263-d774-4722-874c-5a79c992b203 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.885 2 DEBUG oslo_concurrency.lockutils [req-41e213cd-ddb0-4229-99a1-093247e23bbb req-d1fc5263-d774-4722-874c-5a79c992b203 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.885 2 DEBUG nova.compute.manager [req-41e213cd-ddb0-4229-99a1-093247e23bbb req-d1fc5263-d774-4722-874c-5a79c992b203 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] No waiting events found dispatching network-vif-plugged-179b7da1-efa4-4050-801e-30bd6a2faf74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.885 2 WARNING nova.compute.manager [req-41e213cd-ddb0-4229-99a1-093247e23bbb req-d1fc5263-d774-4722-874c-5a79c992b203 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Received unexpected event network-vif-plugged-179b7da1-efa4-4050-801e-30bd6a2faf74 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.893 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[30a37752-b39e-45e2-95d2-6526e5ed91af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.980 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f5f0918e-83fd-44a4-a652-bde14c6a4d86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.981 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ea0a90a-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.981 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.982 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ea0a90a-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:37 np0005465988 NetworkManager[45041]: <info>  [1759409077.9849] manager: (tap6ea0a90a-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Oct  2 08:44:37 np0005465988 kernel: tap6ea0a90a-90: entered promiscuous mode
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:37.990 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6ea0a90a-90, col_values=(('external_ids', {'iface-id': '3850aa59-d3b6-4277-b937-ad9f4b8f7b4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:37 np0005465988 nova_compute[236126]: 2025-10-02 12:44:37.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:37Z|00722|binding|INFO|Releasing lport 3850aa59-d3b6-4277-b937-ad9f4b8f7b4c from this chassis (sb_readonly=0)
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.003 2 DEBUG nova.compute.manager [req-375250fe-20f6-4b17-bc3d-9abde029cfc5 req-2b363009-2223-4d43-ad35-73a140065d6d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Received event network-vif-plugged-36888ba0-b822-4067-a556-6a12a1136d08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.003 2 DEBUG oslo_concurrency.lockutils [req-375250fe-20f6-4b17-bc3d-9abde029cfc5 req-2b363009-2223-4d43-ad35-73a140065d6d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "38b13275-2908-42f3-bb70-73c050f375ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.004 2 DEBUG oslo_concurrency.lockutils [req-375250fe-20f6-4b17-bc3d-9abde029cfc5 req-2b363009-2223-4d43-ad35-73a140065d6d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.004 2 DEBUG oslo_concurrency.lockutils [req-375250fe-20f6-4b17-bc3d-9abde029cfc5 req-2b363009-2223-4d43-ad35-73a140065d6d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.005 2 DEBUG nova.compute.manager [req-375250fe-20f6-4b17-bc3d-9abde029cfc5 req-2b363009-2223-4d43-ad35-73a140065d6d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Processing event network-vif-plugged-36888ba0-b822-4067-a556-6a12a1136d08 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:38.017 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6ea0a90a-9528-4fe1-8b35-dfde9b35e85f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6ea0a90a-9528-4fe1-8b35-dfde9b35e85f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:38.018 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad79f1a-541c-4c77-b35d-a34c0304ec29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:38.018 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/6ea0a90a-9528-4fe1-8b35-dfde9b35e85f.pid.haproxy
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 6ea0a90a-9528-4fe1-8b35-dfde9b35e85f
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:44:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:38.020 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'env', 'PROCESS_TAG=haproxy-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6ea0a90a-9528-4fe1-8b35-dfde9b35e85f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.413 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409078.4127853, 38b13275-2908-42f3-bb70-73c050f375ea => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.414 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] VM Started (Lifecycle Event)#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.418 2 DEBUG nova.compute.manager [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.434 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.439 2 INFO nova.virt.libvirt.driver [-] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Instance spawned successfully.#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.440 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.450 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.455 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.533 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.533 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.534 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.534 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.535 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.535 2 DEBUG nova.virt.libvirt.driver [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.539 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.540 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409078.4131892, 38b13275-2908-42f3-bb70-73c050f375ea => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.540 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:44:38 np0005465988 podman[309717]: 2025-10-02 12:44:38.560407584 +0000 UTC m=+0.051685960 container create 8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.596 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.606 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409078.4335456, 38b13275-2908-42f3-bb70-73c050f375ea => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.606 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:44:38 np0005465988 podman[309717]: 2025-10-02 12:44:38.535267779 +0000 UTC m=+0.026546165 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.627 2 INFO nova.compute.manager [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Took 11.92 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.628 2 DEBUG nova.compute.manager [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.629 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:38 np0005465988 systemd[1]: Started libpod-conmon-8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26.scope.
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.640 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:44:38 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:44:38 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7670e396a6b368ee39fc57bc5e2459251b0c8d1576027628cfd9511e9b8534f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.677 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:44:38 np0005465988 podman[309739]: 2025-10-02 12:44:38.679684474 +0000 UTC m=+0.071391380 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:44:38 np0005465988 podman[309717]: 2025-10-02 12:44:38.686454426 +0000 UTC m=+0.177732822 container init 8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:44:38 np0005465988 podman[309731]: 2025-10-02 12:44:38.688491414 +0000 UTC m=+0.085462860 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  2 08:44:38 np0005465988 podman[309717]: 2025-10-02 12:44:38.69188352 +0000 UTC m=+0.183161896 container start 8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:44:38 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[309766]: [NOTICE]   (309794) : New worker (309797) forked
Oct  2 08:44:38 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[309766]: [NOTICE]   (309794) : Loading success.
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.713 2 INFO nova.compute.manager [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Took 17.55 seconds to build instance.#033[00m
Oct  2 08:44:38 np0005465988 podman[309728]: 2025-10-02 12:44:38.713888926 +0000 UTC m=+0.120029162 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:44:38 np0005465988 nova_compute[236126]: 2025-10-02 12:44:38.731 2 DEBUG oslo_concurrency.lockutils [None req-7bc55d47-3a8c-40f1-8296-8bc55e9d1932 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:38.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.230 2 DEBUG oslo_concurrency.lockutils [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Acquiring lock "45e8ae09-6891-40ac-8d06-222dd16bea27" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.230 2 DEBUG oslo_concurrency.lockutils [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.231 2 DEBUG oslo_concurrency.lockutils [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Acquiring lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.231 2 DEBUG oslo_concurrency.lockutils [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.231 2 DEBUG oslo_concurrency.lockutils [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.232 2 INFO nova.compute.manager [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Terminating instance#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.234 2 DEBUG nova.compute.manager [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:44:39 np0005465988 kernel: tap179b7da1-ef (unregistering): left promiscuous mode
Oct  2 08:44:39 np0005465988 NetworkManager[45041]: <info>  [1759409079.3771] device (tap179b7da1-ef): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:44:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:39Z|00723|binding|INFO|Releasing lport 179b7da1-efa4-4050-801e-30bd6a2faf74 from this chassis (sb_readonly=0)
Oct  2 08:44:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:39Z|00724|binding|INFO|Setting lport 179b7da1-efa4-4050-801e-30bd6a2faf74 down in Southbound
Oct  2 08:44:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:39Z|00725|binding|INFO|Removing iface tap179b7da1-ef ovn-installed in OVS
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:39 np0005465988 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000a0.scope: Deactivated successfully.
Oct  2 08:44:39 np0005465988 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000a0.scope: Consumed 3.739s CPU time.
Oct  2 08:44:39 np0005465988 systemd-machined[192594]: Machine qemu-74-instance-000000a0 terminated.
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.528 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:6e:06 10.100.0.7'], port_security=['fa:16:3e:ee:6e:06 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '45e8ae09-6891-40ac-8d06-222dd16bea27', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f8364c-62cb-4b10-886b-40ba4143ca98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f50509a65834d86866517e09320e48b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ecc2b733-872f-4fa9-87d7-0e3f095bc65e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5069afdb-4291-477c-88a1-921849406f34, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=179b7da1-efa4-4050-801e-30bd6a2faf74) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.529 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 179b7da1-efa4-4050-801e-30bd6a2faf74 in datapath d9f8364c-62cb-4b10-886b-40ba4143ca98 unbound from our chassis#033[00m
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.531 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9f8364c-62cb-4b10-886b-40ba4143ca98, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.532 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d87ffd80-cd42-4e70-bcaa-c4b2b88b951b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.532 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98 namespace which is not needed anymore#033[00m
Oct  2 08:44:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:44:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:39.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.723 2 INFO nova.virt.libvirt.driver [-] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Instance destroyed successfully.#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.724 2 DEBUG nova.objects.instance [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lazy-loading 'resources' on Instance uuid 45e8ae09-6891-40ac-8d06-222dd16bea27 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:44:39 np0005465988 neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98[309556]: [NOTICE]   (309560) : haproxy version is 2.8.14-c23fe91
Oct  2 08:44:39 np0005465988 neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98[309556]: [NOTICE]   (309560) : path to executable is /usr/sbin/haproxy
Oct  2 08:44:39 np0005465988 neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98[309556]: [WARNING]  (309560) : Exiting Master process...
Oct  2 08:44:39 np0005465988 neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98[309556]: [WARNING]  (309560) : Exiting Master process...
Oct  2 08:44:39 np0005465988 neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98[309556]: [ALERT]    (309560) : Current worker (309562) exited with code 143 (Terminated)
Oct  2 08:44:39 np0005465988 neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98[309556]: [WARNING]  (309560) : All workers exited. Exiting... (0)
Oct  2 08:44:39 np0005465988 systemd[1]: libpod-30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac.scope: Deactivated successfully.
Oct  2 08:44:39 np0005465988 podman[309835]: 2025-10-02 12:44:39.777884326 +0000 UTC m=+0.051374412 container died 30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.792 2 DEBUG nova.virt.libvirt.vif [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:44:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1475816137',display_name='tempest-ServerGroupTestJSON-server-1475816137',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1475816137',id=160,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:44:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1f50509a65834d86866517e09320e48b',ramdisk_id='',reservation_id='r-gnaugn5n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerGroupTestJSON-1031786497',owner_user_name='tempest-ServerGroupTestJSON-1031786497-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:44:36Z,user_data=None,user_id='57d85bd790b540cd81dc4d2ab9e6fb13',uuid=45e8ae09-6891-40ac-8d06-222dd16bea27,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "179b7da1-efa4-4050-801e-30bd6a2faf74", "address": "fa:16:3e:ee:6e:06", "network": {"id": "d9f8364c-62cb-4b10-886b-40ba4143ca98", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-805477975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f50509a65834d86866517e09320e48b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap179b7da1-ef", "ovs_interfaceid": "179b7da1-efa4-4050-801e-30bd6a2faf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.793 2 DEBUG nova.network.os_vif_util [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Converting VIF {"id": "179b7da1-efa4-4050-801e-30bd6a2faf74", "address": "fa:16:3e:ee:6e:06", "network": {"id": "d9f8364c-62cb-4b10-886b-40ba4143ca98", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-805477975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1f50509a65834d86866517e09320e48b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap179b7da1-ef", "ovs_interfaceid": "179b7da1-efa4-4050-801e-30bd6a2faf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.794 2 DEBUG nova.network.os_vif_util [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:6e:06,bridge_name='br-int',has_traffic_filtering=True,id=179b7da1-efa4-4050-801e-30bd6a2faf74,network=Network(d9f8364c-62cb-4b10-886b-40ba4143ca98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap179b7da1-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.794 2 DEBUG os_vif [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:6e:06,bridge_name='br-int',has_traffic_filtering=True,id=179b7da1-efa4-4050-801e-30bd6a2faf74,network=Network(d9f8364c-62cb-4b10-886b-40ba4143ca98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap179b7da1-ef') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.799 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap179b7da1-ef, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.806 2 INFO os_vif [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:6e:06,bridge_name='br-int',has_traffic_filtering=True,id=179b7da1-efa4-4050-801e-30bd6a2faf74,network=Network(d9f8364c-62cb-4b10-886b-40ba4143ca98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap179b7da1-ef')#033[00m
Oct  2 08:44:39 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac-userdata-shm.mount: Deactivated successfully.
Oct  2 08:44:39 np0005465988 systemd[1]: var-lib-containers-storage-overlay-8614b7b183f9691ac7113761752912b3f93b18f6cd4ab5e5807801e8fe460bb7-merged.mount: Deactivated successfully.
Oct  2 08:44:39 np0005465988 podman[309835]: 2025-10-02 12:44:39.83715442 +0000 UTC m=+0.110644506 container cleanup 30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:44:39 np0005465988 systemd[1]: libpod-conmon-30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac.scope: Deactivated successfully.
Oct  2 08:44:39 np0005465988 podman[309882]: 2025-10-02 12:44:39.904664219 +0000 UTC m=+0.042166870 container remove 30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.911 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[04d6d8ad-505e-4ef8-9f5f-869fd8605fac]: (4, ('Thu Oct  2 12:44:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98 (30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac)\n30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac\nThu Oct  2 12:44:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98 (30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac)\n30a169550b127fedae4f72e94472483d052dc049878b0eecb38cc92b66cf90ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.913 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8912d8ac-1c70-4b29-9c99-4d82cd2eb67d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.914 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f8364c-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:39 np0005465988 kernel: tapd9f8364c-60: left promiscuous mode
Oct  2 08:44:39 np0005465988 nova_compute[236126]: 2025-10-02 12:44:39.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.932 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c7819d-63d8-462c-b906-ad108e6e9ba7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.968 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2d685235-1f92-40d9-9f90-1979dd8b9622]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.970 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[65ac259e-01b4-4287-93e8-707939fe31ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.992 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bf001d59-94a4-44de-a2db-4b538e2d4f2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 713585, 'reachable_time': 15620, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309900, 'error': None, 'target': 'ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:39 np0005465988 systemd[1]: run-netns-ovnmeta\x2dd9f8364c\x2d62cb\x2d4b10\x2d886b\x2d40ba4143ca98.mount: Deactivated successfully.
Oct  2 08:44:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.999 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d9f8364c-62cb-4b10-886b-40ba4143ca98 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:44:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:44:39.999 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[15915f3e-3055-4888-8cb9-e37785b207ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.060 2 DEBUG nova.compute.manager [req-aa213de4-f52e-4d0c-bfaa-1b2d254be8ef req-1a3e02c5-6e8d-4910-bf38-37a47f603704 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Received event network-vif-unplugged-179b7da1-efa4-4050-801e-30bd6a2faf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.061 2 DEBUG oslo_concurrency.lockutils [req-aa213de4-f52e-4d0c-bfaa-1b2d254be8ef req-1a3e02c5-6e8d-4910-bf38-37a47f603704 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.062 2 DEBUG oslo_concurrency.lockutils [req-aa213de4-f52e-4d0c-bfaa-1b2d254be8ef req-1a3e02c5-6e8d-4910-bf38-37a47f603704 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.062 2 DEBUG oslo_concurrency.lockutils [req-aa213de4-f52e-4d0c-bfaa-1b2d254be8ef req-1a3e02c5-6e8d-4910-bf38-37a47f603704 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.063 2 DEBUG nova.compute.manager [req-aa213de4-f52e-4d0c-bfaa-1b2d254be8ef req-1a3e02c5-6e8d-4910-bf38-37a47f603704 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] No waiting events found dispatching network-vif-unplugged-179b7da1-efa4-4050-801e-30bd6a2faf74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.064 2 DEBUG nova.compute.manager [req-aa213de4-f52e-4d0c-bfaa-1b2d254be8ef req-1a3e02c5-6e8d-4910-bf38-37a47f603704 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Received event network-vif-unplugged-179b7da1-efa4-4050-801e-30bd6a2faf74 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.104 2 DEBUG nova.compute.manager [req-dc291d60-e450-48f6-a304-4e30b233cd98 req-82172791-c3b4-4360-a876-b08eeff5f18a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Received event network-vif-plugged-36888ba0-b822-4067-a556-6a12a1136d08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.105 2 DEBUG oslo_concurrency.lockutils [req-dc291d60-e450-48f6-a304-4e30b233cd98 req-82172791-c3b4-4360-a876-b08eeff5f18a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "38b13275-2908-42f3-bb70-73c050f375ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.105 2 DEBUG oslo_concurrency.lockutils [req-dc291d60-e450-48f6-a304-4e30b233cd98 req-82172791-c3b4-4360-a876-b08eeff5f18a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.106 2 DEBUG oslo_concurrency.lockutils [req-dc291d60-e450-48f6-a304-4e30b233cd98 req-82172791-c3b4-4360-a876-b08eeff5f18a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.106 2 DEBUG nova.compute.manager [req-dc291d60-e450-48f6-a304-4e30b233cd98 req-82172791-c3b4-4360-a876-b08eeff5f18a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] No waiting events found dispatching network-vif-plugged-36888ba0-b822-4067-a556-6a12a1136d08 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:44:40 np0005465988 nova_compute[236126]: 2025-10-02 12:44:40.106 2 WARNING nova.compute.manager [req-dc291d60-e450-48f6-a304-4e30b233cd98 req-82172791-c3b4-4360-a876-b08eeff5f18a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Received unexpected event network-vif-plugged-36888ba0-b822-4067-a556-6a12a1136d08 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:44:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:40.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:41.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:42 np0005465988 nova_compute[236126]: 2025-10-02 12:44:42.201 2 DEBUG nova.compute.manager [req-f44ff560-b343-4f54-8325-29eca08eed4d req-3fcec8d0-ccf1-4418-9c1b-674659fafa41 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Received event network-vif-plugged-179b7da1-efa4-4050-801e-30bd6a2faf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:42 np0005465988 nova_compute[236126]: 2025-10-02 12:44:42.204 2 DEBUG oslo_concurrency.lockutils [req-f44ff560-b343-4f54-8325-29eca08eed4d req-3fcec8d0-ccf1-4418-9c1b-674659fafa41 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:42 np0005465988 nova_compute[236126]: 2025-10-02 12:44:42.204 2 DEBUG oslo_concurrency.lockutils [req-f44ff560-b343-4f54-8325-29eca08eed4d req-3fcec8d0-ccf1-4418-9c1b-674659fafa41 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:42 np0005465988 nova_compute[236126]: 2025-10-02 12:44:42.205 2 DEBUG oslo_concurrency.lockutils [req-f44ff560-b343-4f54-8325-29eca08eed4d req-3fcec8d0-ccf1-4418-9c1b-674659fafa41 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:42 np0005465988 nova_compute[236126]: 2025-10-02 12:44:42.206 2 DEBUG nova.compute.manager [req-f44ff560-b343-4f54-8325-29eca08eed4d req-3fcec8d0-ccf1-4418-9c1b-674659fafa41 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] No waiting events found dispatching network-vif-plugged-179b7da1-efa4-4050-801e-30bd6a2faf74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:44:42 np0005465988 nova_compute[236126]: 2025-10-02 12:44:42.206 2 WARNING nova.compute.manager [req-f44ff560-b343-4f54-8325-29eca08eed4d req-3fcec8d0-ccf1-4418-9c1b-674659fafa41 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Received unexpected event network-vif-plugged-179b7da1-efa4-4050-801e-30bd6a2faf74 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:44:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:42.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:43.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:44:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:44:43 np0005465988 nova_compute[236126]: 2025-10-02 12:44:43.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:43 np0005465988 NetworkManager[45041]: <info>  [1759409083.8652] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Oct  2 08:44:43 np0005465988 NetworkManager[45041]: <info>  [1759409083.8661] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/327)
Oct  2 08:44:43 np0005465988 nova_compute[236126]: 2025-10-02 12:44:43.887 2 INFO nova.virt.libvirt.driver [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Deleting instance files /var/lib/nova/instances/45e8ae09-6891-40ac-8d06-222dd16bea27_del#033[00m
Oct  2 08:44:43 np0005465988 nova_compute[236126]: 2025-10-02 12:44:43.888 2 INFO nova.virt.libvirt.driver [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Deletion of /var/lib/nova/instances/45e8ae09-6891-40ac-8d06-222dd16bea27_del complete#033[00m
Oct  2 08:44:43 np0005465988 nova_compute[236126]: 2025-10-02 12:44:43.948 2 INFO nova.compute.manager [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Took 4.71 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:44:43 np0005465988 nova_compute[236126]: 2025-10-02 12:44:43.949 2 DEBUG oslo.service.loopingcall [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:44:43 np0005465988 nova_compute[236126]: 2025-10-02 12:44:43.950 2 DEBUG nova.compute.manager [-] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:44:43 np0005465988 nova_compute[236126]: 2025-10-02 12:44:43.950 2 DEBUG nova.network.neutron [-] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:44:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:44 np0005465988 nova_compute[236126]: 2025-10-02 12:44:44.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:44 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:44Z|00726|binding|INFO|Releasing lport 3850aa59-d3b6-4277-b937-ad9f4b8f7b4c from this chassis (sb_readonly=0)
Oct  2 08:44:44 np0005465988 nova_compute[236126]: 2025-10-02 12:44:44.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:44 np0005465988 nova_compute[236126]: 2025-10-02 12:44:44.317 2 DEBUG nova.compute.manager [req-34528d31-95f4-4760-bbcd-b81234884270 req-dda2133a-5446-4e42-9c62-ba7d626e5ff4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Received event network-changed-36888ba0-b822-4067-a556-6a12a1136d08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:44 np0005465988 nova_compute[236126]: 2025-10-02 12:44:44.318 2 DEBUG nova.compute.manager [req-34528d31-95f4-4760-bbcd-b81234884270 req-dda2133a-5446-4e42-9c62-ba7d626e5ff4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Refreshing instance network info cache due to event network-changed-36888ba0-b822-4067-a556-6a12a1136d08. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:44:44 np0005465988 nova_compute[236126]: 2025-10-02 12:44:44.319 2 DEBUG oslo_concurrency.lockutils [req-34528d31-95f4-4760-bbcd-b81234884270 req-dda2133a-5446-4e42-9c62-ba7d626e5ff4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:44:44 np0005465988 nova_compute[236126]: 2025-10-02 12:44:44.319 2 DEBUG oslo_concurrency.lockutils [req-34528d31-95f4-4760-bbcd-b81234884270 req-dda2133a-5446-4e42-9c62-ba7d626e5ff4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:44:44 np0005465988 nova_compute[236126]: 2025-10-02 12:44:44.319 2 DEBUG nova.network.neutron [req-34528d31-95f4-4760-bbcd-b81234884270 req-dda2133a-5446-4e42-9c62-ba7d626e5ff4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Refreshing network info cache for port 36888ba0-b822-4067-a556-6a12a1136d08 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:44:44 np0005465988 nova_compute[236126]: 2025-10-02 12:44:44.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:44 np0005465988 nova_compute[236126]: 2025-10-02 12:44:44.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:44.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.212 2 DEBUG nova.network.neutron [-] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.236 2 INFO nova.compute.manager [-] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Took 1.29 seconds to deallocate network for instance.#033[00m
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.290 2 DEBUG nova.compute.manager [req-85761838-f519-487f-b82c-001e898bae3b req-ae6b345c-2df7-4c4c-8dc5-b2c788eb938e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Received event network-vif-deleted-179b7da1-efa4-4050-801e-30bd6a2faf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.292 2 DEBUG oslo_concurrency.lockutils [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.293 2 DEBUG oslo_concurrency.lockutils [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.367 2 DEBUG oslo_concurrency.processutils [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:45.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:44:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1346795661' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.825 2 DEBUG oslo_concurrency.processutils [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.832 2 DEBUG nova.compute.provider_tree [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.858 2 DEBUG nova.scheduler.client.report [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.884 2 DEBUG oslo_concurrency.lockutils [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.910 2 INFO nova.scheduler.client.report [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Deleted allocations for instance 45e8ae09-6891-40ac-8d06-222dd16bea27#033[00m
Oct  2 08:44:45 np0005465988 nova_compute[236126]: 2025-10-02 12:44:45.989 2 DEBUG oslo_concurrency.lockutils [None req-5aee13f6-4e4d-45e7-90b4-98c795e23d37 57d85bd790b540cd81dc4d2ab9e6fb13 1f50509a65834d86866517e09320e48b - - default default] Lock "45e8ae09-6891-40ac-8d06-222dd16bea27" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:46 np0005465988 nova_compute[236126]: 2025-10-02 12:44:46.143 2 DEBUG nova.network.neutron [req-34528d31-95f4-4760-bbcd-b81234884270 req-dda2133a-5446-4e42-9c62-ba7d626e5ff4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Updated VIF entry in instance network info cache for port 36888ba0-b822-4067-a556-6a12a1136d08. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:44:46 np0005465988 nova_compute[236126]: 2025-10-02 12:44:46.144 2 DEBUG nova.network.neutron [req-34528d31-95f4-4760-bbcd-b81234884270 req-dda2133a-5446-4e42-9c62-ba7d626e5ff4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Updating instance_info_cache with network_info: [{"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:44:46 np0005465988 nova_compute[236126]: 2025-10-02 12:44:46.166 2 DEBUG oslo_concurrency.lockutils [req-34528d31-95f4-4760-bbcd-b81234884270 req-dda2133a-5446-4e42-9c62-ba7d626e5ff4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:44:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:44:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:46.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:44:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:47.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:48.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:49.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:49 np0005465988 nova_compute[236126]: 2025-10-02 12:44:49.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:49 np0005465988 nova_compute[236126]: 2025-10-02 12:44:49.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:50.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:51Z|00727|binding|INFO|Releasing lport 3850aa59-d3b6-4277-b937-ad9f4b8f7b4c from this chassis (sb_readonly=0)
Oct  2 08:44:51 np0005465988 nova_compute[236126]: 2025-10-02 12:44:51.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:51.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:52 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:52Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8d:89:9a 10.100.0.12
Oct  2 08:44:52 np0005465988 ovn_controller[132601]: 2025-10-02T12:44:52Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8d:89:9a 10.100.0.12
Oct  2 08:44:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:52.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:44:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:53.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:44:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:54 np0005465988 nova_compute[236126]: 2025-10-02 12:44:54.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:54 np0005465988 nova_compute[236126]: 2025-10-02 12:44:54.712 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409079.7106571, 45e8ae09-6891-40ac-8d06-222dd16bea27 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:54 np0005465988 nova_compute[236126]: 2025-10-02 12:44:54.713 2 INFO nova.compute.manager [-] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:44:54 np0005465988 nova_compute[236126]: 2025-10-02 12:44:54.768 2 DEBUG nova.compute.manager [None req-c52fc546-9685-4717-b29c-eb40ee82fbe3 - - - - - -] [instance: 45e8ae09-6891-40ac-8d06-222dd16bea27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:54 np0005465988 nova_compute[236126]: 2025-10-02 12:44:54.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:54.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:55 np0005465988 podman[310034]: 2025-10-02 12:44:55.522490391 +0000 UTC m=+0.053015308 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:44:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:44:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:55.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:44:56 np0005465988 nova_compute[236126]: 2025-10-02 12:44:56.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:56.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:57.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:44:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:44:58.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:44:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:44:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:44:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:44:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:44:59.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:44:59 np0005465988 nova_compute[236126]: 2025-10-02 12:44:59.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:59 np0005465988 nova_compute[236126]: 2025-10-02 12:44:59.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:00 np0005465988 nova_compute[236126]: 2025-10-02 12:45:00.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:00.017 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:45:00 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:00.018 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:45:00 np0005465988 nova_compute[236126]: 2025-10-02 12:45:00.612 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:00 np0005465988 nova_compute[236126]: 2025-10-02 12:45:00.911 2 DEBUG oslo_concurrency.lockutils [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "38b13275-2908-42f3-bb70-73c050f375ea" by "nova.compute.manager.ComputeManager.shelve_offload_instance.<locals>.do_shelve_offload_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:00 np0005465988 nova_compute[236126]: 2025-10-02 12:45:00.911 2 DEBUG oslo_concurrency.lockutils [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea" acquired by "nova.compute.manager.ComputeManager.shelve_offload_instance.<locals>.do_shelve_offload_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:00 np0005465988 nova_compute[236126]: 2025-10-02 12:45:00.912 2 INFO nova.compute.manager [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Shelve offloading#033[00m
Oct  2 08:45:00 np0005465988 nova_compute[236126]: 2025-10-02 12:45:00.941 2 DEBUG nova.virt.libvirt.driver [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:45:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:00.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:01.020 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:01 np0005465988 nova_compute[236126]: 2025-10-02 12:45:01.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:01 np0005465988 nova_compute[236126]: 2025-10-02 12:45:01.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:01 np0005465988 nova_compute[236126]: 2025-10-02 12:45:01.512 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:01 np0005465988 nova_compute[236126]: 2025-10-02 12:45:01.513 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:01 np0005465988 nova_compute[236126]: 2025-10-02 12:45:01.513 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:01 np0005465988 nova_compute[236126]: 2025-10-02 12:45:01.513 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:45:01 np0005465988 nova_compute[236126]: 2025-10-02 12:45:01.514 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:45:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 08:45:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:01.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 08:45:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:45:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2562379280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:45:01 np0005465988 nova_compute[236126]: 2025-10-02 12:45:01.992 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.093 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.093 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.247 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.248 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3983MB free_disk=20.942489624023438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.248 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.248 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.332 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 38b13275-2908-42f3-bb70-73c050f375ea actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.332 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.333 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.367 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:45:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4084718820' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.914 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.547s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:45:02 np0005465988 nova_compute[236126]: 2025-10-02 12:45:02.920 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:45:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:02.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.049 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:45:03 np0005465988 kernel: tap36888ba0-b8 (unregistering): left promiscuous mode
Oct  2 08:45:03 np0005465988 NetworkManager[45041]: <info>  [1759409103.2318] device (tap36888ba0-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:45:03Z|00728|binding|INFO|Releasing lport 36888ba0-b822-4067-a556-6a12a1136d08 from this chassis (sb_readonly=0)
Oct  2 08:45:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:45:03Z|00729|binding|INFO|Setting lport 36888ba0-b822-4067-a556-6a12a1136d08 down in Southbound
Oct  2 08:45:03 np0005465988 ovn_controller[132601]: 2025-10-02T12:45:03Z|00730|binding|INFO|Removing iface tap36888ba0-b8 ovn-installed in OVS
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.278 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.278 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:03 np0005465988 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a1.scope: Deactivated successfully.
Oct  2 08:45:03 np0005465988 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a1.scope: Consumed 15.005s CPU time.
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.308 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:89:9a 10.100.0.12'], port_security=['fa:16:3e:8d:89:9a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '38b13275-2908-42f3-bb70-73c050f375ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b8f9114c7ab4b6e9fc9650d4bd08af9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cfad1590-a7c0-4c27-a7db-b88ec54c64dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.174'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04a89c39-8141-4654-8368-c858180215b3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=36888ba0-b822-4067-a556-6a12a1136d08) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:45:03 np0005465988 systemd-machined[192594]: Machine qemu-75-instance-000000a1 terminated.
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.309 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 36888ba0-b822-4067-a556-6a12a1136d08 in datapath 6ea0a90a-9528-4fe1-8b35-dfde9b35e85f unbound from our chassis#033[00m
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.311 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.312 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fd6c5a51-1198-4b7e-ae02-717337656512]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.313 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f namespace which is not needed anymore#033[00m
Oct  2 08:45:03 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[309766]: [NOTICE]   (309794) : haproxy version is 2.8.14-c23fe91
Oct  2 08:45:03 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[309766]: [NOTICE]   (309794) : path to executable is /usr/sbin/haproxy
Oct  2 08:45:03 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[309766]: [WARNING]  (309794) : Exiting Master process...
Oct  2 08:45:03 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[309766]: [ALERT]    (309794) : Current worker (309797) exited with code 143 (Terminated)
Oct  2 08:45:03 np0005465988 neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f[309766]: [WARNING]  (309794) : All workers exited. Exiting... (0)
Oct  2 08:45:03 np0005465988 systemd[1]: libpod-8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26.scope: Deactivated successfully.
Oct  2 08:45:03 np0005465988 podman[310126]: 2025-10-02 12:45:03.464492869 +0000 UTC m=+0.053588674 container died 8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:45:03 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26-userdata-shm.mount: Deactivated successfully.
Oct  2 08:45:03 np0005465988 systemd[1]: var-lib-containers-storage-overlay-e7670e396a6b368ee39fc57bc5e2459251b0c8d1576027628cfd9511e9b8534f-merged.mount: Deactivated successfully.
Oct  2 08:45:03 np0005465988 podman[310126]: 2025-10-02 12:45:03.507243124 +0000 UTC m=+0.096338929 container cleanup 8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:45:03 np0005465988 systemd[1]: libpod-conmon-8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26.scope: Deactivated successfully.
Oct  2 08:45:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:45:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:03.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:45:03 np0005465988 podman[310168]: 2025-10-02 12:45:03.585959571 +0000 UTC m=+0.047361887 container remove 8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.593 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3651d4d0-d4d2-4db4-bcf3-d39075ecba84]: (4, ('Thu Oct  2 12:45:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f (8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26)\n8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26\nThu Oct  2 12:45:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f (8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26)\n8a34c50c76b88edc10469b95eef8f1be02932ffe6bd3acecde65d0f7e5311b26\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.596 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ebca61c7-19e4-40c5-94ba-c3dd074b7144]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.597 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ea0a90a-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:03 np0005465988 kernel: tap6ea0a90a-90: left promiscuous mode
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.622 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[661a0f44-0405-4f9a-905d-89172deb29b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.655 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[851329f3-1daf-4714-803c-cccbebf0772d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.656 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e716cea8-07ec-4c9d-bdd2-7973033259ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.677 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a843c7d7-4a85-4d46-a548-d548fc72b2a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 713805, 'reachable_time': 41697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310187, 'error': None, 'target': 'ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:03 np0005465988 systemd[1]: run-netns-ovnmeta\x2d6ea0a90a\x2d9528\x2d4fe1\x2d8b35\x2ddfde9b35e85f.mount: Deactivated successfully.
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.682 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6ea0a90a-9528-4fe1-8b35-dfde9b35e85f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:45:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:03.682 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[0fcb4571-a038-482e-ad13-8bf02322ee82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.961 2 INFO nova.virt.libvirt.driver [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.967 2 INFO nova.virt.libvirt.driver [-] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Instance destroyed successfully.#033[00m
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.968 2 DEBUG nova.objects.instance [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lazy-loading 'numa_topology' on Instance uuid 38b13275-2908-42f3-bb70-73c050f375ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.981 2 DEBUG nova.compute.manager [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.984 2 DEBUG oslo_concurrency.lockutils [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.984 2 DEBUG oslo_concurrency.lockutils [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquired lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:45:03 np0005465988 nova_compute[236126]: 2025-10-02 12:45:03.984 2 DEBUG nova.network.neutron [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:45:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:04 np0005465988 nova_compute[236126]: 2025-10-02 12:45:04.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:04 np0005465988 nova_compute[236126]: 2025-10-02 12:45:04.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:04.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:45:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:05.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:45:06 np0005465988 nova_compute[236126]: 2025-10-02 12:45:06.018 2 DEBUG nova.compute.manager [req-f5ae2565-df00-4c57-abfe-76c77b4debcb req-b20d698f-6bb9-441f-8b7c-31ff025e8893 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Received event network-vif-unplugged-36888ba0-b822-4067-a556-6a12a1136d08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:06 np0005465988 nova_compute[236126]: 2025-10-02 12:45:06.019 2 DEBUG oslo_concurrency.lockutils [req-f5ae2565-df00-4c57-abfe-76c77b4debcb req-b20d698f-6bb9-441f-8b7c-31ff025e8893 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "38b13275-2908-42f3-bb70-73c050f375ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:06 np0005465988 nova_compute[236126]: 2025-10-02 12:45:06.020 2 DEBUG oslo_concurrency.lockutils [req-f5ae2565-df00-4c57-abfe-76c77b4debcb req-b20d698f-6bb9-441f-8b7c-31ff025e8893 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:06 np0005465988 nova_compute[236126]: 2025-10-02 12:45:06.020 2 DEBUG oslo_concurrency.lockutils [req-f5ae2565-df00-4c57-abfe-76c77b4debcb req-b20d698f-6bb9-441f-8b7c-31ff025e8893 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:06 np0005465988 nova_compute[236126]: 2025-10-02 12:45:06.020 2 DEBUG nova.compute.manager [req-f5ae2565-df00-4c57-abfe-76c77b4debcb req-b20d698f-6bb9-441f-8b7c-31ff025e8893 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] No waiting events found dispatching network-vif-unplugged-36888ba0-b822-4067-a556-6a12a1136d08 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:06 np0005465988 nova_compute[236126]: 2025-10-02 12:45:06.020 2 WARNING nova.compute.manager [req-f5ae2565-df00-4c57-abfe-76c77b4debcb req-b20d698f-6bb9-441f-8b7c-31ff025e8893 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Received unexpected event network-vif-unplugged-36888ba0-b822-4067-a556-6a12a1136d08 for instance with vm_state active and task_state shelving.#033[00m
Oct  2 08:45:06 np0005465988 nova_compute[236126]: 2025-10-02 12:45:06.051 2 DEBUG nova.network.neutron [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Updating instance_info_cache with network_info: [{"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:45:06 np0005465988 nova_compute[236126]: 2025-10-02 12:45:06.223 2 DEBUG oslo_concurrency.lockutils [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Releasing lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:45:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:06.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.523 2 INFO nova.virt.libvirt.driver [-] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Instance destroyed successfully.#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.524 2 DEBUG nova.objects.instance [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lazy-loading 'resources' on Instance uuid 38b13275-2908-42f3-bb70-73c050f375ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.546 2 DEBUG nova.virt.libvirt.vif [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:44:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-59400876',display_name='tempest-TestShelveInstance-server-59400876',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-59400876',id=161,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOey7OItBT8ky28T+jgqzhjbvGoB+pWZYmixBVLc/rLrtlb2/muXhDo2zj9MYH0P2A6ukY8/c6TiMhKqcmGhKPZ0/ha7STFDDz62rpDlcbzBiZArK4kjT3veuuC9b5czRQ==',key_name='tempest-TestShelveInstance-1912831029',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:44:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b8f9114c7ab4b6e9fc9650d4bd08af9',ramdisk_id='',reservation_id='r-iu4ddc0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestShelveInstance-1219039163',owner_user_name='tempest-TestShelveInstance-1219039163-project-member'},tags=<?>,task_state='shelving',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:44:38Z,user_data=None,user_id='56c6abe1bb704c8aa499677aeb9017f5',uuid=38b13275-2908-42f3-bb70-73c050f375ea,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.547 2 DEBUG nova.network.os_vif_util [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converting VIF {"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": "br-int", "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36888ba0-b8", "ovs_interfaceid": "36888ba0-b822-4067-a556-6a12a1136d08", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.548 2 DEBUG nova.network.os_vif_util [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8d:89:9a,bridge_name='br-int',has_traffic_filtering=True,id=36888ba0-b822-4067-a556-6a12a1136d08,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36888ba0-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.548 2 DEBUG os_vif [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8d:89:9a,bridge_name='br-int',has_traffic_filtering=True,id=36888ba0-b822-4067-a556-6a12a1136d08,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36888ba0-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.550 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36888ba0-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.559 2 INFO os_vif [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8d:89:9a,bridge_name='br-int',has_traffic_filtering=True,id=36888ba0-b822-4067-a556-6a12a1136d08,network=Network(6ea0a90a-9528-4fe1-8b35-dfde9b35e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36888ba0-b8')#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:07.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.641 2 DEBUG nova.compute.manager [req-dd6be8ac-1ce2-4896-a631-fd8f603c5714 req-e429f604-5de8-4fa0-83c6-b90321b2158c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Received event network-changed-36888ba0-b822-4067-a556-6a12a1136d08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.642 2 DEBUG nova.compute.manager [req-dd6be8ac-1ce2-4896-a631-fd8f603c5714 req-e429f604-5de8-4fa0-83c6-b90321b2158c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Refreshing instance network info cache due to event network-changed-36888ba0-b822-4067-a556-6a12a1136d08. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.642 2 DEBUG oslo_concurrency.lockutils [req-dd6be8ac-1ce2-4896-a631-fd8f603c5714 req-e429f604-5de8-4fa0-83c6-b90321b2158c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.643 2 DEBUG oslo_concurrency.lockutils [req-dd6be8ac-1ce2-4896-a631-fd8f603c5714 req-e429f604-5de8-4fa0-83c6-b90321b2158c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:45:07 np0005465988 nova_compute[236126]: 2025-10-02 12:45:07.643 2 DEBUG nova.network.neutron [req-dd6be8ac-1ce2-4896-a631-fd8f603c5714 req-e429f604-5de8-4fa0-83c6-b90321b2158c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Refreshing network info cache for port 36888ba0-b822-4067-a556-6a12a1136d08 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:45:08 np0005465988 nova_compute[236126]: 2025-10-02 12:45:08.120 2 DEBUG nova.compute.manager [req-765e84a6-3449-4493-84c1-62fbe9070885 req-d1892acf-d23c-412a-9f7d-392d4ef3a972 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Received event network-vif-plugged-36888ba0-b822-4067-a556-6a12a1136d08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:08 np0005465988 nova_compute[236126]: 2025-10-02 12:45:08.121 2 DEBUG oslo_concurrency.lockutils [req-765e84a6-3449-4493-84c1-62fbe9070885 req-d1892acf-d23c-412a-9f7d-392d4ef3a972 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "38b13275-2908-42f3-bb70-73c050f375ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:08 np0005465988 nova_compute[236126]: 2025-10-02 12:45:08.121 2 DEBUG oslo_concurrency.lockutils [req-765e84a6-3449-4493-84c1-62fbe9070885 req-d1892acf-d23c-412a-9f7d-392d4ef3a972 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:08 np0005465988 nova_compute[236126]: 2025-10-02 12:45:08.121 2 DEBUG oslo_concurrency.lockutils [req-765e84a6-3449-4493-84c1-62fbe9070885 req-d1892acf-d23c-412a-9f7d-392d4ef3a972 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:08 np0005465988 nova_compute[236126]: 2025-10-02 12:45:08.122 2 DEBUG nova.compute.manager [req-765e84a6-3449-4493-84c1-62fbe9070885 req-d1892acf-d23c-412a-9f7d-392d4ef3a972 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] No waiting events found dispatching network-vif-plugged-36888ba0-b822-4067-a556-6a12a1136d08 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:08 np0005465988 nova_compute[236126]: 2025-10-02 12:45:08.122 2 WARNING nova.compute.manager [req-765e84a6-3449-4493-84c1-62fbe9070885 req-d1892acf-d23c-412a-9f7d-392d4ef3a972 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Received unexpected event network-vif-plugged-36888ba0-b822-4067-a556-6a12a1136d08 for instance with vm_state active and task_state shelving.#033[00m
Oct  2 08:45:08 np0005465988 nova_compute[236126]: 2025-10-02 12:45:08.773 2 INFO nova.virt.libvirt.driver [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Deleting instance files /var/lib/nova/instances/38b13275-2908-42f3-bb70-73c050f375ea_del#033[00m
Oct  2 08:45:08 np0005465988 nova_compute[236126]: 2025-10-02 12:45:08.774 2 INFO nova.virt.libvirt.driver [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Deletion of /var/lib/nova/instances/38b13275-2908-42f3-bb70-73c050f375ea_del complete#033[00m
Oct  2 08:45:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:45:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:08.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:45:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:09 np0005465988 nova_compute[236126]: 2025-10-02 12:45:09.376 2 DEBUG nova.network.neutron [req-dd6be8ac-1ce2-4896-a631-fd8f603c5714 req-e429f604-5de8-4fa0-83c6-b90321b2158c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Updated VIF entry in instance network info cache for port 36888ba0-b822-4067-a556-6a12a1136d08. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:45:09 np0005465988 nova_compute[236126]: 2025-10-02 12:45:09.376 2 DEBUG nova.network.neutron [req-dd6be8ac-1ce2-4896-a631-fd8f603c5714 req-e429f604-5de8-4fa0-83c6-b90321b2158c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Updating instance_info_cache with network_info: [{"id": "36888ba0-b822-4067-a556-6a12a1136d08", "address": "fa:16:3e:8d:89:9a", "network": {"id": "6ea0a90a-9528-4fe1-8b35-dfde9b35e85f", "bridge": null, "label": "tempest-TestShelveInstance-563697374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b8f9114c7ab4b6e9fc9650d4bd08af9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap36888ba0-b8", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:45:09 np0005465988 nova_compute[236126]: 2025-10-02 12:45:09.426 2 DEBUG oslo_concurrency.lockutils [req-dd6be8ac-1ce2-4896-a631-fd8f603c5714 req-e429f604-5de8-4fa0-83c6-b90321b2158c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-38b13275-2908-42f3-bb70-73c050f375ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:45:09 np0005465988 nova_compute[236126]: 2025-10-02 12:45:09.430 2 INFO nova.scheduler.client.report [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Deleted allocations for instance 38b13275-2908-42f3-bb70-73c050f375ea#033[00m
Oct  2 08:45:09 np0005465988 nova_compute[236126]: 2025-10-02 12:45:09.482 2 DEBUG oslo_concurrency.lockutils [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:09 np0005465988 nova_compute[236126]: 2025-10-02 12:45:09.482 2 DEBUG oslo_concurrency.lockutils [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:09 np0005465988 nova_compute[236126]: 2025-10-02 12:45:09.521 2 DEBUG oslo_concurrency.processutils [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:45:09 np0005465988 podman[310212]: 2025-10-02 12:45:09.540394912 +0000 UTC m=+0.064155945 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 08:45:09 np0005465988 podman[310213]: 2025-10-02 12:45:09.579788022 +0000 UTC m=+0.093030165 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  2 08:45:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:45:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:09.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:45:09 np0005465988 podman[310211]: 2025-10-02 12:45:09.601670763 +0000 UTC m=+0.123649725 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:45:09 np0005465988 nova_compute[236126]: 2025-10-02 12:45:09.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:45:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3072466742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.001 2 DEBUG oslo_concurrency.processutils [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.008 2 DEBUG nova.compute.provider_tree [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.035 2 DEBUG nova.scheduler.client.report [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.065 2 DEBUG oslo_concurrency.lockutils [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.108 2 DEBUG oslo_concurrency.lockutils [None req-8c98fa33-fe72-441e-926a-391b6fdafe3c 56c6abe1bb704c8aa499677aeb9017f5 4b8f9114c7ab4b6e9fc9650d4bd08af9 - - default default] Lock "38b13275-2908-42f3-bb70-73c050f375ea" "released" by "nova.compute.manager.ComputeManager.shelve_offload_instance.<locals>.do_shelve_offload_instance" :: held 9.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.660 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Acquiring lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.660 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.676 2 DEBUG nova.compute.manager [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.734 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.735 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.743 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.743 2 INFO nova.compute.claims [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:45:10 np0005465988 nova_compute[236126]: 2025-10-02 12:45:10.929 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:45:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:45:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:10.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.279 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.280 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:45:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1711706408' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.364 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.372 2 DEBUG nova.compute.provider_tree [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.400 2 DEBUG nova.scheduler.client.report [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.431 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.432 2 DEBUG nova.compute.manager [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.468 2 DEBUG nova.compute.manager [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.468 2 DEBUG nova.network.neutron [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.488 2 INFO nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.507 2 DEBUG nova.compute.manager [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:45:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:45:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:11.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.600 2 DEBUG nova.compute.manager [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.602 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.603 2 INFO nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Creating image(s)#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.646 2 DEBUG nova.storage.rbd_utils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] rbd image 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.674 2 DEBUG nova.storage.rbd_utils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] rbd image 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.699 2 DEBUG nova.storage.rbd_utils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] rbd image 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.709 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.752 2 DEBUG nova.policy [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f5203753507439b848f7dd6c0782f0e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '338177849a8045758e5c446cc24ffaa8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.788 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.789 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.790 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.790 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.814 2 DEBUG nova.storage.rbd_utils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] rbd image 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:45:11 np0005465988 nova_compute[236126]: 2025-10-02 12:45:11.818 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:45:12 np0005465988 nova_compute[236126]: 2025-10-02 12:45:12.172 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.354s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:45:12 np0005465988 nova_compute[236126]: 2025-10-02 12:45:12.250 2 DEBUG nova.storage.rbd_utils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] resizing rbd image 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:45:12 np0005465988 nova_compute[236126]: 2025-10-02 12:45:12.438 2 DEBUG nova.objects.instance [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lazy-loading 'migration_context' on Instance uuid 790d8b15-8028-41cb-9b70-51e04c6ba7ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:45:12 np0005465988 nova_compute[236126]: 2025-10-02 12:45:12.464 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:45:12 np0005465988 nova_compute[236126]: 2025-10-02 12:45:12.465 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Ensure instance console log exists: /var/lib/nova/instances/790d8b15-8028-41cb-9b70-51e04c6ba7ff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:45:12 np0005465988 nova_compute[236126]: 2025-10-02 12:45:12.466 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:12 np0005465988 nova_compute[236126]: 2025-10-02 12:45:12.466 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:12 np0005465988 nova_compute[236126]: 2025-10-02 12:45:12.467 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:12 np0005465988 nova_compute[236126]: 2025-10-02 12:45:12.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:12 np0005465988 nova_compute[236126]: 2025-10-02 12:45:12.559 2 DEBUG nova.network.neutron [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Successfully created port: d4d92197-21c2-4463-8aed-7ad6d4d075d5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:45:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:45:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:12.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:45:13 np0005465988 nova_compute[236126]: 2025-10-02 12:45:13.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:13 np0005465988 nova_compute[236126]: 2025-10-02 12:45:13.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:45:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:13.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:13 np0005465988 nova_compute[236126]: 2025-10-02 12:45:13.782 2 DEBUG nova.network.neutron [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Successfully updated port: d4d92197-21c2-4463-8aed-7ad6d4d075d5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:45:13 np0005465988 nova_compute[236126]: 2025-10-02 12:45:13.819 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Acquiring lock "refresh_cache-790d8b15-8028-41cb-9b70-51e04c6ba7ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:45:13 np0005465988 nova_compute[236126]: 2025-10-02 12:45:13.820 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Acquired lock "refresh_cache-790d8b15-8028-41cb-9b70-51e04c6ba7ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:45:13 np0005465988 nova_compute[236126]: 2025-10-02 12:45:13.820 2 DEBUG nova.network.neutron [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:45:13 np0005465988 nova_compute[236126]: 2025-10-02 12:45:13.906 2 DEBUG nova.compute.manager [req-ecce6b51-aea5-4bfb-bdff-b6f6b429ec3f req-2c87224d-baa2-438e-bd92-748dfadeb8ed d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Received event network-changed-d4d92197-21c2-4463-8aed-7ad6d4d075d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:13 np0005465988 nova_compute[236126]: 2025-10-02 12:45:13.907 2 DEBUG nova.compute.manager [req-ecce6b51-aea5-4bfb-bdff-b6f6b429ec3f req-2c87224d-baa2-438e-bd92-748dfadeb8ed d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Refreshing instance network info cache due to event network-changed-d4d92197-21c2-4463-8aed-7ad6d4d075d5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:45:13 np0005465988 nova_compute[236126]: 2025-10-02 12:45:13.908 2 DEBUG oslo_concurrency.lockutils [req-ecce6b51-aea5-4bfb-bdff-b6f6b429ec3f req-2c87224d-baa2-438e-bd92-748dfadeb8ed d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-790d8b15-8028-41cb-9b70-51e04c6ba7ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:45:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:14 np0005465988 nova_compute[236126]: 2025-10-02 12:45:14.087 2 DEBUG nova.network.neutron [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:45:14 np0005465988 nova_compute[236126]: 2025-10-02 12:45:14.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:14 np0005465988 nova_compute[236126]: 2025-10-02 12:45:14.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:14 np0005465988 nova_compute[236126]: 2025-10-02 12:45:14.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:14 np0005465988 nova_compute[236126]: 2025-10-02 12:45:14.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:14.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.083 2 DEBUG nova.network.neutron [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Updating instance_info_cache with network_info: [{"id": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "address": "fa:16:3e:31:a5:18", "network": {"id": "325a04dc-c467-4853-b5e6-2fae10dff6bd", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1215879349-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "338177849a8045758e5c446cc24ffaa8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4d92197-21", "ovs_interfaceid": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.114 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Releasing lock "refresh_cache-790d8b15-8028-41cb-9b70-51e04c6ba7ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.115 2 DEBUG nova.compute.manager [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Instance network_info: |[{"id": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "address": "fa:16:3e:31:a5:18", "network": {"id": "325a04dc-c467-4853-b5e6-2fae10dff6bd", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1215879349-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "338177849a8045758e5c446cc24ffaa8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4d92197-21", "ovs_interfaceid": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.115 2 DEBUG oslo_concurrency.lockutils [req-ecce6b51-aea5-4bfb-bdff-b6f6b429ec3f req-2c87224d-baa2-438e-bd92-748dfadeb8ed d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-790d8b15-8028-41cb-9b70-51e04c6ba7ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.116 2 DEBUG nova.network.neutron [req-ecce6b51-aea5-4bfb-bdff-b6f6b429ec3f req-2c87224d-baa2-438e-bd92-748dfadeb8ed d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Refreshing network info cache for port d4d92197-21c2-4463-8aed-7ad6d4d075d5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.121 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Start _get_guest_xml network_info=[{"id": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "address": "fa:16:3e:31:a5:18", "network": {"id": "325a04dc-c467-4853-b5e6-2fae10dff6bd", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1215879349-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "338177849a8045758e5c446cc24ffaa8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4d92197-21", "ovs_interfaceid": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.128 2 WARNING nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.134 2 DEBUG nova.virt.libvirt.host [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.136 2 DEBUG nova.virt.libvirt.host [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.149 2 DEBUG nova.virt.libvirt.host [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.150 2 DEBUG nova.virt.libvirt.host [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.151 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.152 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.153 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.154 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.154 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.155 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.155 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.155 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.156 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.157 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.157 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.157 2 DEBUG nova.virt.hardware [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.162 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:45:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:45:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:15.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:45:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:45:15 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3693070582' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.677 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.718 2 DEBUG nova.storage.rbd_utils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] rbd image 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:45:15 np0005465988 nova_compute[236126]: 2025-10-02 12:45:15.725 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:45:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:45:16 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3783318637' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.234 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.237 2 DEBUG nova.virt.libvirt.vif [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:45:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1393000338',display_name='tempest-ServerMetadataTestJSON-server-1393000338',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1393000338',id=163,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='338177849a8045758e5c446cc24ffaa8',ramdisk_id='',reservation_id='r-mhzbov3s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-413132607',owner_user_name='tempest-ServerMetadataTestJSON-413132607-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:45:11Z,user_data=None,user_id='3f5203753507439b848f7dd6c0782f0e',uuid=790d8b15-8028-41cb-9b70-51e04c6ba7ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "address": "fa:16:3e:31:a5:18", "network": {"id": "325a04dc-c467-4853-b5e6-2fae10dff6bd", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1215879349-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "338177849a8045758e5c446cc24ffaa8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4d92197-21", "ovs_interfaceid": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.238 2 DEBUG nova.network.os_vif_util [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Converting VIF {"id": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "address": "fa:16:3e:31:a5:18", "network": {"id": "325a04dc-c467-4853-b5e6-2fae10dff6bd", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1215879349-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "338177849a8045758e5c446cc24ffaa8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4d92197-21", "ovs_interfaceid": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.240 2 DEBUG nova.network.os_vif_util [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:a5:18,bridge_name='br-int',has_traffic_filtering=True,id=d4d92197-21c2-4463-8aed-7ad6d4d075d5,network=Network(325a04dc-c467-4853-b5e6-2fae10dff6bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4d92197-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.242 2 DEBUG nova.objects.instance [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 790d8b15-8028-41cb-9b70-51e04c6ba7ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.469 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  <uuid>790d8b15-8028-41cb-9b70-51e04c6ba7ff</uuid>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  <name>instance-000000a3</name>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerMetadataTestJSON-server-1393000338</nova:name>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:45:15</nova:creationTime>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <nova:user uuid="3f5203753507439b848f7dd6c0782f0e">tempest-ServerMetadataTestJSON-413132607-project-member</nova:user>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <nova:project uuid="338177849a8045758e5c446cc24ffaa8">tempest-ServerMetadataTestJSON-413132607</nova:project>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <nova:port uuid="d4d92197-21c2-4463-8aed-7ad6d4d075d5">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <entry name="serial">790d8b15-8028-41cb-9b70-51e04c6ba7ff</entry>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <entry name="uuid">790d8b15-8028-41cb-9b70-51e04c6ba7ff</entry>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk.config">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:31:a5:18"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <target dev="tapd4d92197-21"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/790d8b15-8028-41cb-9b70-51e04c6ba7ff/console.log" append="off"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:45:16 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:45:16 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:45:16 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:45:16 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.470 2 DEBUG nova.compute.manager [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Preparing to wait for external event network-vif-plugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.471 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Acquiring lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.472 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.472 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.474 2 DEBUG nova.virt.libvirt.vif [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:45:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1393000338',display_name='tempest-ServerMetadataTestJSON-server-1393000338',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1393000338',id=163,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='338177849a8045758e5c446cc24ffaa8',ramdisk_id='',reservation_id='r-mhzbov3s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-413132607',owner_user_name='tempest-ServerMetadataTestJSON-413132607-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:45:11Z,user_data=None,user_id='3f5203753507439b848f7dd6c0782f0e',uuid=790d8b15-8028-41cb-9b70-51e04c6ba7ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "address": "fa:16:3e:31:a5:18", "network": {"id": "325a04dc-c467-4853-b5e6-2fae10dff6bd", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1215879349-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "338177849a8045758e5c446cc24ffaa8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4d92197-21", "ovs_interfaceid": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.474 2 DEBUG nova.network.os_vif_util [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Converting VIF {"id": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "address": "fa:16:3e:31:a5:18", "network": {"id": "325a04dc-c467-4853-b5e6-2fae10dff6bd", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1215879349-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "338177849a8045758e5c446cc24ffaa8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4d92197-21", "ovs_interfaceid": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.475 2 DEBUG nova.network.os_vif_util [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:a5:18,bridge_name='br-int',has_traffic_filtering=True,id=d4d92197-21c2-4463-8aed-7ad6d4d075d5,network=Network(325a04dc-c467-4853-b5e6-2fae10dff6bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4d92197-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.476 2 DEBUG os_vif [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:a5:18,bridge_name='br-int',has_traffic_filtering=True,id=d4d92197-21c2-4463-8aed-7ad6d4d075d5,network=Network(325a04dc-c467-4853-b5e6-2fae10dff6bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4d92197-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.478 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.479 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4d92197-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4d92197-21, col_values=(('external_ids', {'iface-id': 'd4d92197-21c2-4463-8aed-7ad6d4d075d5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:a5:18', 'vm-uuid': '790d8b15-8028-41cb-9b70-51e04c6ba7ff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:16 np0005465988 NetworkManager[45041]: <info>  [1759409116.4881] manager: (tapd4d92197-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.495 2 INFO os_vif [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:a5:18,bridge_name='br-int',has_traffic_filtering=True,id=d4d92197-21c2-4463-8aed-7ad6d4d075d5,network=Network(325a04dc-c467-4853-b5e6-2fae10dff6bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4d92197-21')#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.930 2 DEBUG nova.network.neutron [req-ecce6b51-aea5-4bfb-bdff-b6f6b429ec3f req-2c87224d-baa2-438e-bd92-748dfadeb8ed d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Updated VIF entry in instance network info cache for port d4d92197-21c2-4463-8aed-7ad6d4d075d5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.932 2 DEBUG nova.network.neutron [req-ecce6b51-aea5-4bfb-bdff-b6f6b429ec3f req-2c87224d-baa2-438e-bd92-748dfadeb8ed d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Updating instance_info_cache with network_info: [{"id": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "address": "fa:16:3e:31:a5:18", "network": {"id": "325a04dc-c467-4853-b5e6-2fae10dff6bd", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1215879349-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "338177849a8045758e5c446cc24ffaa8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4d92197-21", "ovs_interfaceid": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.941 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.941 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.942 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] No VIF found with MAC fa:16:3e:31:a5:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.942 2 INFO nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Using config drive#033[00m
Oct  2 08:45:16 np0005465988 nova_compute[236126]: 2025-10-02 12:45:16.988 2 DEBUG nova.storage.rbd_utils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] rbd image 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:45:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:16.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:17 np0005465988 nova_compute[236126]: 2025-10-02 12:45:17.116 2 DEBUG oslo_concurrency.lockutils [req-ecce6b51-aea5-4bfb-bdff-b6f6b429ec3f req-2c87224d-baa2-438e-bd92-748dfadeb8ed d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-790d8b15-8028-41cb-9b70-51e04c6ba7ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:45:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:17 np0005465988 nova_compute[236126]: 2025-10-02 12:45:17.604 2 INFO nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Creating config drive at /var/lib/nova/instances/790d8b15-8028-41cb-9b70-51e04c6ba7ff/disk.config#033[00m
Oct  2 08:45:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:45:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:17.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:45:17 np0005465988 nova_compute[236126]: 2025-10-02 12:45:17.609 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/790d8b15-8028-41cb-9b70-51e04c6ba7ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzxahg8fm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:45:17 np0005465988 nova_compute[236126]: 2025-10-02 12:45:17.759 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/790d8b15-8028-41cb-9b70-51e04c6ba7ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzxahg8fm" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:45:17 np0005465988 nova_compute[236126]: 2025-10-02 12:45:17.797 2 DEBUG nova.storage.rbd_utils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] rbd image 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:45:17 np0005465988 nova_compute[236126]: 2025-10-02 12:45:17.802 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/790d8b15-8028-41cb-9b70-51e04c6ba7ff/disk.config 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:45:18 np0005465988 nova_compute[236126]: 2025-10-02 12:45:18.009 2 DEBUG oslo_concurrency.processutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/790d8b15-8028-41cb-9b70-51e04c6ba7ff/disk.config 790d8b15-8028-41cb-9b70-51e04c6ba7ff_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.207s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:45:18 np0005465988 nova_compute[236126]: 2025-10-02 12:45:18.010 2 INFO nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Deleting local config drive /var/lib/nova/instances/790d8b15-8028-41cb-9b70-51e04c6ba7ff/disk.config because it was imported into RBD.#033[00m
Oct  2 08:45:18 np0005465988 kernel: tapd4d92197-21: entered promiscuous mode
Oct  2 08:45:18 np0005465988 NetworkManager[45041]: <info>  [1759409118.0929] manager: (tapd4d92197-21): new Tun device (/org/freedesktop/NetworkManager/Devices/329)
Oct  2 08:45:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:45:18Z|00731|binding|INFO|Claiming lport d4d92197-21c2-4463-8aed-7ad6d4d075d5 for this chassis.
Oct  2 08:45:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:45:18Z|00732|binding|INFO|d4d92197-21c2-4463-8aed-7ad6d4d075d5: Claiming fa:16:3e:31:a5:18 10.100.0.4
Oct  2 08:45:18 np0005465988 nova_compute[236126]: 2025-10-02 12:45:18.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:45:18Z|00733|binding|INFO|Setting lport d4d92197-21c2-4463-8aed-7ad6d4d075d5 ovn-installed in OVS
Oct  2 08:45:18 np0005465988 nova_compute[236126]: 2025-10-02 12:45:18.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:18 np0005465988 nova_compute[236126]: 2025-10-02 12:45:18.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:18 np0005465988 systemd-udevd[310671]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:45:18 np0005465988 systemd-machined[192594]: New machine qemu-76-instance-000000a3.
Oct  2 08:45:18 np0005465988 NetworkManager[45041]: <info>  [1759409118.1559] device (tapd4d92197-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:45:18 np0005465988 NetworkManager[45041]: <info>  [1759409118.1566] device (tapd4d92197-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:45:18 np0005465988 systemd[1]: Started Virtual Machine qemu-76-instance-000000a3.
Oct  2 08:45:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:45:18Z|00734|binding|INFO|Setting lport d4d92197-21c2-4463-8aed-7ad6d4d075d5 up in Southbound
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.382 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:a5:18 10.100.0.4'], port_security=['fa:16:3e:31:a5:18 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '790d8b15-8028-41cb-9b70-51e04c6ba7ff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-325a04dc-c467-4853-b5e6-2fae10dff6bd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '338177849a8045758e5c446cc24ffaa8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c8f84547-ac84-4b69-8be1-27561c61a5d3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f97c942-0eab-4909-8652-883a6b8601c4, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=d4d92197-21c2-4463-8aed-7ad6d4d075d5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.385 142124 INFO neutron.agent.ovn.metadata.agent [-] Port d4d92197-21c2-4463-8aed-7ad6d4d075d5 in datapath 325a04dc-c467-4853-b5e6-2fae10dff6bd bound to our chassis#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.389 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 325a04dc-c467-4853-b5e6-2fae10dff6bd#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.403 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0392f6ed-5dc7-487e-9afe-e5af8365f9a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.405 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap325a04dc-c1 in ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.407 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap325a04dc-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.408 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fad158-793f-48de-a28d-ed7b8e4dd0a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.409 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[991e310f-4544-424f-86e2-2a535721e72a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.428 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[f3cff1a8-83e5-4f81-b6ee-1f65d8776ceb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.462 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[676b6903-3220-4166-8b3a-a3dad2240791]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 nova_compute[236126]: 2025-10-02 12:45:18.493 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409103.4916532, 38b13275-2908-42f3-bb70-73c050f375ea => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:45:18 np0005465988 nova_compute[236126]: 2025-10-02 12:45:18.494 2 INFO nova.compute.manager [-] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.494 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d7bb3354-0a62-4d4b-ab53-17f64a9ac2b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.505 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1abbf8-101a-4228-8f8d-5335a50d344e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 systemd-udevd[310674]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:45:18 np0005465988 NetworkManager[45041]: <info>  [1759409118.5074] manager: (tap325a04dc-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/330)
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.548 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[41b90072-ed9d-4ce8-96bb-77d64aaaa1b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.552 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2abd793b-9c31-4d0c-8228-b021ae6a703f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 nova_compute[236126]: 2025-10-02 12:45:18.561 2 DEBUG nova.compute.manager [None req-9fc174e2-a80b-41c2-b7ce-494f20f83cba - - - - - -] [instance: 38b13275-2908-42f3-bb70-73c050f375ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:45:18 np0005465988 NetworkManager[45041]: <info>  [1759409118.5829] device (tap325a04dc-c0): carrier: link connected
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.592 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e1102a-4258-46f4-a522-9eea0d3ddbf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.615 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6defa037-7a82-484e-b4f7-0e1d791374d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap325a04dc-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:e2:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 216], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 717891, 'reachable_time': 17418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310747, 'error': None, 'target': 'ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.632 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9f9ac947-2217-4144-94a4-cdd7e125b090]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe66:e261'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 717891, 'tstamp': 717891}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 310748, 'error': None, 'target': 'ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.659 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bf18e9f9-0920-4bfe-a5c0-e5456daaad72]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap325a04dc-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:e2:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 216], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 717891, 'reachable_time': 17418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 310749, 'error': None, 'target': 'ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.692 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b1978eea-6c83-4ed3-991e-47f1d59f4ebe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.770 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c2316161-2dfa-434e-85a5-29b3dcf0f9d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.772 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap325a04dc-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.773 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.774 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap325a04dc-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:18 np0005465988 nova_compute[236126]: 2025-10-02 12:45:18.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:18 np0005465988 NetworkManager[45041]: <info>  [1759409118.7777] manager: (tap325a04dc-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Oct  2 08:45:18 np0005465988 kernel: tap325a04dc-c0: entered promiscuous mode
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.784 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap325a04dc-c0, col_values=(('external_ids', {'iface-id': '49c67fbe-e3d0-41a7-b749-448690427957'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:18 np0005465988 nova_compute[236126]: 2025-10-02 12:45:18.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:45:18Z|00735|binding|INFO|Releasing lport 49c67fbe-e3d0-41a7-b749-448690427957 from this chassis (sb_readonly=0)
Oct  2 08:45:18 np0005465988 nova_compute[236126]: 2025-10-02 12:45:18.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.818 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/325a04dc-c467-4853-b5e6-2fae10dff6bd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/325a04dc-c467-4853-b5e6-2fae10dff6bd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.819 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3cf557ab-6025-4235-800f-4e8f386eac63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.819 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-325a04dc-c467-4853-b5e6-2fae10dff6bd
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/325a04dc-c467-4853-b5e6-2fae10dff6bd.pid.haproxy
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 325a04dc-c467-4853-b5e6-2fae10dff6bd
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:45:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:18.820 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd', 'env', 'PROCESS_TAG=haproxy-325a04dc-c467-4853-b5e6-2fae10dff6bd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/325a04dc-c467-4853-b5e6-2fae10dff6bd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:45:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:18.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:19 np0005465988 nova_compute[236126]: 2025-10-02 12:45:19.106 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409119.1057615, 790d8b15-8028-41cb-9b70-51e04c6ba7ff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:45:19 np0005465988 nova_compute[236126]: 2025-10-02 12:45:19.106 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] VM Started (Lifecycle Event)#033[00m
Oct  2 08:45:19 np0005465988 nova_compute[236126]: 2025-10-02 12:45:19.165 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:45:19 np0005465988 nova_compute[236126]: 2025-10-02 12:45:19.170 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409119.1066296, 790d8b15-8028-41cb-9b70-51e04c6ba7ff => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:45:19 np0005465988 nova_compute[236126]: 2025-10-02 12:45:19.170 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:45:19 np0005465988 nova_compute[236126]: 2025-10-02 12:45:19.208 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:45:19 np0005465988 nova_compute[236126]: 2025-10-02 12:45:19.212 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:45:19 np0005465988 nova_compute[236126]: 2025-10-02 12:45:19.236 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:45:19 np0005465988 podman[310782]: 2025-10-02 12:45:19.294112889 +0000 UTC m=+0.086485279 container create ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:45:19 np0005465988 podman[310782]: 2025-10-02 12:45:19.231638424 +0000 UTC m=+0.024010844 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:45:19 np0005465988 systemd[1]: Started libpod-conmon-ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631.scope.
Oct  2 08:45:19 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:45:19 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/321ebff70f682b6726364766dfb1befded083af15725f5fab86c983bc0a64c2c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:45:19 np0005465988 podman[310782]: 2025-10-02 12:45:19.402886841 +0000 UTC m=+0.195259341 container init ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:45:19 np0005465988 podman[310782]: 2025-10-02 12:45:19.410716993 +0000 UTC m=+0.203089433 container start ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:45:19 np0005465988 neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd[310798]: [NOTICE]   (310802) : New worker (310804) forked
Oct  2 08:45:19 np0005465988 neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd[310798]: [NOTICE]   (310802) : Loading success.
Oct  2 08:45:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:19.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:19 np0005465988 nova_compute[236126]: 2025-10-02 12:45:19.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.477 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.616 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.617 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.754 2 DEBUG nova.compute.manager [req-2db4c9c8-3664-467d-a898-8f69e8642f93 req-787ad134-df34-4046-8815-90bda7295e94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Received event network-vif-plugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.754 2 DEBUG oslo_concurrency.lockutils [req-2db4c9c8-3664-467d-a898-8f69e8642f93 req-787ad134-df34-4046-8815-90bda7295e94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.755 2 DEBUG oslo_concurrency.lockutils [req-2db4c9c8-3664-467d-a898-8f69e8642f93 req-787ad134-df34-4046-8815-90bda7295e94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.755 2 DEBUG oslo_concurrency.lockutils [req-2db4c9c8-3664-467d-a898-8f69e8642f93 req-787ad134-df34-4046-8815-90bda7295e94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.755 2 DEBUG nova.compute.manager [req-2db4c9c8-3664-467d-a898-8f69e8642f93 req-787ad134-df34-4046-8815-90bda7295e94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Processing event network-vif-plugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.756 2 DEBUG nova.compute.manager [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.764 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.765 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409120.7645495, 790d8b15-8028-41cb-9b70-51e04c6ba7ff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.765 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.774 2 INFO nova.virt.libvirt.driver [-] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Instance spawned successfully.#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.775 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.807 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.811 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.834 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.835 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.836 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.837 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.838 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.839 2 DEBUG nova.virt.libvirt.driver [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.845 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:45:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:45:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:20.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.998 2 INFO nova.compute.manager [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Took 9.40 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:45:20 np0005465988 nova_compute[236126]: 2025-10-02 12:45:20.998 2 DEBUG nova.compute.manager [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:45:21 np0005465988 nova_compute[236126]: 2025-10-02 12:45:21.139 2 INFO nova.compute.manager [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Took 10.42 seconds to build instance.#033[00m
Oct  2 08:45:21 np0005465988 nova_compute[236126]: 2025-10-02 12:45:21.168 2 DEBUG oslo_concurrency.lockutils [None req-def1a474-7311-4fa1-a99c-90861236790f 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:21 np0005465988 nova_compute[236126]: 2025-10-02 12:45:21.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:21.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:22.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:23 np0005465988 nova_compute[236126]: 2025-10-02 12:45:23.025 2 DEBUG nova.compute.manager [req-3857691d-a8e1-4fc1-82fd-a87ae48b2fce req-35bc42db-c34b-4d74-83e1-41f6c28867e0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Received event network-vif-plugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:23 np0005465988 nova_compute[236126]: 2025-10-02 12:45:23.026 2 DEBUG oslo_concurrency.lockutils [req-3857691d-a8e1-4fc1-82fd-a87ae48b2fce req-35bc42db-c34b-4d74-83e1-41f6c28867e0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:23 np0005465988 nova_compute[236126]: 2025-10-02 12:45:23.026 2 DEBUG oslo_concurrency.lockutils [req-3857691d-a8e1-4fc1-82fd-a87ae48b2fce req-35bc42db-c34b-4d74-83e1-41f6c28867e0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:23 np0005465988 nova_compute[236126]: 2025-10-02 12:45:23.026 2 DEBUG oslo_concurrency.lockutils [req-3857691d-a8e1-4fc1-82fd-a87ae48b2fce req-35bc42db-c34b-4d74-83e1-41f6c28867e0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:23 np0005465988 nova_compute[236126]: 2025-10-02 12:45:23.026 2 DEBUG nova.compute.manager [req-3857691d-a8e1-4fc1-82fd-a87ae48b2fce req-35bc42db-c34b-4d74-83e1-41f6c28867e0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] No waiting events found dispatching network-vif-plugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:23 np0005465988 nova_compute[236126]: 2025-10-02 12:45:23.026 2 WARNING nova.compute.manager [req-3857691d-a8e1-4fc1-82fd-a87ae48b2fce req-35bc42db-c34b-4d74-83e1-41f6c28867e0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Received unexpected event network-vif-plugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:45:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:45:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:23.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:45:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:24 np0005465988 nova_compute[236126]: 2025-10-02 12:45:24.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:25.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:25.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:26 np0005465988 nova_compute[236126]: 2025-10-02 12:45:26.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:26 np0005465988 podman[310816]: 2025-10-02 12:45:26.557893411 +0000 UTC m=+0.076917018 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:45:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:27.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:27.385 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:27.386 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:27.387 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:27.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:28 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #55. Immutable memtables: 11.
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.615 2 DEBUG oslo_concurrency.lockutils [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Acquiring lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.617 2 DEBUG oslo_concurrency.lockutils [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.618 2 DEBUG oslo_concurrency.lockutils [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Acquiring lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.618 2 DEBUG oslo_concurrency.lockutils [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.618 2 DEBUG oslo_concurrency.lockutils [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.619 2 INFO nova.compute.manager [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Terminating instance#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.621 2 DEBUG nova.compute.manager [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:45:28 np0005465988 kernel: tapd4d92197-21 (unregistering): left promiscuous mode
Oct  2 08:45:28 np0005465988 NetworkManager[45041]: <info>  [1759409128.6674] device (tapd4d92197-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:45:28Z|00736|binding|INFO|Releasing lport d4d92197-21c2-4463-8aed-7ad6d4d075d5 from this chassis (sb_readonly=0)
Oct  2 08:45:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:45:28Z|00737|binding|INFO|Setting lport d4d92197-21c2-4463-8aed-7ad6d4d075d5 down in Southbound
Oct  2 08:45:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:45:28Z|00738|binding|INFO|Removing iface tapd4d92197-21 ovn-installed in OVS
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:28.689 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:a5:18 10.100.0.4'], port_security=['fa:16:3e:31:a5:18 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '790d8b15-8028-41cb-9b70-51e04c6ba7ff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-325a04dc-c467-4853-b5e6-2fae10dff6bd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '338177849a8045758e5c446cc24ffaa8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c8f84547-ac84-4b69-8be1-27561c61a5d3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f97c942-0eab-4909-8652-883a6b8601c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=d4d92197-21c2-4463-8aed-7ad6d4d075d5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:45:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:28.690 142124 INFO neutron.agent.ovn.metadata.agent [-] Port d4d92197-21c2-4463-8aed-7ad6d4d075d5 in datapath 325a04dc-c467-4853-b5e6-2fae10dff6bd unbound from our chassis#033[00m
Oct  2 08:45:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:28.692 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 325a04dc-c467-4853-b5e6-2fae10dff6bd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:45:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:28.694 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4f14f36c-7733-4e8c-86b3-efd6abd1b5b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:28.699 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd namespace which is not needed anymore#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:28 np0005465988 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Oct  2 08:45:28 np0005465988 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a3.scope: Consumed 8.916s CPU time.
Oct  2 08:45:28 np0005465988 systemd-machined[192594]: Machine qemu-76-instance-000000a3 terminated.
Oct  2 08:45:28 np0005465988 neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd[310798]: [NOTICE]   (310802) : haproxy version is 2.8.14-c23fe91
Oct  2 08:45:28 np0005465988 neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd[310798]: [NOTICE]   (310802) : path to executable is /usr/sbin/haproxy
Oct  2 08:45:28 np0005465988 neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd[310798]: [WARNING]  (310802) : Exiting Master process...
Oct  2 08:45:28 np0005465988 neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd[310798]: [ALERT]    (310802) : Current worker (310804) exited with code 143 (Terminated)
Oct  2 08:45:28 np0005465988 neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd[310798]: [WARNING]  (310802) : All workers exited. Exiting... (0)
Oct  2 08:45:28 np0005465988 systemd[1]: libpod-ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631.scope: Deactivated successfully.
Oct  2 08:45:28 np0005465988 conmon[310798]: conmon ec2c534c0cd786d1286c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631.scope/container/memory.events
Oct  2 08:45:28 np0005465988 podman[310861]: 2025-10-02 12:45:28.85197241 +0000 UTC m=+0.057895277 container died ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.859 2 INFO nova.virt.libvirt.driver [-] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Instance destroyed successfully.#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.860 2 DEBUG nova.objects.instance [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lazy-loading 'resources' on Instance uuid 790d8b15-8028-41cb-9b70-51e04c6ba7ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:45:28 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631-userdata-shm.mount: Deactivated successfully.
Oct  2 08:45:28 np0005465988 systemd[1]: var-lib-containers-storage-overlay-321ebff70f682b6726364766dfb1befded083af15725f5fab86c983bc0a64c2c-merged.mount: Deactivated successfully.
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.892 2 DEBUG nova.virt.libvirt.vif [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:45:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1393000338',display_name='tempest-ServerMetadataTestJSON-server-1393000338',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1393000338',id=163,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={key1='alt1',key2='value2',key3='value3'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='338177849a8045758e5c446cc24ffaa8',ramdisk_id='',reservation_id='r-mhzbov3s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataTestJSON-413132607',owner_user_name='tempest-ServerMetadataTestJSON-413132607-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:45:28Z,user_data=None,user_id='3f5203753507439b848f7dd6c0782f0e',uuid=790d8b15-8028-41cb-9b70-51e04c6ba7ff,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "address": "fa:16:3e:31:a5:18", "network": {"id": "325a04dc-c467-4853-b5e6-2fae10dff6bd", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1215879349-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "338177849a8045758e5c446cc24ffaa8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4d92197-21", "ovs_interfaceid": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.893 2 DEBUG nova.network.os_vif_util [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Converting VIF {"id": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "address": "fa:16:3e:31:a5:18", "network": {"id": "325a04dc-c467-4853-b5e6-2fae10dff6bd", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1215879349-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "338177849a8045758e5c446cc24ffaa8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4d92197-21", "ovs_interfaceid": "d4d92197-21c2-4463-8aed-7ad6d4d075d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.893 2 DEBUG nova.network.os_vif_util [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:a5:18,bridge_name='br-int',has_traffic_filtering=True,id=d4d92197-21c2-4463-8aed-7ad6d4d075d5,network=Network(325a04dc-c467-4853-b5e6-2fae10dff6bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4d92197-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.894 2 DEBUG os_vif [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:a5:18,bridge_name='br-int',has_traffic_filtering=True,id=d4d92197-21c2-4463-8aed-7ad6d4d075d5,network=Network(325a04dc-c467-4853-b5e6-2fae10dff6bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4d92197-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.896 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4d92197-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.902 2 INFO os_vif [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:a5:18,bridge_name='br-int',has_traffic_filtering=True,id=d4d92197-21c2-4463-8aed-7ad6d4d075d5,network=Network(325a04dc-c467-4853-b5e6-2fae10dff6bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4d92197-21')#033[00m
Oct  2 08:45:28 np0005465988 podman[310861]: 2025-10-02 12:45:28.909914346 +0000 UTC m=+0.115837213 container cleanup ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:45:28 np0005465988 systemd[1]: libpod-conmon-ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631.scope: Deactivated successfully.
Oct  2 08:45:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:28 np0005465988 podman[310917]: 2025-10-02 12:45:28.9937797 +0000 UTC m=+0.049861838 container remove ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.998 2 DEBUG nova.compute.manager [req-53fe0d20-62b6-480b-8606-61b90aa01d0c req-ef6ff292-72b0-4f7d-89ee-3a1aac21004f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Received event network-vif-unplugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.999 2 DEBUG oslo_concurrency.lockutils [req-53fe0d20-62b6-480b-8606-61b90aa01d0c req-ef6ff292-72b0-4f7d-89ee-3a1aac21004f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:28 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.999 2 DEBUG oslo_concurrency.lockutils [req-53fe0d20-62b6-480b-8606-61b90aa01d0c req-ef6ff292-72b0-4f7d-89ee-3a1aac21004f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:28.999 2 DEBUG oslo_concurrency.lockutils [req-53fe0d20-62b6-480b-8606-61b90aa01d0c req-ef6ff292-72b0-4f7d-89ee-3a1aac21004f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:29.000 2 DEBUG nova.compute.manager [req-53fe0d20-62b6-480b-8606-61b90aa01d0c req-ef6ff292-72b0-4f7d-89ee-3a1aac21004f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] No waiting events found dispatching network-vif-unplugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:29.000 2 DEBUG nova.compute.manager [req-53fe0d20-62b6-480b-8606-61b90aa01d0c req-ef6ff292-72b0-4f7d-89ee-3a1aac21004f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Received event network-vif-unplugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:45:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:29.001 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5d4c5b-eaa8-43f6-a160-9be604fdedc0]: (4, ('Thu Oct  2 12:45:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd (ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631)\nec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631\nThu Oct  2 12:45:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd (ec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631)\nec2c534c0cd786d1286cca38942e83e0553adfaa876951ab672dc71e64e55631\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:29.005 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[75278bd5-59f6-43fb-aa35-d187fb081cc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:29.006 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap325a04dc-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:29 np0005465988 kernel: tap325a04dc-c0: left promiscuous mode
Oct  2 08:45:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:29.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:29.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:29.015 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4ec130a8-be49-491c-b0f1-125e93d39917]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:29.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:29.042 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e257d365-1f26-4e2d-8591-c001a2fae522]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:29.043 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[53d196b1-6f7a-4e6f-a0d8-9840dd3d88b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:29.068 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7d65c577-96ef-4552-a096-2522e13ed4e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 717881, 'reachable_time': 37760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310936, 'error': None, 'target': 'ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:29 np0005465988 systemd[1]: run-netns-ovnmeta\x2d325a04dc\x2dc467\x2d4853\x2db5e6\x2d2fae10dff6bd.mount: Deactivated successfully.
Oct  2 08:45:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:29.074 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-325a04dc-c467-4853-b5e6-2fae10dff6bd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:45:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:29.074 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[2e6f88f7-c767-420d-a698-a92c1dd4d978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:29.398 2 INFO nova.virt.libvirt.driver [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Deleting instance files /var/lib/nova/instances/790d8b15-8028-41cb-9b70-51e04c6ba7ff_del#033[00m
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:29.399 2 INFO nova.virt.libvirt.driver [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Deletion of /var/lib/nova/instances/790d8b15-8028-41cb-9b70-51e04c6ba7ff_del complete#033[00m
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:29.467 2 INFO nova.compute.manager [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Took 0.85 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:29.468 2 DEBUG oslo.service.loopingcall [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:29.468 2 DEBUG nova.compute.manager [-] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:29.468 2 DEBUG nova.network.neutron [-] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:45:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:45:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2824413070' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:45:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:45:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2824413070' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:45:29 np0005465988 nova_compute[236126]: 2025-10-02 12:45:29.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:29.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:30 np0005465988 nova_compute[236126]: 2025-10-02 12:45:30.203 2 DEBUG nova.network.neutron [-] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:45:30 np0005465988 nova_compute[236126]: 2025-10-02 12:45:30.237 2 INFO nova.compute.manager [-] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Took 0.77 seconds to deallocate network for instance.#033[00m
Oct  2 08:45:30 np0005465988 nova_compute[236126]: 2025-10-02 12:45:30.315 2 DEBUG oslo_concurrency.lockutils [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:30 np0005465988 nova_compute[236126]: 2025-10-02 12:45:30.316 2 DEBUG oslo_concurrency.lockutils [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:30 np0005465988 nova_compute[236126]: 2025-10-02 12:45:30.334 2 DEBUG nova.compute.manager [req-b178d8fe-9d0a-4bcc-bb03-5f00adc17124 req-1e9332b8-bb6a-4576-822c-b2be6a562692 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Received event network-vif-deleted-d4d92197-21c2-4463-8aed-7ad6d4d075d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:30 np0005465988 nova_compute[236126]: 2025-10-02 12:45:30.386 2 DEBUG nova.scheduler.client.report [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:45:30 np0005465988 nova_compute[236126]: 2025-10-02 12:45:30.435 2 DEBUG nova.scheduler.client.report [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:45:30 np0005465988 nova_compute[236126]: 2025-10-02 12:45:30.436 2 DEBUG nova.compute.provider_tree [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:45:30 np0005465988 nova_compute[236126]: 2025-10-02 12:45:30.453 2 DEBUG nova.scheduler.client.report [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:45:30 np0005465988 nova_compute[236126]: 2025-10-02 12:45:30.492 2 DEBUG nova.scheduler.client.report [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:45:30 np0005465988 nova_compute[236126]: 2025-10-02 12:45:30.586 2 DEBUG oslo_concurrency.processutils [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:45:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:31.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:45:31 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2787986661' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.044 2 DEBUG oslo_concurrency.processutils [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.052 2 DEBUG nova.compute.provider_tree [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.103 2 DEBUG nova.scheduler.client.report [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.132 2 DEBUG oslo_concurrency.lockutils [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.348 2 DEBUG nova.compute.manager [req-4f6c0af7-139a-4efb-b8e6-0731fcdf43df req-187076b8-f959-4d1b-9db9-2648c38038d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Received event network-vif-plugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.349 2 DEBUG oslo_concurrency.lockutils [req-4f6c0af7-139a-4efb-b8e6-0731fcdf43df req-187076b8-f959-4d1b-9db9-2648c38038d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.349 2 DEBUG oslo_concurrency.lockutils [req-4f6c0af7-139a-4efb-b8e6-0731fcdf43df req-187076b8-f959-4d1b-9db9-2648c38038d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.349 2 DEBUG oslo_concurrency.lockutils [req-4f6c0af7-139a-4efb-b8e6-0731fcdf43df req-187076b8-f959-4d1b-9db9-2648c38038d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.350 2 DEBUG nova.compute.manager [req-4f6c0af7-139a-4efb-b8e6-0731fcdf43df req-187076b8-f959-4d1b-9db9-2648c38038d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] No waiting events found dispatching network-vif-plugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.350 2 WARNING nova.compute.manager [req-4f6c0af7-139a-4efb-b8e6-0731fcdf43df req-187076b8-f959-4d1b-9db9-2648c38038d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Received unexpected event network-vif-plugged-d4d92197-21c2-4463-8aed-7ad6d4d075d5 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.374 2 INFO nova.scheduler.client.report [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Deleted allocations for instance 790d8b15-8028-41cb-9b70-51e04c6ba7ff#033[00m
Oct  2 08:45:31 np0005465988 nova_compute[236126]: 2025-10-02 12:45:31.453 2 DEBUG oslo_concurrency.lockutils [None req-3c2069c8-6591-4c65-86de-cfe8d72d1362 3f5203753507439b848f7dd6c0782f0e 338177849a8045758e5c446cc24ffaa8 - - default default] Lock "790d8b15-8028-41cb-9b70-51e04c6ba7ff" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:31.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:33.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:33.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:33 np0005465988 nova_compute[236126]: 2025-10-02 12:45:33.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e357 e357: 3 total, 3 up, 3 in
Oct  2 08:45:34 np0005465988 nova_compute[236126]: 2025-10-02 12:45:34.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:35.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:35.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:37.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:37.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:38 np0005465988 nova_compute[236126]: 2025-10-02 12:45:38.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:38 np0005465988 nova_compute[236126]: 2025-10-02 12:45:38.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e357 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:39.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:39 np0005465988 nova_compute[236126]: 2025-10-02 12:45:39.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:39 np0005465988 nova_compute[236126]: 2025-10-02 12:45:39.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:39.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:40 np0005465988 podman[311017]: 2025-10-02 12:45:40.545635077 +0000 UTC m=+0.075075255 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:45:40 np0005465988 podman[311018]: 2025-10-02 12:45:40.572437559 +0000 UTC m=+0.090528724 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:45:40 np0005465988 podman[311016]: 2025-10-02 12:45:40.582620018 +0000 UTC m=+0.115359450 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:45:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:41.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:41.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:45:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:43.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:45:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:43.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:43 np0005465988 nova_compute[236126]: 2025-10-02 12:45:43.858 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409128.8567712, 790d8b15-8028-41cb-9b70-51e04c6ba7ff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:45:43 np0005465988 nova_compute[236126]: 2025-10-02 12:45:43.859 2 INFO nova.compute.manager [-] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:45:43 np0005465988 nova_compute[236126]: 2025-10-02 12:45:43.894 2 DEBUG nova.compute.manager [None req-23bee46e-f2e0-470e-a626-99db8c1f704c - - - - - -] [instance: 790d8b15-8028-41cb-9b70-51e04c6ba7ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:45:43 np0005465988 nova_compute[236126]: 2025-10-02 12:45:43.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 e358: 3 total, 3 up, 3 in
Oct  2 08:45:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:44 np0005465988 nova_compute[236126]: 2025-10-02 12:45:44.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:45:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:45:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:45:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:45:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:45.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:45:45 np0005465988 nova_compute[236126]: 2025-10-02 12:45:45.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:45.030 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:45:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:45.032 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:45:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:45.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:47.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:47.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:48 np0005465988 nova_compute[236126]: 2025-10-02 12:45:48.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:45:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:49.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:45:49 np0005465988 nova_compute[236126]: 2025-10-02 12:45:49.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:45:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:49.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:45:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:45:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:51.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:45:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:45:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:45:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:51.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:53.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:53.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:53 np0005465988 nova_compute[236126]: 2025-10-02 12:45:53.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:45:54.035 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:54 np0005465988 nova_compute[236126]: 2025-10-02 12:45:54.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:45:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:55.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:45:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:55.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:45:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:57.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:45:57 np0005465988 podman[311318]: 2025-10-02 12:45:57.539211507 +0000 UTC m=+0.064614087 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 08:45:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:57.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:58 np0005465988 nova_compute[236126]: 2025-10-02 12:45:58.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:45:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:45:59.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:45:59 np0005465988 nova_compute[236126]: 2025-10-02 12:45:59.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:45:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:45:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:45:59.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:01.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.428 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.429 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.470 2 DEBUG nova.compute.manager [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.499 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.499 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.500 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.500 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.500 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.602 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.603 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.610 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.610 2 INFO nova.compute.claims [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:46:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:01.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.751 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:46:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2674747343' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:46:01 np0005465988 nova_compute[236126]: 2025-10-02 12:46:01.981 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.179 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.180 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4131MB free_disk=20.942584991455078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.180 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:46:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2659117791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.228 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.237 2 DEBUG nova.compute.provider_tree [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.295 2 DEBUG nova.scheduler.client.report [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.321 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.323 2 DEBUG nova.compute.manager [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.328 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.414 2 DEBUG nova.compute.manager [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.415 2 DEBUG nova.network.neutron [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.439 2 INFO nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.445 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 10556f01-d62c-45b4-adb3-fa7d9f7d8004 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.446 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.446 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.462 2 DEBUG nova.compute.manager [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.495 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.643 2 DEBUG nova.compute.manager [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.645 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.645 2 INFO nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Creating image(s)#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.669 2 DEBUG nova.storage.rbd_utils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.692 2 DEBUG nova.storage.rbd_utils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.714 2 DEBUG nova.storage.rbd_utils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.717 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.783 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.784 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.784 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.785 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.807 2 DEBUG nova.storage.rbd_utils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.811 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:46:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/845368689' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.939 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.947 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:46:02 np0005465988 nova_compute[236126]: 2025-10-02 12:46:02.973 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.006 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.006 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:03.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.048 2 DEBUG nova.policy [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f9c1a967b21e4d05a1e9cb54949a7527', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '508fca18f76a46cba8f3b8b8d8169ef1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.431 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.620s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.524 2 DEBUG nova.storage.rbd_utils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] resizing rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:46:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:03.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.871 2 DEBUG nova.objects.instance [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lazy-loading 'migration_context' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.899 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.899 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Ensure instance console log exists: /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.900 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.900 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.901 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:03 np0005465988 nova_compute[236126]: 2025-10-02 12:46:03.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:04 np0005465988 nova_compute[236126]: 2025-10-02 12:46:04.006 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:04 np0005465988 nova_compute[236126]: 2025-10-02 12:46:04.006 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:04 np0005465988 nova_compute[236126]: 2025-10-02 12:46:04.043 2 WARNING nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.#033[00m
Oct  2 08:46:04 np0005465988 nova_compute[236126]: 2025-10-02 12:46:04.043 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Triggering sync for uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:46:04 np0005465988 nova_compute[236126]: 2025-10-02 12:46:04.043 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:04 np0005465988 nova_compute[236126]: 2025-10-02 12:46:04.505 2 DEBUG nova.network.neutron [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Successfully created port: d63ba18a-72cf-4a65-b12b-e9ddcba7161b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:46:04 np0005465988 nova_compute[236126]: 2025-10-02 12:46:04.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:05.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:46:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:05.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:46:06 np0005465988 nova_compute[236126]: 2025-10-02 12:46:06.688 2 DEBUG nova.network.neutron [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Successfully updated port: d63ba18a-72cf-4a65-b12b-e9ddcba7161b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:46:06 np0005465988 nova_compute[236126]: 2025-10-02 12:46:06.744 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquiring lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:46:06 np0005465988 nova_compute[236126]: 2025-10-02 12:46:06.744 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquired lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:46:06 np0005465988 nova_compute[236126]: 2025-10-02 12:46:06.745 2 DEBUG nova.network.neutron [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:46:06 np0005465988 nova_compute[236126]: 2025-10-02 12:46:06.986 2 DEBUG nova.compute.manager [req-d6ea2b64-e329-44c9-be10-0a2197cb2d16 req-c7393d09-c296-464b-9bff-755c01e5f094 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received event network-changed-d63ba18a-72cf-4a65-b12b-e9ddcba7161b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:06 np0005465988 nova_compute[236126]: 2025-10-02 12:46:06.987 2 DEBUG nova.compute.manager [req-d6ea2b64-e329-44c9-be10-0a2197cb2d16 req-c7393d09-c296-464b-9bff-755c01e5f094 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Refreshing instance network info cache due to event network-changed-d63ba18a-72cf-4a65-b12b-e9ddcba7161b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:46:06 np0005465988 nova_compute[236126]: 2025-10-02 12:46:06.987 2 DEBUG oslo_concurrency.lockutils [req-d6ea2b64-e329-44c9-be10-0a2197cb2d16 req-c7393d09-c296-464b-9bff-755c01e5f094 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:46:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:46:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:07.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:46:07 np0005465988 nova_compute[236126]: 2025-10-02 12:46:07.591 2 DEBUG nova.network.neutron [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:46:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:07.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:08 np0005465988 nova_compute[236126]: 2025-10-02 12:46:08.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:46:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:09.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:46:09 np0005465988 nova_compute[236126]: 2025-10-02 12:46:09.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:09.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.437 2 DEBUG nova.network.neutron [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Updating instance_info_cache with network_info: [{"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.511 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.607 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Releasing lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.607 2 DEBUG nova.compute.manager [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Instance network_info: |[{"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.608 2 DEBUG oslo_concurrency.lockutils [req-d6ea2b64-e329-44c9-be10-0a2197cb2d16 req-c7393d09-c296-464b-9bff-755c01e5f094 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.609 2 DEBUG nova.network.neutron [req-d6ea2b64-e329-44c9-be10-0a2197cb2d16 req-c7393d09-c296-464b-9bff-755c01e5f094 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Refreshing network info cache for port d63ba18a-72cf-4a65-b12b-e9ddcba7161b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.614 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Start _get_guest_xml network_info=[{"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.624 2 WARNING nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.633 2 DEBUG nova.virt.libvirt.host [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.634 2 DEBUG nova.virt.libvirt.host [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.637 2 DEBUG nova.virt.libvirt.host [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.638 2 DEBUG nova.virt.libvirt.host [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.640 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.640 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.641 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.641 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.642 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.642 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.643 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.643 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.644 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.644 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.645 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.645 2 DEBUG nova.virt.hardware [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:46:10 np0005465988 nova_compute[236126]: 2025-10-02 12:46:10.649 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:11.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:46:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1504725169' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.138 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.169 2 DEBUG nova.storage.rbd_utils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.173 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:11 np0005465988 podman[311639]: 2025-10-02 12:46:11.569234903 +0000 UTC m=+0.086441058 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:46:11 np0005465988 podman[311640]: 2025-10-02 12:46:11.573955247 +0000 UTC m=+0.094040454 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:46:11 np0005465988 podman[311638]: 2025-10-02 12:46:11.604009781 +0000 UTC m=+0.125476697 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:46:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:46:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/662368703' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.647 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.649 2 DEBUG nova.virt.libvirt.vif [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:45:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-996046809',display_name='tempest-ServerRescueTestJSON-server-996046809',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-996046809',id=164,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='508fca18f76a46cba8f3b8b8d8169ef1',ramdisk_id='',reservation_id='r-zf6i9dmz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1772269056',owner_user_name='tempest-ServerRescueTestJSON-1772269056-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:46:02Z,user_data=None,user_id='f9c1a967b21e4d05a1e9cb54949a7527',uuid=10556f01-d62c-45b4-adb3-fa7d9f7d8004,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.650 2 DEBUG nova.network.os_vif_util [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Converting VIF {"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.650 2 DEBUG nova.network.os_vif_util [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:55:ae,bridge_name='br-int',has_traffic_filtering=True,id=d63ba18a-72cf-4a65-b12b-e9ddcba7161b,network=Network(54eb1883-31f7-40fa-864f-3516c27c1276),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63ba18a-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.652 2 DEBUG nova.objects.instance [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:11.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.779 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  <uuid>10556f01-d62c-45b4-adb3-fa7d9f7d8004</uuid>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  <name>instance-000000a4</name>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerRescueTestJSON-server-996046809</nova:name>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:46:10</nova:creationTime>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <nova:user uuid="f9c1a967b21e4d05a1e9cb54949a7527">tempest-ServerRescueTestJSON-1772269056-project-member</nova:user>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <nova:project uuid="508fca18f76a46cba8f3b8b8d8169ef1">tempest-ServerRescueTestJSON-1772269056</nova:project>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <nova:port uuid="d63ba18a-72cf-4a65-b12b-e9ddcba7161b">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <entry name="serial">10556f01-d62c-45b4-adb3-fa7d9f7d8004</entry>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <entry name="uuid">10556f01-d62c-45b4-adb3-fa7d9f7d8004</entry>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.config">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:03:55:ae"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <target dev="tapd63ba18a-72"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/console.log" append="off"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:46:11 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:46:11 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:46:11 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:46:11 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.780 2 DEBUG nova.compute.manager [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Preparing to wait for external event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.781 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.781 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.782 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.783 2 DEBUG nova.virt.libvirt.vif [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:45:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-996046809',display_name='tempest-ServerRescueTestJSON-server-996046809',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-996046809',id=164,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='508fca18f76a46cba8f3b8b8d8169ef1',ramdisk_id='',reservation_id='r-zf6i9dmz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1772269056',owner_user_name='tempest-ServerRescueTestJSON-1772269056-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:46:02Z,user_data=None,user_id='f9c1a967b21e4d05a1e9cb54949a7527',uuid=10556f01-d62c-45b4-adb3-fa7d9f7d8004,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.783 2 DEBUG nova.network.os_vif_util [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Converting VIF {"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.785 2 DEBUG nova.network.os_vif_util [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:55:ae,bridge_name='br-int',has_traffic_filtering=True,id=d63ba18a-72cf-4a65-b12b-e9ddcba7161b,network=Network(54eb1883-31f7-40fa-864f-3516c27c1276),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63ba18a-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.785 2 DEBUG os_vif [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:55:ae,bridge_name='br-int',has_traffic_filtering=True,id=d63ba18a-72cf-4a65-b12b-e9ddcba7161b,network=Network(54eb1883-31f7-40fa-864f-3516c27c1276),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63ba18a-72') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.787 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.788 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.793 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd63ba18a-72, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.794 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd63ba18a-72, col_values=(('external_ids', {'iface-id': 'd63ba18a-72cf-4a65-b12b-e9ddcba7161b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:55:ae', 'vm-uuid': '10556f01-d62c-45b4-adb3-fa7d9f7d8004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:11 np0005465988 NetworkManager[45041]: <info>  [1759409171.7991] manager: (tapd63ba18a-72): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/332)
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:11 np0005465988 nova_compute[236126]: 2025-10-02 12:46:11.809 2 INFO os_vif [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:55:ae,bridge_name='br-int',has_traffic_filtering=True,id=d63ba18a-72cf-4a65-b12b-e9ddcba7161b,network=Network(54eb1883-31f7-40fa-864f-3516c27c1276),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63ba18a-72')#033[00m
Oct  2 08:46:12 np0005465988 nova_compute[236126]: 2025-10-02 12:46:12.006 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:46:12 np0005465988 nova_compute[236126]: 2025-10-02 12:46:12.006 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:46:12 np0005465988 nova_compute[236126]: 2025-10-02 12:46:12.007 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] No VIF found with MAC fa:16:3e:03:55:ae, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:46:12 np0005465988 nova_compute[236126]: 2025-10-02 12:46:12.008 2 INFO nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Using config drive#033[00m
Oct  2 08:46:12 np0005465988 nova_compute[236126]: 2025-10-02 12:46:12.043 2 DEBUG nova.storage.rbd_utils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:46:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:13.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:46:13 np0005465988 nova_compute[236126]: 2025-10-02 12:46:13.183 2 INFO nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Creating config drive at /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config#033[00m
Oct  2 08:46:13 np0005465988 nova_compute[236126]: 2025-10-02 12:46:13.190 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjzh50kak execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:13 np0005465988 nova_compute[236126]: 2025-10-02 12:46:13.333 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjzh50kak" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:13 np0005465988 nova_compute[236126]: 2025-10-02 12:46:13.377 2 DEBUG nova.storage.rbd_utils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:13 np0005465988 nova_compute[236126]: 2025-10-02 12:46:13.382 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:13 np0005465988 nova_compute[236126]: 2025-10-02 12:46:13.592 2 DEBUG oslo_concurrency.processutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:13 np0005465988 nova_compute[236126]: 2025-10-02 12:46:13.593 2 INFO nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Deleting local config drive /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config because it was imported into RBD.#033[00m
Oct  2 08:46:13 np0005465988 kernel: tapd63ba18a-72: entered promiscuous mode
Oct  2 08:46:13 np0005465988 NetworkManager[45041]: <info>  [1759409173.6588] manager: (tapd63ba18a-72): new Tun device (/org/freedesktop/NetworkManager/Devices/333)
Oct  2 08:46:13 np0005465988 nova_compute[236126]: 2025-10-02 12:46:13.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:13 np0005465988 nova_compute[236126]: 2025-10-02 12:46:13.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:13 np0005465988 ovn_controller[132601]: 2025-10-02T12:46:13Z|00739|binding|INFO|Claiming lport d63ba18a-72cf-4a65-b12b-e9ddcba7161b for this chassis.
Oct  2 08:46:13 np0005465988 ovn_controller[132601]: 2025-10-02T12:46:13Z|00740|binding|INFO|d63ba18a-72cf-4a65-b12b-e9ddcba7161b: Claiming fa:16:3e:03:55:ae 10.100.0.13
Oct  2 08:46:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:13.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:13.693 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:55:ae 10.100.0.13'], port_security=['fa:16:3e:03:55:ae 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '10556f01-d62c-45b4-adb3-fa7d9f7d8004', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54eb1883-31f7-40fa-864f-3516c27c1276', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '508fca18f76a46cba8f3b8b8d8169ef1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1004833a-e427-4539-9aa9-ed5f03a58603', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b082b214-cf6d-4c88-9481-dd3bf0098234, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=d63ba18a-72cf-4a65-b12b-e9ddcba7161b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:46:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:13.695 142124 INFO neutron.agent.ovn.metadata.agent [-] Port d63ba18a-72cf-4a65-b12b-e9ddcba7161b in datapath 54eb1883-31f7-40fa-864f-3516c27c1276 bound to our chassis#033[00m
Oct  2 08:46:13 np0005465988 systemd-udevd[311828]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:46:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:13.697 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 54eb1883-31f7-40fa-864f-3516c27c1276 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:46:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:13.700 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e8dda2b8-c75f-4c34-83df-d11e15742f17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:13 np0005465988 systemd-machined[192594]: New machine qemu-77-instance-000000a4.
Oct  2 08:46:13 np0005465988 NetworkManager[45041]: <info>  [1759409173.7130] device (tapd63ba18a-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:46:13 np0005465988 NetworkManager[45041]: <info>  [1759409173.7139] device (tapd63ba18a-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:46:13 np0005465988 systemd[1]: Started Virtual Machine qemu-77-instance-000000a4.
Oct  2 08:46:13 np0005465988 nova_compute[236126]: 2025-10-02 12:46:13.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:13 np0005465988 ovn_controller[132601]: 2025-10-02T12:46:13Z|00741|binding|INFO|Setting lport d63ba18a-72cf-4a65-b12b-e9ddcba7161b ovn-installed in OVS
Oct  2 08:46:13 np0005465988 ovn_controller[132601]: 2025-10-02T12:46:13Z|00742|binding|INFO|Setting lport d63ba18a-72cf-4a65-b12b-e9ddcba7161b up in Southbound
Oct  2 08:46:13 np0005465988 nova_compute[236126]: 2025-10-02 12:46:13.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.252 2 DEBUG nova.compute.manager [req-b0dd59a2-916b-4b29-ba5c-b7004f67f438 req-4bcc728d-7aa9-4e05-8a53-91cbb8366344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.253 2 DEBUG oslo_concurrency.lockutils [req-b0dd59a2-916b-4b29-ba5c-b7004f67f438 req-4bcc728d-7aa9-4e05-8a53-91cbb8366344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.253 2 DEBUG oslo_concurrency.lockutils [req-b0dd59a2-916b-4b29-ba5c-b7004f67f438 req-4bcc728d-7aa9-4e05-8a53-91cbb8366344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.253 2 DEBUG oslo_concurrency.lockutils [req-b0dd59a2-916b-4b29-ba5c-b7004f67f438 req-4bcc728d-7aa9-4e05-8a53-91cbb8366344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.254 2 DEBUG nova.compute.manager [req-b0dd59a2-916b-4b29-ba5c-b7004f67f438 req-4bcc728d-7aa9-4e05-8a53-91cbb8366344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Processing event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.470 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.673 2 DEBUG nova.compute.manager [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.674 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409174.6732514, 10556f01-d62c-45b4-adb3-fa7d9f7d8004 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.674 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] VM Started (Lifecycle Event)#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.677 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.681 2 INFO nova.virt.libvirt.driver [-] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Instance spawned successfully.#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.681 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.778 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.786 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.787 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.788 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.789 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.790 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.791 2 DEBUG nova.virt.libvirt.driver [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.798 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.850 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.851 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409174.6742737, 10556f01-d62c-45b4-adb3-fa7d9f7d8004 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.851 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.930 2 DEBUG nova.network.neutron [req-d6ea2b64-e329-44c9-be10-0a2197cb2d16 req-c7393d09-c296-464b-9bff-755c01e5f094 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Updated VIF entry in instance network info cache for port d63ba18a-72cf-4a65-b12b-e9ddcba7161b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.931 2 DEBUG nova.network.neutron [req-d6ea2b64-e329-44c9-be10-0a2197cb2d16 req-c7393d09-c296-464b-9bff-755c01e5f094 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Updating instance_info_cache with network_info: [{"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.940 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.945 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409174.6780705, 10556f01-d62c-45b4-adb3-fa7d9f7d8004 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.946 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.977 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.978 2 DEBUG oslo_concurrency.lockutils [req-d6ea2b64-e329-44c9-be10-0a2197cb2d16 req-c7393d09-c296-464b-9bff-755c01e5f094 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:46:14 np0005465988 nova_compute[236126]: 2025-10-02 12:46:14.981 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:46:15 np0005465988 nova_compute[236126]: 2025-10-02 12:46:15.055 2 INFO nova.compute.manager [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Took 12.41 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:46:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:15.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:15 np0005465988 nova_compute[236126]: 2025-10-02 12:46:15.056 2 DEBUG nova.compute.manager [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:46:15 np0005465988 nova_compute[236126]: 2025-10-02 12:46:15.084 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:46:15 np0005465988 nova_compute[236126]: 2025-10-02 12:46:15.222 2 INFO nova.compute.manager [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Took 13.66 seconds to build instance.#033[00m
Oct  2 08:46:15 np0005465988 nova_compute[236126]: 2025-10-02 12:46:15.310 2 DEBUG oslo_concurrency.lockutils [None req-735e6632-4850-4d4e-8029-975695b79a9e f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:15 np0005465988 nova_compute[236126]: 2025-10-02 12:46:15.312 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 11.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:15 np0005465988 nova_compute[236126]: 2025-10-02 12:46:15.312 2 INFO nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:46:15 np0005465988 nova_compute[236126]: 2025-10-02 12:46:15.313 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:15 np0005465988 nova_compute[236126]: 2025-10-02 12:46:15.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:15.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:16 np0005465988 nova_compute[236126]: 2025-10-02 12:46:16.414 2 DEBUG nova.compute.manager [req-1a7dbb98-d41e-4883-9c6e-523a873a6f44 req-7e70a67a-f1ea-42c3-acc9-a7ce4a811f64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:16 np0005465988 nova_compute[236126]: 2025-10-02 12:46:16.415 2 DEBUG oslo_concurrency.lockutils [req-1a7dbb98-d41e-4883-9c6e-523a873a6f44 req-7e70a67a-f1ea-42c3-acc9-a7ce4a811f64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:16 np0005465988 nova_compute[236126]: 2025-10-02 12:46:16.416 2 DEBUG oslo_concurrency.lockutils [req-1a7dbb98-d41e-4883-9c6e-523a873a6f44 req-7e70a67a-f1ea-42c3-acc9-a7ce4a811f64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:16 np0005465988 nova_compute[236126]: 2025-10-02 12:46:16.416 2 DEBUG oslo_concurrency.lockutils [req-1a7dbb98-d41e-4883-9c6e-523a873a6f44 req-7e70a67a-f1ea-42c3-acc9-a7ce4a811f64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:16 np0005465988 nova_compute[236126]: 2025-10-02 12:46:16.416 2 DEBUG nova.compute.manager [req-1a7dbb98-d41e-4883-9c6e-523a873a6f44 req-7e70a67a-f1ea-42c3-acc9-a7ce4a811f64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] No waiting events found dispatching network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:46:16 np0005465988 nova_compute[236126]: 2025-10-02 12:46:16.417 2 WARNING nova.compute.manager [req-1a7dbb98-d41e-4883-9c6e-523a873a6f44 req-7e70a67a-f1ea-42c3-acc9-a7ce4a811f64 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received unexpected event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:46:16 np0005465988 nova_compute[236126]: 2025-10-02 12:46:16.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:16 np0005465988 nova_compute[236126]: 2025-10-02 12:46:16.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:17.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:17 np0005465988 nova_compute[236126]: 2025-10-02 12:46:17.612 2 INFO nova.compute.manager [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Rescuing#033[00m
Oct  2 08:46:17 np0005465988 nova_compute[236126]: 2025-10-02 12:46:17.613 2 DEBUG oslo_concurrency.lockutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquiring lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:46:17 np0005465988 nova_compute[236126]: 2025-10-02 12:46:17.614 2 DEBUG oslo_concurrency.lockutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquired lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:46:17 np0005465988 nova_compute[236126]: 2025-10-02 12:46:17.614 2 DEBUG nova.network.neutron [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:46:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:17.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:19.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:19 np0005465988 nova_compute[236126]: 2025-10-02 12:46:19.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:19.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:20 np0005465988 nova_compute[236126]: 2025-10-02 12:46:20.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:20 np0005465988 nova_compute[236126]: 2025-10-02 12:46:20.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:46:20 np0005465988 nova_compute[236126]: 2025-10-02 12:46:20.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:46:20 np0005465988 nova_compute[236126]: 2025-10-02 12:46:20.496 2 DEBUG nova.network.neutron [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Updating instance_info_cache with network_info: [{"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:46:20 np0005465988 nova_compute[236126]: 2025-10-02 12:46:20.580 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:46:20 np0005465988 nova_compute[236126]: 2025-10-02 12:46:20.595 2 DEBUG oslo_concurrency.lockutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Releasing lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:46:20 np0005465988 nova_compute[236126]: 2025-10-02 12:46:20.599 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:46:20 np0005465988 nova_compute[236126]: 2025-10-02 12:46:20.599 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:46:20 np0005465988 nova_compute[236126]: 2025-10-02 12:46:20.600 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:21.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:21 np0005465988 nova_compute[236126]: 2025-10-02 12:46:21.078 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:46:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:21.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:21 np0005465988 nova_compute[236126]: 2025-10-02 12:46:21.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:22 np0005465988 nova_compute[236126]: 2025-10-02 12:46:22.867 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Updating instance_info_cache with network_info: [{"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:46:22 np0005465988 nova_compute[236126]: 2025-10-02 12:46:22.924 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:46:22 np0005465988 nova_compute[236126]: 2025-10-02 12:46:22.925 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:46:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:46:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:23.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:46:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:23.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:24 np0005465988 nova_compute[236126]: 2025-10-02 12:46:24.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:25.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:25.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:26.122 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:46:26 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:26.123 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:46:26 np0005465988 nova_compute[236126]: 2025-10-02 12:46:26.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:26 np0005465988 nova_compute[236126]: 2025-10-02 12:46:26.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:27.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:27.386 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:27.387 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:27.387 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:27.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:28.126 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:28 np0005465988 podman[311887]: 2025-10-02 12:46:28.538379688 +0000 UTC m=+0.065141022 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  2 08:46:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:29.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:29 np0005465988 nova_compute[236126]: 2025-10-02 12:46:29.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:29.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:31.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:31 np0005465988 nova_compute[236126]: 2025-10-02 12:46:31.155 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:46:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:31.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:31 np0005465988 nova_compute[236126]: 2025-10-02 12:46:31.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:31 np0005465988 nova_compute[236126]: 2025-10-02 12:46:31.920 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:46:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:33.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:46:33 np0005465988 kernel: tapd63ba18a-72 (unregistering): left promiscuous mode
Oct  2 08:46:33 np0005465988 NetworkManager[45041]: <info>  [1759409193.6428] device (tapd63ba18a-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:46:33 np0005465988 nova_compute[236126]: 2025-10-02 12:46:33.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:33 np0005465988 ovn_controller[132601]: 2025-10-02T12:46:33Z|00743|binding|INFO|Releasing lport d63ba18a-72cf-4a65-b12b-e9ddcba7161b from this chassis (sb_readonly=0)
Oct  2 08:46:33 np0005465988 ovn_controller[132601]: 2025-10-02T12:46:33Z|00744|binding|INFO|Setting lport d63ba18a-72cf-4a65-b12b-e9ddcba7161b down in Southbound
Oct  2 08:46:33 np0005465988 ovn_controller[132601]: 2025-10-02T12:46:33Z|00745|binding|INFO|Removing iface tapd63ba18a-72 ovn-installed in OVS
Oct  2 08:46:33 np0005465988 nova_compute[236126]: 2025-10-02 12:46:33.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:33.675 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:55:ae 10.100.0.13'], port_security=['fa:16:3e:03:55:ae 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '10556f01-d62c-45b4-adb3-fa7d9f7d8004', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54eb1883-31f7-40fa-864f-3516c27c1276', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '508fca18f76a46cba8f3b8b8d8169ef1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1004833a-e427-4539-9aa9-ed5f03a58603', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b082b214-cf6d-4c88-9481-dd3bf0098234, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=d63ba18a-72cf-4a65-b12b-e9ddcba7161b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:46:33 np0005465988 nova_compute[236126]: 2025-10-02 12:46:33.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:33.678 142124 INFO neutron.agent.ovn.metadata.agent [-] Port d63ba18a-72cf-4a65-b12b-e9ddcba7161b in datapath 54eb1883-31f7-40fa-864f-3516c27c1276 unbound from our chassis#033[00m
Oct  2 08:46:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:33.679 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 54eb1883-31f7-40fa-864f-3516c27c1276 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:46:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:33.680 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cfed3e21-32c4-4d87-9eb4-60b8da606e21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:33.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:33 np0005465988 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a4.scope: Deactivated successfully.
Oct  2 08:46:33 np0005465988 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a4.scope: Consumed 12.900s CPU time.
Oct  2 08:46:33 np0005465988 systemd-machined[192594]: Machine qemu-77-instance-000000a4 terminated.
Oct  2 08:46:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.051 2 DEBUG nova.compute.manager [req-24781a72-e731-4473-85ad-70a927ac87de req-ba888ffd-a51f-4023-9d86-17b08ba67a01 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received event network-vif-unplugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.052 2 DEBUG oslo_concurrency.lockutils [req-24781a72-e731-4473-85ad-70a927ac87de req-ba888ffd-a51f-4023-9d86-17b08ba67a01 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.052 2 DEBUG oslo_concurrency.lockutils [req-24781a72-e731-4473-85ad-70a927ac87de req-ba888ffd-a51f-4023-9d86-17b08ba67a01 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.052 2 DEBUG oslo_concurrency.lockutils [req-24781a72-e731-4473-85ad-70a927ac87de req-ba888ffd-a51f-4023-9d86-17b08ba67a01 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.053 2 DEBUG nova.compute.manager [req-24781a72-e731-4473-85ad-70a927ac87de req-ba888ffd-a51f-4023-9d86-17b08ba67a01 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] No waiting events found dispatching network-vif-unplugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.053 2 WARNING nova.compute.manager [req-24781a72-e731-4473-85ad-70a927ac87de req-ba888ffd-a51f-4023-9d86-17b08ba67a01 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received unexpected event network-vif-unplugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.170 2 INFO nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.176 2 INFO nova.virt.libvirt.driver [-] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Instance destroyed successfully.#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.177 2 DEBUG nova.objects.instance [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lazy-loading 'numa_topology' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.196 2 INFO nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Attempting rescue#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.198 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.204 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.204 2 INFO nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Creating image(s)#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.243 2 DEBUG nova.storage.rbd_utils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.247 2 DEBUG nova.objects.instance [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.319 2 DEBUG nova.storage.rbd_utils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.365 2 DEBUG nova.storage.rbd_utils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.375 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.480 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.482 2 DEBUG oslo_concurrency.lockutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.483 2 DEBUG oslo_concurrency.lockutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.483 2 DEBUG oslo_concurrency.lockutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.524 2 DEBUG nova.storage.rbd_utils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.531 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.857 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.326s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.860 2 DEBUG nova.objects.instance [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lazy-loading 'migration_context' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.885 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.886 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Start _get_guest_xml network_info=[{"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-717258684-network", "vif_mac": "fa:16:3e:03:55:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.886 2 DEBUG nova.objects.instance [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lazy-loading 'resources' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.916 2 WARNING nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.927 2 DEBUG nova.virt.libvirt.host [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.927 2 DEBUG nova.virt.libvirt.host [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.934 2 DEBUG nova.virt.libvirt.host [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.935 2 DEBUG nova.virt.libvirt.host [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.936 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.936 2 DEBUG nova.virt.hardware [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.936 2 DEBUG nova.virt.hardware [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.937 2 DEBUG nova.virt.hardware [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.937 2 DEBUG nova.virt.hardware [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.937 2 DEBUG nova.virt.hardware [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.937 2 DEBUG nova.virt.hardware [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.937 2 DEBUG nova.virt.hardware [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.938 2 DEBUG nova.virt.hardware [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.938 2 DEBUG nova.virt.hardware [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.938 2 DEBUG nova.virt.hardware [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.938 2 DEBUG nova.virt.hardware [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.938 2 DEBUG nova.objects.instance [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:34 np0005465988 nova_compute[236126]: 2025-10-02 12:46:34.987 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:35.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:46:35 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1827592277' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:46:35 np0005465988 nova_compute[236126]: 2025-10-02 12:46:35.463 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:35 np0005465988 nova_compute[236126]: 2025-10-02 12:46:35.465 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:35.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:46:35 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1679318807' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:46:35 np0005465988 nova_compute[236126]: 2025-10-02 12:46:35.935 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:35 np0005465988 nova_compute[236126]: 2025-10-02 12:46:35.938 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.180 2 DEBUG nova.compute.manager [req-ef62b2fb-29dd-4879-8029-f33e52755634 req-bb393d0f-8328-4698-b984-cf69392c7296 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.182 2 DEBUG oslo_concurrency.lockutils [req-ef62b2fb-29dd-4879-8029-f33e52755634 req-bb393d0f-8328-4698-b984-cf69392c7296 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.183 2 DEBUG oslo_concurrency.lockutils [req-ef62b2fb-29dd-4879-8029-f33e52755634 req-bb393d0f-8328-4698-b984-cf69392c7296 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.183 2 DEBUG oslo_concurrency.lockutils [req-ef62b2fb-29dd-4879-8029-f33e52755634 req-bb393d0f-8328-4698-b984-cf69392c7296 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.184 2 DEBUG nova.compute.manager [req-ef62b2fb-29dd-4879-8029-f33e52755634 req-bb393d0f-8328-4698-b984-cf69392c7296 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] No waiting events found dispatching network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.185 2 WARNING nova.compute.manager [req-ef62b2fb-29dd-4879-8029-f33e52755634 req-bb393d0f-8328-4698-b984-cf69392c7296 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received unexpected event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:46:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:46:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1319475623' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.439 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.441 2 DEBUG nova.virt.libvirt.vif [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:45:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-996046809',display_name='tempest-ServerRescueTestJSON-server-996046809',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-996046809',id=164,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:46:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='508fca18f76a46cba8f3b8b8d8169ef1',ramdisk_id='',reservation_id='r-zf6i9dmz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1772269056',owner_user_name='tempest-ServerRescueTestJSON-1772269056-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:46:15Z,user_data=None,user_id='f9c1a967b21e4d05a1e9cb54949a7527',uuid=10556f01-d62c-45b4-adb3-fa7d9f7d8004,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-717258684-network", "vif_mac": "fa:16:3e:03:55:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.441 2 DEBUG nova.network.os_vif_util [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Converting VIF {"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-717258684-network", "vif_mac": "fa:16:3e:03:55:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.442 2 DEBUG nova.network.os_vif_util [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:55:ae,bridge_name='br-int',has_traffic_filtering=True,id=d63ba18a-72cf-4a65-b12b-e9ddcba7161b,network=Network(54eb1883-31f7-40fa-864f-3516c27c1276),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63ba18a-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.444 2 DEBUG nova.objects.instance [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.464 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  <uuid>10556f01-d62c-45b4-adb3-fa7d9f7d8004</uuid>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  <name>instance-000000a4</name>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerRescueTestJSON-server-996046809</nova:name>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:46:34</nova:creationTime>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <nova:user uuid="f9c1a967b21e4d05a1e9cb54949a7527">tempest-ServerRescueTestJSON-1772269056-project-member</nova:user>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <nova:project uuid="508fca18f76a46cba8f3b8b8d8169ef1">tempest-ServerRescueTestJSON-1772269056</nova:project>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <nova:port uuid="d63ba18a-72cf-4a65-b12b-e9ddcba7161b">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <entry name="serial">10556f01-d62c-45b4-adb3-fa7d9f7d8004</entry>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <entry name="uuid">10556f01-d62c-45b4-adb3-fa7d9f7d8004</entry>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.rescue">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <target dev="vdb" bus="virtio"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.config.rescue">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:03:55:ae"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <target dev="tapd63ba18a-72"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/console.log" append="off"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:46:36 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:46:36 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:46:36 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:46:36 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.475 2 INFO nova.virt.libvirt.driver [-] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Instance destroyed successfully.#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.548 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.549 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.549 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.550 2 DEBUG nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] No VIF found with MAC fa:16:3e:03:55:ae, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.550 2 INFO nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Using config drive#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.581 2 DEBUG nova.storage.rbd_utils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.602 2 DEBUG nova.objects.instance [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.628 2 DEBUG nova.objects.instance [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lazy-loading 'keypairs' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:36 np0005465988 nova_compute[236126]: 2025-10-02 12:46:36.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:37.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:37 np0005465988 nova_compute[236126]: 2025-10-02 12:46:37.387 2 INFO nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Creating config drive at /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config.rescue#033[00m
Oct  2 08:46:37 np0005465988 nova_compute[236126]: 2025-10-02 12:46:37.393 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp23yx9gjs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:37 np0005465988 nova_compute[236126]: 2025-10-02 12:46:37.539 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp23yx9gjs" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:37 np0005465988 nova_compute[236126]: 2025-10-02 12:46:37.576 2 DEBUG nova.storage.rbd_utils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] rbd image 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:46:37 np0005465988 nova_compute[236126]: 2025-10-02 12:46:37.580 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config.rescue 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:46:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:37.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:46:37 np0005465988 nova_compute[236126]: 2025-10-02 12:46:37.768 2 DEBUG oslo_concurrency.processutils [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config.rescue 10556f01-d62c-45b4-adb3-fa7d9f7d8004_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:37 np0005465988 nova_compute[236126]: 2025-10-02 12:46:37.769 2 INFO nova.virt.libvirt.driver [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Deleting local config drive /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004/disk.config.rescue because it was imported into RBD.#033[00m
Oct  2 08:46:37 np0005465988 kernel: tapd63ba18a-72: entered promiscuous mode
Oct  2 08:46:37 np0005465988 NetworkManager[45041]: <info>  [1759409197.8249] manager: (tapd63ba18a-72): new Tun device (/org/freedesktop/NetworkManager/Devices/334)
Oct  2 08:46:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:46:37Z|00746|binding|INFO|Claiming lport d63ba18a-72cf-4a65-b12b-e9ddcba7161b for this chassis.
Oct  2 08:46:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:46:37Z|00747|binding|INFO|d63ba18a-72cf-4a65-b12b-e9ddcba7161b: Claiming fa:16:3e:03:55:ae 10.100.0.13
Oct  2 08:46:37 np0005465988 nova_compute[236126]: 2025-10-02 12:46:37.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:37.831 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:55:ae 10.100.0.13'], port_security=['fa:16:3e:03:55:ae 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '10556f01-d62c-45b4-adb3-fa7d9f7d8004', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54eb1883-31f7-40fa-864f-3516c27c1276', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '508fca18f76a46cba8f3b8b8d8169ef1', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1004833a-e427-4539-9aa9-ed5f03a58603', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b082b214-cf6d-4c88-9481-dd3bf0098234, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=d63ba18a-72cf-4a65-b12b-e9ddcba7161b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:46:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:37.832 142124 INFO neutron.agent.ovn.metadata.agent [-] Port d63ba18a-72cf-4a65-b12b-e9ddcba7161b in datapath 54eb1883-31f7-40fa-864f-3516c27c1276 bound to our chassis#033[00m
Oct  2 08:46:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:37.834 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 54eb1883-31f7-40fa-864f-3516c27c1276 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:46:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:46:37.835 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[783173a8-7a9b-448b-96b0-7d72eaf44822]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:46:37Z|00748|binding|INFO|Setting lport d63ba18a-72cf-4a65-b12b-e9ddcba7161b ovn-installed in OVS
Oct  2 08:46:37 np0005465988 ovn_controller[132601]: 2025-10-02T12:46:37Z|00749|binding|INFO|Setting lport d63ba18a-72cf-4a65-b12b-e9ddcba7161b up in Southbound
Oct  2 08:46:37 np0005465988 nova_compute[236126]: 2025-10-02 12:46:37.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:37 np0005465988 systemd-udevd[312212]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:46:37 np0005465988 NetworkManager[45041]: <info>  [1759409197.8674] device (tapd63ba18a-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:46:37 np0005465988 NetworkManager[45041]: <info>  [1759409197.8687] device (tapd63ba18a-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:46:37 np0005465988 systemd-machined[192594]: New machine qemu-78-instance-000000a4.
Oct  2 08:46:37 np0005465988 systemd[1]: Started Virtual Machine qemu-78-instance-000000a4.
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.316 2 DEBUG nova.compute.manager [req-d65e3305-78ed-4556-9871-1c0f19825730 req-f5df5fd6-dbe2-4067-bf7c-0ab1186d0900 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.317 2 DEBUG oslo_concurrency.lockutils [req-d65e3305-78ed-4556-9871-1c0f19825730 req-f5df5fd6-dbe2-4067-bf7c-0ab1186d0900 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.317 2 DEBUG oslo_concurrency.lockutils [req-d65e3305-78ed-4556-9871-1c0f19825730 req-f5df5fd6-dbe2-4067-bf7c-0ab1186d0900 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.318 2 DEBUG oslo_concurrency.lockutils [req-d65e3305-78ed-4556-9871-1c0f19825730 req-f5df5fd6-dbe2-4067-bf7c-0ab1186d0900 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.318 2 DEBUG nova.compute.manager [req-d65e3305-78ed-4556-9871-1c0f19825730 req-f5df5fd6-dbe2-4067-bf7c-0ab1186d0900 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] No waiting events found dispatching network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.318 2 WARNING nova.compute.manager [req-d65e3305-78ed-4556-9871-1c0f19825730 req-f5df5fd6-dbe2-4067-bf7c-0ab1186d0900 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received unexpected event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.898 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 10556f01-d62c-45b4-adb3-fa7d9f7d8004 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.898 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409198.8979774, 10556f01-d62c-45b4-adb3-fa7d9f7d8004 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.899 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.903 2 DEBUG nova.compute.manager [None req-5f938dbb-cf1b-44b3-81c8-fdd0c4862346 f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.960 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:46:38 np0005465988 nova_compute[236126]: 2025-10-02 12:46:38.963 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:46:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:39 np0005465988 nova_compute[236126]: 2025-10-02 12:46:39.062 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:46:39 np0005465988 nova_compute[236126]: 2025-10-02 12:46:39.063 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409198.8993144, 10556f01-d62c-45b4-adb3-fa7d9f7d8004 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:46:39 np0005465988 nova_compute[236126]: 2025-10-02 12:46:39.063 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] VM Started (Lifecycle Event)#033[00m
Oct  2 08:46:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:46:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:39.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:46:39 np0005465988 nova_compute[236126]: 2025-10-02 12:46:39.151 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:46:39 np0005465988 nova_compute[236126]: 2025-10-02 12:46:39.159 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:46:39 np0005465988 nova_compute[236126]: 2025-10-02 12:46:39.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:39.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:40 np0005465988 nova_compute[236126]: 2025-10-02 12:46:40.508 2 DEBUG nova.compute.manager [req-5a77109d-162a-4c14-ad52-4b60c3442d40 req-7401b937-7e27-4c57-b179-40425f57ef54 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:40 np0005465988 nova_compute[236126]: 2025-10-02 12:46:40.511 2 DEBUG oslo_concurrency.lockutils [req-5a77109d-162a-4c14-ad52-4b60c3442d40 req-7401b937-7e27-4c57-b179-40425f57ef54 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:40 np0005465988 nova_compute[236126]: 2025-10-02 12:46:40.512 2 DEBUG oslo_concurrency.lockutils [req-5a77109d-162a-4c14-ad52-4b60c3442d40 req-7401b937-7e27-4c57-b179-40425f57ef54 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:40 np0005465988 nova_compute[236126]: 2025-10-02 12:46:40.512 2 DEBUG oslo_concurrency.lockutils [req-5a77109d-162a-4c14-ad52-4b60c3442d40 req-7401b937-7e27-4c57-b179-40425f57ef54 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:40 np0005465988 nova_compute[236126]: 2025-10-02 12:46:40.512 2 DEBUG nova.compute.manager [req-5a77109d-162a-4c14-ad52-4b60c3442d40 req-7401b937-7e27-4c57-b179-40425f57ef54 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] No waiting events found dispatching network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:46:40 np0005465988 nova_compute[236126]: 2025-10-02 12:46:40.513 2 WARNING nova.compute.manager [req-5a77109d-162a-4c14-ad52-4b60c3442d40 req-7401b937-7e27-4c57-b179-40425f57ef54 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received unexpected event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:46:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:41.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:41.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:41 np0005465988 nova_compute[236126]: 2025-10-02 12:46:41.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:42 np0005465988 podman[312285]: 2025-10-02 12:46:42.546212843 +0000 UTC m=+0.075354123 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 08:46:42 np0005465988 podman[312286]: 2025-10-02 12:46:42.550746182 +0000 UTC m=+0.077214536 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:46:42 np0005465988 podman[312284]: 2025-10-02 12:46:42.587464495 +0000 UTC m=+0.116525442 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller)
Oct  2 08:46:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:43.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:43.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:44 np0005465988 nova_compute[236126]: 2025-10-02 12:46:44.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:46:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:45.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:46:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:45.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:46 np0005465988 nova_compute[236126]: 2025-10-02 12:46:46.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:46:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:47.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:46:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:47.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:49.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:49 np0005465988 nova_compute[236126]: 2025-10-02 12:46:49.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:49.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:51.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:51.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:46:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:46:51 np0005465988 nova_compute[236126]: 2025-10-02 12:46:51.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:46:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:46:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:46:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:46:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:46:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:53.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:53.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:54 np0005465988 nova_compute[236126]: 2025-10-02 12:46:54.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:55.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:55.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:56 np0005465988 nova_compute[236126]: 2025-10-02 12:46:56.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:57.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:57.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:46:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:46:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:46:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:46:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:46:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:46:59.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:46:59 np0005465988 podman[312592]: 2025-10-02 12:46:59.550742385 +0000 UTC m=+0.083342909 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:46:59 np0005465988 nova_compute[236126]: 2025-10-02 12:46:59.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:46:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:46:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:46:59.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:47:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:01.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:47:01 np0005465988 nova_compute[236126]: 2025-10-02 12:47:01.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:01 np0005465988 nova_compute[236126]: 2025-10-02 12:47:01.500 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:01 np0005465988 nova_compute[236126]: 2025-10-02 12:47:01.500 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:01 np0005465988 nova_compute[236126]: 2025-10-02 12:47:01.501 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:01 np0005465988 nova_compute[236126]: 2025-10-02 12:47:01.501 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:47:01 np0005465988 nova_compute[236126]: 2025-10-02 12:47:01.502 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:01.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:01 np0005465988 nova_compute[236126]: 2025-10-02 12:47:01.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:47:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3752406533' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:47:01 np0005465988 nova_compute[236126]: 2025-10-02 12:47:01.987 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.088 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000a4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.089 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000a4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.089 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000a4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.274 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.276 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3952MB free_disk=20.81344985961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.276 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.276 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.376 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 10556f01-d62c-45b4-adb3-fa7d9f7d8004 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.377 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.377 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.428 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:47:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4123488424' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.886 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.894 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.918 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.944 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:47:02 np0005465988 nova_compute[236126]: 2025-10-02 12:47:02.944 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.668s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:03.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:03.199 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:47:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:03.200 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:47:03 np0005465988 nova_compute[236126]: 2025-10-02 12:47:03.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:47:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:03.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:47:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:04.204 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:04 np0005465988 nova_compute[236126]: 2025-10-02 12:47:04.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:04 np0005465988 nova_compute[236126]: 2025-10-02 12:47:04.945 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:05.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:05.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:06 np0005465988 nova_compute[236126]: 2025-10-02 12:47:06.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:47:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:07.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:47:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:07.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:08 np0005465988 nova_compute[236126]: 2025-10-02 12:47:08.432 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "bf9e8de1-5081-4daa-9041-1d329e06be86" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:08 np0005465988 nova_compute[236126]: 2025-10-02 12:47:08.433 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:08 np0005465988 nova_compute[236126]: 2025-10-02 12:47:08.462 2 DEBUG nova.compute.manager [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:47:08 np0005465988 nova_compute[236126]: 2025-10-02 12:47:08.607 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:08 np0005465988 nova_compute[236126]: 2025-10-02 12:47:08.607 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:08 np0005465988 nova_compute[236126]: 2025-10-02 12:47:08.615 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:47:08 np0005465988 nova_compute[236126]: 2025-10-02 12:47:08.615 2 INFO nova.compute.claims [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:47:08 np0005465988 nova_compute[236126]: 2025-10-02 12:47:08.763 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:09.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:47:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2225986146' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.232 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.241 2 DEBUG nova.compute.provider_tree [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.261 2 DEBUG nova.scheduler.client.report [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.290 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.292 2 DEBUG nova.compute.manager [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.351 2 DEBUG nova.compute.manager [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.351 2 DEBUG nova.network.neutron [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.375 2 INFO nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.402 2 DEBUG nova.compute.manager [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.531 2 DEBUG nova.compute.manager [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.532 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.533 2 INFO nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Creating image(s)#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.562 2 DEBUG nova.storage.rbd_utils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] rbd image bf9e8de1-5081-4daa-9041-1d329e06be86_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.595 2 DEBUG nova.storage.rbd_utils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] rbd image bf9e8de1-5081-4daa-9041-1d329e06be86_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.622 2 DEBUG nova.storage.rbd_utils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] rbd image bf9e8de1-5081-4daa-9041-1d329e06be86_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.626 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.696 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.696 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.697 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.697 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.718 2 DEBUG nova.storage.rbd_utils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] rbd image bf9e8de1-5081-4daa-9041-1d329e06be86_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:47:09 np0005465988 nova_compute[236126]: 2025-10-02 12:47:09.721 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 bf9e8de1-5081-4daa-9041-1d329e06be86_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:09.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:10 np0005465988 nova_compute[236126]: 2025-10-02 12:47:10.034 2 DEBUG nova.policy [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a24a7109471f4d96ad5f11b637fdb8e7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a837417d42da439cb794b4295bca2cee', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:47:10 np0005465988 nova_compute[236126]: 2025-10-02 12:47:10.116 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 bf9e8de1-5081-4daa-9041-1d329e06be86_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.395s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:10 np0005465988 nova_compute[236126]: 2025-10-02 12:47:10.202 2 DEBUG nova.storage.rbd_utils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] resizing rbd image bf9e8de1-5081-4daa-9041-1d329e06be86_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:47:10 np0005465988 nova_compute[236126]: 2025-10-02 12:47:10.320 2 DEBUG nova.objects.instance [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lazy-loading 'migration_context' on Instance uuid bf9e8de1-5081-4daa-9041-1d329e06be86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:47:10 np0005465988 nova_compute[236126]: 2025-10-02 12:47:10.344 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:47:10 np0005465988 nova_compute[236126]: 2025-10-02 12:47:10.344 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Ensure instance console log exists: /var/lib/nova/instances/bf9e8de1-5081-4daa-9041-1d329e06be86/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:47:10 np0005465988 nova_compute[236126]: 2025-10-02 12:47:10.345 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:10 np0005465988 nova_compute[236126]: 2025-10-02 12:47:10.345 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:10 np0005465988 nova_compute[236126]: 2025-10-02 12:47:10.346 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:11.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:11 np0005465988 nova_compute[236126]: 2025-10-02 12:47:11.474 2 DEBUG nova.network.neutron [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Successfully created port: 3ee9f78f-884b-40ae-b226-ed5161be4522 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:47:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:47:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:11.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:47:11 np0005465988 nova_compute[236126]: 2025-10-02 12:47:11.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:12 np0005465988 nova_compute[236126]: 2025-10-02 12:47:12.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:47:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:13.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:47:13 np0005465988 nova_compute[236126]: 2025-10-02 12:47:13.442 2 DEBUG nova.network.neutron [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Successfully updated port: 3ee9f78f-884b-40ae-b226-ed5161be4522 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:47:13 np0005465988 nova_compute[236126]: 2025-10-02 12:47:13.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:13 np0005465988 podman[312853]: 2025-10-02 12:47:13.528257659 +0000 UTC m=+0.063977410 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:47:13 np0005465988 podman[312854]: 2025-10-02 12:47:13.545309003 +0000 UTC m=+0.079527491 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 08:47:13 np0005465988 podman[312852]: 2025-10-02 12:47:13.56138746 +0000 UTC m=+0.101913667 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller)
Oct  2 08:47:13 np0005465988 nova_compute[236126]: 2025-10-02 12:47:13.609 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:47:13 np0005465988 nova_compute[236126]: 2025-10-02 12:47:13.609 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquired lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:47:13 np0005465988 nova_compute[236126]: 2025-10-02 12:47:13.610 2 DEBUG nova.network.neutron [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:47:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:13.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:13 np0005465988 nova_compute[236126]: 2025-10-02 12:47:13.865 2 DEBUG nova.compute.manager [req-1fac2f42-9a23-4a87-a52d-96c10a82f31a req-46432806-239c-4fb5-b4e0-f0565bac94d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Received event network-changed-3ee9f78f-884b-40ae-b226-ed5161be4522 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:13 np0005465988 nova_compute[236126]: 2025-10-02 12:47:13.866 2 DEBUG nova.compute.manager [req-1fac2f42-9a23-4a87-a52d-96c10a82f31a req-46432806-239c-4fb5-b4e0-f0565bac94d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Refreshing instance network info cache due to event network-changed-3ee9f78f-884b-40ae-b226-ed5161be4522. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:47:13 np0005465988 nova_compute[236126]: 2025-10-02 12:47:13.867 2 DEBUG oslo_concurrency.lockutils [req-1fac2f42-9a23-4a87-a52d-96c10a82f31a req-46432806-239c-4fb5-b4e0-f0565bac94d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:47:13 np0005465988 nova_compute[236126]: 2025-10-02 12:47:13.901 2 DEBUG nova.network.neutron [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:47:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:14 np0005465988 nova_compute[236126]: 2025-10-02 12:47:14.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:14 np0005465988 nova_compute[236126]: 2025-10-02 12:47:14.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:47:14 np0005465988 nova_compute[236126]: 2025-10-02 12:47:14.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:15.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.387 2 DEBUG nova.network.neutron [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updating instance_info_cache with network_info: [{"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.423 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Releasing lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.424 2 DEBUG nova.compute.manager [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Instance network_info: |[{"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.424 2 DEBUG oslo_concurrency.lockutils [req-1fac2f42-9a23-4a87-a52d-96c10a82f31a req-46432806-239c-4fb5-b4e0-f0565bac94d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.424 2 DEBUG nova.network.neutron [req-1fac2f42-9a23-4a87-a52d-96c10a82f31a req-46432806-239c-4fb5-b4e0-f0565bac94d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Refreshing network info cache for port 3ee9f78f-884b-40ae-b226-ed5161be4522 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.428 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Start _get_guest_xml network_info=[{"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.433 2 WARNING nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.442 2 DEBUG nova.virt.libvirt.host [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.443 2 DEBUG nova.virt.libvirt.host [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.449 2 DEBUG nova.virt.libvirt.host [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.450 2 DEBUG nova.virt.libvirt.host [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.451 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.451 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.452 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.452 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.452 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.452 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.453 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.453 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.453 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.453 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.453 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.454 2 DEBUG nova.virt.hardware [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.457 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.511 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:47:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:15.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:47:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:47:15 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3699447283' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.940 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.973 2 DEBUG nova.storage.rbd_utils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] rbd image bf9e8de1-5081-4daa-9041-1d329e06be86_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:47:15 np0005465988 nova_compute[236126]: 2025-10-02 12:47:15.976 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:47:16 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/102933903' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.468 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.471 2 DEBUG nova.virt.libvirt.vif [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestStampPattern-server-1078269595',display_name='tempest-TestStampPattern-server-1078269595',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-teststamppattern-server-1078269595',id=168,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCPnME7yj7AGpi8+zR0j6wiXokmK+k5Lh86YSlnXsP0prCCTi2saYZPKg3ZreiW8R+IqbWLBOHnWbtyyC7ToJeWaqTKxTG25O47OUrV5FbVX8vZbUi2AzjxwYa4KvWo3jw==',key_name='tempest-TestStampPattern-1303591770',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a837417d42da439cb794b4295bca2cee',ramdisk_id='',reservation_id='r-qfgfequ1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestStampPattern-901207223',owner_user_name='tempest-TestStampPattern-901207223-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:47:09Z,user_data=None,user_id='a24a7109471f4d96ad5f11b637fdb8e7',uuid=bf9e8de1-5081-4daa-9041-1d329e06be86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.471 2 DEBUG nova.network.os_vif_util [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Converting VIF {"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.472 2 DEBUG nova.network.os_vif_util [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:51:c5,bridge_name='br-int',has_traffic_filtering=True,id=3ee9f78f-884b-40ae-b226-ed5161be4522,network=Network(0739dafe-4d9b-4048-ac97-c017fd298447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ee9f78f-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.473 2 DEBUG nova.objects.instance [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lazy-loading 'pci_devices' on Instance uuid bf9e8de1-5081-4daa-9041-1d329e06be86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.476 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.507 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  <uuid>bf9e8de1-5081-4daa-9041-1d329e06be86</uuid>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  <name>instance-000000a8</name>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestStampPattern-server-1078269595</nova:name>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:47:15</nova:creationTime>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <nova:user uuid="a24a7109471f4d96ad5f11b637fdb8e7">tempest-TestStampPattern-901207223-project-member</nova:user>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <nova:project uuid="a837417d42da439cb794b4295bca2cee">tempest-TestStampPattern-901207223</nova:project>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <nova:port uuid="3ee9f78f-884b-40ae-b226-ed5161be4522">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <entry name="serial">bf9e8de1-5081-4daa-9041-1d329e06be86</entry>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <entry name="uuid">bf9e8de1-5081-4daa-9041-1d329e06be86</entry>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/bf9e8de1-5081-4daa-9041-1d329e06be86_disk">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/bf9e8de1-5081-4daa-9041-1d329e06be86_disk.config">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:05:51:c5"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <target dev="tap3ee9f78f-88"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/bf9e8de1-5081-4daa-9041-1d329e06be86/console.log" append="off"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:47:16 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:47:16 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:47:16 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:47:16 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.507 2 DEBUG nova.compute.manager [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Preparing to wait for external event network-vif-plugged-3ee9f78f-884b-40ae-b226-ed5161be4522 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.508 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.508 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.508 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.509 2 DEBUG nova.virt.libvirt.vif [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestStampPattern-server-1078269595',display_name='tempest-TestStampPattern-server-1078269595',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-teststamppattern-server-1078269595',id=168,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCPnME7yj7AGpi8+zR0j6wiXokmK+k5Lh86YSlnXsP0prCCTi2saYZPKg3ZreiW8R+IqbWLBOHnWbtyyC7ToJeWaqTKxTG25O47OUrV5FbVX8vZbUi2AzjxwYa4KvWo3jw==',key_name='tempest-TestStampPattern-1303591770',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a837417d42da439cb794b4295bca2cee',ramdisk_id='',reservation_id='r-qfgfequ1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestStampPattern-901207223',owner_user_name='tempest-TestStampPattern-901207223-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:47:09Z,user_data=None,user_id='a24a7109471f4d96ad5f11b637fdb8e7',uuid=bf9e8de1-5081-4daa-9041-1d329e06be86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.509 2 DEBUG nova.network.os_vif_util [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Converting VIF {"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.510 2 DEBUG nova.network.os_vif_util [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:51:c5,bridge_name='br-int',has_traffic_filtering=True,id=3ee9f78f-884b-40ae-b226-ed5161be4522,network=Network(0739dafe-4d9b-4048-ac97-c017fd298447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ee9f78f-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.510 2 DEBUG os_vif [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:51:c5,bridge_name='br-int',has_traffic_filtering=True,id=3ee9f78f-884b-40ae-b226-ed5161be4522,network=Network(0739dafe-4d9b-4048-ac97-c017fd298447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ee9f78f-88') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.511 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.512 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.518 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ee9f78f-88, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.519 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3ee9f78f-88, col_values=(('external_ids', {'iface-id': '3ee9f78f-884b-40ae-b226-ed5161be4522', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:51:c5', 'vm-uuid': 'bf9e8de1-5081-4daa-9041-1d329e06be86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:16 np0005465988 NetworkManager[45041]: <info>  [1759409236.5221] manager: (tap3ee9f78f-88): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.531 2 INFO os_vif [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:51:c5,bridge_name='br-int',has_traffic_filtering=True,id=3ee9f78f-884b-40ae-b226-ed5161be4522,network=Network(0739dafe-4d9b-4048-ac97-c017fd298447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ee9f78f-88')#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.766 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.767 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.769 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] No VIF found with MAC fa:16:3e:05:51:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.770 2 INFO nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Using config drive#033[00m
Oct  2 08:47:16 np0005465988 nova_compute[236126]: 2025-10-02 12:47:16.797 2 DEBUG nova.storage.rbd_utils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] rbd image bf9e8de1-5081-4daa-9041-1d329e06be86_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:47:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:17.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:17.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.149 2 INFO nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Creating config drive at /var/lib/nova/instances/bf9e8de1-5081-4daa-9041-1d329e06be86/disk.config#033[00m
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.157 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bf9e8de1-5081-4daa-9041-1d329e06be86/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfz3mt8c7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.254 2 DEBUG nova.network.neutron [req-1fac2f42-9a23-4a87-a52d-96c10a82f31a req-46432806-239c-4fb5-b4e0-f0565bac94d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updated VIF entry in instance network info cache for port 3ee9f78f-884b-40ae-b226-ed5161be4522. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.256 2 DEBUG nova.network.neutron [req-1fac2f42-9a23-4a87-a52d-96c10a82f31a req-46432806-239c-4fb5-b4e0-f0565bac94d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updating instance_info_cache with network_info: [{"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.310 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bf9e8de1-5081-4daa-9041-1d329e06be86/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfz3mt8c7" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.353 2 DEBUG nova.storage.rbd_utils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] rbd image bf9e8de1-5081-4daa-9041-1d329e06be86_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.360 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bf9e8de1-5081-4daa-9041-1d329e06be86/disk.config bf9e8de1-5081-4daa-9041-1d329e06be86_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.489 2 DEBUG oslo_concurrency.lockutils [req-1fac2f42-9a23-4a87-a52d-96c10a82f31a req-46432806-239c-4fb5-b4e0-f0565bac94d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.586 2 DEBUG oslo_concurrency.processutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bf9e8de1-5081-4daa-9041-1d329e06be86/disk.config bf9e8de1-5081-4daa-9041-1d329e06be86_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.587 2 INFO nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Deleting local config drive /var/lib/nova/instances/bf9e8de1-5081-4daa-9041-1d329e06be86/disk.config because it was imported into RBD.#033[00m
Oct  2 08:47:18 np0005465988 kernel: tap3ee9f78f-88: entered promiscuous mode
Oct  2 08:47:18 np0005465988 NetworkManager[45041]: <info>  [1759409238.6633] manager: (tap3ee9f78f-88): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:18Z|00750|binding|INFO|Claiming lport 3ee9f78f-884b-40ae-b226-ed5161be4522 for this chassis.
Oct  2 08:47:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:18Z|00751|binding|INFO|3ee9f78f-884b-40ae-b226-ed5161be4522: Claiming fa:16:3e:05:51:c5 10.100.0.10
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.682 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:51:c5 10.100.0.10'], port_security=['fa:16:3e:05:51:c5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bf9e8de1-5081-4daa-9041-1d329e06be86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0739dafe-4d9b-4048-ac97-c017fd298447', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a837417d42da439cb794b4295bca2cee', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2135b395-ac43-463e-b267-fe36f0a53800', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57c22c18-07d9-4913-840f-1bcc05bb2313, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3ee9f78f-884b-40ae-b226-ed5161be4522) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.683 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3ee9f78f-884b-40ae-b226-ed5161be4522 in datapath 0739dafe-4d9b-4048-ac97-c017fd298447 bound to our chassis#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.685 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0739dafe-4d9b-4048-ac97-c017fd298447#033[00m
Oct  2 08:47:18 np0005465988 systemd-udevd[313104]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.702 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8f175543-7498-4f55-a39b-5e095a6b863a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.703 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0739dafe-41 in ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.706 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0739dafe-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.706 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e839d85b-c82d-4b30-b23c-5e7ee033a9f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.707 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dd74af83-3111-4842-a75c-e426cc73d9f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 NetworkManager[45041]: <info>  [1759409238.7162] device (tap3ee9f78f-88): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:47:18 np0005465988 NetworkManager[45041]: <info>  [1759409238.7176] device (tap3ee9f78f-88): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.721 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[06115dbe-0259-4fcb-9368-5ce9a97cf98b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 systemd-machined[192594]: New machine qemu-79-instance-000000a8.
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.747 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f7472e8b-df95-4467-922d-22ed80b23803]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:18Z|00752|binding|INFO|Setting lport 3ee9f78f-884b-40ae-b226-ed5161be4522 ovn-installed in OVS
Oct  2 08:47:18 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:18Z|00753|binding|INFO|Setting lport 3ee9f78f-884b-40ae-b226-ed5161be4522 up in Southbound
Oct  2 08:47:18 np0005465988 systemd[1]: Started Virtual Machine qemu-79-instance-000000a8.
Oct  2 08:47:18 np0005465988 nova_compute[236126]: 2025-10-02 12:47:18.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.786 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[49c7f1ca-ad88-49b7-9398-6264808f2e83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.790 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5e6600-ddc9-47bd-9d43-3c0b0a4d8486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 NetworkManager[45041]: <info>  [1759409238.7922] manager: (tap0739dafe-40): new Veth device (/org/freedesktop/NetworkManager/Devices/337)
Oct  2 08:47:18 np0005465988 systemd-udevd[313109]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.826 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4435f4bf-cbed-47b7-a978-c70bcc6f731d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.830 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[11a467c4-9710-4736-9b2f-71f849ec69a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 NetworkManager[45041]: <info>  [1759409238.8586] device (tap0739dafe-40): carrier: link connected
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.866 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8121045f-e3e1-4951-a963-12f5acb591a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.884 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d8ca6ef7-1624-4a2c-b0d6-5bae77519a95]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0739dafe-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:9f:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 729918, 'reachable_time': 16550, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 313139, 'error': None, 'target': 'ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.902 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[870fd3d1-eccd-49cc-9ead-52eff943ed1a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe53:9fc0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 729918, 'tstamp': 729918}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 313140, 'error': None, 'target': 'ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.924 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a5682c98-fed4-4299-8927-0957c258299d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0739dafe-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:9f:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 729918, 'reachable_time': 16550, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 313141, 'error': None, 'target': 'ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:18.954 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6b1b5b44-8137-445f-91ac-addcd021d342]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:19.021 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba0ba61-7356-465d-83c5-e99cc2626fcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:19.023 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0739dafe-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:19.023 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:19.023 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0739dafe-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:19 np0005465988 kernel: tap0739dafe-40: entered promiscuous mode
Oct  2 08:47:19 np0005465988 NetworkManager[45041]: <info>  [1759409239.0262] manager: (tap0739dafe-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/338)
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:19.029 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0739dafe-40, col_values=(('external_ids', {'iface-id': '28a672bd-7c4d-49bd-8937-0e065b62aa5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:19 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:19Z|00754|binding|INFO|Releasing lport 28a672bd-7c4d-49bd-8937-0e065b62aa5f from this chassis (sb_readonly=0)
Oct  2 08:47:19 np0005465988 nova_compute[236126]: 2025-10-02 12:47:19.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:19 np0005465988 nova_compute[236126]: 2025-10-02 12:47:19.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:19.049 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0739dafe-4d9b-4048-ac97-c017fd298447.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0739dafe-4d9b-4048-ac97-c017fd298447.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:19.050 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b54416-ae8e-4bc2-9f35-f8b13496f7e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:19.051 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-0739dafe-4d9b-4048-ac97-c017fd298447
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/0739dafe-4d9b-4048-ac97-c017fd298447.pid.haproxy
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 0739dafe-4d9b-4048-ac97-c017fd298447
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:47:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:19.051 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447', 'env', 'PROCESS_TAG=haproxy-0739dafe-4d9b-4048-ac97-c017fd298447', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0739dafe-4d9b-4048-ac97-c017fd298447.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:47:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:47:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:19.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:47:19 np0005465988 podman[313216]: 2025-10-02 12:47:19.443128873 +0000 UTC m=+0.053294296 container create 927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:47:19 np0005465988 systemd[1]: Started libpod-conmon-927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b.scope.
Oct  2 08:47:19 np0005465988 podman[313216]: 2025-10-02 12:47:19.413196002 +0000 UTC m=+0.023361405 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:47:19 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:47:19 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b93d2b678e4bc3708eb956b1bc508428e60a43d0a18e51dced4a5bd2aad6f74e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:47:19 np0005465988 podman[313216]: 2025-10-02 12:47:19.544495764 +0000 UTC m=+0.154661177 container init 927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:47:19 np0005465988 podman[313216]: 2025-10-02 12:47:19.552162552 +0000 UTC m=+0.162327935 container start 927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:47:19 np0005465988 neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447[313231]: [NOTICE]   (313235) : New worker (313237) forked
Oct  2 08:47:19 np0005465988 neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447[313231]: [NOTICE]   (313235) : Loading success.
Oct  2 08:47:19 np0005465988 nova_compute[236126]: 2025-10-02 12:47:19.655 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409239.655009, bf9e8de1-5081-4daa-9041-1d329e06be86 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:47:19 np0005465988 nova_compute[236126]: 2025-10-02 12:47:19.656 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] VM Started (Lifecycle Event)#033[00m
Oct  2 08:47:19 np0005465988 nova_compute[236126]: 2025-10-02 12:47:19.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:19 np0005465988 nova_compute[236126]: 2025-10-02 12:47:19.693 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:47:19 np0005465988 nova_compute[236126]: 2025-10-02 12:47:19.697 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409239.6559389, bf9e8de1-5081-4daa-9041-1d329e06be86 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:47:19 np0005465988 nova_compute[236126]: 2025-10-02 12:47:19.698 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:47:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:19.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:19 np0005465988 nova_compute[236126]: 2025-10-02 12:47:19.814 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:47:19 np0005465988 nova_compute[236126]: 2025-10-02 12:47:19.820 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:47:19 np0005465988 nova_compute[236126]: 2025-10-02 12:47:19.931 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.636 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.872 2 DEBUG nova.compute.manager [req-dddf38f8-bc1e-4c09-aaaf-0ff644a21c0a req-59de5902-a619-4a9f-a1fe-b40522389ddf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Received event network-vif-plugged-3ee9f78f-884b-40ae-b226-ed5161be4522 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.873 2 DEBUG oslo_concurrency.lockutils [req-dddf38f8-bc1e-4c09-aaaf-0ff644a21c0a req-59de5902-a619-4a9f-a1fe-b40522389ddf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.873 2 DEBUG oslo_concurrency.lockutils [req-dddf38f8-bc1e-4c09-aaaf-0ff644a21c0a req-59de5902-a619-4a9f-a1fe-b40522389ddf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.874 2 DEBUG oslo_concurrency.lockutils [req-dddf38f8-bc1e-4c09-aaaf-0ff644a21c0a req-59de5902-a619-4a9f-a1fe-b40522389ddf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.874 2 DEBUG nova.compute.manager [req-dddf38f8-bc1e-4c09-aaaf-0ff644a21c0a req-59de5902-a619-4a9f-a1fe-b40522389ddf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Processing event network-vif-plugged-3ee9f78f-884b-40ae-b226-ed5161be4522 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.875 2 DEBUG nova.compute.manager [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.878 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409240.8778553, bf9e8de1-5081-4daa-9041-1d329e06be86 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.878 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.881 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.884 2 INFO nova.virt.libvirt.driver [-] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Instance spawned successfully.#033[00m
Oct  2 08:47:20 np0005465988 nova_compute[236126]: 2025-10-02 12:47:20.885 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.063 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.069 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.073 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.074 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.074 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.075 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.075 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.076 2 DEBUG nova.virt.libvirt.driver [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:21.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.197 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.312 2 INFO nova.compute.manager [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Took 11.78 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.313 2 DEBUG nova.compute.manager [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.499 2 INFO nova.compute.manager [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Took 12.93 seconds to build instance.#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:21 np0005465988 nova_compute[236126]: 2025-10-02 12:47:21.595 2 DEBUG oslo_concurrency.lockutils [None req-134487f1-328d-4d2f-a8a1-30d0175419e1 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:21.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:22 np0005465988 nova_compute[236126]: 2025-10-02 12:47:22.062 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:47:22 np0005465988 nova_compute[236126]: 2025-10-02 12:47:22.063 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:47:22 np0005465988 nova_compute[236126]: 2025-10-02 12:47:22.063 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:47:22 np0005465988 nova_compute[236126]: 2025-10-02 12:47:22.064 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:47:23 np0005465988 nova_compute[236126]: 2025-10-02 12:47:23.068 2 DEBUG nova.compute.manager [req-1756c669-6bbe-4ebc-bc9f-461d04d7d6da req-9c6793d9-fbf2-47d6-9251-d65dacc0510e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Received event network-vif-plugged-3ee9f78f-884b-40ae-b226-ed5161be4522 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:23 np0005465988 nova_compute[236126]: 2025-10-02 12:47:23.068 2 DEBUG oslo_concurrency.lockutils [req-1756c669-6bbe-4ebc-bc9f-461d04d7d6da req-9c6793d9-fbf2-47d6-9251-d65dacc0510e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:23 np0005465988 nova_compute[236126]: 2025-10-02 12:47:23.068 2 DEBUG oslo_concurrency.lockutils [req-1756c669-6bbe-4ebc-bc9f-461d04d7d6da req-9c6793d9-fbf2-47d6-9251-d65dacc0510e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:23 np0005465988 nova_compute[236126]: 2025-10-02 12:47:23.069 2 DEBUG oslo_concurrency.lockutils [req-1756c669-6bbe-4ebc-bc9f-461d04d7d6da req-9c6793d9-fbf2-47d6-9251-d65dacc0510e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:23 np0005465988 nova_compute[236126]: 2025-10-02 12:47:23.069 2 DEBUG nova.compute.manager [req-1756c669-6bbe-4ebc-bc9f-461d04d7d6da req-9c6793d9-fbf2-47d6-9251-d65dacc0510e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] No waiting events found dispatching network-vif-plugged-3ee9f78f-884b-40ae-b226-ed5161be4522 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:47:23 np0005465988 nova_compute[236126]: 2025-10-02 12:47:23.069 2 WARNING nova.compute.manager [req-1756c669-6bbe-4ebc-bc9f-461d04d7d6da req-9c6793d9-fbf2-47d6-9251-d65dacc0510e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Received unexpected event network-vif-plugged-3ee9f78f-884b-40ae-b226-ed5161be4522 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:47:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:23.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:23.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:24 np0005465988 nova_compute[236126]: 2025-10-02 12:47:24.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:47:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:25.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:47:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:25.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:25 np0005465988 nova_compute[236126]: 2025-10-02 12:47:25.880 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Updating instance_info_cache with network_info: [{"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:47:25 np0005465988 nova_compute[236126]: 2025-10-02 12:47:25.936 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-10556f01-d62c-45b4-adb3-fa7d9f7d8004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:47:25 np0005465988 nova_compute[236126]: 2025-10-02 12:47:25.936 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:47:26 np0005465988 nova_compute[236126]: 2025-10-02 12:47:26.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:27.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:27.388 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:27.389 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:27.390 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:27.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:27 np0005465988 nova_compute[236126]: 2025-10-02 12:47:27.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:27 np0005465988 NetworkManager[45041]: <info>  [1759409247.8212] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Oct  2 08:47:27 np0005465988 NetworkManager[45041]: <info>  [1759409247.8236] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Oct  2 08:47:28 np0005465988 nova_compute[236126]: 2025-10-02 12:47:28.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:28Z|00755|binding|INFO|Releasing lport 28a672bd-7c4d-49bd-8937-0e065b62aa5f from this chassis (sb_readonly=0)
Oct  2 08:47:28 np0005465988 nova_compute[236126]: 2025-10-02 12:47:28.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:28 np0005465988 nova_compute[236126]: 2025-10-02 12:47:28.163 2 DEBUG nova.compute.manager [req-c2f13b1a-4563-4c1b-b25d-6244827b1e14 req-3e704c2e-6055-4897-a91f-b0b70f52c483 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Received event network-changed-3ee9f78f-884b-40ae-b226-ed5161be4522 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:28 np0005465988 nova_compute[236126]: 2025-10-02 12:47:28.164 2 DEBUG nova.compute.manager [req-c2f13b1a-4563-4c1b-b25d-6244827b1e14 req-3e704c2e-6055-4897-a91f-b0b70f52c483 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Refreshing instance network info cache due to event network-changed-3ee9f78f-884b-40ae-b226-ed5161be4522. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:47:28 np0005465988 nova_compute[236126]: 2025-10-02 12:47:28.164 2 DEBUG oslo_concurrency.lockutils [req-c2f13b1a-4563-4c1b-b25d-6244827b1e14 req-3e704c2e-6055-4897-a91f-b0b70f52c483 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:47:28 np0005465988 nova_compute[236126]: 2025-10-02 12:47:28.164 2 DEBUG oslo_concurrency.lockutils [req-c2f13b1a-4563-4c1b-b25d-6244827b1e14 req-3e704c2e-6055-4897-a91f-b0b70f52c483 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:47:28 np0005465988 nova_compute[236126]: 2025-10-02 12:47:28.165 2 DEBUG nova.network.neutron [req-c2f13b1a-4563-4c1b-b25d-6244827b1e14 req-3e704c2e-6055-4897-a91f-b0b70f52c483 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Refreshing network info cache for port 3ee9f78f-884b-40ae-b226-ed5161be4522 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:47:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:29.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:29 np0005465988 nova_compute[236126]: 2025-10-02 12:47:29.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:29.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:30 np0005465988 podman[313252]: 2025-10-02 12:47:30.526004819 +0000 UTC m=+0.052926585 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct  2 08:47:30 np0005465988 nova_compute[236126]: 2025-10-02 12:47:30.834 2 DEBUG nova.network.neutron [req-c2f13b1a-4563-4c1b-b25d-6244827b1e14 req-3e704c2e-6055-4897-a91f-b0b70f52c483 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updated VIF entry in instance network info cache for port 3ee9f78f-884b-40ae-b226-ed5161be4522. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:47:30 np0005465988 nova_compute[236126]: 2025-10-02 12:47:30.835 2 DEBUG nova.network.neutron [req-c2f13b1a-4563-4c1b-b25d-6244827b1e14 req-3e704c2e-6055-4897-a91f-b0b70f52c483 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updating instance_info_cache with network_info: [{"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:47:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:47:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:31.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:47:31 np0005465988 nova_compute[236126]: 2025-10-02 12:47:31.211 2 DEBUG oslo_concurrency.lockutils [req-c2f13b1a-4563-4c1b-b25d-6244827b1e14 req-3e704c2e-6055-4897-a91f-b0b70f52c483 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:47:31 np0005465988 nova_compute[236126]: 2025-10-02 12:47:31.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:31.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:33 np0005465988 nova_compute[236126]: 2025-10-02 12:47:33.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:33.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:33.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:34 np0005465988 nova_compute[236126]: 2025-10-02 12:47:34.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:35Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:05:51:c5 10.100.0.10
Oct  2 08:47:35 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:35Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:51:c5 10.100.0.10
Oct  2 08:47:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:35.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:47:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:35.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:47:36 np0005465988 nova_compute[236126]: 2025-10-02 12:47:36.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:47:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:37.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:47:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:37.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:37 np0005465988 nova_compute[236126]: 2025-10-02 12:47:37.969 2 DEBUG oslo_concurrency.lockutils [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:37 np0005465988 nova_compute[236126]: 2025-10-02 12:47:37.970 2 DEBUG oslo_concurrency.lockutils [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:37 np0005465988 nova_compute[236126]: 2025-10-02 12:47:37.970 2 DEBUG oslo_concurrency.lockutils [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:37 np0005465988 nova_compute[236126]: 2025-10-02 12:47:37.970 2 DEBUG oslo_concurrency.lockutils [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:37 np0005465988 nova_compute[236126]: 2025-10-02 12:47:37.970 2 DEBUG oslo_concurrency.lockutils [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:37 np0005465988 nova_compute[236126]: 2025-10-02 12:47:37.971 2 INFO nova.compute.manager [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Terminating instance#033[00m
Oct  2 08:47:37 np0005465988 nova_compute[236126]: 2025-10-02 12:47:37.972 2 DEBUG nova.compute.manager [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:47:38 np0005465988 kernel: tapd63ba18a-72 (unregistering): left promiscuous mode
Oct  2 08:47:38 np0005465988 NetworkManager[45041]: <info>  [1759409258.0921] device (tapd63ba18a-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:38Z|00756|binding|INFO|Releasing lport d63ba18a-72cf-4a65-b12b-e9ddcba7161b from this chassis (sb_readonly=0)
Oct  2 08:47:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:38Z|00757|binding|INFO|Setting lport d63ba18a-72cf-4a65-b12b-e9ddcba7161b down in Southbound
Oct  2 08:47:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:38Z|00758|binding|INFO|Removing iface tapd63ba18a-72 ovn-installed in OVS
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:38.117 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:55:ae 10.100.0.13'], port_security=['fa:16:3e:03:55:ae 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '10556f01-d62c-45b4-adb3-fa7d9f7d8004', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54eb1883-31f7-40fa-864f-3516c27c1276', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '508fca18f76a46cba8f3b8b8d8169ef1', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1004833a-e427-4539-9aa9-ed5f03a58603', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b082b214-cf6d-4c88-9481-dd3bf0098234, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=d63ba18a-72cf-4a65-b12b-e9ddcba7161b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:47:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:38.118 142124 INFO neutron.agent.ovn.metadata.agent [-] Port d63ba18a-72cf-4a65-b12b-e9ddcba7161b in datapath 54eb1883-31f7-40fa-864f-3516c27c1276 unbound from our chassis#033[00m
Oct  2 08:47:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:38.120 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 54eb1883-31f7-40fa-864f-3516c27c1276 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:47:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:38.123 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d25f0e69-18da-41a7-8297-2953a8688bd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:38 np0005465988 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a4.scope: Deactivated successfully.
Oct  2 08:47:38 np0005465988 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a4.scope: Consumed 15.658s CPU time.
Oct  2 08:47:38 np0005465988 systemd-machined[192594]: Machine qemu-78-instance-000000a4 terminated.
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.211 2 INFO nova.virt.libvirt.driver [-] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Instance destroyed successfully.#033[00m
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.212 2 DEBUG nova.objects.instance [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lazy-loading 'resources' on Instance uuid 10556f01-d62c-45b4-adb3-fa7d9f7d8004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.230 2 DEBUG nova.virt.libvirt.vif [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:45:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-996046809',display_name='tempest-ServerRescueTestJSON-server-996046809',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-996046809',id=164,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:46:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='508fca18f76a46cba8f3b8b8d8169ef1',ramdisk_id='',reservation_id='r-zf6i9dmz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1772269056',owner_user_name='tempest-ServerRescueTestJSON-1772269056-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:46:39Z,user_data=None,user_id='f9c1a967b21e4d05a1e9cb54949a7527',uuid=10556f01-d62c-45b4-adb3-fa7d9f7d8004,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.231 2 DEBUG nova.network.os_vif_util [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Converting VIF {"id": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "address": "fa:16:3e:03:55:ae", "network": {"id": "54eb1883-31f7-40fa-864f-3516c27c1276", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-717258684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "508fca18f76a46cba8f3b8b8d8169ef1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63ba18a-72", "ovs_interfaceid": "d63ba18a-72cf-4a65-b12b-e9ddcba7161b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.231 2 DEBUG nova.network.os_vif_util [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:55:ae,bridge_name='br-int',has_traffic_filtering=True,id=d63ba18a-72cf-4a65-b12b-e9ddcba7161b,network=Network(54eb1883-31f7-40fa-864f-3516c27c1276),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63ba18a-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.234 2 DEBUG os_vif [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:55:ae,bridge_name='br-int',has_traffic_filtering=True,id=d63ba18a-72cf-4a65-b12b-e9ddcba7161b,network=Network(54eb1883-31f7-40fa-864f-3516c27c1276),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63ba18a-72') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.236 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd63ba18a-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:38 np0005465988 nova_compute[236126]: 2025-10-02 12:47:38.242 2 INFO os_vif [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:55:ae,bridge_name='br-int',has_traffic_filtering=True,id=d63ba18a-72cf-4a65-b12b-e9ddcba7161b,network=Network(54eb1883-31f7-40fa-864f-3516c27c1276),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63ba18a-72')#033[00m
Oct  2 08:47:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:39 np0005465988 nova_compute[236126]: 2025-10-02 12:47:39.045 2 DEBUG nova.compute.manager [req-9ca22a1f-c6b6-4429-93a8-d4ecab9cc04b req-34f0ecf0-2c0f-44fa-9f79-b63dd26e1b9d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received event network-vif-unplugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:39 np0005465988 nova_compute[236126]: 2025-10-02 12:47:39.045 2 DEBUG oslo_concurrency.lockutils [req-9ca22a1f-c6b6-4429-93a8-d4ecab9cc04b req-34f0ecf0-2c0f-44fa-9f79-b63dd26e1b9d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:39 np0005465988 nova_compute[236126]: 2025-10-02 12:47:39.048 2 DEBUG oslo_concurrency.lockutils [req-9ca22a1f-c6b6-4429-93a8-d4ecab9cc04b req-34f0ecf0-2c0f-44fa-9f79-b63dd26e1b9d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:39 np0005465988 nova_compute[236126]: 2025-10-02 12:47:39.048 2 DEBUG oslo_concurrency.lockutils [req-9ca22a1f-c6b6-4429-93a8-d4ecab9cc04b req-34f0ecf0-2c0f-44fa-9f79-b63dd26e1b9d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:39 np0005465988 nova_compute[236126]: 2025-10-02 12:47:39.048 2 DEBUG nova.compute.manager [req-9ca22a1f-c6b6-4429-93a8-d4ecab9cc04b req-34f0ecf0-2c0f-44fa-9f79-b63dd26e1b9d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] No waiting events found dispatching network-vif-unplugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:47:39 np0005465988 nova_compute[236126]: 2025-10-02 12:47:39.048 2 DEBUG nova.compute.manager [req-9ca22a1f-c6b6-4429-93a8-d4ecab9cc04b req-34f0ecf0-2c0f-44fa-9f79-b63dd26e1b9d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received event network-vif-unplugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:47:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:39.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:39 np0005465988 nova_compute[236126]: 2025-10-02 12:47:39.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:39.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:40 np0005465988 nova_compute[236126]: 2025-10-02 12:47:40.272 2 INFO nova.virt.libvirt.driver [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Deleting instance files /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004_del#033[00m
Oct  2 08:47:40 np0005465988 nova_compute[236126]: 2025-10-02 12:47:40.274 2 INFO nova.virt.libvirt.driver [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Deletion of /var/lib/nova/instances/10556f01-d62c-45b4-adb3-fa7d9f7d8004_del complete#033[00m
Oct  2 08:47:40 np0005465988 nova_compute[236126]: 2025-10-02 12:47:40.373 2 INFO nova.compute.manager [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Took 2.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:47:40 np0005465988 nova_compute[236126]: 2025-10-02 12:47:40.374 2 DEBUG oslo.service.loopingcall [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:47:40 np0005465988 nova_compute[236126]: 2025-10-02 12:47:40.374 2 DEBUG nova.compute.manager [-] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:47:40 np0005465988 nova_compute[236126]: 2025-10-02 12:47:40.375 2 DEBUG nova.network.neutron [-] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:47:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:47:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:41.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:47:41 np0005465988 nova_compute[236126]: 2025-10-02 12:47:41.188 2 DEBUG nova.compute.manager [req-577f8be7-70d9-4dc7-ac0c-46fe461115b7 req-07537d01-c43d-4e28-86f3-b8aa6d158b7e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:41 np0005465988 nova_compute[236126]: 2025-10-02 12:47:41.189 2 DEBUG oslo_concurrency.lockutils [req-577f8be7-70d9-4dc7-ac0c-46fe461115b7 req-07537d01-c43d-4e28-86f3-b8aa6d158b7e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:41 np0005465988 nova_compute[236126]: 2025-10-02 12:47:41.189 2 DEBUG oslo_concurrency.lockutils [req-577f8be7-70d9-4dc7-ac0c-46fe461115b7 req-07537d01-c43d-4e28-86f3-b8aa6d158b7e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:41 np0005465988 nova_compute[236126]: 2025-10-02 12:47:41.189 2 DEBUG oslo_concurrency.lockutils [req-577f8be7-70d9-4dc7-ac0c-46fe461115b7 req-07537d01-c43d-4e28-86f3-b8aa6d158b7e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:41 np0005465988 nova_compute[236126]: 2025-10-02 12:47:41.189 2 DEBUG nova.compute.manager [req-577f8be7-70d9-4dc7-ac0c-46fe461115b7 req-07537d01-c43d-4e28-86f3-b8aa6d158b7e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] No waiting events found dispatching network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:47:41 np0005465988 nova_compute[236126]: 2025-10-02 12:47:41.190 2 WARNING nova.compute.manager [req-577f8be7-70d9-4dc7-ac0c-46fe461115b7 req-07537d01-c43d-4e28-86f3-b8aa6d158b7e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received unexpected event network-vif-plugged-d63ba18a-72cf-4a65-b12b-e9ddcba7161b for instance with vm_state rescued and task_state deleting.#033[00m
Oct  2 08:47:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:47:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:41.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:47:41 np0005465988 nova_compute[236126]: 2025-10-02 12:47:41.850 2 DEBUG nova.network.neutron [-] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:47:42 np0005465988 nova_compute[236126]: 2025-10-02 12:47:42.024 2 DEBUG nova.compute.manager [req-b6296f07-8404-4bad-955a-ac790a120f91 req-be7cc44e-c171-4667-86d0-8135fe4caade d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Received event network-vif-deleted-d63ba18a-72cf-4a65-b12b-e9ddcba7161b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:42 np0005465988 nova_compute[236126]: 2025-10-02 12:47:42.024 2 INFO nova.compute.manager [req-b6296f07-8404-4bad-955a-ac790a120f91 req-be7cc44e-c171-4667-86d0-8135fe4caade d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Neutron deleted interface d63ba18a-72cf-4a65-b12b-e9ddcba7161b; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:47:42 np0005465988 nova_compute[236126]: 2025-10-02 12:47:42.025 2 DEBUG nova.network.neutron [req-b6296f07-8404-4bad-955a-ac790a120f91 req-be7cc44e-c171-4667-86d0-8135fe4caade d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:47:42 np0005465988 nova_compute[236126]: 2025-10-02 12:47:42.048 2 INFO nova.compute.manager [-] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Took 1.67 seconds to deallocate network for instance.#033[00m
Oct  2 08:47:42 np0005465988 nova_compute[236126]: 2025-10-02 12:47:42.169 2 DEBUG nova.compute.manager [req-b6296f07-8404-4bad-955a-ac790a120f91 req-be7cc44e-c171-4667-86d0-8135fe4caade d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Detach interface failed, port_id=d63ba18a-72cf-4a65-b12b-e9ddcba7161b, reason: Instance 10556f01-d62c-45b4-adb3-fa7d9f7d8004 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:47:42 np0005465988 nova_compute[236126]: 2025-10-02 12:47:42.217 2 DEBUG oslo_concurrency.lockutils [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:42 np0005465988 nova_compute[236126]: 2025-10-02 12:47:42.218 2 DEBUG oslo_concurrency.lockutils [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:42 np0005465988 nova_compute[236126]: 2025-10-02 12:47:42.346 2 DEBUG oslo_concurrency.processutils [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:47:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1205338556' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:47:42 np0005465988 nova_compute[236126]: 2025-10-02 12:47:42.796 2 DEBUG oslo_concurrency.processutils [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:42 np0005465988 nova_compute[236126]: 2025-10-02 12:47:42.805 2 DEBUG nova.compute.provider_tree [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:47:42 np0005465988 nova_compute[236126]: 2025-10-02 12:47:42.923 2 DEBUG nova.scheduler.client.report [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:47:43 np0005465988 nova_compute[236126]: 2025-10-02 12:47:43.015 2 DEBUG oslo_concurrency.lockutils [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.797s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:43.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:43 np0005465988 nova_compute[236126]: 2025-10-02 12:47:43.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:43 np0005465988 nova_compute[236126]: 2025-10-02 12:47:43.405 2 INFO nova.scheduler.client.report [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Deleted allocations for instance 10556f01-d62c-45b4-adb3-fa7d9f7d8004#033[00m
Oct  2 08:47:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:43.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:44 np0005465988 podman[313387]: 2025-10-02 12:47:44.533699378 +0000 UTC m=+0.066588104 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:47:44 np0005465988 podman[313386]: 2025-10-02 12:47:44.533846012 +0000 UTC m=+0.069170907 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:47:44 np0005465988 podman[313385]: 2025-10-02 12:47:44.563301159 +0000 UTC m=+0.102486784 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  2 08:47:44 np0005465988 nova_compute[236126]: 2025-10-02 12:47:44.569 2 DEBUG oslo_concurrency.lockutils [None req-7064f25a-f7b3-4940-b5c9-e42ffd17649c f9c1a967b21e4d05a1e9cb54949a7527 508fca18f76a46cba8f3b8b8d8169ef1 - - default default] Lock "10556f01-d62c-45b4-adb3-fa7d9f7d8004" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:44 np0005465988 nova_compute[236126]: 2025-10-02 12:47:44.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:45.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:45 np0005465988 nova_compute[236126]: 2025-10-02 12:47:45.193 2 DEBUG oslo_concurrency.lockutils [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "bf9e8de1-5081-4daa-9041-1d329e06be86" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:45 np0005465988 nova_compute[236126]: 2025-10-02 12:47:45.194 2 DEBUG oslo_concurrency.lockutils [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:45 np0005465988 nova_compute[236126]: 2025-10-02 12:47:45.349 2 DEBUG nova.objects.instance [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lazy-loading 'flavor' on Instance uuid bf9e8de1-5081-4daa-9041-1d329e06be86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:47:45 np0005465988 nova_compute[236126]: 2025-10-02 12:47:45.531 2 DEBUG oslo_concurrency.lockutils [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:45.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:46 np0005465988 nova_compute[236126]: 2025-10-02 12:47:46.950 2 DEBUG oslo_concurrency.lockutils [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "bf9e8de1-5081-4daa-9041-1d329e06be86" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:46 np0005465988 nova_compute[236126]: 2025-10-02 12:47:46.951 2 DEBUG oslo_concurrency.lockutils [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:46 np0005465988 nova_compute[236126]: 2025-10-02 12:47:46.952 2 INFO nova.compute.manager [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Attaching volume 23d0d8ce-0e39-4b00-81b2-fea970998125 to /dev/vdb#033[00m
Oct  2 08:47:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:47.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.725 2 DEBUG os_brick.utils [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.727 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.737 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.738 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[7713345e-db08-4761-bf39-82ae3b5ff345]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.740 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.747 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.748 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3938a5-06b0-4920-a720-95248ae95696]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.750 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.761 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.762 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[8edc6726-58bd-499c-8855-6d5bd6772992]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.764 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e43a8f-d33e-428b-84b3-3b2e5a316b52]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.764 2 DEBUG oslo_concurrency.processutils [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.809 2 DEBUG oslo_concurrency.processutils [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CMD "nvme version" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.812 2 DEBUG os_brick.initiator.connectors.lightos [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.813 2 DEBUG os_brick.initiator.connectors.lightos [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.813 2 DEBUG os_brick.initiator.connectors.lightos [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.813 2 DEBUG os_brick.utils [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] <== get_connector_properties: return (87ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 08:47:47 np0005465988 nova_compute[236126]: 2025-10-02 12:47:47.814 2 DEBUG nova.virt.block_device [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updating existing volume attachment record: 5b502dd5-055b-47c7-aea8-4ed473a43f8e _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 08:47:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:47:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:47.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:47:48 np0005465988 nova_compute[236126]: 2025-10-02 12:47:48.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:47:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:49.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:47:49 np0005465988 nova_compute[236126]: 2025-10-02 12:47:49.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:49.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:50 np0005465988 nova_compute[236126]: 2025-10-02 12:47:50.283 2 DEBUG nova.objects.instance [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lazy-loading 'flavor' on Instance uuid bf9e8de1-5081-4daa-9041-1d329e06be86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:47:50 np0005465988 nova_compute[236126]: 2025-10-02 12:47:50.399 2 DEBUG nova.virt.libvirt.driver [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Attempting to attach volume 23d0d8ce-0e39-4b00-81b2-fea970998125 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Oct  2 08:47:50 np0005465988 nova_compute[236126]: 2025-10-02 12:47:50.403 2 DEBUG nova.virt.libvirt.guest [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] attach device xml: <disk type="network" device="disk">
Oct  2 08:47:50 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:47:50 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-23d0d8ce-0e39-4b00-81b2-fea970998125">
Oct  2 08:47:50 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:47:50 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:47:50 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:47:50 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:47:50 np0005465988 nova_compute[236126]:  <auth username="openstack">
Oct  2 08:47:50 np0005465988 nova_compute[236126]:    <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:47:50 np0005465988 nova_compute[236126]:  </auth>
Oct  2 08:47:50 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:47:50 np0005465988 nova_compute[236126]:  <serial>23d0d8ce-0e39-4b00-81b2-fea970998125</serial>
Oct  2 08:47:50 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:47:50 np0005465988 nova_compute[236126]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:47:50 np0005465988 nova_compute[236126]: 2025-10-02 12:47:50.601 2 DEBUG nova.virt.libvirt.driver [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:47:50 np0005465988 nova_compute[236126]: 2025-10-02 12:47:50.602 2 DEBUG nova.virt.libvirt.driver [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:47:50 np0005465988 nova_compute[236126]: 2025-10-02 12:47:50.603 2 DEBUG nova.virt.libvirt.driver [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:47:50 np0005465988 nova_compute[236126]: 2025-10-02 12:47:50.603 2 DEBUG nova.virt.libvirt.driver [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] No VIF found with MAC fa:16:3e:05:51:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:47:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:47:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:51.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:47:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:51.235 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:47:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:51.235 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:47:51 np0005465988 nova_compute[236126]: 2025-10-02 12:47:51.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:51 np0005465988 nova_compute[236126]: 2025-10-02 12:47:51.420 2 DEBUG oslo_concurrency.lockutils [None req-f282dce3-8069-4c39-bcac-f90648a5d397 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 4.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:51.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:47:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:53.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:47:53 np0005465988 nova_compute[236126]: 2025-10-02 12:47:53.209 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409258.2079852, 10556f01-d62c-45b4-adb3-fa7d9f7d8004 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:47:53 np0005465988 nova_compute[236126]: 2025-10-02 12:47:53.210 2 INFO nova.compute.manager [-] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:47:53 np0005465988 nova_compute[236126]: 2025-10-02 12:47:53.239 2 DEBUG nova.compute.manager [None req-2c23a3f9-c9f5-4617-8449-1002c92d30d3 - - - - - -] [instance: 10556f01-d62c-45b4-adb3-fa7d9f7d8004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:47:53 np0005465988 nova_compute[236126]: 2025-10-02 12:47:53.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:53.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:54 np0005465988 ovn_controller[132601]: 2025-10-02T12:47:54Z|00759|binding|INFO|Releasing lport 28a672bd-7c4d-49bd-8937-0e065b62aa5f from this chassis (sb_readonly=0)
Oct  2 08:47:54 np0005465988 nova_compute[236126]: 2025-10-02 12:47:54.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:54 np0005465988 nova_compute[236126]: 2025-10-02 12:47:54.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:55.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:55.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.033 2 DEBUG oslo_concurrency.lockutils [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "bf9e8de1-5081-4daa-9041-1d329e06be86" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.033 2 DEBUG oslo_concurrency.lockutils [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.054 2 INFO nova.compute.manager [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Detaching volume 23d0d8ce-0e39-4b00-81b2-fea970998125#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.267 2 INFO nova.virt.block_device [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Attempting to driver detach volume 23d0d8ce-0e39-4b00-81b2-fea970998125 from mountpoint /dev/vdb#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.277 2 DEBUG nova.virt.libvirt.driver [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Attempting to detach device vdb from instance bf9e8de1-5081-4daa-9041-1d329e06be86 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.277 2 DEBUG nova.virt.libvirt.guest [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-23d0d8ce-0e39-4b00-81b2-fea970998125">
Oct  2 08:47:56 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  <serial>23d0d8ce-0e39-4b00-81b2-fea970998125</serial>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:47:56 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.355 2 INFO nova.virt.libvirt.driver [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Successfully detached device vdb from instance bf9e8de1-5081-4daa-9041-1d329e06be86 from the persistent domain config.#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.355 2 DEBUG nova.virt.libvirt.driver [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance bf9e8de1-5081-4daa-9041-1d329e06be86 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.356 2 DEBUG nova.virt.libvirt.guest [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] detach device xml: <disk type="network" device="disk">
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  <source protocol="rbd" name="volumes/volume-23d0d8ce-0e39-4b00-81b2-fea970998125">
Oct  2 08:47:56 np0005465988 nova_compute[236126]:    <host name="192.168.122.100" port="6789"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:    <host name="192.168.122.102" port="6789"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:    <host name="192.168.122.101" port="6789"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  </source>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  <target dev="vdb" bus="virtio"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  <serial>23d0d8ce-0e39-4b00-81b2-fea970998125</serial>
Oct  2 08:47:56 np0005465988 nova_compute[236126]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Oct  2 08:47:56 np0005465988 nova_compute[236126]: </disk>
Oct  2 08:47:56 np0005465988 nova_compute[236126]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.422 2 DEBUG nova.virt.libvirt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Received event <DeviceRemovedEvent: 1759409276.4216921, bf9e8de1-5081-4daa-9041-1d329e06be86 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.423 2 DEBUG nova.virt.libvirt.driver [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance bf9e8de1-5081-4daa-9041-1d329e06be86 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.425 2 INFO nova.virt.libvirt.driver [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Successfully detached device vdb from instance bf9e8de1-5081-4daa-9041-1d329e06be86 from the live domain config.#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.614 2 DEBUG nova.objects.instance [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lazy-loading 'flavor' on Instance uuid bf9e8de1-5081-4daa-9041-1d329e06be86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:47:56 np0005465988 nova_compute[236126]: 2025-10-02 12:47:56.682 2 DEBUG oslo_concurrency.lockutils [None req-b4f24395-8039-484f-8b00-0b59782ac0ba a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:57.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:47:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:57.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:47:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:47:58.237 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:58 np0005465988 nova_compute[236126]: 2025-10-02 12:47:58.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e359 e359: 3 total, 3 up, 3 in
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #121. Immutable memtables: 0.
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.053058) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 75] Flushing memtable with next log file: 121
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409279053123, "job": 75, "event": "flush_started", "num_memtables": 1, "num_entries": 2398, "num_deletes": 253, "total_data_size": 5639396, "memory_usage": 5709632, "flush_reason": "Manual Compaction"}
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 75] Level-0 flush table #122: started
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409279161590, "cf_name": "default", "job": 75, "event": "table_file_creation", "file_number": 122, "file_size": 3685261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 60889, "largest_seqno": 63281, "table_properties": {"data_size": 3675699, "index_size": 5992, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20366, "raw_average_key_size": 20, "raw_value_size": 3656361, "raw_average_value_size": 3693, "num_data_blocks": 261, "num_entries": 990, "num_filter_entries": 990, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759409071, "oldest_key_time": 1759409071, "file_creation_time": 1759409279, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 122, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 75] Flush lasted 108587 microseconds, and 10638 cpu microseconds.
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:47:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:47:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:47:59.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.161647) [db/flush_job.cc:967] [default] [JOB 75] Level-0 flush table #122: 3685261 bytes OK
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.161674) [db/memtable_list.cc:519] [default] Level-0 commit table #122 started
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.217800) [db/memtable_list.cc:722] [default] Level-0 commit table #122: memtable #1 done
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.217847) EVENT_LOG_v1 {"time_micros": 1759409279217834, "job": 75, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.217882) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 75] Try to delete WAL files size 5628921, prev total WAL file size 5634517, number of live WAL files 2.
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000118.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.220591) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035323731' seq:72057594037927935, type:22 .. '7061786F730035353233' seq:0, type:0; will stop at (end)
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 76] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 75 Base level 0, inputs: [122(3598KB)], [120(11MB)]
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409279220645, "job": 76, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [122], "files_L6": [120], "score": -1, "input_data_size": 15780801, "oldest_snapshot_seqno": -1}
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 76] Generated table #123: 8926 keys, 13811339 bytes, temperature: kUnknown
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409279639028, "cf_name": "default", "job": 76, "event": "table_file_creation", "file_number": 123, "file_size": 13811339, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13750351, "index_size": 37580, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22341, "raw_key_size": 230761, "raw_average_key_size": 25, "raw_value_size": 13590545, "raw_average_value_size": 1522, "num_data_blocks": 1473, "num_entries": 8926, "num_filter_entries": 8926, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759409279, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 123, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.639283) [db/compaction/compaction_job.cc:1663] [default] [JOB 76] Compacted 1@0 + 1@6 files to L6 => 13811339 bytes
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.678113) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 37.7 rd, 33.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 11.5 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 9451, records dropped: 525 output_compression: NoCompression
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.678153) EVENT_LOG_v1 {"time_micros": 1759409279678140, "job": 76, "event": "compaction_finished", "compaction_time_micros": 418462, "compaction_time_cpu_micros": 52775, "output_level": 6, "num_output_files": 1, "total_output_size": 13811339, "num_input_records": 9451, "num_output_records": 8926, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000122.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409279679179, "job": 76, "event": "table_file_deletion", "file_number": 122}
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000120.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409279681576, "job": 76, "event": "table_file_deletion", "file_number": 120}
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.220463) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.681722) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.681728) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.681729) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.681731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:47:59.681732) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:47:59 np0005465988 nova_compute[236126]: 2025-10-02 12:47:59.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:47:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:47:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:47:59.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:47:59 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct  2 08:48:00 np0005465988 nova_compute[236126]: 2025-10-02 12:48:00.688 2 DEBUG nova.compute.manager [None req-c98c8a57-56e9-46b5-a84a-9338affcc218 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:00 np0005465988 nova_compute[236126]: 2025-10-02 12:48:00.781 2 INFO nova.compute.manager [None req-c98c8a57-56e9-46b5-a84a-9338affcc218 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] instance snapshotting#033[00m
Oct  2 08:48:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 08:48:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:48:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:48:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:48:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:01.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:01 np0005465988 nova_compute[236126]: 2025-10-02 12:48:01.358 2 INFO nova.virt.libvirt.driver [None req-c98c8a57-56e9-46b5-a84a-9338affcc218 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Beginning live snapshot process#033[00m
Oct  2 08:48:01 np0005465988 nova_compute[236126]: 2025-10-02 12:48:01.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:01 np0005465988 nova_compute[236126]: 2025-10-02 12:48:01.506 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:01 np0005465988 nova_compute[236126]: 2025-10-02 12:48:01.507 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:01 np0005465988 nova_compute[236126]: 2025-10-02 12:48:01.507 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:01 np0005465988 nova_compute[236126]: 2025-10-02 12:48:01.507 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:48:01 np0005465988 nova_compute[236126]: 2025-10-02 12:48:01.508 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:01 np0005465988 podman[313785]: 2025-10-02 12:48:01.543015959 +0000 UTC m=+0.071527304 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 08:48:01 np0005465988 nova_compute[236126]: 2025-10-02 12:48:01.654 2 DEBUG nova.virt.libvirt.imagebackend [None req-c98c8a57-56e9-46b5-a84a-9338affcc218 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:48:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:01.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:48:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1578206844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.025 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.125 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.126 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.182 2 DEBUG nova.storage.rbd_utils [None req-c98c8a57-56e9-46b5-a84a-9338affcc218 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] creating snapshot(34cc59cb4bbc4155980002c847798341) on rbd image(bf9e8de1-5081-4daa-9041-1d329e06be86_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.317 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.318 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3954MB free_disk=20.853466033935547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.318 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.319 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.399 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance bf9e8de1-5081-4daa-9041-1d329e06be86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.399 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.399 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.455 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:48:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3044533313' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.880 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.886 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.915 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.952 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:48:02 np0005465988 nova_compute[236126]: 2025-10-02 12:48:02.953 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:03.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:03 np0005465988 nova_compute[236126]: 2025-10-02 12:48:03.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e360 e360: 3 total, 3 up, 3 in
Oct  2 08:48:03 np0005465988 nova_compute[236126]: 2025-10-02 12:48:03.467 2 DEBUG nova.storage.rbd_utils [None req-c98c8a57-56e9-46b5-a84a-9338affcc218 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] cloning vms/bf9e8de1-5081-4daa-9041-1d329e06be86_disk@34cc59cb4bbc4155980002c847798341 to images/a62d470f-f755-4a5a-b8e5-0dc1be0600d1 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:48:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:48:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:03.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:48:03 np0005465988 nova_compute[236126]: 2025-10-02 12:48:03.983 2 DEBUG nova.storage.rbd_utils [None req-c98c8a57-56e9-46b5-a84a-9338affcc218 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] flattening images/a62d470f-f755-4a5a-b8e5-0dc1be0600d1 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:48:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:04 np0005465988 nova_compute[236126]: 2025-10-02 12:48:04.394 2 DEBUG nova.storage.rbd_utils [None req-c98c8a57-56e9-46b5-a84a-9338affcc218 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] removing snapshot(34cc59cb4bbc4155980002c847798341) on rbd image(bf9e8de1-5081-4daa-9041-1d329e06be86_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:48:04 np0005465988 nova_compute[236126]: 2025-10-02 12:48:04.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:04 np0005465988 nova_compute[236126]: 2025-10-02 12:48:04.953 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:05.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e361 e361: 3 total, 3 up, 3 in
Oct  2 08:48:05 np0005465988 nova_compute[236126]: 2025-10-02 12:48:05.465 2 DEBUG nova.storage.rbd_utils [None req-c98c8a57-56e9-46b5-a84a-9338affcc218 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] creating snapshot(snap) on rbd image(a62d470f-f755-4a5a-b8e5-0dc1be0600d1) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:48:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 08:48:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:05.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 08:48:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e362 e362: 3 total, 3 up, 3 in
Oct  2 08:48:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:07.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:48:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:48:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:48:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:07.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:48:08 np0005465988 nova_compute[236126]: 2025-10-02 12:48:08.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e362 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:09.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:09 np0005465988 nova_compute[236126]: 2025-10-02 12:48:09.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:09 np0005465988 nova_compute[236126]: 2025-10-02 12:48:09.445 2 INFO nova.virt.libvirt.driver [None req-c98c8a57-56e9-46b5-a84a-9338affcc218 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Snapshot image upload complete#033[00m
Oct  2 08:48:09 np0005465988 nova_compute[236126]: 2025-10-02 12:48:09.446 2 INFO nova.compute.manager [None req-c98c8a57-56e9-46b5-a84a-9338affcc218 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Took 8.66 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:48:09 np0005465988 nova_compute[236126]: 2025-10-02 12:48:09.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:48:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:09.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:48:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:10Z|00760|binding|INFO|Releasing lport 28a672bd-7c4d-49bd-8937-0e065b62aa5f from this chassis (sb_readonly=0)
Oct  2 08:48:10 np0005465988 nova_compute[236126]: 2025-10-02 12:48:10.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:11.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:11.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:12 np0005465988 nova_compute[236126]: 2025-10-02 12:48:12.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:13.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:13 np0005465988 nova_compute[236126]: 2025-10-02 12:48:13.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:13 np0005465988 nova_compute[236126]: 2025-10-02 12:48:13.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:13.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e362 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e363 e363: 3 total, 3 up, 3 in
Oct  2 08:48:14 np0005465988 nova_compute[236126]: 2025-10-02 12:48:14.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:14 np0005465988 nova_compute[236126]: 2025-10-02 12:48:14.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:48:14 np0005465988 podman[314070]: 2025-10-02 12:48:14.64790333 +0000 UTC m=+0.069966559 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:48:14 np0005465988 podman[314071]: 2025-10-02 12:48:14.656152005 +0000 UTC m=+0.070870445 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:48:14 np0005465988 podman[314073]: 2025-10-02 12:48:14.695109592 +0000 UTC m=+0.102587017 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  2 08:48:14 np0005465988 nova_compute[236126]: 2025-10-02 12:48:14.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:15.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:15 np0005465988 nova_compute[236126]: 2025-10-02 12:48:15.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:15 np0005465988 nova_compute[236126]: 2025-10-02 12:48:15.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:15.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:16 np0005465988 nova_compute[236126]: 2025-10-02 12:48:16.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:48:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:17.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:48:17 np0005465988 nova_compute[236126]: 2025-10-02 12:48:17.635 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:17 np0005465988 nova_compute[236126]: 2025-10-02 12:48:17.635 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:17 np0005465988 nova_compute[236126]: 2025-10-02 12:48:17.657 2 DEBUG nova.compute.manager [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:48:17 np0005465988 nova_compute[236126]: 2025-10-02 12:48:17.765 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:17 np0005465988 nova_compute[236126]: 2025-10-02 12:48:17.766 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:17 np0005465988 nova_compute[236126]: 2025-10-02 12:48:17.777 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:48:17 np0005465988 nova_compute[236126]: 2025-10-02 12:48:17.778 2 INFO nova.compute.claims [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:48:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:17.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.024 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:48:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/949741358' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.536 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.544 2 DEBUG nova.compute.provider_tree [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.565 2 DEBUG nova.scheduler.client.report [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.588 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.822s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.589 2 DEBUG nova.compute.manager [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.649 2 DEBUG nova.compute.manager [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.649 2 DEBUG nova.network.neutron [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.672 2 INFO nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.697 2 DEBUG nova.compute.manager [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.833 2 DEBUG nova.compute.manager [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.834 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.835 2 INFO nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Creating image(s)#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.866 2 DEBUG nova.storage.rbd_utils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.899 2 DEBUG nova.storage.rbd_utils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.933 2 DEBUG nova.storage.rbd_utils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.938 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:18 np0005465988 nova_compute[236126]: 2025-10-02 12:48:18.972 2 DEBUG nova.policy [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6785ffe5d6554514b4ed9fd47665eca0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6a442bc513e14406b73e96e70396e6c3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:48:19 np0005465988 nova_compute[236126]: 2025-10-02 12:48:19.008 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:19 np0005465988 nova_compute[236126]: 2025-10-02 12:48:19.009 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:19 np0005465988 nova_compute[236126]: 2025-10-02 12:48:19.010 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:19 np0005465988 nova_compute[236126]: 2025-10-02 12:48:19.011 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:19 np0005465988 nova_compute[236126]: 2025-10-02 12:48:19.041 2 DEBUG nova.storage.rbd_utils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:19 np0005465988 nova_compute[236126]: 2025-10-02 12:48:19.046 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 439392e5-66ae-4162-a7e5-077f87ca558b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:48:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:19.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:48:19 np0005465988 nova_compute[236126]: 2025-10-02 12:48:19.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:19 np0005465988 nova_compute[236126]: 2025-10-02 12:48:19.848 2 DEBUG nova.network.neutron [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Successfully created port: 7020ab2d-943e-4985-b442-c6584c56c0d2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:48:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:19.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:20 np0005465988 nova_compute[236126]: 2025-10-02 12:48:20.024 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 439392e5-66ae-4162-a7e5-077f87ca558b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.978s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:20 np0005465988 nova_compute[236126]: 2025-10-02 12:48:20.114 2 DEBUG nova.storage.rbd_utils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] resizing rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:48:20 np0005465988 nova_compute[236126]: 2025-10-02 12:48:20.238 2 DEBUG nova.objects.instance [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'migration_context' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:20 np0005465988 nova_compute[236126]: 2025-10-02 12:48:20.258 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:48:20 np0005465988 nova_compute[236126]: 2025-10-02 12:48:20.258 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Ensure instance console log exists: /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:48:20 np0005465988 nova_compute[236126]: 2025-10-02 12:48:20.259 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:20 np0005465988 nova_compute[236126]: 2025-10-02 12:48:20.259 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:20 np0005465988 nova_compute[236126]: 2025-10-02 12:48:20.259 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:21 np0005465988 nova_compute[236126]: 2025-10-02 12:48:21.036 2 DEBUG nova.network.neutron [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Successfully updated port: 7020ab2d-943e-4985-b442-c6584c56c0d2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:48:21 np0005465988 nova_compute[236126]: 2025-10-02 12:48:21.079 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:48:21 np0005465988 nova_compute[236126]: 2025-10-02 12:48:21.080 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquired lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:48:21 np0005465988 nova_compute[236126]: 2025-10-02 12:48:21.080 2 DEBUG nova.network.neutron [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:48:21 np0005465988 nova_compute[236126]: 2025-10-02 12:48:21.177 2 DEBUG nova.compute.manager [req-81e03dd3-0408-4b82-8608-11a637a69d53 req-6a514815-63c8-431f-8451-226a5f0cd276 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-changed-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:48:21 np0005465988 nova_compute[236126]: 2025-10-02 12:48:21.178 2 DEBUG nova.compute.manager [req-81e03dd3-0408-4b82-8608-11a637a69d53 req-6a514815-63c8-431f-8451-226a5f0cd276 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Refreshing instance network info cache due to event network-changed-7020ab2d-943e-4985-b442-c6584c56c0d2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:48:21 np0005465988 nova_compute[236126]: 2025-10-02 12:48:21.178 2 DEBUG oslo_concurrency.lockutils [req-81e03dd3-0408-4b82-8608-11a637a69d53 req-6a514815-63c8-431f-8451-226a5f0cd276 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:48:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:48:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:21.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:48:21 np0005465988 nova_compute[236126]: 2025-10-02 12:48:21.280 2 DEBUG nova.network.neutron [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:48:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:21.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:22 np0005465988 nova_compute[236126]: 2025-10-02 12:48:22.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:22 np0005465988 nova_compute[236126]: 2025-10-02 12:48:22.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:48:22 np0005465988 nova_compute[236126]: 2025-10-02 12:48:22.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:48:22 np0005465988 nova_compute[236126]: 2025-10-02 12:48:22.502 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:48:22 np0005465988 nova_compute[236126]: 2025-10-02 12:48:22.842 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:48:22 np0005465988 nova_compute[236126]: 2025-10-02 12:48:22.843 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:48:22 np0005465988 nova_compute[236126]: 2025-10-02 12:48:22.843 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:48:22 np0005465988 nova_compute[236126]: 2025-10-02 12:48:22.844 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bf9e8de1-5081-4daa-9041-1d329e06be86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:23 np0005465988 nova_compute[236126]: 2025-10-02 12:48:23.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:23.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:23 np0005465988 nova_compute[236126]: 2025-10-02 12:48:23.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:23.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.129 2 DEBUG nova.network.neutron [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updating instance_info_cache with network_info: [{"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.181 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Releasing lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.182 2 DEBUG nova.compute.manager [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Instance network_info: |[{"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.182 2 DEBUG oslo_concurrency.lockutils [req-81e03dd3-0408-4b82-8608-11a637a69d53 req-6a514815-63c8-431f-8451-226a5f0cd276 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.183 2 DEBUG nova.network.neutron [req-81e03dd3-0408-4b82-8608-11a637a69d53 req-6a514815-63c8-431f-8451-226a5f0cd276 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Refreshing network info cache for port 7020ab2d-943e-4985-b442-c6584c56c0d2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.185 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Start _get_guest_xml network_info=[{"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.191 2 WARNING nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.196 2 DEBUG nova.virt.libvirt.host [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.197 2 DEBUG nova.virt.libvirt.host [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.202 2 DEBUG nova.virt.libvirt.host [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.203 2 DEBUG nova.virt.libvirt.host [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.205 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.205 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.206 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.206 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.206 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.207 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.207 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.207 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.208 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.208 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.208 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.208 2 DEBUG nova.virt.hardware [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.212 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:48:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/574974802' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.660 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.704 2 DEBUG nova.storage.rbd_utils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.710 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:24 np0005465988 nova_compute[236126]: 2025-10-02 12:48:24.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:48:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3848049764' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:48:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:25.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.371 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.661s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.372 2 DEBUG nova.virt.libvirt.vif [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:48:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-91664647',display_name='tempest-ServerStableDeviceRescueTest-server-91664647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-91664647',id=173,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a442bc513e14406b73e96e70396e6c3',ramdisk_id='',reservation_id='r-x1ok7izy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-454391960',owner_user_name='tempest-ServerStableDeviceRescueTest-454391960-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:48:18Z,user_data=None,user_id='6785ffe5d6554514b4ed9fd47665eca0',uuid=439392e5-66ae-4162-a7e5-077f87ca558b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.373 2 DEBUG nova.network.os_vif_util [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converting VIF {"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.374 2 DEBUG nova.network.os_vif_util [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:a9:95,bridge_name='br-int',has_traffic_filtering=True,id=7020ab2d-943e-4985-b442-c6584c56c0d2,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7020ab2d-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.376 2 DEBUG nova.objects.instance [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.561 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  <uuid>439392e5-66ae-4162-a7e5-077f87ca558b</uuid>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  <name>instance-000000ad</name>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-91664647</nova:name>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:48:24</nova:creationTime>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <nova:user uuid="6785ffe5d6554514b4ed9fd47665eca0">tempest-ServerStableDeviceRescueTest-454391960-project-member</nova:user>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <nova:project uuid="6a442bc513e14406b73e96e70396e6c3">tempest-ServerStableDeviceRescueTest-454391960</nova:project>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <nova:port uuid="7020ab2d-943e-4985-b442-c6584c56c0d2">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <entry name="serial">439392e5-66ae-4162-a7e5-077f87ca558b</entry>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <entry name="uuid">439392e5-66ae-4162-a7e5-077f87ca558b</entry>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/439392e5-66ae-4162-a7e5-077f87ca558b_disk">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/439392e5-66ae-4162-a7e5-077f87ca558b_disk.config">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:5b:a9:95"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <target dev="tap7020ab2d-94"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/console.log" append="off"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:48:25 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:48:25 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:48:25 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:48:25 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.561 2 DEBUG nova.compute.manager [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Preparing to wait for external event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.562 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.562 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.562 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.563 2 DEBUG nova.virt.libvirt.vif [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:48:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-91664647',display_name='tempest-ServerStableDeviceRescueTest-server-91664647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-91664647',id=173,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a442bc513e14406b73e96e70396e6c3',ramdisk_id='',reservation_id='r-x1ok7izy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-454391960',owner_user_name='tempest-ServerStableDeviceRescueTest-454391960-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:48:18Z,user_data=None,user_id='6785ffe5d6554514b4ed9fd47665eca0',uuid=439392e5-66ae-4162-a7e5-077f87ca558b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.563 2 DEBUG nova.network.os_vif_util [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converting VIF {"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.564 2 DEBUG nova.network.os_vif_util [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:a9:95,bridge_name='br-int',has_traffic_filtering=True,id=7020ab2d-943e-4985-b442-c6584c56c0d2,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7020ab2d-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.565 2 DEBUG os_vif [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:a9:95,bridge_name='br-int',has_traffic_filtering=True,id=7020ab2d-943e-4985-b442-c6584c56c0d2,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7020ab2d-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.566 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.566 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.570 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7020ab2d-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.570 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7020ab2d-94, col_values=(('external_ids', {'iface-id': '7020ab2d-943e-4985-b442-c6584c56c0d2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:a9:95', 'vm-uuid': '439392e5-66ae-4162-a7e5-077f87ca558b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:25 np0005465988 NetworkManager[45041]: <info>  [1759409305.5733] manager: (tap7020ab2d-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/341)
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.585 2 INFO os_vif [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:a9:95,bridge_name='br-int',has_traffic_filtering=True,id=7020ab2d-943e-4985-b442-c6584c56c0d2,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7020ab2d-94')#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.821 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.821 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.821 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No VIF found with MAC fa:16:3e:5b:a9:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.822 2 INFO nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Using config drive#033[00m
Oct  2 08:48:25 np0005465988 nova_compute[236126]: 2025-10-02 12:48:25.857 2 DEBUG nova.storage.rbd_utils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:25.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:26 np0005465988 nova_compute[236126]: 2025-10-02 12:48:26.340 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updating instance_info_cache with network_info: [{"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:48:26 np0005465988 nova_compute[236126]: 2025-10-02 12:48:26.382 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:48:26 np0005465988 nova_compute[236126]: 2025-10-02 12:48:26.382 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:48:26 np0005465988 nova_compute[236126]: 2025-10-02 12:48:26.546 2 INFO nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Creating config drive at /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config#033[00m
Oct  2 08:48:26 np0005465988 nova_compute[236126]: 2025-10-02 12:48:26.553 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp27vs4jtg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:26 np0005465988 nova_compute[236126]: 2025-10-02 12:48:26.709 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp27vs4jtg" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:26 np0005465988 nova_compute[236126]: 2025-10-02 12:48:26.754 2 DEBUG nova.storage.rbd_utils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:26 np0005465988 nova_compute[236126]: 2025-10-02 12:48:26.759 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config 439392e5-66ae-4162-a7e5-077f87ca558b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:27 np0005465988 nova_compute[236126]: 2025-10-02 12:48:27.135 2 DEBUG oslo_concurrency.processutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config 439392e5-66ae-4162-a7e5-077f87ca558b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.376s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:27 np0005465988 nova_compute[236126]: 2025-10-02 12:48:27.137 2 INFO nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Deleting local config drive /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config because it was imported into RBD.#033[00m
Oct  2 08:48:27 np0005465988 kernel: tap7020ab2d-94: entered promiscuous mode
Oct  2 08:48:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:48:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:27.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:48:27 np0005465988 NetworkManager[45041]: <info>  [1759409307.2085] manager: (tap7020ab2d-94): new Tun device (/org/freedesktop/NetworkManager/Devices/342)
Oct  2 08:48:27 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:27Z|00761|binding|INFO|Claiming lport 7020ab2d-943e-4985-b442-c6584c56c0d2 for this chassis.
Oct  2 08:48:27 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:27Z|00762|binding|INFO|7020ab2d-943e-4985-b442-c6584c56c0d2: Claiming fa:16:3e:5b:a9:95 10.100.0.9
Oct  2 08:48:27 np0005465988 nova_compute[236126]: 2025-10-02 12:48:27.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:27Z|00763|binding|INFO|Setting lport 7020ab2d-943e-4985-b442-c6584c56c0d2 ovn-installed in OVS
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.231 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:a9:95 10.100.0.9'], port_security=['fa:16:3e:5b:a9:95 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '439392e5-66ae-4162-a7e5-077f87ca558b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7020ab2d-943e-4985-b442-c6584c56c0d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:48:27 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:27Z|00764|binding|INFO|Setting lport 7020ab2d-943e-4985-b442-c6584c56c0d2 up in Southbound
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.232 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7020ab2d-943e-4985-b442-c6584c56c0d2 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 bound to our chassis#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.234 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:48:27 np0005465988 nova_compute[236126]: 2025-10-02 12:48:27.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.250 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[43d13e15-84d0-406a-b1bc-dfdd4074611e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.251 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48e4ff16-11 in ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.253 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48e4ff16-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.254 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ef2b505b-10fa-4d94-82df-479a13611d8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.254 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[48443c92-a6ee-4284-b61e-9b28a2abaad4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 systemd-udevd[314489]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:48:27 np0005465988 systemd-machined[192594]: New machine qemu-80-instance-000000ad.
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.268 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[1f5e5f74-606a-4f5f-9e36-5f5e73470f13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 systemd[1]: Started Virtual Machine qemu-80-instance-000000ad.
Oct  2 08:48:27 np0005465988 NetworkManager[45041]: <info>  [1759409307.2751] device (tap7020ab2d-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:48:27 np0005465988 NetworkManager[45041]: <info>  [1759409307.2759] device (tap7020ab2d-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.293 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d700685a-9e57-4ce8-91bb-299762040c9e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.324 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[6b413722-fa58-48f7-8e9d-23e7417f45ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.330 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c995c74e-71c0-4838-85b7-fce458e81904]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 NetworkManager[45041]: <info>  [1759409307.3322] manager: (tap48e4ff16-10): new Veth device (/org/freedesktop/NetworkManager/Devices/343)
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.366 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1320b782-54d5-4389-92e5-17b0427bb437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.370 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4201f644-fc19-4a8e-b32a-79bbef362409]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 NetworkManager[45041]: <info>  [1759409307.3932] device (tap48e4ff16-10): carrier: link connected
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.393 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.393 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.394 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.398 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7019202a-f63b-4ff5-9dad-8152e091197c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.413 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[77a704e1-ca58-42da-9dc5-5ad87fcf56e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736772, 'reachable_time': 17395, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314521, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.429 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f9ca4ecd-d0a8-4531-be2b-376e38c2e937]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8b:53bc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736772, 'tstamp': 736772}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 314522, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.446 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1c08c054-1309-4589-9b3b-9ad9735e895c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736772, 'reachable_time': 17395, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 314523, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.485 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d116a5c3-77ef-4558-ad4b-650e76d29b10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.551 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a751a6a2-7b54-43c9-a802-0c7d2f1028cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.553 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.557 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.558 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:27 np0005465988 nova_compute[236126]: 2025-10-02 12:48:27.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005465988 NetworkManager[45041]: <info>  [1759409307.5608] manager: (tap48e4ff16-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/344)
Oct  2 08:48:27 np0005465988 kernel: tap48e4ff16-10: entered promiscuous mode
Oct  2 08:48:27 np0005465988 nova_compute[236126]: 2025-10-02 12:48:27.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.567 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:27 np0005465988 nova_compute[236126]: 2025-10-02 12:48:27.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:27Z|00765|binding|INFO|Releasing lport 1fc80788-89b8-413a-b0b0-d36f1a11a2b1 from this chassis (sb_readonly=0)
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.574 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48e4ff16-1388-40c7-a27a-83a3b4869808.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48e4ff16-1388-40c7-a27a-83a3b4869808.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.575 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[43874005-b07c-4dd0-bea5-142a9ef13782]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.576 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-48e4ff16-1388-40c7-a27a-83a3b4869808
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/48e4ff16-1388-40c7-a27a-83a3b4869808.pid.haproxy
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 48e4ff16-1388-40c7-a27a-83a3b4869808
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:48:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:27.577 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'env', 'PROCESS_TAG=haproxy-48e4ff16-1388-40c7-a27a-83a3b4869808', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48e4ff16-1388-40c7-a27a-83a3b4869808.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:48:27 np0005465988 nova_compute[236126]: 2025-10-02 12:48:27.581 2 DEBUG nova.network.neutron [req-81e03dd3-0408-4b82-8608-11a637a69d53 req-6a514815-63c8-431f-8451-226a5f0cd276 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updated VIF entry in instance network info cache for port 7020ab2d-943e-4985-b442-c6584c56c0d2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:48:27 np0005465988 nova_compute[236126]: 2025-10-02 12:48:27.581 2 DEBUG nova.network.neutron [req-81e03dd3-0408-4b82-8608-11a637a69d53 req-6a514815-63c8-431f-8451-226a5f0cd276 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updating instance_info_cache with network_info: [{"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:48:27 np0005465988 nova_compute[236126]: 2025-10-02 12:48:27.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005465988 nova_compute[236126]: 2025-10-02 12:48:27.629 2 DEBUG oslo_concurrency.lockutils [req-81e03dd3-0408-4b82-8608-11a637a69d53 req-6a514815-63c8-431f-8451-226a5f0cd276 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:48:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:27.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:27 np0005465988 podman[314597]: 2025-10-02 12:48:27.958061685 +0000 UTC m=+0.078706938 container create 70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:48:27 np0005465988 podman[314597]: 2025-10-02 12:48:27.905944863 +0000 UTC m=+0.026590136 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:48:28 np0005465988 systemd[1]: Started libpod-conmon-70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c.scope.
Oct  2 08:48:28 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:48:28 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/435a3bcd75246087a7339e8e9a3725b4ef5e5d242558ecefc251af26fd16ef6e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:48:28 np0005465988 podman[314597]: 2025-10-02 12:48:28.069119481 +0000 UTC m=+0.189764764 container init 70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:48:28 np0005465988 podman[314597]: 2025-10-02 12:48:28.076674476 +0000 UTC m=+0.197319729 container start 70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:48:28 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[314612]: [NOTICE]   (314616) : New worker (314618) forked
Oct  2 08:48:28 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[314612]: [NOTICE]   (314616) : Loading success.
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.171 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409308.1713393, 439392e5-66ae-4162-a7e5-077f87ca558b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.172 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] VM Started (Lifecycle Event)#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.253 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.260 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409308.1715336, 439392e5-66ae-4162-a7e5-077f87ca558b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.260 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.417 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.422 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.631 2 DEBUG nova.compute.manager [req-874ad0af-c02a-44cd-b841-291fb918a869 req-0caa4667-6db8-40d1-b16b-a3637336c005 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.631 2 DEBUG oslo_concurrency.lockutils [req-874ad0af-c02a-44cd-b841-291fb918a869 req-0caa4667-6db8-40d1-b16b-a3637336c005 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.632 2 DEBUG oslo_concurrency.lockutils [req-874ad0af-c02a-44cd-b841-291fb918a869 req-0caa4667-6db8-40d1-b16b-a3637336c005 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.632 2 DEBUG oslo_concurrency.lockutils [req-874ad0af-c02a-44cd-b841-291fb918a869 req-0caa4667-6db8-40d1-b16b-a3637336c005 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.632 2 DEBUG nova.compute.manager [req-874ad0af-c02a-44cd-b841-291fb918a869 req-0caa4667-6db8-40d1-b16b-a3637336c005 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Processing event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.633 2 DEBUG nova.compute.manager [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.633 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.640 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.641 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409308.640423, 439392e5-66ae-4162-a7e5-077f87ca558b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.641 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.645 2 INFO nova.virt.libvirt.driver [-] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Instance spawned successfully.#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.645 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.746 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.751 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:28.888 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:48:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:28.890 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.904 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.905 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.905 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.905 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.906 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.906 2 DEBUG nova.virt.libvirt.driver [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:28 np0005465988 nova_compute[236126]: 2025-10-02 12:48:28.966 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:48:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:29 np0005465988 nova_compute[236126]: 2025-10-02 12:48:29.126 2 INFO nova.compute.manager [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Took 10.29 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:48:29 np0005465988 nova_compute[236126]: 2025-10-02 12:48:29.127 2 DEBUG nova.compute.manager [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:29.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:29 np0005465988 nova_compute[236126]: 2025-10-02 12:48:29.373 2 INFO nova.compute.manager [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Took 11.65 seconds to build instance.#033[00m
Oct  2 08:48:29 np0005465988 nova_compute[236126]: 2025-10-02 12:48:29.544 2 DEBUG oslo_concurrency.lockutils [None req-40c55db7-a61e-4f01-9800-ffb2bf422145 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.908s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:29.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:29 np0005465988 nova_compute[236126]: 2025-10-02 12:48:29.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:30 np0005465988 nova_compute[236126]: 2025-10-02 12:48:30.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:30 np0005465988 nova_compute[236126]: 2025-10-02 12:48:30.820 2 DEBUG nova.compute.manager [req-bd269215-06b7-49ef-8403-849e5abd3b92 req-2b23b93e-5413-4f84-9358-48a56c65d4ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:48:30 np0005465988 nova_compute[236126]: 2025-10-02 12:48:30.821 2 DEBUG oslo_concurrency.lockutils [req-bd269215-06b7-49ef-8403-849e5abd3b92 req-2b23b93e-5413-4f84-9358-48a56c65d4ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:30 np0005465988 nova_compute[236126]: 2025-10-02 12:48:30.821 2 DEBUG oslo_concurrency.lockutils [req-bd269215-06b7-49ef-8403-849e5abd3b92 req-2b23b93e-5413-4f84-9358-48a56c65d4ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:30 np0005465988 nova_compute[236126]: 2025-10-02 12:48:30.822 2 DEBUG oslo_concurrency.lockutils [req-bd269215-06b7-49ef-8403-849e5abd3b92 req-2b23b93e-5413-4f84-9358-48a56c65d4ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:30 np0005465988 nova_compute[236126]: 2025-10-02 12:48:30.822 2 DEBUG nova.compute.manager [req-bd269215-06b7-49ef-8403-849e5abd3b92 req-2b23b93e-5413-4f84-9358-48a56c65d4ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] No waiting events found dispatching network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:48:30 np0005465988 nova_compute[236126]: 2025-10-02 12:48:30.823 2 WARNING nova.compute.manager [req-bd269215-06b7-49ef-8403-849e5abd3b92 req-2b23b93e-5413-4f84-9358-48a56c65d4ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received unexpected event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:48:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:31.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:31.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:32 np0005465988 nova_compute[236126]: 2025-10-02 12:48:32.134 2 DEBUG nova.compute.manager [None req-862fc640-42a2-47ea-8118-f1f9748d38ed 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:32 np0005465988 nova_compute[236126]: 2025-10-02 12:48:32.191 2 INFO nova.compute.manager [None req-862fc640-42a2-47ea-8118-f1f9748d38ed 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] instance snapshotting#033[00m
Oct  2 08:48:32 np0005465988 podman[314629]: 2025-10-02 12:48:32.527999672 +0000 UTC m=+0.061460354 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:48:32 np0005465988 nova_compute[236126]: 2025-10-02 12:48:32.580 2 INFO nova.virt.libvirt.driver [None req-862fc640-42a2-47ea-8118-f1f9748d38ed 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Beginning live snapshot process#033[00m
Oct  2 08:48:32 np0005465988 nova_compute[236126]: 2025-10-02 12:48:32.755 2 DEBUG nova.virt.libvirt.imagebackend [None req-862fc640-42a2-47ea-8118-f1f9748d38ed 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:48:33 np0005465988 nova_compute[236126]: 2025-10-02 12:48:33.123 2 DEBUG nova.storage.rbd_utils [None req-862fc640-42a2-47ea-8118-f1f9748d38ed 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] creating snapshot(381b47f42f864890a9a5b2b408d429da) on rbd image(439392e5-66ae-4162-a7e5-077f87ca558b_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:48:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:33.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:33.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e364 e364: 3 total, 3 up, 3 in
Oct  2 08:48:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e364 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:34 np0005465988 nova_compute[236126]: 2025-10-02 12:48:34.494 2 DEBUG nova.storage.rbd_utils [None req-862fc640-42a2-47ea-8118-f1f9748d38ed 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] cloning vms/439392e5-66ae-4162-a7e5-077f87ca558b_disk@381b47f42f864890a9a5b2b408d429da to images/5428ba77-2370-48a0-8396-a85c08f86505 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:48:34 np0005465988 nova_compute[236126]: 2025-10-02 12:48:34.646 2 DEBUG nova.storage.rbd_utils [None req-862fc640-42a2-47ea-8118-f1f9748d38ed 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] flattening images/5428ba77-2370-48a0-8396-a85c08f86505 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:48:34 np0005465988 nova_compute[236126]: 2025-10-02 12:48:34.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:35.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:35 np0005465988 nova_compute[236126]: 2025-10-02 12:48:35.492 2 DEBUG nova.storage.rbd_utils [None req-862fc640-42a2-47ea-8118-f1f9748d38ed 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] removing snapshot(381b47f42f864890a9a5b2b408d429da) on rbd image(439392e5-66ae-4162-a7e5-077f87ca558b_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:48:35 np0005465988 nova_compute[236126]: 2025-10-02 12:48:35.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:35.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:35.893 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e365 e365: 3 total, 3 up, 3 in
Oct  2 08:48:36 np0005465988 nova_compute[236126]: 2025-10-02 12:48:36.325 2 DEBUG nova.storage.rbd_utils [None req-862fc640-42a2-47ea-8118-f1f9748d38ed 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] creating snapshot(snap) on rbd image(5428ba77-2370-48a0-8396-a85c08f86505) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:48:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:48:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:37.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:48:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e366 e366: 3 total, 3 up, 3 in
Oct  2 08:48:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:37.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:38 np0005465988 nova_compute[236126]: 2025-10-02 12:48:38.377 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:39.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:39 np0005465988 nova_compute[236126]: 2025-10-02 12:48:39.568 2 INFO nova.virt.libvirt.driver [None req-862fc640-42a2-47ea-8118-f1f9748d38ed 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Snapshot image upload complete#033[00m
Oct  2 08:48:39 np0005465988 nova_compute[236126]: 2025-10-02 12:48:39.568 2 INFO nova.compute.manager [None req-862fc640-42a2-47ea-8118-f1f9748d38ed 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Took 7.38 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:48:39 np0005465988 nova_compute[236126]: 2025-10-02 12:48:39.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:39.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:40 np0005465988 nova_compute[236126]: 2025-10-02 12:48:40.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:41.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:41.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:42 np0005465988 nova_compute[236126]: 2025-10-02 12:48:42.120 2 INFO nova.compute.manager [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Rescuing#033[00m
Oct  2 08:48:42 np0005465988 nova_compute[236126]: 2025-10-02 12:48:42.120 2 DEBUG oslo_concurrency.lockutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:48:42 np0005465988 nova_compute[236126]: 2025-10-02 12:48:42.121 2 DEBUG oslo_concurrency.lockutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquired lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:48:42 np0005465988 nova_compute[236126]: 2025-10-02 12:48:42.121 2 DEBUG nova.network.neutron [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:48:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:43.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:43.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e367 e367: 3 total, 3 up, 3 in
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #124. Immutable memtables: 0.
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:44.609211) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 77] Flushing memtable with next log file: 124
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409324609264, "job": 77, "event": "flush_started", "num_memtables": 1, "num_entries": 868, "num_deletes": 253, "total_data_size": 1478368, "memory_usage": 1509216, "flush_reason": "Manual Compaction"}
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 77] Level-0 flush table #125: started
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409324706162, "cf_name": "default", "job": 77, "event": "table_file_creation", "file_number": 125, "file_size": 728548, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 63286, "largest_seqno": 64149, "table_properties": {"data_size": 724805, "index_size": 1459, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9940, "raw_average_key_size": 21, "raw_value_size": 716877, "raw_average_value_size": 1538, "num_data_blocks": 62, "num_entries": 466, "num_filter_entries": 466, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759409279, "oldest_key_time": 1759409279, "file_creation_time": 1759409324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 125, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 77] Flush lasted 96995 microseconds, and 3956 cpu microseconds.
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:44.706206) [db/flush_job.cc:967] [default] [JOB 77] Level-0 flush table #125: 728548 bytes OK
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:44.706231) [db/memtable_list.cc:519] [default] Level-0 commit table #125 started
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:44.784204) [db/memtable_list.cc:722] [default] Level-0 commit table #125: memtable #1 done
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:44.784289) EVENT_LOG_v1 {"time_micros": 1759409324784276, "job": 77, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:44.784324) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 77] Try to delete WAL files size 1473875, prev total WAL file size 1473875, number of live WAL files 2.
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000121.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:44.785447) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032303039' seq:72057594037927935, type:22 .. '6D6772737461740032323631' seq:0, type:0; will stop at (end)
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 78] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 77 Base level 0, inputs: [125(711KB)], [123(13MB)]
Oct  2 08:48:44 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409324785528, "job": 78, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [125], "files_L6": [123], "score": -1, "input_data_size": 14539887, "oldest_snapshot_seqno": -1}
Oct  2 08:48:44 np0005465988 nova_compute[236126]: 2025-10-02 12:48:44.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 78] Generated table #126: 8884 keys, 10918174 bytes, temperature: kUnknown
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409325055543, "cf_name": "default", "job": 78, "event": "table_file_creation", "file_number": 126, "file_size": 10918174, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10861484, "index_size": 33346, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22277, "raw_key_size": 230196, "raw_average_key_size": 25, "raw_value_size": 10706394, "raw_average_value_size": 1205, "num_data_blocks": 1296, "num_entries": 8884, "num_filter_entries": 8884, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759409324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 126, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:45.055962) [db/compaction/compaction_job.cc:1663] [default] [JOB 78] Compacted 1@0 + 1@6 files to L6 => 10918174 bytes
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:45.083320) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 53.8 rd, 40.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 13.2 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(34.9) write-amplify(15.0) OK, records in: 9392, records dropped: 508 output_compression: NoCompression
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:45.083438) EVENT_LOG_v1 {"time_micros": 1759409325083356, "job": 78, "event": "compaction_finished", "compaction_time_micros": 270124, "compaction_time_cpu_micros": 42786, "output_level": 6, "num_output_files": 1, "total_output_size": 10918174, "num_input_records": 9392, "num_output_records": 8884, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000125.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409325083879, "job": 78, "event": "table_file_deletion", "file_number": 125}
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000123.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409325089095, "job": 78, "event": "table_file_deletion", "file_number": 123}
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:44.785275) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:45.089167) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:45.089172) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:45.089175) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:45.089177) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:48:45 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:48:45.089179) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:48:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:45.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:45 np0005465988 podman[314847]: 2025-10-02 12:48:45.530221662 +0000 UTC m=+0.065275324 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid)
Oct  2 08:48:45 np0005465988 podman[314848]: 2025-10-02 12:48:45.557407936 +0000 UTC m=+0.086895828 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:48:45 np0005465988 nova_compute[236126]: 2025-10-02 12:48:45.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:45 np0005465988 podman[314846]: 2025-10-02 12:48:45.58042669 +0000 UTC m=+0.115270436 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:48:45 np0005465988 nova_compute[236126]: 2025-10-02 12:48:45.800 2 DEBUG nova.network.neutron [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updating instance_info_cache with network_info: [{"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:48:45 np0005465988 nova_compute[236126]: 2025-10-02 12:48:45.826 2 DEBUG oslo_concurrency.lockutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Releasing lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:48:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:45.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:46 np0005465988 nova_compute[236126]: 2025-10-02 12:48:46.169 2 DEBUG nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:48:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:47.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:47Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:a9:95 10.100.0.9
Oct  2 08:48:47 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:47Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:a9:95 10.100.0.9
Oct  2 08:48:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:47.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:49.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:49 np0005465988 nova_compute[236126]: 2025-10-02 12:48:49.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:49.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:50 np0005465988 nova_compute[236126]: 2025-10-02 12:48:50.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.195 2 INFO nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Instance shutdown successfully after 5 seconds.#033[00m
Oct  2 08:48:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:48:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:51.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:48:51 np0005465988 kernel: tap7020ab2d-94 (unregistering): left promiscuous mode
Oct  2 08:48:51 np0005465988 NetworkManager[45041]: <info>  [1759409331.2793] device (tap7020ab2d-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:48:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:51Z|00766|binding|INFO|Releasing lport 7020ab2d-943e-4985-b442-c6584c56c0d2 from this chassis (sb_readonly=0)
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:51Z|00767|binding|INFO|Setting lport 7020ab2d-943e-4985-b442-c6584c56c0d2 down in Southbound
Oct  2 08:48:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:51Z|00768|binding|INFO|Removing iface tap7020ab2d-94 ovn-installed in OVS
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:51.346 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:a9:95 10.100.0.9'], port_security=['fa:16:3e:5b:a9:95 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '439392e5-66ae-4162-a7e5-077f87ca558b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7020ab2d-943e-4985-b442-c6584c56c0d2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:48:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:51.348 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7020ab2d-943e-4985-b442-c6584c56c0d2 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 unbound from our chassis#033[00m
Oct  2 08:48:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:51.350 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48e4ff16-1388-40c7-a27a-83a3b4869808, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:48:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:51.352 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4a39ab69-8ea3-4a48-90d8-69b96fd40e8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:51.352 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 namespace which is not needed anymore#033[00m
Oct  2 08:48:51 np0005465988 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000ad.scope: Deactivated successfully.
Oct  2 08:48:51 np0005465988 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000ad.scope: Consumed 14.020s CPU time.
Oct  2 08:48:51 np0005465988 systemd-machined[192594]: Machine qemu-80-instance-000000ad terminated.
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.438 2 INFO nova.virt.libvirt.driver [-] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Instance destroyed successfully.#033[00m
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.439 2 DEBUG nova.objects.instance [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.547 2 INFO nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Attempting a stable device rescue#033[00m
Oct  2 08:48:51 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[314612]: [NOTICE]   (314616) : haproxy version is 2.8.14-c23fe91
Oct  2 08:48:51 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[314612]: [NOTICE]   (314616) : path to executable is /usr/sbin/haproxy
Oct  2 08:48:51 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[314612]: [WARNING]  (314616) : Exiting Master process...
Oct  2 08:48:51 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[314612]: [WARNING]  (314616) : Exiting Master process...
Oct  2 08:48:51 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[314612]: [ALERT]    (314616) : Current worker (314618) exited with code 143 (Terminated)
Oct  2 08:48:51 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[314612]: [WARNING]  (314616) : All workers exited. Exiting... (0)
Oct  2 08:48:51 np0005465988 systemd[1]: libpod-70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c.scope: Deactivated successfully.
Oct  2 08:48:51 np0005465988 podman[314947]: 2025-10-02 12:48:51.570107362 +0000 UTC m=+0.112841737 container died 70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:48:51 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c-userdata-shm.mount: Deactivated successfully.
Oct  2 08:48:51 np0005465988 systemd[1]: var-lib-containers-storage-overlay-435a3bcd75246087a7339e8e9a3725b4ef5e5d242558ecefc251af26fd16ef6e-merged.mount: Deactivated successfully.
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.832 2 DEBUG nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.841 2 DEBUG nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.841 2 INFO nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Creating image(s)#033[00m
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.879 2 DEBUG nova.storage.rbd_utils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.884 2 DEBUG nova.objects.instance [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:51.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:51 np0005465988 nova_compute[236126]: 2025-10-02 12:48:51.974 2 DEBUG nova.storage.rbd_utils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.000 2 DEBUG nova.storage.rbd_utils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.004 2 DEBUG oslo_concurrency.lockutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "372f9397a23cc3596b809ad261ffaaef0a16efbf" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.005 2 DEBUG oslo_concurrency.lockutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "372f9397a23cc3596b809ad261ffaaef0a16efbf" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:52 np0005465988 podman[314947]: 2025-10-02 12:48:52.025701495 +0000 UTC m=+0.568435870 container cleanup 70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:48:52 np0005465988 systemd[1]: libpod-conmon-70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c.scope: Deactivated successfully.
Oct  2 08:48:52 np0005465988 podman[315032]: 2025-10-02 12:48:52.092518892 +0000 UTC m=+0.043360641 container remove 70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:48:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:52.100 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[481e6d81-fa6d-4158-b0ec-79e4106afe3c]: (4, ('Thu Oct  2 12:48:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 (70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c)\n70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c\nThu Oct  2 12:48:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 (70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c)\n70b8bdaf3d507610762590f3bf71f57af1524afe0378e0b165af178db970668c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:52.101 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5d1b55cb-1dd1-4dc2-917d-ab4c3681405c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:52.102 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:52 np0005465988 kernel: tap48e4ff16-10: left promiscuous mode
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:52.126 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[61305089-661d-43d5-9707-220a12522dfe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:52.158 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2e39a0ba-b641-42c4-abe2-c387b03b2073]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:52.160 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[20bd89fc-bbbe-4eab-a90d-9cf01b57ddd3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:52.176 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7e54e7fc-e839-41b1-afdb-169066943a2f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736765, 'reachable_time': 29945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315050, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:52.178 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:48:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:52.178 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[b48e271f-e5be-4466-948e-f86f17b4db26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:52 np0005465988 systemd[1]: run-netns-ovnmeta\x2d48e4ff16\x2d1388\x2d40c7\x2da27a\x2d83a3b4869808.mount: Deactivated successfully.
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.313 2 DEBUG nova.virt.libvirt.imagebackend [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Image locations are: [{'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/5428ba77-2370-48a0-8396-a85c08f86505/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/5428ba77-2370-48a0-8396-a85c08f86505/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.387 2 DEBUG nova.virt.libvirt.imagebackend [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Selected location: {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/5428ba77-2370-48a0-8396-a85c08f86505/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.388 2 DEBUG nova.storage.rbd_utils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] cloning images/5428ba77-2370-48a0-8396-a85c08f86505@snap to None/439392e5-66ae-4162-a7e5-077f87ca558b_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.547 2 DEBUG oslo_concurrency.lockutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "372f9397a23cc3596b809ad261ffaaef0a16efbf" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.612 2 DEBUG nova.objects.instance [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'migration_context' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.666 2 DEBUG nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.669 2 DEBUG nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Start _get_guest_xml network_info=[{"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "vif_mac": "fa:16:3e:5b:a9:95"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '5428ba77-2370-48a0-8396-a85c08f86505', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.669 2 DEBUG nova.objects.instance [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'resources' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.813 2 WARNING nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.822 2 DEBUG nova.virt.libvirt.host [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.823 2 DEBUG nova.virt.libvirt.host [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.829 2 DEBUG nova.virt.libvirt.host [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.830 2 DEBUG nova.virt.libvirt.host [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.831 2 DEBUG nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.831 2 DEBUG nova.virt.hardware [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.831 2 DEBUG nova.virt.hardware [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.832 2 DEBUG nova.virt.hardware [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.832 2 DEBUG nova.virt.hardware [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.832 2 DEBUG nova.virt.hardware [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.832 2 DEBUG nova.virt.hardware [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.832 2 DEBUG nova.virt.hardware [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.833 2 DEBUG nova.virt.hardware [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.833 2 DEBUG nova.virt.hardware [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.833 2 DEBUG nova.virt.hardware [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.833 2 DEBUG nova.virt.hardware [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.833 2 DEBUG nova.objects.instance [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:52 np0005465988 nova_compute[236126]: 2025-10-02 12:48:52.851 2 DEBUG oslo_concurrency.processutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:53.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:48:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1329233657' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.320 2 DEBUG oslo_concurrency.processutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.358 2 DEBUG oslo_concurrency.processutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:48:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/492698381' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.824 2 DEBUG oslo_concurrency.processutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.825 2 DEBUG oslo_concurrency.processutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:53.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.914 2 DEBUG nova.compute.manager [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-unplugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.915 2 DEBUG oslo_concurrency.lockutils [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.915 2 DEBUG oslo_concurrency.lockutils [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.916 2 DEBUG oslo_concurrency.lockutils [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.916 2 DEBUG nova.compute.manager [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] No waiting events found dispatching network-vif-unplugged-7020ab2d-943e-4985-b442-c6584c56c0d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.916 2 WARNING nova.compute.manager [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received unexpected event network-vif-unplugged-7020ab2d-943e-4985-b442-c6584c56c0d2 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.917 2 DEBUG nova.compute.manager [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.917 2 DEBUG oslo_concurrency.lockutils [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.918 2 DEBUG oslo_concurrency.lockutils [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.918 2 DEBUG oslo_concurrency.lockutils [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.918 2 DEBUG nova.compute.manager [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] No waiting events found dispatching network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:48:53 np0005465988 nova_compute[236126]: 2025-10-02 12:48:53.918 2 WARNING nova.compute.manager [req-16fd2866-9065-4e62-818c-9599281d7461 req-b6bfbc7f-d0a7-46a0-9fe7-e7e4b7691929 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received unexpected event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:48:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:48:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/770388318' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.310 2 DEBUG oslo_concurrency.processutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.311 2 DEBUG nova.virt.libvirt.vif [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:48:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-91664647',display_name='tempest-ServerStableDeviceRescueTest-server-91664647',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-91664647',id=173,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:48:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6a442bc513e14406b73e96e70396e6c3',ramdisk_id='',reservation_id='r-x1ok7izy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-454391960',owner_user_name='tempest-ServerStableDeviceRescueTest-454391960-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:48:39Z,user_data=None,user_id='6785ffe5d6554514b4ed9fd47665eca0',uuid=439392e5-66ae-4162-a7e5-077f87ca558b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "vif_mac": "fa:16:3e:5b:a9:95"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.311 2 DEBUG nova.network.os_vif_util [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converting VIF {"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "vif_mac": "fa:16:3e:5b:a9:95"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.312 2 DEBUG nova.network.os_vif_util [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:a9:95,bridge_name='br-int',has_traffic_filtering=True,id=7020ab2d-943e-4985-b442-c6584c56c0d2,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7020ab2d-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.313 2 DEBUG nova.objects.instance [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.335 2 DEBUG nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  <uuid>439392e5-66ae-4162-a7e5-077f87ca558b</uuid>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  <name>instance-000000ad</name>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-91664647</nova:name>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:48:52</nova:creationTime>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <nova:user uuid="6785ffe5d6554514b4ed9fd47665eca0">tempest-ServerStableDeviceRescueTest-454391960-project-member</nova:user>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <nova:project uuid="6a442bc513e14406b73e96e70396e6c3">tempest-ServerStableDeviceRescueTest-454391960</nova:project>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <nova:port uuid="7020ab2d-943e-4985-b442-c6584c56c0d2">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <entry name="serial">439392e5-66ae-4162-a7e5-077f87ca558b</entry>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <entry name="uuid">439392e5-66ae-4162-a7e5-077f87ca558b</entry>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/439392e5-66ae-4162-a7e5-077f87ca558b_disk">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/439392e5-66ae-4162-a7e5-077f87ca558b_disk.config">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/439392e5-66ae-4162-a7e5-077f87ca558b_disk.rescue">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <target dev="sdb" bus="scsi"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <boot order="1"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:5b:a9:95"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <target dev="tap7020ab2d-94"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/console.log" append="off"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:48:54 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:48:54 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:48:54 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:48:54 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.343 2 INFO nova.virt.libvirt.driver [-] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Instance destroyed successfully.#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.392 2 DEBUG nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.392 2 DEBUG nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.393 2 DEBUG nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.393 2 DEBUG nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No VIF found with MAC fa:16:3e:5b:a9:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.393 2 INFO nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Using config drive#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.417 2 DEBUG nova.storage.rbd_utils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.436 2 DEBUG nova.objects.instance [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.468 2 DEBUG nova.objects.instance [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'keypairs' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:54 np0005465988 nova_compute[236126]: 2025-10-02 12:48:54.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.012 2 INFO nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Creating config drive at /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config.rescue#033[00m
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.018 2 DEBUG oslo_concurrency.processutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz9zff04w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.166 2 DEBUG oslo_concurrency.processutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz9zff04w" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.200 2 DEBUG nova.storage.rbd_utils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image 439392e5-66ae-4162-a7e5-077f87ca558b_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.204 2 DEBUG oslo_concurrency.processutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config.rescue 439392e5-66ae-4162-a7e5-077f87ca558b_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:55.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.389 2 DEBUG oslo_concurrency.processutils [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config.rescue 439392e5-66ae-4162-a7e5-077f87ca558b_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.185s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.390 2 INFO nova.virt.libvirt.driver [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Deleting local config drive /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b/disk.config.rescue because it was imported into RBD.#033[00m
Oct  2 08:48:55 np0005465988 kernel: tap7020ab2d-94: entered promiscuous mode
Oct  2 08:48:55 np0005465988 NetworkManager[45041]: <info>  [1759409335.4577] manager: (tap7020ab2d-94): new Tun device (/org/freedesktop/NetworkManager/Devices/345)
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:55 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:55Z|00769|binding|INFO|Claiming lport 7020ab2d-943e-4985-b442-c6584c56c0d2 for this chassis.
Oct  2 08:48:55 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:55Z|00770|binding|INFO|7020ab2d-943e-4985-b442-c6584c56c0d2: Claiming fa:16:3e:5b:a9:95 10.100.0.9
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.473 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:a9:95 10.100.0.9'], port_security=['fa:16:3e:5b:a9:95 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '439392e5-66ae-4162-a7e5-077f87ca558b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7020ab2d-943e-4985-b442-c6584c56c0d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.474 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7020ab2d-943e-4985-b442-c6584c56c0d2 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 bound to our chassis#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.476 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:48:55 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:55Z|00771|binding|INFO|Setting lport 7020ab2d-943e-4985-b442-c6584c56c0d2 ovn-installed in OVS
Oct  2 08:48:55 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:55Z|00772|binding|INFO|Setting lport 7020ab2d-943e-4985-b442-c6584c56c0d2 up in Southbound
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:55 np0005465988 systemd-udevd[315344]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.495 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ee5e6a07-7fe2-4e99-bb82-56fddcecae8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.496 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48e4ff16-11 in ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:48:55 np0005465988 NetworkManager[45041]: <info>  [1759409335.4989] device (tap7020ab2d-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.498 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48e4ff16-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.498 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b65a26ab-2324-452d-850e-389e291871d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 NetworkManager[45041]: <info>  [1759409335.4997] device (tap7020ab2d-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.500 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0e67a7c3-096c-4adc-a4b2-dcf45d366cd7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 systemd-machined[192594]: New machine qemu-81-instance-000000ad.
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.512 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[099de972-6bc6-4b97-aea8-9c065de6ccd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 systemd[1]: Started Virtual Machine qemu-81-instance-000000ad.
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.529 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a7bdd5af-4200-4af7-a5e0-598d70e91243]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.557 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[81be7383-97f8-48ae-a325-2cb7f91cc720]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 NetworkManager[45041]: <info>  [1759409335.5639] manager: (tap48e4ff16-10): new Veth device (/org/freedesktop/NetworkManager/Devices/346)
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.564 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f7066fd7-9459-4c67-8053-6d15c420ea39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.607 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[38a9bc2e-5b9f-4185-b69f-1793b9eb862e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.611 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9837f27f-73d2-4e6e-af4a-063a8961b47d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:55 np0005465988 NetworkManager[45041]: <info>  [1759409335.6345] device (tap48e4ff16-10): carrier: link connected
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.644 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5776e69a-f191-4536-a05c-cc0eeb06611c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.659 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bd5448cb-9d4b-4d51-93da-1c183678ab4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 739596, 'reachable_time': 44073, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315379, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.673 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f03d1854-2497-46ae-b96d-00409b07968b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8b:53bc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 739596, 'tstamp': 739596}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315380, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.686 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a3951c2a-c667-4688-a86d-7972e8647508]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 739596, 'reachable_time': 44073, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 315381, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.713 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b2e6fc43-0707-4d55-b556-a3b0bf83c0d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.865 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6d38bebf-39ea-4212-afa2-b0a9d720b633]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.867 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.867 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.868 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:55 np0005465988 NetworkManager[45041]: <info>  [1759409335.8705] manager: (tap48e4ff16-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/347)
Oct  2 08:48:55 np0005465988 kernel: tap48e4ff16-10: entered promiscuous mode
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.876 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:55 np0005465988 ovn_controller[132601]: 2025-10-02T12:48:55Z|00773|binding|INFO|Releasing lport 1fc80788-89b8-413a-b0b0-d36f1a11a2b1 from this chassis (sb_readonly=0)
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:55 np0005465988 nova_compute[236126]: 2025-10-02 12:48:55.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.892 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48e4ff16-1388-40c7-a27a-83a3b4869808.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48e4ff16-1388-40c7-a27a-83a3b4869808.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.893 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3371212a-0518-4f8c-acf1-4091478b23cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.894 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-48e4ff16-1388-40c7-a27a-83a3b4869808
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/48e4ff16-1388-40c7-a27a-83a3b4869808.pid.haproxy
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 48e4ff16-1388-40c7-a27a-83a3b4869808
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:48:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:48:55.895 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'env', 'PROCESS_TAG=haproxy-48e4ff16-1388-40c7-a27a-83a3b4869808', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48e4ff16-1388-40c7-a27a-83a3b4869808.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:48:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:55.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.330 2 DEBUG nova.compute.manager [req-f5a8279f-f217-4b09-88a8-0508ccf5404f req-00c09c42-7bfb-42d8-a719-6bbd847a2d2b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.330 2 DEBUG oslo_concurrency.lockutils [req-f5a8279f-f217-4b09-88a8-0508ccf5404f req-00c09c42-7bfb-42d8-a719-6bbd847a2d2b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.331 2 DEBUG oslo_concurrency.lockutils [req-f5a8279f-f217-4b09-88a8-0508ccf5404f req-00c09c42-7bfb-42d8-a719-6bbd847a2d2b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.331 2 DEBUG oslo_concurrency.lockutils [req-f5a8279f-f217-4b09-88a8-0508ccf5404f req-00c09c42-7bfb-42d8-a719-6bbd847a2d2b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.331 2 DEBUG nova.compute.manager [req-f5a8279f-f217-4b09-88a8-0508ccf5404f req-00c09c42-7bfb-42d8-a719-6bbd847a2d2b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] No waiting events found dispatching network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.331 2 WARNING nova.compute.manager [req-f5a8279f-f217-4b09-88a8-0508ccf5404f req-00c09c42-7bfb-42d8-a719-6bbd847a2d2b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received unexpected event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:48:56 np0005465988 podman[315473]: 2025-10-02 12:48:56.268935183 +0000 UTC m=+0.029854172 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:48:56 np0005465988 podman[315473]: 2025-10-02 12:48:56.406254845 +0000 UTC m=+0.167173794 container create 5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:48:56 np0005465988 systemd[1]: Started libpod-conmon-5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c.scope.
Oct  2 08:48:56 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:48:56 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe6f8ff7f2f10075a90e9172945b71cebc33dac4e522f77c81ba4da4553fa4d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:48:56 np0005465988 podman[315473]: 2025-10-02 12:48:56.549800296 +0000 UTC m=+0.310719285 container init 5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:48:56 np0005465988 podman[315473]: 2025-10-02 12:48:56.556255172 +0000 UTC m=+0.317174141 container start 5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.557 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 439392e5-66ae-4162-a7e5-077f87ca558b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.558 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409336.5572534, 439392e5-66ae-4162-a7e5-077f87ca558b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.558 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.563 2 DEBUG nova.compute.manager [None req-048bceef-ace3-45af-8494-133091771294 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:56 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315489]: [NOTICE]   (315493) : New worker (315495) forked
Oct  2 08:48:56 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315489]: [NOTICE]   (315493) : Loading success.
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.645 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.648 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.708 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.709 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409336.5638368, 439392e5-66ae-4162-a7e5-077f87ca558b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.709 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] VM Started (Lifecycle Event)#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.740 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:56 np0005465988 nova_compute[236126]: 2025-10-02 12:48:56.745 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:48:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:48:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:57.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:48:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:57.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:58 np0005465988 nova_compute[236126]: 2025-10-02 12:48:58.483 2 INFO nova.compute.manager [None req-5649fa65-c318-4451-8e30-d718a9f906dd 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Unrescuing#033[00m
Oct  2 08:48:58 np0005465988 nova_compute[236126]: 2025-10-02 12:48:58.484 2 DEBUG oslo_concurrency.lockutils [None req-5649fa65-c318-4451-8e30-d718a9f906dd 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:48:58 np0005465988 nova_compute[236126]: 2025-10-02 12:48:58.485 2 DEBUG oslo_concurrency.lockutils [None req-5649fa65-c318-4451-8e30-d718a9f906dd 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquired lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:48:58 np0005465988 nova_compute[236126]: 2025-10-02 12:48:58.485 2 DEBUG nova.network.neutron [None req-5649fa65-c318-4451-8e30-d718a9f906dd 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:48:58 np0005465988 nova_compute[236126]: 2025-10-02 12:48:58.570 2 DEBUG nova.compute.manager [req-a92cdbbf-1d50-4bc4-8576-8c814163e8bb req-c7b93e62-6561-4e5f-a626-e39818f5a64a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:48:58 np0005465988 nova_compute[236126]: 2025-10-02 12:48:58.571 2 DEBUG oslo_concurrency.lockutils [req-a92cdbbf-1d50-4bc4-8576-8c814163e8bb req-c7b93e62-6561-4e5f-a626-e39818f5a64a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:58 np0005465988 nova_compute[236126]: 2025-10-02 12:48:58.571 2 DEBUG oslo_concurrency.lockutils [req-a92cdbbf-1d50-4bc4-8576-8c814163e8bb req-c7b93e62-6561-4e5f-a626-e39818f5a64a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:58 np0005465988 nova_compute[236126]: 2025-10-02 12:48:58.572 2 DEBUG oslo_concurrency.lockutils [req-a92cdbbf-1d50-4bc4-8576-8c814163e8bb req-c7b93e62-6561-4e5f-a626-e39818f5a64a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:58 np0005465988 nova_compute[236126]: 2025-10-02 12:48:58.572 2 DEBUG nova.compute.manager [req-a92cdbbf-1d50-4bc4-8576-8c814163e8bb req-c7b93e62-6561-4e5f-a626-e39818f5a64a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] No waiting events found dispatching network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:48:58 np0005465988 nova_compute[236126]: 2025-10-02 12:48:58.573 2 WARNING nova.compute.manager [req-a92cdbbf-1d50-4bc4-8576-8c814163e8bb req-c7b93e62-6561-4e5f-a626-e39818f5a64a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received unexpected event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 for instance with vm_state rescued and task_state unrescuing.#033[00m
Oct  2 08:48:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:48:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:48:59.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:48:59 np0005465988 nova_compute[236126]: 2025-10-02 12:48:59.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:48:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:48:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:48:59.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:00 np0005465988 nova_compute[236126]: 2025-10-02 12:49:00.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:00 np0005465988 nova_compute[236126]: 2025-10-02 12:49:00.874 2 DEBUG nova.network.neutron [None req-5649fa65-c318-4451-8e30-d718a9f906dd 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updating instance_info_cache with network_info: [{"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:49:00 np0005465988 nova_compute[236126]: 2025-10-02 12:49:00.979 2 DEBUG oslo_concurrency.lockutils [None req-5649fa65-c318-4451-8e30-d718a9f906dd 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Releasing lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:49:00 np0005465988 nova_compute[236126]: 2025-10-02 12:49:00.980 2 DEBUG nova.objects.instance [None req-5649fa65-c318-4451-8e30-d718a9f906dd 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'flavor' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:01 np0005465988 kernel: tap7020ab2d-94 (unregistering): left promiscuous mode
Oct  2 08:49:01 np0005465988 NetworkManager[45041]: <info>  [1759409341.0843] device (tap7020ab2d-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00774|binding|INFO|Releasing lport 7020ab2d-943e-4985-b442-c6584c56c0d2 from this chassis (sb_readonly=0)
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00775|binding|INFO|Setting lport 7020ab2d-943e-4985-b442-c6584c56c0d2 down in Southbound
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00776|binding|INFO|Removing iface tap7020ab2d-94 ovn-installed in OVS
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.142 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:a9:95 10.100.0.9'], port_security=['fa:16:3e:5b:a9:95 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '439392e5-66ae-4162-a7e5-077f87ca558b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7020ab2d-943e-4985-b442-c6584c56c0d2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.144 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7020ab2d-943e-4985-b442-c6584c56c0d2 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 unbound from our chassis#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.146 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48e4ff16-1388-40c7-a27a-83a3b4869808, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.148 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6670548a-1d5e-40bc-8856-96e87cb40a70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.148 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 namespace which is not needed anymore#033[00m
Oct  2 08:49:01 np0005465988 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000ad.scope: Deactivated successfully.
Oct  2 08:49:01 np0005465988 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000ad.scope: Consumed 5.489s CPU time.
Oct  2 08:49:01 np0005465988 systemd-machined[192594]: Machine qemu-81-instance-000000ad terminated.
Oct  2 08:49:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:01.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:01 np0005465988 kernel: tap7020ab2d-94: entered promiscuous mode
Oct  2 08:49:01 np0005465988 NetworkManager[45041]: <info>  [1759409341.2540] manager: (tap7020ab2d-94): new Tun device (/org/freedesktop/NetworkManager/Devices/348)
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 kernel: tap7020ab2d-94 (unregistering): left promiscuous mode
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00777|binding|INFO|Claiming lport 7020ab2d-943e-4985-b442-c6584c56c0d2 for this chassis.
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00778|binding|INFO|7020ab2d-943e-4985-b442-c6584c56c0d2: Claiming fa:16:3e:5b:a9:95 10.100.0.9
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.278 2 INFO nova.virt.libvirt.driver [-] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Instance destroyed successfully.#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.278 2 DEBUG nova.objects.instance [None req-5649fa65-c318-4451-8e30-d718a9f906dd 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00779|binding|INFO|Setting lport 7020ab2d-943e-4985-b442-c6584c56c0d2 ovn-installed in OVS
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00780|if_status|INFO|Dropped 2 log messages in last 1090 seconds (most recently, 1090 seconds ago) due to excessive rate
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00781|if_status|INFO|Not setting lport 7020ab2d-943e-4985-b442-c6584c56c0d2 down as sb is readonly
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315489]: [NOTICE]   (315493) : haproxy version is 2.8.14-c23fe91
Oct  2 08:49:01 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315489]: [NOTICE]   (315493) : path to executable is /usr/sbin/haproxy
Oct  2 08:49:01 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315489]: [WARNING]  (315493) : Exiting Master process...
Oct  2 08:49:01 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315489]: [WARNING]  (315493) : Exiting Master process...
Oct  2 08:49:01 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315489]: [ALERT]    (315493) : Current worker (315495) exited with code 143 (Terminated)
Oct  2 08:49:01 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315489]: [WARNING]  (315493) : All workers exited. Exiting... (0)
Oct  2 08:49:01 np0005465988 systemd[1]: libpod-5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c.scope: Deactivated successfully.
Oct  2 08:49:01 np0005465988 podman[315530]: 2025-10-02 12:49:01.324062995 +0000 UTC m=+0.052544087 container died 5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00782|binding|INFO|Releasing lport 7020ab2d-943e-4985-b442-c6584c56c0d2 from this chassis (sb_readonly=0)
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.329 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:a9:95 10.100.0.9'], port_security=['fa:16:3e:5b:a9:95 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '439392e5-66ae-4162-a7e5-077f87ca558b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7020ab2d-943e-4985-b442-c6584c56c0d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c-userdata-shm.mount: Deactivated successfully.
Oct  2 08:49:01 np0005465988 systemd[1]: var-lib-containers-storage-overlay-fe6f8ff7f2f10075a90e9172945b71cebc33dac4e522f77c81ba4da4553fa4d9-merged.mount: Deactivated successfully.
Oct  2 08:49:01 np0005465988 podman[315530]: 2025-10-02 12:49:01.37796803 +0000 UTC m=+0.106449122 container cleanup 5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:49:01 np0005465988 systemd[1]: libpod-conmon-5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c.scope: Deactivated successfully.
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.418 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:a9:95 10.100.0.9'], port_security=['fa:16:3e:5b:a9:95 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '439392e5-66ae-4162-a7e5-077f87ca558b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7020ab2d-943e-4985-b442-c6584c56c0d2) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:01 np0005465988 podman[315564]: 2025-10-02 12:49:01.449973237 +0000 UTC m=+0.047398148 container remove 5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.460 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae447a3-74ad-463a-a616-cf9c118a2d3e]: (4, ('Thu Oct  2 12:49:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 (5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c)\n5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c\nThu Oct  2 12:49:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 (5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c)\n5603748d1ef035a18431f3d16ffaa6f3ed6d4a878a4e5faa6a0e845b487b820c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.464 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d7ba18-49c3-43fd-b557-27715ac8536c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.465 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 kernel: tap48e4ff16-10: left promiscuous mode
Oct  2 08:49:01 np0005465988 NetworkManager[45041]: <info>  [1759409341.5210] manager: (tap7020ab2d-94): new Tun device (/org/freedesktop/NetworkManager/Devices/349)
Oct  2 08:49:01 np0005465988 systemd-udevd[315510]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 kernel: tap7020ab2d-94: entered promiscuous mode
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00783|binding|INFO|Claiming lport 7020ab2d-943e-4985-b442-c6584c56c0d2 for this chassis.
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00784|binding|INFO|7020ab2d-943e-4985-b442-c6584c56c0d2: Claiming fa:16:3e:5b:a9:95 10.100.0.9
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00785|binding|INFO|Removing lport 7020ab2d-943e-4985-b442-c6584c56c0d2 ovn-installed in OVS
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.527 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba6b8c0-52cf-4cc8-ae66-3f29961fdee1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 NetworkManager[45041]: <info>  [1759409341.5392] device (tap7020ab2d-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:49:01 np0005465988 NetworkManager[45041]: <info>  [1759409341.5400] device (tap7020ab2d-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00786|binding|INFO|Setting lport 7020ab2d-943e-4985-b442-c6584c56c0d2 ovn-installed in OVS
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 systemd-machined[192594]: New machine qemu-82-instance-000000ad.
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.559 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[78644f47-fed6-4aa5-b86f-82e4f6fa52f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00787|binding|INFO|Setting lport 7020ab2d-943e-4985-b442-c6584c56c0d2 up in Southbound
Oct  2 08:49:01 np0005465988 systemd[1]: Started Virtual Machine qemu-82-instance-000000ad.
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.560 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6c3a7aea-af59-414d-af22-53bd9bc3054d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.564 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:a9:95 10.100.0.9'], port_security=['fa:16:3e:5b:a9:95 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '439392e5-66ae-4162-a7e5-077f87ca558b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7020ab2d-943e-4985-b442-c6584c56c0d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.587 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[07dd443b-1b2b-4763-8be2-2c2ff4957bca]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 739588, 'reachable_time': 32010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315594, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 systemd[1]: run-netns-ovnmeta\x2d48e4ff16\x2d1388\x2d40c7\x2da27a\x2d83a3b4869808.mount: Deactivated successfully.
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.593 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.593 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[f1a2662e-0ef1-47ce-8a16-cf3d362738e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.595 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7020ab2d-943e-4985-b442-c6584c56c0d2 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 unbound from our chassis#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.596 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.609 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[413bbb2f-0c04-4ea1-8ead-24fc22db87d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.610 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48e4ff16-11 in ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.612 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48e4ff16-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.612 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cba23b59-dcc8-459c-9f4d-d3e41e205345]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.613 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b4080b60-0044-4afa-be8e-5fee3fd04cbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.628 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[6236ec10-d569-4efd-89a2-097223908688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.644 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[40d361df-4e09-4ac6-886e-0df772e877f4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.672 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0293c919-085b-4f45-b918-0b471cdd02af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 NetworkManager[45041]: <info>  [1759409341.6808] manager: (tap48e4ff16-10): new Veth device (/org/freedesktop/NetworkManager/Devices/350)
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.679 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab0ab44-c565-48fa-9705-a34e3da71820]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.718 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[29f2b1d0-be15-4bd8-b050-624fc5410377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.722 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[57ef502e-bbc6-4740-9e7c-862971a7c3dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 NetworkManager[45041]: <info>  [1759409341.7502] device (tap48e4ff16-10): carrier: link connected
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.761 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[a176d934-22a0-4136-b2d2-f1bcfea6a0ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.782 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[efdd0d56-5e24-4271-a340-d4c67f8390bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 35722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315625, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.806 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[407e70ab-81aa-497f-91f9-4252c3642e72]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8b:53bc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740208, 'tstamp': 740208}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315626, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.828 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2e835502-d485-47c0-a543-f7c6a18fa13e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 35722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 315627, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.867 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[570ee309-94e1-440b-b419-db6791c12e10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.902 2 DEBUG nova.compute.manager [req-39a1590f-cc6d-4e96-aac9-f3ae2dc344fc req-79ea15f0-6f6a-487b-bf8d-69721b844cc2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-unplugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.902 2 DEBUG oslo_concurrency.lockutils [req-39a1590f-cc6d-4e96-aac9-f3ae2dc344fc req-79ea15f0-6f6a-487b-bf8d-69721b844cc2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.903 2 DEBUG oslo_concurrency.lockutils [req-39a1590f-cc6d-4e96-aac9-f3ae2dc344fc req-79ea15f0-6f6a-487b-bf8d-69721b844cc2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.903 2 DEBUG oslo_concurrency.lockutils [req-39a1590f-cc6d-4e96-aac9-f3ae2dc344fc req-79ea15f0-6f6a-487b-bf8d-69721b844cc2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.903 2 DEBUG nova.compute.manager [req-39a1590f-cc6d-4e96-aac9-f3ae2dc344fc req-79ea15f0-6f6a-487b-bf8d-69721b844cc2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] No waiting events found dispatching network-vif-unplugged-7020ab2d-943e-4985-b442-c6584c56c0d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.903 2 WARNING nova.compute.manager [req-39a1590f-cc6d-4e96-aac9-f3ae2dc344fc req-79ea15f0-6f6a-487b-bf8d-69721b844cc2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received unexpected event network-vif-unplugged-7020ab2d-943e-4985-b442-c6584c56c0d2 for instance with vm_state rescued and task_state unrescuing.#033[00m
Oct  2 08:49:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:01.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.930 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[28c61f04-c714-4e6f-85ca-a96419beb30f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.932 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.932 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.933 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 NetworkManager[45041]: <info>  [1759409341.9356] manager: (tap48e4ff16-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Oct  2 08:49:01 np0005465988 kernel: tap48e4ff16-10: entered promiscuous mode
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.945 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:01Z|00788|binding|INFO|Releasing lport 1fc80788-89b8-413a-b0b0-d36f1a11a2b1 from this chassis (sb_readonly=0)
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.958 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48e4ff16-1388-40c7-a27a-83a3b4869808.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48e4ff16-1388-40c7-a27a-83a3b4869808.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.959 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[003a9c05-d4c8-4823-9451-0fe252d28859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.960 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-48e4ff16-1388-40c7-a27a-83a3b4869808
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/48e4ff16-1388-40c7-a27a-83a3b4869808.pid.haproxy
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 48e4ff16-1388-40c7-a27a-83a3b4869808
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:49:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:01.960 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'env', 'PROCESS_TAG=haproxy-48e4ff16-1388-40c7-a27a-83a3b4869808', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48e4ff16-1388-40c7-a27a-83a3b4869808.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:49:01 np0005465988 nova_compute[236126]: 2025-10-02 12:49:01.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:02Z|00789|binding|INFO|Releasing lport 28a672bd-7c4d-49bd-8937-0e065b62aa5f from this chassis (sb_readonly=0)
Oct  2 08:49:02 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:02Z|00790|binding|INFO|Releasing lport 1fc80788-89b8-413a-b0b0-d36f1a11a2b1 from this chassis (sb_readonly=0)
Oct  2 08:49:02 np0005465988 podman[315701]: 2025-10-02 12:49:02.308244057 +0000 UTC m=+0.022149760 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:02 np0005465988 podman[315701]: 2025-10-02 12:49:02.496150128 +0000 UTC m=+0.210055861 container create bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.496 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.496 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.496 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.496 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.497 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.585 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 439392e5-66ae-4162-a7e5-077f87ca558b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.586 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409342.5842495, 439392e5-66ae-4162-a7e5-077f87ca558b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.587 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:49:02 np0005465988 systemd[1]: Started libpod-conmon-bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb.scope.
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.630 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:02 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.638 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:49:02 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0755f3ea1419a1dbb47ea63f096e47b6843023923230b44feb353f6e5fd40b4e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.666 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.667 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409342.5862744, 439392e5-66ae-4162-a7e5-077f87ca558b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.667 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] VM Started (Lifecycle Event)#033[00m
Oct  2 08:49:02 np0005465988 podman[315701]: 2025-10-02 12:49:02.668433788 +0000 UTC m=+0.382339501 container init bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:49:02 np0005465988 podman[315701]: 2025-10-02 12:49:02.674679238 +0000 UTC m=+0.388584931 container start bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.696 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:02 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315718]: [NOTICE]   (315769) : New worker (315775) forked
Oct  2 08:49:02 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315718]: [NOTICE]   (315769) : Loading success.
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.706 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.728 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Oct  2 08:49:02 np0005465988 podman[315717]: 2025-10-02 12:49:02.742954157 +0000 UTC m=+0.121935828 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.763 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7020ab2d-943e-4985-b442-c6584c56c0d2 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 unbound from our chassis#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.766 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.781 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7fca1e5c-fb39-4ab6-863c-21261cf07003]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.825 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[e252ac0a-c19d-41bc-b084-eb5c6bf8cf1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.828 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4353441b-5d3f-4dd4-9cf2-c5da07cb956e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:02 np0005465988 ceph-mgr[76715]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3158772141
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.863 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[cbfe8553-7d09-4fd5-b8c6-d9088371a7ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.880 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b5a151-0963-4bea-b55e-73e053d085c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 306, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 306, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 35722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315792, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.899 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8f00bf1f-deef-43d7-ae7f-69c68944d9f7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740222, 'tstamp': 740222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315793, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740225, 'tstamp': 740225}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315793, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.902 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:02 np0005465988 nova_compute[236126]: 2025-10-02 12:49:02.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.905 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.905 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.906 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.906 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.907 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7020ab2d-943e-4985-b442-c6584c56c0d2 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 unbound from our chassis#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.909 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.927 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7905eb4b-292c-4afb-ac57-719010ef8681]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.961 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[feb210cf-6d3d-4b93-86cc-94de238831d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.965 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ef56edbf-38ba-4814-a586-fee515b708b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:02.995 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[be88ef0e-1186-4d5d-aff6-2b665b2fa23d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:03.013 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bf2adab3-13a3-4e41-bb9c-ed05271d3de7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 8, 'rx_bytes': 306, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 8, 'rx_bytes': 306, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 35722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315799, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:49:03 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2645208543' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:49:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:03.034 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dd2d219a-0412-42fd-9908-ececef804aa7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740222, 'tstamp': 740222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315802, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740225, 'tstamp': 740225}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315802, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:03.036 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:03.039 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:03.039 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:03.040 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:03.040 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.068 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.571s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.095 2 DEBUG nova.compute.manager [None req-5649fa65-c318-4451-8e30-d718a9f906dd 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.192 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.192 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.197 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.197 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:49:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:03.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.408 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.409 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3807MB free_disk=20.86111068725586GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.409 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.409 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.617 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance bf9e8de1-5081-4daa-9041-1d329e06be86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.617 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 439392e5-66ae-4162-a7e5-077f87ca558b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.618 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.620 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:49:03 np0005465988 nova_compute[236126]: 2025-10-02 12:49:03.752 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:03.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.080 2 DEBUG nova.compute.manager [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.081 2 DEBUG oslo_concurrency.lockutils [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.081 2 DEBUG oslo_concurrency.lockutils [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.081 2 DEBUG oslo_concurrency.lockutils [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.082 2 DEBUG nova.compute.manager [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] No waiting events found dispatching network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.082 2 WARNING nova.compute.manager [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received unexpected event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.082 2 DEBUG nova.compute.manager [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.083 2 DEBUG oslo_concurrency.lockutils [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.083 2 DEBUG oslo_concurrency.lockutils [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.083 2 DEBUG oslo_concurrency.lockutils [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.084 2 DEBUG nova.compute.manager [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] No waiting events found dispatching network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.085 2 WARNING nova.compute.manager [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received unexpected event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.085 2 DEBUG nova.compute.manager [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.085 2 DEBUG oslo_concurrency.lockutils [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.086 2 DEBUG oslo_concurrency.lockutils [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.086 2 DEBUG oslo_concurrency.lockutils [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.086 2 DEBUG nova.compute.manager [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] No waiting events found dispatching network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.086 2 WARNING nova.compute.manager [req-d8dd7b0d-bd88-4504-86d9-4949dfecb644 req-3d30ca4a-1312-4340-9784-92ec05c94731 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received unexpected event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:49:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:49:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2762881637' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.203 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.208 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:49:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.229 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.269 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.269 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:04 np0005465988 nova_compute[236126]: 2025-10-02 12:49:04.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:05.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:05 np0005465988 nova_compute[236126]: 2025-10-02 12:49:05.270 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:05 np0005465988 nova_compute[236126]: 2025-10-02 12:49:05.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:05.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:06 np0005465988 nova_compute[236126]: 2025-10-02 12:49:06.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:07.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:07.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:09.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:49:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:49:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:49:09 np0005465988 nova_compute[236126]: 2025-10-02 12:49:09.386 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:09 np0005465988 nova_compute[236126]: 2025-10-02 12:49:09.387 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:09 np0005465988 nova_compute[236126]: 2025-10-02 12:49:09.479 2 DEBUG nova.compute.manager [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:49:09 np0005465988 nova_compute[236126]: 2025-10-02 12:49:09.733 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:09 np0005465988 nova_compute[236126]: 2025-10-02 12:49:09.734 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:09 np0005465988 nova_compute[236126]: 2025-10-02 12:49:09.746 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:49:09 np0005465988 nova_compute[236126]: 2025-10-02 12:49:09.747 2 INFO nova.compute.claims [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:49:09 np0005465988 nova_compute[236126]: 2025-10-02 12:49:09.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:09.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:09 np0005465988 nova_compute[236126]: 2025-10-02 12:49:09.983 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:49:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3025005931' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:49:10 np0005465988 nova_compute[236126]: 2025-10-02 12:49:10.486 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:10 np0005465988 nova_compute[236126]: 2025-10-02 12:49:10.494 2 DEBUG nova.compute.provider_tree [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:49:10 np0005465988 nova_compute[236126]: 2025-10-02 12:49:10.538 2 DEBUG nova.scheduler.client.report [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:49:10 np0005465988 nova_compute[236126]: 2025-10-02 12:49:10.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:10 np0005465988 nova_compute[236126]: 2025-10-02 12:49:10.793 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:10 np0005465988 nova_compute[236126]: 2025-10-02 12:49:10.794 2 DEBUG nova.compute.manager [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:49:10 np0005465988 nova_compute[236126]: 2025-10-02 12:49:10.970 2 DEBUG nova.compute.manager [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:49:10 np0005465988 nova_compute[236126]: 2025-10-02 12:49:10.971 2 DEBUG nova.network.neutron [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.045 2 INFO nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.133 2 DEBUG nova.compute.manager [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:49:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:49:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:11.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.745 2 DEBUG nova.compute.manager [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.746 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.747 2 INFO nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Creating image(s)#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.770 2 DEBUG nova.storage.rbd_utils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.794 2 DEBUG nova.storage.rbd_utils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.821 2 DEBUG nova.storage.rbd_utils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.826 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.902 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.903 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.904 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.904 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.932 2 DEBUG nova.storage.rbd_utils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.935 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:11.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:11 np0005465988 nova_compute[236126]: 2025-10-02 12:49:11.969 2 DEBUG nova.policy [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6785ffe5d6554514b4ed9fd47665eca0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6a442bc513e14406b73e96e70396e6c3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:49:12 np0005465988 nova_compute[236126]: 2025-10-02 12:49:12.353 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:12 np0005465988 nova_compute[236126]: 2025-10-02 12:49:12.434 2 DEBUG nova.storage.rbd_utils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] resizing rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:49:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:12.677 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:12.679 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:49:12 np0005465988 nova_compute[236126]: 2025-10-02 12:49:12.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:12 np0005465988 nova_compute[236126]: 2025-10-02 12:49:12.711 2 DEBUG nova.objects.instance [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'migration_context' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:12 np0005465988 nova_compute[236126]: 2025-10-02 12:49:12.901 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:49:12 np0005465988 nova_compute[236126]: 2025-10-02 12:49:12.901 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Ensure instance console log exists: /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:49:12 np0005465988 nova_compute[236126]: 2025-10-02 12:49:12.902 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:12 np0005465988 nova_compute[236126]: 2025-10-02 12:49:12.902 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:12 np0005465988 nova_compute[236126]: 2025-10-02 12:49:12.903 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:13.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:13 np0005465988 nova_compute[236126]: 2025-10-02 12:49:13.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:13 np0005465988 nova_compute[236126]: 2025-10-02 12:49:13.940 2 DEBUG nova.network.neutron [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Successfully created port: eccb499c-961f-4ee4-9995-578966625db6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:49:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:13.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:14 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #56. Immutable memtables: 12.
Oct  2 08:49:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:14 np0005465988 nova_compute[236126]: 2025-10-02 12:49:14.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:14 np0005465988 nova_compute[236126]: 2025-10-02 12:49:14.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:14 np0005465988 nova_compute[236126]: 2025-10-02 12:49:14.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:49:14 np0005465988 nova_compute[236126]: 2025-10-02 12:49:14.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:15.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:15Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:a9:95 10.100.0.9
Oct  2 08:49:15 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:15Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:a9:95 10.100.0.9
Oct  2 08:49:15 np0005465988 nova_compute[236126]: 2025-10-02 12:49:15.651 2 DEBUG nova.network.neutron [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Successfully updated port: eccb499c-961f-4ee4-9995-578966625db6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:49:15 np0005465988 nova_compute[236126]: 2025-10-02 12:49:15.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:15 np0005465988 nova_compute[236126]: 2025-10-02 12:49:15.813 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:49:15 np0005465988 nova_compute[236126]: 2025-10-02 12:49:15.813 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquired lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:49:15 np0005465988 nova_compute[236126]: 2025-10-02 12:49:15.813 2 DEBUG nova.network.neutron [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:49:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:15.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:16 np0005465988 podman[316226]: 2025-10-02 12:49:16.110147406 +0000 UTC m=+0.073336027 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible)
Oct  2 08:49:16 np0005465988 podman[316227]: 2025-10-02 12:49:16.14254444 +0000 UTC m=+0.098864753 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:49:16 np0005465988 podman[316225]: 2025-10-02 12:49:16.160870559 +0000 UTC m=+0.117119390 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:49:16 np0005465988 nova_compute[236126]: 2025-10-02 12:49:16.187 2 DEBUG nova.network.neutron [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:49:16 np0005465988 nova_compute[236126]: 2025-10-02 12:49:16.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:16 np0005465988 nova_compute[236126]: 2025-10-02 12:49:16.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:49:16 np0005465988 nova_compute[236126]: 2025-10-02 12:49:16.510 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:49:16 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:16.682 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:49:16 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:49:17 np0005465988 nova_compute[236126]: 2025-10-02 12:49:17.060 2 DEBUG nova.compute.manager [req-0e5be19f-bf6a-47fd-8364-2c7b58f434b2 req-e0736331-96d1-44d6-8ea7-b88ab6589141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-changed-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:17 np0005465988 nova_compute[236126]: 2025-10-02 12:49:17.060 2 DEBUG nova.compute.manager [req-0e5be19f-bf6a-47fd-8364-2c7b58f434b2 req-e0736331-96d1-44d6-8ea7-b88ab6589141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Refreshing instance network info cache due to event network-changed-eccb499c-961f-4ee4-9995-578966625db6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:49:17 np0005465988 nova_compute[236126]: 2025-10-02 12:49:17.061 2 DEBUG oslo_concurrency.lockutils [req-0e5be19f-bf6a-47fd-8364-2c7b58f434b2 req-e0736331-96d1-44d6-8ea7-b88ab6589141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:49:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:17.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:17 np0005465988 nova_compute[236126]: 2025-10-02 12:49:17.509 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:17 np0005465988 nova_compute[236126]: 2025-10-02 12:49:17.510 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:17.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.013 2 DEBUG nova.network.neutron [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Updating instance_info_cache with network_info: [{"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.041 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Releasing lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.042 2 DEBUG nova.compute.manager [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Instance network_info: |[{"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.042 2 DEBUG oslo_concurrency.lockutils [req-0e5be19f-bf6a-47fd-8364-2c7b58f434b2 req-e0736331-96d1-44d6-8ea7-b88ab6589141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.042 2 DEBUG nova.network.neutron [req-0e5be19f-bf6a-47fd-8364-2c7b58f434b2 req-e0736331-96d1-44d6-8ea7-b88ab6589141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Refreshing network info cache for port eccb499c-961f-4ee4-9995-578966625db6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.046 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Start _get_guest_xml network_info=[{"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.052 2 WARNING nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.062 2 DEBUG nova.virt.libvirt.host [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.063 2 DEBUG nova.virt.libvirt.host [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.074 2 DEBUG nova.virt.libvirt.host [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.075 2 DEBUG nova.virt.libvirt.host [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.076 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.077 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.077 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.077 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.078 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.078 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.078 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.078 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.079 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.079 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.079 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.079 2 DEBUG nova.virt.hardware [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.083 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:49:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3336780980' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.562 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.590 2 DEBUG nova.storage.rbd_utils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:18 np0005465988 nova_compute[236126]: 2025-10-02 12:49:18.593 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:49:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2461171271' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.053 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.055 2 DEBUG nova.virt.libvirt.vif [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:49:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-926303208',display_name='tempest-ServerStableDeviceRescueTest-server-926303208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-926303208',id=175,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a442bc513e14406b73e96e70396e6c3',ramdisk_id='',reservation_id='r-s2ebpg4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-454391960',owner_user_name='tempest-ServerStableDeviceRescueTest-454391960-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:49:11Z,user_data=None,user_id='6785ffe5d6554514b4ed9fd47665eca0',uuid=ac6724c1-4d98-45f7-8e2b-dfac55d9cb13,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.056 2 DEBUG nova.network.os_vif_util [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converting VIF {"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.057 2 DEBUG nova.network.os_vif_util [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:45:eb,bridge_name='br-int',has_traffic_filtering=True,id=eccb499c-961f-4ee4-9995-578966625db6,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccb499c-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.058 2 DEBUG nova.objects.instance [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'pci_devices' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.084 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  <uuid>ac6724c1-4d98-45f7-8e2b-dfac55d9cb13</uuid>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  <name>instance-000000af</name>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-926303208</nova:name>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:49:18</nova:creationTime>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <nova:user uuid="6785ffe5d6554514b4ed9fd47665eca0">tempest-ServerStableDeviceRescueTest-454391960-project-member</nova:user>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <nova:project uuid="6a442bc513e14406b73e96e70396e6c3">tempest-ServerStableDeviceRescueTest-454391960</nova:project>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <nova:port uuid="eccb499c-961f-4ee4-9995-578966625db6">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <entry name="serial">ac6724c1-4d98-45f7-8e2b-dfac55d9cb13</entry>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <entry name="uuid">ac6724c1-4d98-45f7-8e2b-dfac55d9cb13</entry>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.config">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:39:45:eb"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <target dev="tapeccb499c-96"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/console.log" append="off"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:49:19 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:49:19 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:49:19 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:49:19 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.086 2 DEBUG nova.compute.manager [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Preparing to wait for external event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.086 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.086 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.087 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.088 2 DEBUG nova.virt.libvirt.vif [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:49:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-926303208',display_name='tempest-ServerStableDeviceRescueTest-server-926303208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-926303208',id=175,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a442bc513e14406b73e96e70396e6c3',ramdisk_id='',reservation_id='r-s2ebpg4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-454391960',owner_user_name='tempest-ServerStableDeviceRescueTest-454391960-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:49:11Z,user_data=None,user_id='6785ffe5d6554514b4ed9fd47665eca0',uuid=ac6724c1-4d98-45f7-8e2b-dfac55d9cb13,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.088 2 DEBUG nova.network.os_vif_util [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converting VIF {"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.089 2 DEBUG nova.network.os_vif_util [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:45:eb,bridge_name='br-int',has_traffic_filtering=True,id=eccb499c-961f-4ee4-9995-578966625db6,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccb499c-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.089 2 DEBUG os_vif [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:45:eb,bridge_name='br-int',has_traffic_filtering=True,id=eccb499c-961f-4ee4-9995-578966625db6,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccb499c-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.090 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.091 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.094 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeccb499c-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.095 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeccb499c-96, col_values=(('external_ids', {'iface-id': 'eccb499c-961f-4ee4-9995-578966625db6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:45:eb', 'vm-uuid': 'ac6724c1-4d98-45f7-8e2b-dfac55d9cb13'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:19 np0005465988 NetworkManager[45041]: <info>  [1759409359.0981] manager: (tapeccb499c-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/352)
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.106 2 INFO os_vif [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:45:eb,bridge_name='br-int',has_traffic_filtering=True,id=eccb499c-961f-4ee4-9995-578966625db6,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccb499c-96')#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.164 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.166 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.166 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No VIF found with MAC fa:16:3e:39:45:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.166 2 INFO nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Using config drive#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.194 2 DEBUG nova.storage.rbd_utils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:19.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.766 2 INFO nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Creating config drive at /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.772 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiumomjre execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.916 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiumomjre" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.948 2 DEBUG nova.storage.rbd_utils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:19.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.952 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:19 np0005465988 nova_compute[236126]: 2025-10-02 12:49:19.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:20 np0005465988 nova_compute[236126]: 2025-10-02 12:49:20.301 2 DEBUG oslo_concurrency.processutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.349s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:20 np0005465988 nova_compute[236126]: 2025-10-02 12:49:20.301 2 INFO nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Deleting local config drive /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config because it was imported into RBD.#033[00m
Oct  2 08:49:20 np0005465988 kernel: tapeccb499c-96: entered promiscuous mode
Oct  2 08:49:20 np0005465988 NetworkManager[45041]: <info>  [1759409360.3509] manager: (tapeccb499c-96): new Tun device (/org/freedesktop/NetworkManager/Devices/353)
Oct  2 08:49:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:20Z|00791|binding|INFO|Claiming lport eccb499c-961f-4ee4-9995-578966625db6 for this chassis.
Oct  2 08:49:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:20Z|00792|binding|INFO|eccb499c-961f-4ee4-9995-578966625db6: Claiming fa:16:3e:39:45:eb 10.100.0.6
Oct  2 08:49:20 np0005465988 nova_compute[236126]: 2025-10-02 12:49:20.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.357 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:45:eb 10.100.0.6'], port_security=['fa:16:3e:39:45:eb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ac6724c1-4d98-45f7-8e2b-dfac55d9cb13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=eccb499c-961f-4ee4-9995-578966625db6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.359 142124 INFO neutron.agent.ovn.metadata.agent [-] Port eccb499c-961f-4ee4-9995-578966625db6 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 bound to our chassis#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.361 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:49:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:20Z|00793|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 ovn-installed in OVS
Oct  2 08:49:20 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:20Z|00794|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 up in Southbound
Oct  2 08:49:20 np0005465988 nova_compute[236126]: 2025-10-02 12:49:20.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:20 np0005465988 nova_compute[236126]: 2025-10-02 12:49:20.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.381 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4ca3e810-c070-4a23-94ac-656b7049d4c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:20 np0005465988 systemd-udevd[316455]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:49:20 np0005465988 systemd-machined[192594]: New machine qemu-83-instance-000000af.
Oct  2 08:49:20 np0005465988 NetworkManager[45041]: <info>  [1759409360.4037] device (tapeccb499c-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:49:20 np0005465988 NetworkManager[45041]: <info>  [1759409360.4046] device (tapeccb499c-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:49:20 np0005465988 systemd[1]: Started Virtual Machine qemu-83-instance-000000af.
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.413 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f7e4f4-ae0f-43bf-b95d-bd7bd42ec2b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.416 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d1dcd530-9d6f-479f-a67b-20cb9029c40d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:20 np0005465988 nova_compute[236126]: 2025-10-02 12:49:20.424 2 DEBUG nova.network.neutron [req-0e5be19f-bf6a-47fd-8364-2c7b58f434b2 req-e0736331-96d1-44d6-8ea7-b88ab6589141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Updated VIF entry in instance network info cache for port eccb499c-961f-4ee4-9995-578966625db6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:49:20 np0005465988 nova_compute[236126]: 2025-10-02 12:49:20.424 2 DEBUG nova.network.neutron [req-0e5be19f-bf6a-47fd-8364-2c7b58f434b2 req-e0736331-96d1-44d6-8ea7-b88ab6589141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Updating instance_info_cache with network_info: [{"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.451 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1956066c-5a11-483b-a6b1-8edb1da0e772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:20 np0005465988 nova_compute[236126]: 2025-10-02 12:49:20.461 2 DEBUG oslo_concurrency.lockutils [req-0e5be19f-bf6a-47fd-8364-2c7b58f434b2 req-e0736331-96d1-44d6-8ea7-b88ab6589141 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.473 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3607f47c-27b9-4def-b0b9-1e597b0c779d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 10, 'rx_bytes': 874, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 10, 'rx_bytes': 874, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 35722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 316465, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.492 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8e11f706-1885-47fd-9486-e06adf6b2e87]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740222, 'tstamp': 740222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 316469, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740225, 'tstamp': 740225}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 316469, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.494 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:20 np0005465988 nova_compute[236126]: 2025-10-02 12:49:20.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:20 np0005465988 nova_compute[236126]: 2025-10-02 12:49:20.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.505 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.505 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.506 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:20.506 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:21.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.394 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409361.394161, ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.395 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] VM Started (Lifecycle Event)#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.425 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.429 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409361.3958578, ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.429 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.454 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.458 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.487 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.493 2 DEBUG nova.compute.manager [req-f57cab6c-8edd-46cc-92d5-3fefcb3dc1fb req-a6d75fb7-6b25-454f-89de-927815909ffd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.493 2 DEBUG oslo_concurrency.lockutils [req-f57cab6c-8edd-46cc-92d5-3fefcb3dc1fb req-a6d75fb7-6b25-454f-89de-927815909ffd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.493 2 DEBUG oslo_concurrency.lockutils [req-f57cab6c-8edd-46cc-92d5-3fefcb3dc1fb req-a6d75fb7-6b25-454f-89de-927815909ffd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.493 2 DEBUG oslo_concurrency.lockutils [req-f57cab6c-8edd-46cc-92d5-3fefcb3dc1fb req-a6d75fb7-6b25-454f-89de-927815909ffd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.494 2 DEBUG nova.compute.manager [req-f57cab6c-8edd-46cc-92d5-3fefcb3dc1fb req-a6d75fb7-6b25-454f-89de-927815909ffd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Processing event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.494 2 DEBUG nova.compute.manager [req-f57cab6c-8edd-46cc-92d5-3fefcb3dc1fb req-a6d75fb7-6b25-454f-89de-927815909ffd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.494 2 DEBUG oslo_concurrency.lockutils [req-f57cab6c-8edd-46cc-92d5-3fefcb3dc1fb req-a6d75fb7-6b25-454f-89de-927815909ffd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.494 2 DEBUG oslo_concurrency.lockutils [req-f57cab6c-8edd-46cc-92d5-3fefcb3dc1fb req-a6d75fb7-6b25-454f-89de-927815909ffd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.494 2 DEBUG oslo_concurrency.lockutils [req-f57cab6c-8edd-46cc-92d5-3fefcb3dc1fb req-a6d75fb7-6b25-454f-89de-927815909ffd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.495 2 DEBUG nova.compute.manager [req-f57cab6c-8edd-46cc-92d5-3fefcb3dc1fb req-a6d75fb7-6b25-454f-89de-927815909ffd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] No waiting events found dispatching network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.495 2 WARNING nova.compute.manager [req-f57cab6c-8edd-46cc-92d5-3fefcb3dc1fb req-a6d75fb7-6b25-454f-89de-927815909ffd d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received unexpected event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.495 2 DEBUG nova.compute.manager [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.499 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409361.498969, ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.499 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.500 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.504 2 INFO nova.virt.libvirt.driver [-] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Instance spawned successfully.#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.504 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.534 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.542 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.547 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.548 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.548 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.549 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.549 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.550 2 DEBUG nova.virt.libvirt.driver [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.593 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.632 2 INFO nova.compute.manager [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Took 9.89 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.633 2 DEBUG nova.compute.manager [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.721 2 INFO nova.compute.manager [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Took 12.02 seconds to build instance.#033[00m
Oct  2 08:49:21 np0005465988 nova_compute[236126]: 2025-10-02 12:49:21.752 2 DEBUG oslo_concurrency.lockutils [None req-0d1725aa-3f2e-41dc-bb45-00d098416c68 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:21.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:22 np0005465988 nova_compute[236126]: 2025-10-02 12:49:22.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:22 np0005465988 nova_compute[236126]: 2025-10-02 12:49:22.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:49:22 np0005465988 nova_compute[236126]: 2025-10-02 12:49:22.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:49:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e368 e368: 3 total, 3 up, 3 in
Oct  2 08:49:22 np0005465988 nova_compute[236126]: 2025-10-02 12:49:22.852 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:49:22 np0005465988 nova_compute[236126]: 2025-10-02 12:49:22.853 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:49:22 np0005465988 nova_compute[236126]: 2025-10-02 12:49:22.853 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:49:22 np0005465988 nova_compute[236126]: 2025-10-02 12:49:22.853 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bf9e8de1-5081-4daa-9041-1d329e06be86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:23.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:23 np0005465988 nova_compute[236126]: 2025-10-02 12:49:23.955 2 DEBUG nova.compute.manager [None req-d0553b69-c504-43dc-973e-0977ce64c8a9 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:23.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:24 np0005465988 nova_compute[236126]: 2025-10-02 12:49:24.071 2 INFO nova.compute.manager [None req-d0553b69-c504-43dc-973e-0977ce64c8a9 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] instance snapshotting#033[00m
Oct  2 08:49:24 np0005465988 nova_compute[236126]: 2025-10-02 12:49:24.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e368 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:24 np0005465988 nova_compute[236126]: 2025-10-02 12:49:24.443 2 INFO nova.virt.libvirt.driver [None req-d0553b69-c504-43dc-973e-0977ce64c8a9 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Beginning live snapshot process#033[00m
Oct  2 08:49:24 np0005465988 nova_compute[236126]: 2025-10-02 12:49:24.692 2 DEBUG nova.virt.libvirt.imagebackend [None req-d0553b69-c504-43dc-973e-0977ce64c8a9 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No parent info for c2d0c2bc-fe21-4689-86ae-d6728c15874c; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Oct  2 08:49:24 np0005465988 nova_compute[236126]: 2025-10-02 12:49:24.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:25 np0005465988 nova_compute[236126]: 2025-10-02 12:49:25.147 2 DEBUG nova.storage.rbd_utils [None req-d0553b69-c504-43dc-973e-0977ce64c8a9 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] creating snapshot(5ea1588124584a6c87c93f3d3459cfa2) on rbd image(ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:49:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:25.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:25 np0005465988 nova_compute[236126]: 2025-10-02 12:49:25.286 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updating instance_info_cache with network_info: [{"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:49:25 np0005465988 nova_compute[236126]: 2025-10-02 12:49:25.314 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:49:25 np0005465988 nova_compute[236126]: 2025-10-02 12:49:25.315 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:49:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e369 e369: 3 total, 3 up, 3 in
Oct  2 08:49:25 np0005465988 nova_compute[236126]: 2025-10-02 12:49:25.632 2 DEBUG nova.storage.rbd_utils [None req-d0553b69-c504-43dc-973e-0977ce64c8a9 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] cloning vms/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk@5ea1588124584a6c87c93f3d3459cfa2 to images/b25d743c-aae4-4c6f-a9cb-a03c63b08b2c clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:49:25 np0005465988 nova_compute[236126]: 2025-10-02 12:49:25.950 2 DEBUG nova.storage.rbd_utils [None req-d0553b69-c504-43dc-973e-0977ce64c8a9 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] flattening images/b25d743c-aae4-4c6f-a9cb-a03c63b08b2c flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 08:49:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:25.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:26 np0005465988 nova_compute[236126]: 2025-10-02 12:49:26.998 2 DEBUG nova.storage.rbd_utils [None req-d0553b69-c504-43dc-973e-0977ce64c8a9 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] removing snapshot(5ea1588124584a6c87c93f3d3459cfa2) on rbd image(ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 08:49:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:27.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:27.394 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:27.394 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:27.395 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e370 e370: 3 total, 3 up, 3 in
Oct  2 08:49:27 np0005465988 nova_compute[236126]: 2025-10-02 12:49:27.936 2 DEBUG nova.storage.rbd_utils [None req-d0553b69-c504-43dc-973e-0977ce64c8a9 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] creating snapshot(snap) on rbd image(b25d743c-aae4-4c6f-a9cb-a03c63b08b2c) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 08:49:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:27.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.057 2 DEBUG oslo_concurrency.lockutils [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "bf9e8de1-5081-4daa-9041-1d329e06be86" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.058 2 DEBUG oslo_concurrency.lockutils [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.059 2 DEBUG oslo_concurrency.lockutils [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.059 2 DEBUG oslo_concurrency.lockutils [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.060 2 DEBUG oslo_concurrency.lockutils [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.061 2 INFO nova.compute.manager [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Terminating instance#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.062 2 DEBUG nova.compute.manager [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:49:28 np0005465988 kernel: tap3ee9f78f-88 (unregistering): left promiscuous mode
Oct  2 08:49:28 np0005465988 NetworkManager[45041]: <info>  [1759409368.1248] device (tap3ee9f78f-88): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:49:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:28Z|00795|binding|INFO|Releasing lport 3ee9f78f-884b-40ae-b226-ed5161be4522 from this chassis (sb_readonly=0)
Oct  2 08:49:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:28Z|00796|binding|INFO|Setting lport 3ee9f78f-884b-40ae-b226-ed5161be4522 down in Southbound
Oct  2 08:49:28 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:28Z|00797|binding|INFO|Removing iface tap3ee9f78f-88 ovn-installed in OVS
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:28.178 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:51:c5 10.100.0.10'], port_security=['fa:16:3e:05:51:c5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bf9e8de1-5081-4daa-9041-1d329e06be86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0739dafe-4d9b-4048-ac97-c017fd298447', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a837417d42da439cb794b4295bca2cee', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2135b395-ac43-463e-b267-fe36f0a53800', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57c22c18-07d9-4913-840f-1bcc05bb2313, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=3ee9f78f-884b-40ae-b226-ed5161be4522) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:28.179 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 3ee9f78f-884b-40ae-b226-ed5161be4522 in datapath 0739dafe-4d9b-4048-ac97-c017fd298447 unbound from our chassis#033[00m
Oct  2 08:49:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:28.181 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0739dafe-4d9b-4048-ac97-c017fd298447, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:49:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:28.182 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bc78b009-492a-44a1-a88b-1975b49f3001]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:28.184 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447 namespace which is not needed anymore#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:28 np0005465988 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a8.scope: Deactivated successfully.
Oct  2 08:49:28 np0005465988 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a8.scope: Consumed 19.087s CPU time.
Oct  2 08:49:28 np0005465988 systemd-machined[192594]: Machine qemu-79-instance-000000a8 terminated.
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.315 2 INFO nova.virt.libvirt.driver [-] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Instance destroyed successfully.#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.318 2 DEBUG nova.objects.instance [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lazy-loading 'resources' on Instance uuid bf9e8de1-5081-4daa-9041-1d329e06be86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.347 2 DEBUG nova.virt.libvirt.vif [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestStampPattern-server-1078269595',display_name='tempest-TestStampPattern-server-1078269595',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-teststamppattern-server-1078269595',id=168,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCPnME7yj7AGpi8+zR0j6wiXokmK+k5Lh86YSlnXsP0prCCTi2saYZPKg3ZreiW8R+IqbWLBOHnWbtyyC7ToJeWaqTKxTG25O47OUrV5FbVX8vZbUi2AzjxwYa4KvWo3jw==',key_name='tempest-TestStampPattern-1303591770',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:47:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a837417d42da439cb794b4295bca2cee',ramdisk_id='',reservation_id='r-qfgfequ1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestStampPattern-901207223',owner_user_name='tempest-TestStampPattern-901207223-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:48:09Z,user_data=None,user_id='a24a7109471f4d96ad5f11b637fdb8e7',uuid=bf9e8de1-5081-4daa-9041-1d329e06be86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.348 2 DEBUG nova.network.os_vif_util [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Converting VIF {"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.349 2 DEBUG nova.network.os_vif_util [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:51:c5,bridge_name='br-int',has_traffic_filtering=True,id=3ee9f78f-884b-40ae-b226-ed5161be4522,network=Network(0739dafe-4d9b-4048-ac97-c017fd298447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ee9f78f-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.350 2 DEBUG os_vif [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:51:c5,bridge_name='br-int',has_traffic_filtering=True,id=3ee9f78f-884b-40ae-b226-ed5161be4522,network=Network(0739dafe-4d9b-4048-ac97-c017fd298447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ee9f78f-88') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.352 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ee9f78f-88, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.359 2 INFO os_vif [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:51:c5,bridge_name='br-int',has_traffic_filtering=True,id=3ee9f78f-884b-40ae-b226-ed5161be4522,network=Network(0739dafe-4d9b-4048-ac97-c017fd298447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ee9f78f-88')#033[00m
Oct  2 08:49:28 np0005465988 neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447[313231]: [NOTICE]   (313235) : haproxy version is 2.8.14-c23fe91
Oct  2 08:49:28 np0005465988 neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447[313231]: [NOTICE]   (313235) : path to executable is /usr/sbin/haproxy
Oct  2 08:49:28 np0005465988 neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447[313231]: [WARNING]  (313235) : Exiting Master process...
Oct  2 08:49:28 np0005465988 neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447[313231]: [WARNING]  (313235) : Exiting Master process...
Oct  2 08:49:28 np0005465988 neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447[313231]: [ALERT]    (313235) : Current worker (313237) exited with code 143 (Terminated)
Oct  2 08:49:28 np0005465988 neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447[313231]: [WARNING]  (313235) : All workers exited. Exiting... (0)
Oct  2 08:49:28 np0005465988 systemd[1]: libpod-927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b.scope: Deactivated successfully.
Oct  2 08:49:28 np0005465988 podman[316686]: 2025-10-02 12:49:28.499198496 +0000 UTC m=+0.202002968 container died 927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.964 2 DEBUG nova.compute.manager [req-76c2fe68-7950-4fc4-94ec-d4c4e54ef364 req-20c9205a-0512-4856-817f-dd73d933f37b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Received event network-vif-unplugged-3ee9f78f-884b-40ae-b226-ed5161be4522 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.966 2 DEBUG oslo_concurrency.lockutils [req-76c2fe68-7950-4fc4-94ec-d4c4e54ef364 req-20c9205a-0512-4856-817f-dd73d933f37b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.967 2 DEBUG oslo_concurrency.lockutils [req-76c2fe68-7950-4fc4-94ec-d4c4e54ef364 req-20c9205a-0512-4856-817f-dd73d933f37b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.967 2 DEBUG oslo_concurrency.lockutils [req-76c2fe68-7950-4fc4-94ec-d4c4e54ef364 req-20c9205a-0512-4856-817f-dd73d933f37b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.967 2 DEBUG nova.compute.manager [req-76c2fe68-7950-4fc4-94ec-d4c4e54ef364 req-20c9205a-0512-4856-817f-dd73d933f37b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] No waiting events found dispatching network-vif-unplugged-3ee9f78f-884b-40ae-b226-ed5161be4522 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:28 np0005465988 nova_compute[236126]: 2025-10-02 12:49:28.968 2 DEBUG nova.compute.manager [req-76c2fe68-7950-4fc4-94ec-d4c4e54ef364 req-20c9205a-0512-4856-817f-dd73d933f37b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Received event network-vif-unplugged-3ee9f78f-884b-40ae-b226-ed5161be4522 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:49:29 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b-userdata-shm.mount: Deactivated successfully.
Oct  2 08:49:29 np0005465988 systemd[1]: var-lib-containers-storage-overlay-b93d2b678e4bc3708eb956b1bc508428e60a43d0a18e51dced4a5bd2aad6f74e-merged.mount: Deactivated successfully.
Oct  2 08:49:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e371 e371: 3 total, 3 up, 3 in
Oct  2 08:49:29 np0005465988 podman[316686]: 2025-10-02 12:49:29.124634779 +0000 UTC m=+0.827439261 container cleanup 927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:49:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e372 e372: 3 total, 3 up, 3 in
Oct  2 08:49:29 np0005465988 systemd[1]: libpod-conmon-927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b.scope: Deactivated successfully.
Oct  2 08:49:29 np0005465988 podman[316742]: 2025-10-02 12:49:29.209406174 +0000 UTC m=+0.051667671 container remove 927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:49:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:29.217 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fb836335-c442-4860-bedc-db93e864e7c2]: (4, ('Thu Oct  2 12:49:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447 (927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b)\n927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b\nThu Oct  2 12:49:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447 (927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b)\n927dc1bc1049abb4743020a3fa898225f39eae6f80a308677c0cd7c9f06ae42b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:29.220 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[774d036f-1b50-4aaa-bac8-6b7911c91a7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:29.221 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0739dafe-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:29 np0005465988 kernel: tap0739dafe-40: left promiscuous mode
Oct  2 08:49:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e372 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:29.244 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fee237b9-1383-4c7d-82a2-65ee15e5897b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:29.270 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[60985e28-3920-4b26-82e6-af1a0427250c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:29.272 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c513a9ef-eb47-4b52-8172-6e83da171f26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:29.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:29.290 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0151cb-cd0d-41fa-9968-74bae33d0fb1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 729910, 'reachable_time': 37911, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 316757, 'error': None, 'target': 'ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:29 np0005465988 systemd[1]: run-netns-ovnmeta\x2d0739dafe\x2d4d9b\x2d4048\x2dac97\x2dc017fd298447.mount: Deactivated successfully.
Oct  2 08:49:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:29.296 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0739dafe-4d9b-4048-ac97-c017fd298447 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:49:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:29.296 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[0e22f01c-f2fb-4209-bfc9-0c8ec0cec7ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.358 2 DEBUG nova.compute.manager [req-d0b4dd48-1027-4608-8549-439434284002 req-875b7db6-51a7-44a5-b005-44a4bbdc2493 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Received event network-changed-3ee9f78f-884b-40ae-b226-ed5161be4522 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.358 2 DEBUG nova.compute.manager [req-d0b4dd48-1027-4608-8549-439434284002 req-875b7db6-51a7-44a5-b005-44a4bbdc2493 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Refreshing instance network info cache due to event network-changed-3ee9f78f-884b-40ae-b226-ed5161be4522. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.359 2 DEBUG oslo_concurrency.lockutils [req-d0b4dd48-1027-4608-8549-439434284002 req-875b7db6-51a7-44a5-b005-44a4bbdc2493 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.359 2 DEBUG oslo_concurrency.lockutils [req-d0b4dd48-1027-4608-8549-439434284002 req-875b7db6-51a7-44a5-b005-44a4bbdc2493 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.360 2 DEBUG nova.network.neutron [req-d0b4dd48-1027-4608-8549-439434284002 req-875b7db6-51a7-44a5-b005-44a4bbdc2493 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Refreshing network info cache for port 3ee9f78f-884b-40ae-b226-ed5161be4522 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.650 2 INFO nova.virt.libvirt.driver [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Deleting instance files /var/lib/nova/instances/bf9e8de1-5081-4daa-9041-1d329e06be86_del#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.651 2 INFO nova.virt.libvirt.driver [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Deletion of /var/lib/nova/instances/bf9e8de1-5081-4daa-9041-1d329e06be86_del complete#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.716 2 INFO nova.compute.manager [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Took 1.65 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.717 2 DEBUG oslo.service.loopingcall [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.717 2 DEBUG nova.compute.manager [-] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.718 2 DEBUG nova.network.neutron [-] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:49:29 np0005465988 nova_compute[236126]: 2025-10-02 12:49:29.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:29.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.254 2 DEBUG nova.compute.manager [req-1daaacdb-fef0-4f91-a50f-87c5198fe3e0 req-934717ff-b68e-4312-b2eb-58d39b4e1a32 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Received event network-vif-plugged-3ee9f78f-884b-40ae-b226-ed5161be4522 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.255 2 DEBUG oslo_concurrency.lockutils [req-1daaacdb-fef0-4f91-a50f-87c5198fe3e0 req-934717ff-b68e-4312-b2eb-58d39b4e1a32 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.255 2 DEBUG oslo_concurrency.lockutils [req-1daaacdb-fef0-4f91-a50f-87c5198fe3e0 req-934717ff-b68e-4312-b2eb-58d39b4e1a32 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.256 2 DEBUG oslo_concurrency.lockutils [req-1daaacdb-fef0-4f91-a50f-87c5198fe3e0 req-934717ff-b68e-4312-b2eb-58d39b4e1a32 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.256 2 DEBUG nova.compute.manager [req-1daaacdb-fef0-4f91-a50f-87c5198fe3e0 req-934717ff-b68e-4312-b2eb-58d39b4e1a32 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] No waiting events found dispatching network-vif-plugged-3ee9f78f-884b-40ae-b226-ed5161be4522 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.257 2 WARNING nova.compute.manager [req-1daaacdb-fef0-4f91-a50f-87c5198fe3e0 req-934717ff-b68e-4312-b2eb-58d39b4e1a32 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Received unexpected event network-vif-plugged-3ee9f78f-884b-40ae-b226-ed5161be4522 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:49:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:31.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.468 2 INFO nova.virt.libvirt.driver [None req-d0553b69-c504-43dc-973e-0977ce64c8a9 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Snapshot image upload complete#033[00m
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.469 2 INFO nova.compute.manager [None req-d0553b69-c504-43dc-973e-0977ce64c8a9 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Took 7.40 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.675 2 DEBUG nova.network.neutron [-] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.824 2 INFO nova.compute.manager [-] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Took 2.11 seconds to deallocate network for instance.#033[00m
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.966 2 DEBUG oslo_concurrency.lockutils [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:31 np0005465988 nova_compute[236126]: 2025-10-02 12:49:31.967 2 DEBUG oslo_concurrency.lockutils [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:31.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:32 np0005465988 nova_compute[236126]: 2025-10-02 12:49:32.063 2 DEBUG oslo_concurrency.processutils [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:32 np0005465988 nova_compute[236126]: 2025-10-02 12:49:32.290 2 DEBUG nova.network.neutron [req-d0b4dd48-1027-4608-8549-439434284002 req-875b7db6-51a7-44a5-b005-44a4bbdc2493 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updated VIF entry in instance network info cache for port 3ee9f78f-884b-40ae-b226-ed5161be4522. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:49:32 np0005465988 nova_compute[236126]: 2025-10-02 12:49:32.292 2 DEBUG nova.network.neutron [req-d0b4dd48-1027-4608-8549-439434284002 req-875b7db6-51a7-44a5-b005-44a4bbdc2493 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Updating instance_info_cache with network_info: [{"id": "3ee9f78f-884b-40ae-b226-ed5161be4522", "address": "fa:16:3e:05:51:c5", "network": {"id": "0739dafe-4d9b-4048-ac97-c017fd298447", "bridge": "br-int", "label": "tempest-TestStampPattern-656915325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a837417d42da439cb794b4295bca2cee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ee9f78f-88", "ovs_interfaceid": "3ee9f78f-884b-40ae-b226-ed5161be4522", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:49:32 np0005465988 nova_compute[236126]: 2025-10-02 12:49:32.462 2 DEBUG oslo_concurrency.lockutils [req-d0b4dd48-1027-4608-8549-439434284002 req-875b7db6-51a7-44a5-b005-44a4bbdc2493 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-bf9e8de1-5081-4daa-9041-1d329e06be86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:49:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:49:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1493815161' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:49:32 np0005465988 nova_compute[236126]: 2025-10-02 12:49:32.516 2 DEBUG oslo_concurrency.processutils [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:32 np0005465988 nova_compute[236126]: 2025-10-02 12:49:32.525 2 DEBUG nova.compute.provider_tree [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:49:32 np0005465988 nova_compute[236126]: 2025-10-02 12:49:32.567 2 DEBUG nova.scheduler.client.report [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:49:32 np0005465988 nova_compute[236126]: 2025-10-02 12:49:32.647 2 DEBUG oslo_concurrency.lockutils [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:32 np0005465988 nova_compute[236126]: 2025-10-02 12:49:32.717 2 INFO nova.scheduler.client.report [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Deleted allocations for instance bf9e8de1-5081-4daa-9041-1d329e06be86#033[00m
Oct  2 08:49:33 np0005465988 nova_compute[236126]: 2025-10-02 12:49:33.000 2 DEBUG oslo_concurrency.lockutils [None req-ece87bf2-bf7c-41b9-82d7-d6416e5d6305 a24a7109471f4d96ad5f11b637fdb8e7 a837417d42da439cb794b4295bca2cee - - default default] Lock "bf9e8de1-5081-4daa-9041-1d329e06be86" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.942s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:33.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:33 np0005465988 nova_compute[236126]: 2025-10-02 12:49:33.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:33 np0005465988 podman[316783]: 2025-10-02 12:49:33.519350278 +0000 UTC m=+0.060339531 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:49:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:33.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:34 np0005465988 nova_compute[236126]: 2025-10-02 12:49:34.094 2 DEBUG nova.compute.manager [req-0a9584d9-2e90-4a40-9d3f-32eaea00c1f6 req-47025d7a-96e7-4818-a006-67eaa4db2d7c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Received event network-vif-deleted-3ee9f78f-884b-40ae-b226-ed5161be4522 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:34 np0005465988 nova_compute[236126]: 2025-10-02 12:49:34.095 2 INFO nova.compute.manager [req-0a9584d9-2e90-4a40-9d3f-32eaea00c1f6 req-47025d7a-96e7-4818-a006-67eaa4db2d7c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Neutron deleted interface 3ee9f78f-884b-40ae-b226-ed5161be4522; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:49:34 np0005465988 nova_compute[236126]: 2025-10-02 12:49:34.095 2 DEBUG nova.network.neutron [req-0a9584d9-2e90-4a40-9d3f-32eaea00c1f6 req-47025d7a-96e7-4818-a006-67eaa4db2d7c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  2 08:49:34 np0005465988 nova_compute[236126]: 2025-10-02 12:49:34.098 2 DEBUG nova.compute.manager [req-0a9584d9-2e90-4a40-9d3f-32eaea00c1f6 req-47025d7a-96e7-4818-a006-67eaa4db2d7c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Detach interface failed, port_id=3ee9f78f-884b-40ae-b226-ed5161be4522, reason: Instance bf9e8de1-5081-4daa-9041-1d329e06be86 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:49:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e372 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:34 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:34Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:45:eb 10.100.0.6
Oct  2 08:49:34 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:34Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:45:eb 10.100.0.6
Oct  2 08:49:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e373 e373: 3 total, 3 up, 3 in
Oct  2 08:49:34 np0005465988 nova_compute[236126]: 2025-10-02 12:49:34.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:35.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:35 np0005465988 nova_compute[236126]: 2025-10-02 12:49:35.392 2 INFO nova.compute.manager [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Rescuing#033[00m
Oct  2 08:49:35 np0005465988 nova_compute[236126]: 2025-10-02 12:49:35.393 2 DEBUG oslo_concurrency.lockutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:49:35 np0005465988 nova_compute[236126]: 2025-10-02 12:49:35.394 2 DEBUG oslo_concurrency.lockutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquired lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:49:35 np0005465988 nova_compute[236126]: 2025-10-02 12:49:35.394 2 DEBUG nova.network.neutron [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:49:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:35.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:37 np0005465988 nova_compute[236126]: 2025-10-02 12:49:37.252 2 DEBUG nova.network.neutron [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Updating instance_info_cache with network_info: [{"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:49:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:37.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:37 np0005465988 nova_compute[236126]: 2025-10-02 12:49:37.631 2 DEBUG oslo_concurrency.lockutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Releasing lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:49:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:37.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:38 np0005465988 nova_compute[236126]: 2025-10-02 12:49:38.015 2 DEBUG nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:49:38 np0005465988 nova_compute[236126]: 2025-10-02 12:49:38.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:38Z|00798|binding|INFO|Releasing lport 1fc80788-89b8-413a-b0b0-d36f1a11a2b1 from this chassis (sb_readonly=0)
Oct  2 08:49:38 np0005465988 nova_compute[236126]: 2025-10-02 12:49:38.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:39Z|00799|binding|INFO|Releasing lport 1fc80788-89b8-413a-b0b0-d36f1a11a2b1 from this chassis (sb_readonly=0)
Oct  2 08:49:39 np0005465988 nova_compute[236126]: 2025-10-02 12:49:39.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e374 e374: 3 total, 3 up, 3 in
Oct  2 08:49:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:39.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:39 np0005465988 nova_compute[236126]: 2025-10-02 12:49:39.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:39 np0005465988 nova_compute[236126]: 2025-10-02 12:49:39.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:39.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:40 np0005465988 kernel: tapeccb499c-96 (unregistering): left promiscuous mode
Oct  2 08:49:40 np0005465988 NetworkManager[45041]: <info>  [1759409380.3188] device (tapeccb499c-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:49:40 np0005465988 nova_compute[236126]: 2025-10-02 12:49:40.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:40Z|00800|binding|INFO|Releasing lport eccb499c-961f-4ee4-9995-578966625db6 from this chassis (sb_readonly=0)
Oct  2 08:49:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:40Z|00801|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 down in Southbound
Oct  2 08:49:40 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:40Z|00802|binding|INFO|Removing iface tapeccb499c-96 ovn-installed in OVS
Oct  2 08:49:40 np0005465988 nova_compute[236126]: 2025-10-02 12:49:40.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:40 np0005465988 nova_compute[236126]: 2025-10-02 12:49:40.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.377 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:45:eb 10.100.0.6'], port_security=['fa:16:3e:39:45:eb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ac6724c1-4d98-45f7-8e2b-dfac55d9cb13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=eccb499c-961f-4ee4-9995-578966625db6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.378 142124 INFO neutron.agent.ovn.metadata.agent [-] Port eccb499c-961f-4ee4-9995-578966625db6 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 unbound from our chassis#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.380 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:49:40 np0005465988 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000af.scope: Deactivated successfully.
Oct  2 08:49:40 np0005465988 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000af.scope: Consumed 14.353s CPU time.
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.396 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[28f51ae8-83b0-46dc-86f3-5550928898a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:40 np0005465988 systemd-machined[192594]: Machine qemu-83-instance-000000af terminated.
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.426 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[66e026b2-443a-4927-bd2e-49e9f85847a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.430 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[83e8615a-e129-40bd-871d-5289dff5e49f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.462 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[85c64171-89be-4227-a9f2-ac5a5370e0fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.485 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[19d560b7-7139-40bb-8b0e-e5c01c76957a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 12, 'rx_bytes': 916, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 12, 'rx_bytes': 916, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 35722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 316867, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.504 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f1143468-b355-4a59-a3a4-66ca8855e170]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740222, 'tstamp': 740222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 316868, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740225, 'tstamp': 740225}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 316868, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.506 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:40 np0005465988 nova_compute[236126]: 2025-10-02 12:49:40.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:40 np0005465988 nova_compute[236126]: 2025-10-02 12:49:40.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.513 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.514 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.514 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:40.515 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:40 np0005465988 nova_compute[236126]: 2025-10-02 12:49:40.761 2 DEBUG nova.compute.manager [req-a86e4cb3-5315-47ec-b8bc-afd201f0ca2b req-346745cf-9212-414b-a4a4-f2b8f75a3025 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-vif-unplugged-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:40 np0005465988 nova_compute[236126]: 2025-10-02 12:49:40.762 2 DEBUG oslo_concurrency.lockutils [req-a86e4cb3-5315-47ec-b8bc-afd201f0ca2b req-346745cf-9212-414b-a4a4-f2b8f75a3025 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:40 np0005465988 nova_compute[236126]: 2025-10-02 12:49:40.762 2 DEBUG oslo_concurrency.lockutils [req-a86e4cb3-5315-47ec-b8bc-afd201f0ca2b req-346745cf-9212-414b-a4a4-f2b8f75a3025 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:40 np0005465988 nova_compute[236126]: 2025-10-02 12:49:40.763 2 DEBUG oslo_concurrency.lockutils [req-a86e4cb3-5315-47ec-b8bc-afd201f0ca2b req-346745cf-9212-414b-a4a4-f2b8f75a3025 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:40 np0005465988 nova_compute[236126]: 2025-10-02 12:49:40.763 2 DEBUG nova.compute.manager [req-a86e4cb3-5315-47ec-b8bc-afd201f0ca2b req-346745cf-9212-414b-a4a4-f2b8f75a3025 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] No waiting events found dispatching network-vif-unplugged-eccb499c-961f-4ee4-9995-578966625db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:40 np0005465988 nova_compute[236126]: 2025-10-02 12:49:40.763 2 WARNING nova.compute.manager [req-a86e4cb3-5315-47ec-b8bc-afd201f0ca2b req-346745cf-9212-414b-a4a4-f2b8f75a3025 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received unexpected event network-vif-unplugged-eccb499c-961f-4ee4-9995-578966625db6 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.034 2 INFO nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.041 2 INFO nova.virt.libvirt.driver [-] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Instance destroyed successfully.#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.041 2 DEBUG nova.objects.instance [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'numa_topology' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.174 2 INFO nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Attempting a stable device rescue#033[00m
Oct  2 08:49:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:41.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.466 2 DEBUG nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.472 2 DEBUG nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.472 2 INFO nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Creating image(s)#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.506 2 DEBUG nova.storage.rbd_utils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.512 2 DEBUG nova.objects.instance [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.612 2 DEBUG nova.storage.rbd_utils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.642 2 DEBUG nova.storage.rbd_utils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.645 2 DEBUG oslo_concurrency.lockutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "899c09e5126e93fd1b41daa0b1a98086a6462b56" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.647 2 DEBUG oslo_concurrency.lockutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "899c09e5126e93fd1b41daa0b1a98086a6462b56" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:41 np0005465988 nova_compute[236126]: 2025-10-02 12:49:41.966 2 DEBUG nova.virt.libvirt.imagebackend [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Image locations are: [{'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/b25d743c-aae4-4c6f-a9cb-a03c63b08b2c/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/b25d743c-aae4-4c6f-a9cb-a03c63b08b2c/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Oct  2 08:49:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:41.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.040 2 DEBUG nova.virt.libvirt.imagebackend [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Selected location: {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/b25d743c-aae4-4c6f-a9cb-a03c63b08b2c/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.041 2 DEBUG nova.storage.rbd_utils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] cloning images/b25d743c-aae4-4c6f-a9cb-a03c63b08b2c@snap to None/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.501 2 DEBUG oslo_concurrency.lockutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "899c09e5126e93fd1b41daa0b1a98086a6462b56" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.854s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.561 2 DEBUG nova.objects.instance [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'migration_context' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.675 2 DEBUG nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.678 2 DEBUG nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Start _get_guest_xml network_info=[{"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "vif_mac": "fa:16:3e:39:45:eb"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'b25d743c-aae4-4c6f-a9cb-a03c63b08b2c', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.678 2 DEBUG nova.objects.instance [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'resources' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.785 2 WARNING nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.791 2 DEBUG nova.virt.libvirt.host [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.793 2 DEBUG nova.virt.libvirt.host [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.798 2 DEBUG nova.virt.libvirt.host [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.799 2 DEBUG nova.virt.libvirt.host [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.801 2 DEBUG nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.802 2 DEBUG nova.virt.hardware [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.803 2 DEBUG nova.virt.hardware [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.803 2 DEBUG nova.virt.hardware [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.804 2 DEBUG nova.virt.hardware [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.804 2 DEBUG nova.virt.hardware [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.804 2 DEBUG nova.virt.hardware [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.805 2 DEBUG nova.virt.hardware [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.806 2 DEBUG nova.virt.hardware [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.806 2 DEBUG nova.virt.hardware [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.807 2 DEBUG nova.virt.hardware [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.807 2 DEBUG nova.virt.hardware [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.808 2 DEBUG nova.objects.instance [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:42 np0005465988 nova_compute[236126]: 2025-10-02 12:49:42.927 2 DEBUG oslo_concurrency.processutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.044 2 DEBUG nova.compute.manager [req-cbc3e379-830a-4e3e-b841-beb4431fb98a req-3977007a-5b24-405b-bd83-fff89777564a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.044 2 DEBUG oslo_concurrency.lockutils [req-cbc3e379-830a-4e3e-b841-beb4431fb98a req-3977007a-5b24-405b-bd83-fff89777564a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.044 2 DEBUG oslo_concurrency.lockutils [req-cbc3e379-830a-4e3e-b841-beb4431fb98a req-3977007a-5b24-405b-bd83-fff89777564a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.045 2 DEBUG oslo_concurrency.lockutils [req-cbc3e379-830a-4e3e-b841-beb4431fb98a req-3977007a-5b24-405b-bd83-fff89777564a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.045 2 DEBUG nova.compute.manager [req-cbc3e379-830a-4e3e-b841-beb4431fb98a req-3977007a-5b24-405b-bd83-fff89777564a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] No waiting events found dispatching network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.045 2 WARNING nova.compute.manager [req-cbc3e379-830a-4e3e-b841-beb4431fb98a req-3977007a-5b24-405b-bd83-fff89777564a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received unexpected event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:49:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.004000116s ======
Oct  2 08:49:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:43.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000116s
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.303 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409368.3026347, bf9e8de1-5081-4daa-9041-1d329e06be86 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.304 2 INFO nova.compute.manager [-] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.369 2 DEBUG nova.compute.manager [None req-20eb25ec-561a-46ce-b3e1-67fbecec25ef - - - - - -] [instance: bf9e8de1-5081-4daa-9041-1d329e06be86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:49:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4100157186' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.414 2 DEBUG oslo_concurrency.processutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.468 2 DEBUG oslo_concurrency.processutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:49:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3741594729' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.965 2 DEBUG oslo_concurrency.processutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:43 np0005465988 nova_compute[236126]: 2025-10-02 12:49:43.968 2 DEBUG oslo_concurrency.processutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:43.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #127. Immutable memtables: 0.
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.141984) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 79] Flushing memtable with next log file: 127
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409384142011, "job": 79, "event": "flush_started", "num_memtables": 1, "num_entries": 964, "num_deletes": 258, "total_data_size": 1789683, "memory_usage": 1809312, "flush_reason": "Manual Compaction"}
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 79] Level-0 flush table #128: started
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409384148501, "cf_name": "default", "job": 79, "event": "table_file_creation", "file_number": 128, "file_size": 1179227, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 64154, "largest_seqno": 65113, "table_properties": {"data_size": 1174768, "index_size": 2046, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10372, "raw_average_key_size": 19, "raw_value_size": 1165541, "raw_average_value_size": 2245, "num_data_blocks": 89, "num_entries": 519, "num_filter_entries": 519, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759409325, "oldest_key_time": 1759409325, "file_creation_time": 1759409384, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 128, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 79] Flush lasted 6551 microseconds, and 2882 cpu microseconds.
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.148533) [db/flush_job.cc:967] [default] [JOB 79] Level-0 flush table #128: 1179227 bytes OK
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.148550) [db/memtable_list.cc:519] [default] Level-0 commit table #128 started
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.150677) [db/memtable_list.cc:722] [default] Level-0 commit table #128: memtable #1 done
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.150691) EVENT_LOG_v1 {"time_micros": 1759409384150686, "job": 79, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.150706) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 79] Try to delete WAL files size 1784777, prev total WAL file size 1784777, number of live WAL files 2.
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000124.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.151386) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032323536' seq:72057594037927935, type:22 .. '6C6F676D0032353037' seq:0, type:0; will stop at (end)
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 80] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 79 Base level 0, inputs: [128(1151KB)], [126(10MB)]
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409384151422, "job": 80, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [128], "files_L6": [126], "score": -1, "input_data_size": 12097401, "oldest_snapshot_seqno": -1}
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 80] Generated table #129: 8871 keys, 11946059 bytes, temperature: kUnknown
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409384231266, "cf_name": "default", "job": 80, "event": "table_file_creation", "file_number": 129, "file_size": 11946059, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11888123, "index_size": 34655, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22213, "raw_key_size": 231057, "raw_average_key_size": 26, "raw_value_size": 11731870, "raw_average_value_size": 1322, "num_data_blocks": 1349, "num_entries": 8871, "num_filter_entries": 8871, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759409384, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 129, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.232037) [db/compaction/compaction_job.cc:1663] [default] [JOB 80] Compacted 1@0 + 1@6 files to L6 => 11946059 bytes
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.234556) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.4 rd, 149.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 10.4 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(20.4) write-amplify(10.1) OK, records in: 9403, records dropped: 532 output_compression: NoCompression
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.234574) EVENT_LOG_v1 {"time_micros": 1759409384234566, "job": 80, "event": "compaction_finished", "compaction_time_micros": 79881, "compaction_time_cpu_micros": 25311, "output_level": 6, "num_output_files": 1, "total_output_size": 11946059, "num_input_records": 9403, "num_output_records": 8871, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000128.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409384234851, "job": 80, "event": "table_file_deletion", "file_number": 128}
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000126.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409384236540, "job": 80, "event": "table_file_deletion", "file_number": 126}
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.151240) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.236628) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.236634) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.236637) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.236640) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:49:44.236646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:49:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3022817253' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.422 2 DEBUG oslo_concurrency.processutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.424 2 DEBUG nova.virt.libvirt.vif [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:49:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-926303208',display_name='tempest-ServerStableDeviceRescueTest-server-926303208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-926303208',id=175,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:49:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6a442bc513e14406b73e96e70396e6c3',ramdisk_id='',reservation_id='r-s2ebpg4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-454391960',owner_user_name='tempest-ServerStableDeviceRescueTest-454391960-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:49:31Z,user_data=None,user_id='6785ffe5d6554514b4ed9fd47665eca0',uuid=ac6724c1-4d98-45f7-8e2b-dfac55d9cb13,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "vif_mac": "fa:16:3e:39:45:eb"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.424 2 DEBUG nova.network.os_vif_util [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converting VIF {"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "vif_mac": "fa:16:3e:39:45:eb"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.425 2 DEBUG nova.network.os_vif_util [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:45:eb,bridge_name='br-int',has_traffic_filtering=True,id=eccb499c-961f-4ee4-9995-578966625db6,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccb499c-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.427 2 DEBUG nova.objects.instance [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'pci_devices' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.545 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.546 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.627 2 DEBUG nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  <uuid>ac6724c1-4d98-45f7-8e2b-dfac55d9cb13</uuid>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  <name>instance-000000af</name>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-926303208</nova:name>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:49:42</nova:creationTime>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <nova:user uuid="6785ffe5d6554514b4ed9fd47665eca0">tempest-ServerStableDeviceRescueTest-454391960-project-member</nova:user>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <nova:project uuid="6a442bc513e14406b73e96e70396e6c3">tempest-ServerStableDeviceRescueTest-454391960</nova:project>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <nova:port uuid="eccb499c-961f-4ee4-9995-578966625db6">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <entry name="serial">ac6724c1-4d98-45f7-8e2b-dfac55d9cb13</entry>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <entry name="uuid">ac6724c1-4d98-45f7-8e2b-dfac55d9cb13</entry>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.config">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.rescue">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <target dev="sdb" bus="usb"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <boot order="1"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:39:45:eb"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <target dev="tapeccb499c-96"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/console.log" append="off"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:49:44 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:49:44 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:49:44 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:49:44 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.638 2 INFO nova.virt.libvirt.driver [-] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Instance destroyed successfully.#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.788 2 DEBUG nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.788 2 DEBUG nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.788 2 DEBUG nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.788 2 DEBUG nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] No VIF found with MAC fa:16:3e:39:45:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.789 2 INFO nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Using config drive#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.821 2 DEBUG nova.storage.rbd_utils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:44 np0005465988 nova_compute[236126]: 2025-10-02 12:49:44.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:45 np0005465988 nova_compute[236126]: 2025-10-02 12:49:45.121 2 DEBUG nova.objects.instance [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:45 np0005465988 nova_compute[236126]: 2025-10-02 12:49:45.167 2 DEBUG nova.objects.instance [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'keypairs' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:45.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:45 np0005465988 nova_compute[236126]: 2025-10-02 12:49:45.665 2 INFO nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Creating config drive at /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config.rescue#033[00m
Oct  2 08:49:45 np0005465988 nova_compute[236126]: 2025-10-02 12:49:45.670 2 DEBUG oslo_concurrency.processutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppgv67xke execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:45 np0005465988 nova_compute[236126]: 2025-10-02 12:49:45.819 2 DEBUG oslo_concurrency.processutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppgv67xke" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:45 np0005465988 nova_compute[236126]: 2025-10-02 12:49:45.852 2 DEBUG nova.storage.rbd_utils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] rbd image ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:49:45 np0005465988 nova_compute[236126]: 2025-10-02 12:49:45.857 2 DEBUG oslo_concurrency.processutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config.rescue ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:49:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:45.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:46 np0005465988 nova_compute[236126]: 2025-10-02 12:49:46.232 2 DEBUG oslo_concurrency.processutils [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config.rescue ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.375s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:49:46 np0005465988 nova_compute[236126]: 2025-10-02 12:49:46.235 2 INFO nova.virt.libvirt.driver [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Deleting local config drive /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13/disk.config.rescue because it was imported into RBD.#033[00m
Oct  2 08:49:46 np0005465988 kernel: tapeccb499c-96: entered promiscuous mode
Oct  2 08:49:46 np0005465988 NetworkManager[45041]: <info>  [1759409386.2881] manager: (tapeccb499c-96): new Tun device (/org/freedesktop/NetworkManager/Devices/354)
Oct  2 08:49:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:46Z|00803|binding|INFO|Claiming lport eccb499c-961f-4ee4-9995-578966625db6 for this chassis.
Oct  2 08:49:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:46Z|00804|binding|INFO|eccb499c-961f-4ee4-9995-578966625db6: Claiming fa:16:3e:39:45:eb 10.100.0.6
Oct  2 08:49:46 np0005465988 nova_compute[236126]: 2025-10-02 12:49:46.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.304 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:45:eb 10.100.0.6'], port_security=['fa:16:3e:39:45:eb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ac6724c1-4d98-45f7-8e2b-dfac55d9cb13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=eccb499c-961f-4ee4-9995-578966625db6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.305 142124 INFO neutron.agent.ovn.metadata.agent [-] Port eccb499c-961f-4ee4-9995-578966625db6 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 bound to our chassis#033[00m
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.306 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:49:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:46Z|00805|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 ovn-installed in OVS
Oct  2 08:49:46 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:46Z|00806|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 up in Southbound
Oct  2 08:49:46 np0005465988 nova_compute[236126]: 2025-10-02 12:49:46.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:46 np0005465988 nova_compute[236126]: 2025-10-02 12:49:46.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.326 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[de397bf0-0fea-4d98-b47b-2e89867654d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:46 np0005465988 systemd-udevd[317209]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:49:46 np0005465988 NetworkManager[45041]: <info>  [1759409386.3617] device (tapeccb499c-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:49:46 np0005465988 NetworkManager[45041]: <info>  [1759409386.3626] device (tapeccb499c-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:49:46 np0005465988 systemd-machined[192594]: New machine qemu-84-instance-000000af.
Oct  2 08:49:46 np0005465988 systemd[1]: Started Virtual Machine qemu-84-instance-000000af.
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.369 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[07fae305-e875-4bd1-a6ad-9f297665366d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.374 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4d92a3-1a84-4296-b9d4-8be89f132656]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.409 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c33d26aa-5f92-45cf-a4ba-c0d639d15d87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:46 np0005465988 podman[317180]: 2025-10-02 12:49:46.42217072 +0000 UTC m=+0.097463993 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:49:46 np0005465988 podman[317177]: 2025-10-02 12:49:46.422778458 +0000 UTC m=+0.099726808 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.432 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7e5639f3-fdb7-4e8c-b483-e894d09f7c55]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 14, 'rx_bytes': 916, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 14, 'rx_bytes': 916, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 35722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 317253, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:46 np0005465988 podman[317179]: 2025-10-02 12:49:46.446748909 +0000 UTC m=+0.122865405 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.452 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b2269096-270b-40c5-91d9-45ee4fadb7a8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740222, 'tstamp': 740222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 317257, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740225, 'tstamp': 740225}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 317257, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.453 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:46 np0005465988 nova_compute[236126]: 2025-10-02 12:49:46.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:46 np0005465988 nova_compute[236126]: 2025-10-02 12:49:46.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.456 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.456 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.457 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:46.457 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.067 2 DEBUG nova.compute.manager [req-ec9cc1f8-bd81-447a-bf42-991cdaa980f4 req-555fa76b-779c-46cb-8354-eeb8a4d53e33 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.068 2 DEBUG oslo_concurrency.lockutils [req-ec9cc1f8-bd81-447a-bf42-991cdaa980f4 req-555fa76b-779c-46cb-8354-eeb8a4d53e33 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.069 2 DEBUG oslo_concurrency.lockutils [req-ec9cc1f8-bd81-447a-bf42-991cdaa980f4 req-555fa76b-779c-46cb-8354-eeb8a4d53e33 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.069 2 DEBUG oslo_concurrency.lockutils [req-ec9cc1f8-bd81-447a-bf42-991cdaa980f4 req-555fa76b-779c-46cb-8354-eeb8a4d53e33 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.070 2 DEBUG nova.compute.manager [req-ec9cc1f8-bd81-447a-bf42-991cdaa980f4 req-555fa76b-779c-46cb-8354-eeb8a4d53e33 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] No waiting events found dispatching network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.070 2 WARNING nova.compute.manager [req-ec9cc1f8-bd81-447a-bf42-991cdaa980f4 req-555fa76b-779c-46cb-8354-eeb8a4d53e33 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received unexpected event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:49:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:47.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.839 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.840 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409387.838943, ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.840 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.844 2 DEBUG nova.compute.manager [None req-11b3c23c-9f8f-4395-9c84-5eca8dfb6b99 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.899 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.904 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:49:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:47.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.990 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.991 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409387.841898, ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:49:47 np0005465988 nova_compute[236126]: 2025-10-02 12:49:47.992 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] VM Started (Lifecycle Event)#033[00m
Oct  2 08:49:48 np0005465988 nova_compute[236126]: 2025-10-02 12:49:48.027 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:48 np0005465988 nova_compute[236126]: 2025-10-02 12:49:48.037 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:49:48 np0005465988 nova_compute[236126]: 2025-10-02 12:49:48.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:49.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:49 np0005465988 nova_compute[236126]: 2025-10-02 12:49:49.356 2 DEBUG nova.compute.manager [req-26a2cf8e-7fa8-4945-b0a5-7c4205274a56 req-b8dce6b6-9c2c-4386-abe7-0f7668ed038f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:49 np0005465988 nova_compute[236126]: 2025-10-02 12:49:49.357 2 DEBUG oslo_concurrency.lockutils [req-26a2cf8e-7fa8-4945-b0a5-7c4205274a56 req-b8dce6b6-9c2c-4386-abe7-0f7668ed038f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:49 np0005465988 nova_compute[236126]: 2025-10-02 12:49:49.358 2 DEBUG oslo_concurrency.lockutils [req-26a2cf8e-7fa8-4945-b0a5-7c4205274a56 req-b8dce6b6-9c2c-4386-abe7-0f7668ed038f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:49 np0005465988 nova_compute[236126]: 2025-10-02 12:49:49.358 2 DEBUG oslo_concurrency.lockutils [req-26a2cf8e-7fa8-4945-b0a5-7c4205274a56 req-b8dce6b6-9c2c-4386-abe7-0f7668ed038f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:49 np0005465988 nova_compute[236126]: 2025-10-02 12:49:49.358 2 DEBUG nova.compute.manager [req-26a2cf8e-7fa8-4945-b0a5-7c4205274a56 req-b8dce6b6-9c2c-4386-abe7-0f7668ed038f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] No waiting events found dispatching network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:49 np0005465988 nova_compute[236126]: 2025-10-02 12:49:49.359 2 WARNING nova.compute.manager [req-26a2cf8e-7fa8-4945-b0a5-7c4205274a56 req-b8dce6b6-9c2c-4386-abe7-0f7668ed038f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received unexpected event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 for instance with vm_state rescued and task_state unrescuing.#033[00m
Oct  2 08:49:49 np0005465988 nova_compute[236126]: 2025-10-02 12:49:49.391 2 INFO nova.compute.manager [None req-7869d8d9-ba12-4825-ac15-e7ccc7fa91e3 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Unrescuing#033[00m
Oct  2 08:49:49 np0005465988 nova_compute[236126]: 2025-10-02 12:49:49.392 2 DEBUG oslo_concurrency.lockutils [None req-7869d8d9-ba12-4825-ac15-e7ccc7fa91e3 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:49:49 np0005465988 nova_compute[236126]: 2025-10-02 12:49:49.392 2 DEBUG oslo_concurrency.lockutils [None req-7869d8d9-ba12-4825-ac15-e7ccc7fa91e3 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquired lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:49:49 np0005465988 nova_compute[236126]: 2025-10-02 12:49:49.392 2 DEBUG nova.network.neutron [None req-7869d8d9-ba12-4825-ac15-e7ccc7fa91e3 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:49:49 np0005465988 nova_compute[236126]: 2025-10-02 12:49:49.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:49.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:50 np0005465988 nova_compute[236126]: 2025-10-02 12:49:50.988 2 DEBUG nova.network.neutron [None req-7869d8d9-ba12-4825-ac15-e7ccc7fa91e3 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Updating instance_info_cache with network_info: [{"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.011 2 DEBUG oslo_concurrency.lockutils [None req-7869d8d9-ba12-4825-ac15-e7ccc7fa91e3 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Releasing lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.013 2 DEBUG nova.objects.instance [None req-7869d8d9-ba12-4825-ac15-e7ccc7fa91e3 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'flavor' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:51 np0005465988 kernel: tapeccb499c-96 (unregistering): left promiscuous mode
Oct  2 08:49:51 np0005465988 NetworkManager[45041]: <info>  [1759409391.1013] device (tapeccb499c-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:49:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:51Z|00807|binding|INFO|Releasing lport eccb499c-961f-4ee4-9995-578966625db6 from this chassis (sb_readonly=0)
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:51Z|00808|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 down in Southbound
Oct  2 08:49:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:51Z|00809|binding|INFO|Removing iface tapeccb499c-96 ovn-installed in OVS
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.119 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:45:eb 10.100.0.6'], port_security=['fa:16:3e:39:45:eb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ac6724c1-4d98-45f7-8e2b-dfac55d9cb13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=eccb499c-961f-4ee4-9995-578966625db6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.120 142124 INFO neutron.agent.ovn.metadata.agent [-] Port eccb499c-961f-4ee4-9995-578966625db6 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 unbound from our chassis#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.121 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.143 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d7d54c0e-2278-4c8a-a28f-f69fd3cabc4d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.177 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[40cba82e-fd5a-4273-b207-0c4fecb89591]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000af.scope: Deactivated successfully.
Oct  2 08:49:51 np0005465988 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000af.scope: Consumed 4.567s CPU time.
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.181 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f1c4d6-4d0d-4f1e-83dc-d108e99ed65e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 systemd-machined[192594]: Machine qemu-84-instance-000000af terminated.
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.216 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ea493c97-9b52-4c81-83a1-d4efa1986190]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.235 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fff39178-89a4-44da-b348-ea24d280b3c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 16, 'rx_bytes': 916, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 16, 'rx_bytes': 916, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 35722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 317335, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.258 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cee5c9ef-89da-44fc-b4b3-7989f3d5cf47]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740222, 'tstamp': 740222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 317336, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740225, 'tstamp': 740225}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 317336, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.260 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.270 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.270 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.271 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.271 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.287 2 INFO nova.virt.libvirt.driver [-] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Instance destroyed successfully.#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.288 2 DEBUG nova.objects.instance [None req-7869d8d9-ba12-4825-ac15-e7ccc7fa91e3 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'numa_topology' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:51.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:51 np0005465988 kernel: tapeccb499c-96: entered promiscuous mode
Oct  2 08:49:51 np0005465988 systemd-udevd[317327]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:49:51 np0005465988 NetworkManager[45041]: <info>  [1759409391.4053] manager: (tapeccb499c-96): new Tun device (/org/freedesktop/NetworkManager/Devices/355)
Oct  2 08:49:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:51Z|00810|binding|INFO|Claiming lport eccb499c-961f-4ee4-9995-578966625db6 for this chassis.
Oct  2 08:49:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:51Z|00811|binding|INFO|eccb499c-961f-4ee4-9995-578966625db6: Claiming fa:16:3e:39:45:eb 10.100.0.6
Oct  2 08:49:51 np0005465988 NetworkManager[45041]: <info>  [1759409391.4411] device (tapeccb499c-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:51 np0005465988 NetworkManager[45041]: <info>  [1759409391.4422] device (tapeccb499c-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.447 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:45:eb 10.100.0.6'], port_security=['fa:16:3e:39:45:eb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ac6724c1-4d98-45f7-8e2b-dfac55d9cb13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=eccb499c-961f-4ee4-9995-578966625db6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.449 142124 INFO neutron.agent.ovn.metadata.agent [-] Port eccb499c-961f-4ee4-9995-578966625db6 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 bound to our chassis#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.450 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.451 2 DEBUG nova.compute.manager [req-ee5347a7-7a98-4af3-9bd1-97247019bc0c req-4d8d0547-c4b3-4201-92dc-765340eb8e9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-vif-unplugged-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.452 2 DEBUG oslo_concurrency.lockutils [req-ee5347a7-7a98-4af3-9bd1-97247019bc0c req-4d8d0547-c4b3-4201-92dc-765340eb8e9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.452 2 DEBUG oslo_concurrency.lockutils [req-ee5347a7-7a98-4af3-9bd1-97247019bc0c req-4d8d0547-c4b3-4201-92dc-765340eb8e9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.452 2 DEBUG oslo_concurrency.lockutils [req-ee5347a7-7a98-4af3-9bd1-97247019bc0c req-4d8d0547-c4b3-4201-92dc-765340eb8e9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.452 2 DEBUG nova.compute.manager [req-ee5347a7-7a98-4af3-9bd1-97247019bc0c req-4d8d0547-c4b3-4201-92dc-765340eb8e9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] No waiting events found dispatching network-vif-unplugged-eccb499c-961f-4ee4-9995-578966625db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.452 2 WARNING nova.compute.manager [req-ee5347a7-7a98-4af3-9bd1-97247019bc0c req-4d8d0547-c4b3-4201-92dc-765340eb8e9e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received unexpected event network-vif-unplugged-eccb499c-961f-4ee4-9995-578966625db6 for instance with vm_state rescued and task_state unrescuing.#033[00m
Oct  2 08:49:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:51Z|00812|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 ovn-installed in OVS
Oct  2 08:49:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:49:51Z|00813|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 up in Southbound
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:51 np0005465988 systemd-machined[192594]: New machine qemu-85-instance-000000af.
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.467 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[64b2651c-8545-42a0-bc8e-49228e1c8c8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:51 np0005465988 systemd[1]: Started Virtual Machine qemu-85-instance-000000af.
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.499 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4dbf0ac4-9bbe-4bbd-8d0c-7152767c7ae1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.502 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1a6db2c3-19f1-4325-95c8-3326399c37f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.534 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7c97894b-823e-4ff1-a48a-0f450104e680]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.555 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[57df9179-9d08-4fc5-a827-157a78210c60]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 18, 'rx_bytes': 916, 'tx_bytes': 948, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 18, 'rx_bytes': 916, 'tx_bytes': 948, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 35722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 317373, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.571 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed81903-3e0b-492e-a671-8396bb04ba9d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740222, 'tstamp': 740222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 317374, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740225, 'tstamp': 740225}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 317374, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.573 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:51 np0005465988 nova_compute[236126]: 2025-10-02 12:49:51.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.577 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.577 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.577 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:51.577 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:49:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:51.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:52 np0005465988 nova_compute[236126]: 2025-10-02 12:49:52.603 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:49:52 np0005465988 nova_compute[236126]: 2025-10-02 12:49:52.604 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409392.603272, ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:49:52 np0005465988 nova_compute[236126]: 2025-10-02 12:49:52.604 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:49:52 np0005465988 nova_compute[236126]: 2025-10-02 12:49:52.641 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:52 np0005465988 nova_compute[236126]: 2025-10-02 12:49:52.646 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:49:52 np0005465988 nova_compute[236126]: 2025-10-02 12:49:52.697 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Oct  2 08:49:52 np0005465988 nova_compute[236126]: 2025-10-02 12:49:52.698 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409392.6054397, ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:49:52 np0005465988 nova_compute[236126]: 2025-10-02 12:49:52.698 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] VM Started (Lifecycle Event)#033[00m
Oct  2 08:49:52 np0005465988 nova_compute[236126]: 2025-10-02 12:49:52.727 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:52 np0005465988 nova_compute[236126]: 2025-10-02 12:49:52.732 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:49:52 np0005465988 nova_compute[236126]: 2025-10-02 12:49:52.770 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Oct  2 08:49:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:53.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.691 2 DEBUG nova.compute.manager [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.692 2 DEBUG oslo_concurrency.lockutils [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.693 2 DEBUG oslo_concurrency.lockutils [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.693 2 DEBUG oslo_concurrency.lockutils [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.693 2 DEBUG nova.compute.manager [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] No waiting events found dispatching network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.694 2 WARNING nova.compute.manager [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received unexpected event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 for instance with vm_state rescued and task_state unrescuing.#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.694 2 DEBUG nova.compute.manager [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.694 2 DEBUG oslo_concurrency.lockutils [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.695 2 DEBUG oslo_concurrency.lockutils [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.695 2 DEBUG oslo_concurrency.lockutils [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.695 2 DEBUG nova.compute.manager [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] No waiting events found dispatching network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.695 2 WARNING nova.compute.manager [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received unexpected event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 for instance with vm_state rescued and task_state unrescuing.#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.696 2 DEBUG nova.compute.manager [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.696 2 DEBUG oslo_concurrency.lockutils [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.696 2 DEBUG oslo_concurrency.lockutils [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.697 2 DEBUG oslo_concurrency.lockutils [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.697 2 DEBUG nova.compute.manager [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] No waiting events found dispatching network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.697 2 WARNING nova.compute.manager [req-256e42af-eaca-4995-82a0-aca90b80226f req-f15c6696-39a0-4bca-a1d8-d5a95d2b38f4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received unexpected event network-vif-plugged-eccb499c-961f-4ee4-9995-578966625db6 for instance with vm_state rescued and task_state unrescuing.#033[00m
Oct  2 08:49:53 np0005465988 nova_compute[236126]: 2025-10-02 12:49:53.920 2 DEBUG nova.compute.manager [None req-7869d8d9-ba12-4825-ac15-e7ccc7fa91e3 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:53.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:54.033 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:49:54.034 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:49:54 np0005465988 nova_compute[236126]: 2025-10-02 12:49:54.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:54 np0005465988 nova_compute[236126]: 2025-10-02 12:49:54.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:49:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3406662994' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:49:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:49:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3406662994' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:49:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:49:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:55.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:49:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:49:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:55.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:49:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:57.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:49:58.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:58 np0005465988 nova_compute[236126]: 2025-10-02 12:49:58.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:49:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:49:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:49:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:49:59.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:49:59 np0005465988 nova_compute[236126]: 2025-10-02 12:49:59.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:00.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:00 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 08:50:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:01.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:02.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:02 np0005465988 nova_compute[236126]: 2025-10-02 12:50:02.600 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:02 np0005465988 nova_compute[236126]: 2025-10-02 12:50:02.662 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:02 np0005465988 nova_compute[236126]: 2025-10-02 12:50:02.663 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:02 np0005465988 nova_compute[236126]: 2025-10-02 12:50:02.663 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:02 np0005465988 nova_compute[236126]: 2025-10-02 12:50:02.663 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:50:02 np0005465988 nova_compute[236126]: 2025-10-02 12:50:02.664 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:50:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:50:03 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/812729193' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:50:03 np0005465988 nova_compute[236126]: 2025-10-02 12:50:03.240 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.576s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:50:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:03.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:03 np0005465988 nova_compute[236126]: 2025-10-02 12:50:03.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:03 np0005465988 nova_compute[236126]: 2025-10-02 12:50:03.693 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:50:03 np0005465988 nova_compute[236126]: 2025-10-02 12:50:03.694 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:50:03 np0005465988 nova_compute[236126]: 2025-10-02 12:50:03.698 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:50:03 np0005465988 nova_compute[236126]: 2025-10-02 12:50:03.698 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:50:03 np0005465988 nova_compute[236126]: 2025-10-02 12:50:03.880 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:50:03 np0005465988 nova_compute[236126]: 2025-10-02 12:50:03.881 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3682MB free_disk=20.85171890258789GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:50:03 np0005465988 nova_compute[236126]: 2025-10-02 12:50:03.882 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:03 np0005465988 nova_compute[236126]: 2025-10-02 12:50:03.882 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:50:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:04.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:50:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:50:04.036 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:50:04 np0005465988 nova_compute[236126]: 2025-10-02 12:50:04.209 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 439392e5-66ae-4162-a7e5-077f87ca558b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:50:04 np0005465988 nova_compute[236126]: 2025-10-02 12:50:04.210 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:50:04 np0005465988 nova_compute[236126]: 2025-10-02 12:50:04.210 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:50:04 np0005465988 nova_compute[236126]: 2025-10-02 12:50:04.210 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:50:04 np0005465988 nova_compute[236126]: 2025-10-02 12:50:04.256 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:50:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:50:04 np0005465988 podman[317535]: 2025-10-02 12:50:04.528232716 +0000 UTC m=+0.056931603 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 08:50:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:50:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2818901054' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:50:04 np0005465988 nova_compute[236126]: 2025-10-02 12:50:04.755 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:50:04 np0005465988 nova_compute[236126]: 2025-10-02 12:50:04.762 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:50:04 np0005465988 nova_compute[236126]: 2025-10-02 12:50:04.869 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:50:04 np0005465988 nova_compute[236126]: 2025-10-02 12:50:04.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:05 np0005465988 nova_compute[236126]: 2025-10-02 12:50:05.022 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:50:05 np0005465988 nova_compute[236126]: 2025-10-02 12:50:05.022 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:05.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:06.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:06 np0005465988 ovn_controller[132601]: 2025-10-02T12:50:06Z|00104|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:45:eb 10.100.0.6
Oct  2 08:50:06 np0005465988 ovn_controller[132601]: 2025-10-02T12:50:06Z|00105|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:45:eb 10.100.0.6
Oct  2 08:50:06 np0005465988 nova_compute[236126]: 2025-10-02 12:50:06.896 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:07.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:08.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:08 np0005465988 nova_compute[236126]: 2025-10-02 12:50:08.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:50:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:09.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:09 np0005465988 nova_compute[236126]: 2025-10-02 12:50:09.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:10.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:50:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/112898182' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:50:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:11.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:12.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:13.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:13 np0005465988 nova_compute[236126]: 2025-10-02 12:50:13.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:13 np0005465988 nova_compute[236126]: 2025-10-02 12:50:13.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:14.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:50:14 np0005465988 nova_compute[236126]: 2025-10-02 12:50:14.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:50:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:15.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:50:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:50:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:16.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:50:16 np0005465988 nova_compute[236126]: 2025-10-02 12:50:16.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:16 np0005465988 nova_compute[236126]: 2025-10-02 12:50:16.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:50:16 np0005465988 podman[317715]: 2025-10-02 12:50:16.539323064 +0000 UTC m=+0.070556747 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:50:16 np0005465988 podman[317716]: 2025-10-02 12:50:16.542729882 +0000 UTC m=+0.068005693 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:50:16 np0005465988 podman[317714]: 2025-10-02 12:50:16.557183769 +0000 UTC m=+0.095636570 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:50:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:17.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:18.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:50:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:50:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:50:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e375 e375: 3 total, 3 up, 3 in
Oct  2 08:50:18 np0005465988 nova_compute[236126]: 2025-10-02 12:50:18.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:18 np0005465988 nova_compute[236126]: 2025-10-02 12:50:18.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:18 np0005465988 nova_compute[236126]: 2025-10-02 12:50:18.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:18 np0005465988 nova_compute[236126]: 2025-10-02 12:50:18.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e375 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:50:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:19.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:19 np0005465988 nova_compute[236126]: 2025-10-02 12:50:19.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:20.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e376 e376: 3 total, 3 up, 3 in
Oct  2 08:50:20 np0005465988 nova_compute[236126]: 2025-10-02 12:50:20.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:50:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:21.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:50:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e377 e377: 3 total, 3 up, 3 in
Oct  2 08:50:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:22.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:22 np0005465988 nova_compute[236126]: 2025-10-02 12:50:22.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:22 np0005465988 nova_compute[236126]: 2025-10-02 12:50:22.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:50:22 np0005465988 nova_compute[236126]: 2025-10-02 12:50:22.867 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:50:22 np0005465988 nova_compute[236126]: 2025-10-02 12:50:22.867 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:50:22 np0005465988 nova_compute[236126]: 2025-10-02 12:50:22.867 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:50:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:23.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:23 np0005465988 nova_compute[236126]: 2025-10-02 12:50:23.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:23 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:50:23 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:50:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:24.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #130. Immutable memtables: 0.
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.186143) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 81] Flushing memtable with next log file: 130
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409424186196, "job": 81, "event": "flush_started", "num_memtables": 1, "num_entries": 721, "num_deletes": 250, "total_data_size": 1183116, "memory_usage": 1198296, "flush_reason": "Manual Compaction"}
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 81] Level-0 flush table #131: started
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409424196215, "cf_name": "default", "job": 81, "event": "table_file_creation", "file_number": 131, "file_size": 780614, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 65118, "largest_seqno": 65834, "table_properties": {"data_size": 777105, "index_size": 1352, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7440, "raw_average_key_size": 17, "raw_value_size": 769916, "raw_average_value_size": 1803, "num_data_blocks": 59, "num_entries": 427, "num_filter_entries": 427, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759409384, "oldest_key_time": 1759409384, "file_creation_time": 1759409424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 131, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 81] Flush lasted 10128 microseconds, and 3624 cpu microseconds.
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.196272) [db/flush_job.cc:967] [default] [JOB 81] Level-0 flush table #131: 780614 bytes OK
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.196293) [db/memtable_list.cc:519] [default] Level-0 commit table #131 started
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.203692) [db/memtable_list.cc:722] [default] Level-0 commit table #131: memtable #1 done
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.203723) EVENT_LOG_v1 {"time_micros": 1759409424203714, "job": 81, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.203745) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 81] Try to delete WAL files size 1179247, prev total WAL file size 1179247, number of live WAL files 2.
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000127.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.204433) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323530' seq:72057594037927935, type:22 .. '6B7600353031' seq:0, type:0; will stop at (end)
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 82] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 81 Base level 0, inputs: [131(762KB)], [129(11MB)]
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409424204720, "job": 82, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [131], "files_L6": [129], "score": -1, "input_data_size": 12726673, "oldest_snapshot_seqno": -1}
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 82] Generated table #132: 8781 keys, 11639114 bytes, temperature: kUnknown
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409424289313, "cf_name": "default", "job": 82, "event": "table_file_creation", "file_number": 132, "file_size": 11639114, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11581956, "index_size": 34069, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22021, "raw_key_size": 230966, "raw_average_key_size": 26, "raw_value_size": 11427318, "raw_average_value_size": 1301, "num_data_blocks": 1306, "num_entries": 8781, "num_filter_entries": 8781, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759409424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 132, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.289595) [db/compaction/compaction_job.cc:1663] [default] [JOB 82] Compacted 1@0 + 1@6 files to L6 => 11639114 bytes
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.292740) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.4 rd, 137.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 11.4 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(31.2) write-amplify(14.9) OK, records in: 9298, records dropped: 517 output_compression: NoCompression
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.292758) EVENT_LOG_v1 {"time_micros": 1759409424292750, "job": 82, "event": "compaction_finished", "compaction_time_micros": 84608, "compaction_time_cpu_micros": 28348, "output_level": 6, "num_output_files": 1, "total_output_size": 11639114, "num_input_records": 9298, "num_output_records": 8781, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000131.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409424293027, "job": 82, "event": "table_file_deletion", "file_number": 131}
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000129.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409424295483, "job": 82, "event": "table_file_deletion", "file_number": 129}
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.204297) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.295545) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.295559) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.295560) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.295562) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:50:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:50:24.295563) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:50:24 np0005465988 nova_compute[236126]: 2025-10-02 12:50:24.530 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updating instance_info_cache with network_info: [{"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:50:24 np0005465988 nova_compute[236126]: 2025-10-02 12:50:24.553 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:50:24 np0005465988 nova_compute[236126]: 2025-10-02 12:50:24.554 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:50:24 np0005465988 nova_compute[236126]: 2025-10-02 12:50:24.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:50:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:25.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:50:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:26.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:27.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:50:27.395 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:50:27.395 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:50:27.396 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:28.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:28 np0005465988 nova_compute[236126]: 2025-10-02 12:50:28.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:50:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:29.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 e378: 3 total, 3 up, 3 in
Oct  2 08:50:29 np0005465988 nova_compute[236126]: 2025-10-02 12:50:29.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:30.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:31.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:32.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:33.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:33 np0005465988 nova_compute[236126]: 2025-10-02 12:50:33.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:34.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:50:34 np0005465988 nova_compute[236126]: 2025-10-02 12:50:34.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:35.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:35 np0005465988 systemd[1]: Starting dnf makecache...
Oct  2 08:50:35 np0005465988 podman[317869]: 2025-10-02 12:50:35.543249322 +0000 UTC m=+0.084943412 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 08:50:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:36.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:36 np0005465988 dnf[317870]: Metadata cache refreshed recently.
Oct  2 08:50:36 np0005465988 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  2 08:50:36 np0005465988 systemd[1]: Finished dnf makecache.
Oct  2 08:50:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:50:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:37.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:50:37 np0005465988 nova_compute[236126]: 2025-10-02 12:50:37.549 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:38.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:38 np0005465988 nova_compute[236126]: 2025-10-02 12:50:38.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:50:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:39.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:39 np0005465988 nova_compute[236126]: 2025-10-02 12:50:39.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:40.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:41.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:42.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:50:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:43.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:50:43 np0005465988 nova_compute[236126]: 2025-10-02 12:50:43.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:44.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:50:44 np0005465988 nova_compute[236126]: 2025-10-02 12:50:44.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:50:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:45.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:50:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:46.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:50:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:47.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:50:47 np0005465988 podman[317947]: 2025-10-02 12:50:47.533810487 +0000 UTC m=+0.064718209 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct  2 08:50:47 np0005465988 podman[317948]: 2025-10-02 12:50:47.567077606 +0000 UTC m=+0.086339902 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible)
Oct  2 08:50:47 np0005465988 podman[317946]: 2025-10-02 12:50:47.584617042 +0000 UTC m=+0.119517959 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:50:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:48.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:48 np0005465988 nova_compute[236126]: 2025-10-02 12:50:48.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:50:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:49.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:50 np0005465988 nova_compute[236126]: 2025-10-02 12:50:50.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:50.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:51.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:52.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:50:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:53.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:50:53 np0005465988 nova_compute[236126]: 2025-10-02 12:50:53.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:54.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:50:55 np0005465988 nova_compute[236126]: 2025-10-02 12:50:55.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:50:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:55.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:50:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:50:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2167788464' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:50:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:50:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2167788464' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:50:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:56.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:57.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:50:58.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:58 np0005465988 nova_compute[236126]: 2025-10-02 12:50:58.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:50:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:50:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:50:59.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:50:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:00 np0005465988 nova_compute[236126]: 2025-10-02 12:51:00.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:00.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:01 np0005465988 ovn_controller[132601]: 2025-10-02T12:51:01Z|00814|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Oct  2 08:51:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:01.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:02.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:02 np0005465988 nova_compute[236126]: 2025-10-02 12:51:02.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:02 np0005465988 nova_compute[236126]: 2025-10-02 12:51:02.541 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:02 np0005465988 nova_compute[236126]: 2025-10-02 12:51:02.542 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:02 np0005465988 nova_compute[236126]: 2025-10-02 12:51:02.542 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:02 np0005465988 nova_compute[236126]: 2025-10-02 12:51:02.542 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:51:02 np0005465988 nova_compute[236126]: 2025-10-02 12:51:02.543 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:51:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:51:03 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3015363131' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.046 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.182 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.183 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.186 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.186 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.328 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.329 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3650MB free_disk=20.806690216064453GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.330 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.330 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:03.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.544 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 439392e5-66ae-4162-a7e5-077f87ca558b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.545 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.545 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.545 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.657 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.685 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.685 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.705 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.728 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:51:03 np0005465988 nova_compute[236126]: 2025-10-02 12:51:03.794 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:51:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:04.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:51:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2674474749' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:51:04 np0005465988 nova_compute[236126]: 2025-10-02 12:51:04.736 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.942s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:51:04 np0005465988 nova_compute[236126]: 2025-10-02 12:51:04.743 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:51:05 np0005465988 nova_compute[236126]: 2025-10-02 12:51:05.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:05 np0005465988 nova_compute[236126]: 2025-10-02 12:51:05.379 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:51:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:05.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:05.424 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:51:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:05.425 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:51:05 np0005465988 nova_compute[236126]: 2025-10-02 12:51:05.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:05 np0005465988 nova_compute[236126]: 2025-10-02 12:51:05.566 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:51:05 np0005465988 nova_compute[236126]: 2025-10-02 12:51:05.566 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:06.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:06.428 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:06 np0005465988 podman[318115]: 2025-10-02 12:51:06.507030459 +0000 UTC m=+0.049409897 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  2 08:51:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:07.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #133. Immutable memtables: 0.
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.697497) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 83] Flushing memtable with next log file: 133
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409467697537, "job": 83, "event": "flush_started", "num_memtables": 1, "num_entries": 682, "num_deletes": 252, "total_data_size": 1137358, "memory_usage": 1164720, "flush_reason": "Manual Compaction"}
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 83] Level-0 flush table #134: started
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409467731574, "cf_name": "default", "job": 83, "event": "table_file_creation", "file_number": 134, "file_size": 749791, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 65839, "largest_seqno": 66516, "table_properties": {"data_size": 746508, "index_size": 1190, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7898, "raw_average_key_size": 19, "raw_value_size": 739770, "raw_average_value_size": 1822, "num_data_blocks": 53, "num_entries": 406, "num_filter_entries": 406, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759409424, "oldest_key_time": 1759409424, "file_creation_time": 1759409467, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 134, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 83] Flush lasted 34137 microseconds, and 2862 cpu microseconds.
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.731628) [db/flush_job.cc:967] [default] [JOB 83] Level-0 flush table #134: 749791 bytes OK
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.731653) [db/memtable_list.cc:519] [default] Level-0 commit table #134 started
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.762256) [db/memtable_list.cc:722] [default] Level-0 commit table #134: memtable #1 done
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.762302) EVENT_LOG_v1 {"time_micros": 1759409467762292, "job": 83, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.762326) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 83] Try to delete WAL files size 1133664, prev total WAL file size 1133664, number of live WAL files 2.
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000130.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.762954) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035353232' seq:72057594037927935, type:22 .. '7061786F730035373734' seq:0, type:0; will stop at (end)
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 84] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 83 Base level 0, inputs: [134(732KB)], [132(11MB)]
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409467762976, "job": 84, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [134], "files_L6": [132], "score": -1, "input_data_size": 12388905, "oldest_snapshot_seqno": -1}
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 84] Generated table #135: 8669 keys, 10503175 bytes, temperature: kUnknown
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409467920939, "cf_name": "default", "job": 84, "event": "table_file_creation", "file_number": 135, "file_size": 10503175, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10447815, "index_size": 32591, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21701, "raw_key_size": 229337, "raw_average_key_size": 26, "raw_value_size": 10296000, "raw_average_value_size": 1187, "num_data_blocks": 1237, "num_entries": 8669, "num_filter_entries": 8669, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759409467, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 135, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.921192) [db/compaction/compaction_job.cc:1663] [default] [JOB 84] Compacted 1@0 + 1@6 files to L6 => 10503175 bytes
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.972096) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 78.4 rd, 66.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 11.1 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(30.5) write-amplify(14.0) OK, records in: 9187, records dropped: 518 output_compression: NoCompression
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.972143) EVENT_LOG_v1 {"time_micros": 1759409467972126, "job": 84, "event": "compaction_finished", "compaction_time_micros": 158035, "compaction_time_cpu_micros": 25015, "output_level": 6, "num_output_files": 1, "total_output_size": 10503175, "num_input_records": 9187, "num_output_records": 8669, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000134.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409467973213, "job": 84, "event": "table_file_deletion", "file_number": 134}
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000132.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409467975352, "job": 84, "event": "table_file_deletion", "file_number": 132}
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.762889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.975497) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.975501) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.975503) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.975505) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:51:07 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:51:07.975506) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:51:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:08.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:08 np0005465988 nova_compute[236126]: 2025-10-02 12:51:08.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:09.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:09 np0005465988 nova_compute[236126]: 2025-10-02 12:51:09.568 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:10 np0005465988 nova_compute[236126]: 2025-10-02 12:51:10.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:10.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:11.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:12.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:12 np0005465988 nova_compute[236126]: 2025-10-02 12:51:12.312 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "c1ab1282-e440-44ee-9a39-d59322467ce9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:12 np0005465988 nova_compute[236126]: 2025-10-02 12:51:12.313 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:12 np0005465988 nova_compute[236126]: 2025-10-02 12:51:12.403 2 DEBUG nova.compute.manager [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:51:12 np0005465988 nova_compute[236126]: 2025-10-02 12:51:12.716 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:12 np0005465988 nova_compute[236126]: 2025-10-02 12:51:12.716 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:12 np0005465988 nova_compute[236126]: 2025-10-02 12:51:12.727 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:51:12 np0005465988 nova_compute[236126]: 2025-10-02 12:51:12.728 2 INFO nova.compute.claims [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:51:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:13.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:13 np0005465988 nova_compute[236126]: 2025-10-02 12:51:13.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:51:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:14.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:51:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:15 np0005465988 nova_compute[236126]: 2025-10-02 12:51:15.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:15 np0005465988 nova_compute[236126]: 2025-10-02 12:51:15.222 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:51:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:15.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:15 np0005465988 nova_compute[236126]: 2025-10-02 12:51:15.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:51:15 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/113032608' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:51:15 np0005465988 nova_compute[236126]: 2025-10-02 12:51:15.713 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:51:15 np0005465988 nova_compute[236126]: 2025-10-02 12:51:15.720 2 DEBUG nova.compute.provider_tree [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:51:16 np0005465988 nova_compute[236126]: 2025-10-02 12:51:16.077 2 DEBUG nova.scheduler.client.report [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:51:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:16.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:16 np0005465988 nova_compute[236126]: 2025-10-02 12:51:16.783 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 4.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:16 np0005465988 nova_compute[236126]: 2025-10-02 12:51:16.783 2 DEBUG nova.compute.manager [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:51:16 np0005465988 nova_compute[236126]: 2025-10-02 12:51:16.962 2 DEBUG nova.compute.manager [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:51:16 np0005465988 nova_compute[236126]: 2025-10-02 12:51:16.962 2 DEBUG nova.network.neutron [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:51:17 np0005465988 nova_compute[236126]: 2025-10-02 12:51:17.283 2 INFO nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:51:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:51:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:17.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:51:17 np0005465988 nova_compute[236126]: 2025-10-02 12:51:17.415 2 DEBUG nova.compute.manager [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:51:17 np0005465988 nova_compute[236126]: 2025-10-02 12:51:17.431 2 DEBUG nova.policy [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16730f38111542e58a05fb4deb2b3914', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5ade962c517a483dbfe4bb13386f0006', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:51:17 np0005465988 nova_compute[236126]: 2025-10-02 12:51:17.832 2 DEBUG nova.compute.manager [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:51:17 np0005465988 nova_compute[236126]: 2025-10-02 12:51:17.834 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:51:17 np0005465988 nova_compute[236126]: 2025-10-02 12:51:17.834 2 INFO nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Creating image(s)#033[00m
Oct  2 08:51:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:18.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.154 2 DEBUG nova.storage.rbd_utils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image c1ab1282-e440-44ee-9a39-d59322467ce9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:51:18 np0005465988 podman[318238]: 2025-10-02 12:51:18.550522971 +0000 UTC m=+0.076278461 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:51:18 np0005465988 podman[318239]: 2025-10-02 12:51:18.550538711 +0000 UTC m=+0.068328632 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:51:18 np0005465988 podman[318237]: 2025-10-02 12:51:18.576242293 +0000 UTC m=+0.105701030 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.727 2 DEBUG nova.storage.rbd_utils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image c1ab1282-e440-44ee-9a39-d59322467ce9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.768 2 DEBUG nova.storage.rbd_utils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image c1ab1282-e440-44ee-9a39-d59322467ce9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.775 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.828 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.829 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.880 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.881 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.882 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.882 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.920 2 DEBUG nova.storage.rbd_utils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image c1ab1282-e440-44ee-9a39-d59322467ce9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:51:18 np0005465988 nova_compute[236126]: 2025-10-02 12:51:18.925 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c1ab1282-e440-44ee-9a39-d59322467ce9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:51:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:19.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:19 np0005465988 nova_compute[236126]: 2025-10-02 12:51:19.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:19 np0005465988 nova_compute[236126]: 2025-10-02 12:51:19.570 2 DEBUG nova.network.neutron [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Successfully created port: e248b8bd-d715-4187-bc4a-f4451f9237fd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:51:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:20 np0005465988 nova_compute[236126]: 2025-10-02 12:51:20.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:20.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:20 np0005465988 nova_compute[236126]: 2025-10-02 12:51:20.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:20 np0005465988 nova_compute[236126]: 2025-10-02 12:51:20.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:20 np0005465988 nova_compute[236126]: 2025-10-02 12:51:20.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:21.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:21 np0005465988 nova_compute[236126]: 2025-10-02 12:51:21.428 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 c1ab1282-e440-44ee-9a39-d59322467ce9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:51:21 np0005465988 nova_compute[236126]: 2025-10-02 12:51:21.551 2 DEBUG nova.storage.rbd_utils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] resizing rbd image c1ab1282-e440-44ee-9a39-d59322467ce9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:51:21 np0005465988 nova_compute[236126]: 2025-10-02 12:51:21.794 2 DEBUG nova.network.neutron [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Successfully updated port: e248b8bd-d715-4187-bc4a-f4451f9237fd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:51:21 np0005465988 nova_compute[236126]: 2025-10-02 12:51:21.811 2 DEBUG nova.compute.manager [req-bff4fbcd-aa18-4573-9edd-d9fcf6d4220d req-b21371e2-b056-44d6-a544-01898ed41b65 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Received event network-changed-e248b8bd-d715-4187-bc4a-f4451f9237fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:21 np0005465988 nova_compute[236126]: 2025-10-02 12:51:21.812 2 DEBUG nova.compute.manager [req-bff4fbcd-aa18-4573-9edd-d9fcf6d4220d req-b21371e2-b056-44d6-a544-01898ed41b65 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Refreshing instance network info cache due to event network-changed-e248b8bd-d715-4187-bc4a-f4451f9237fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:51:21 np0005465988 nova_compute[236126]: 2025-10-02 12:51:21.812 2 DEBUG oslo_concurrency.lockutils [req-bff4fbcd-aa18-4573-9edd-d9fcf6d4220d req-b21371e2-b056-44d6-a544-01898ed41b65 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:51:21 np0005465988 nova_compute[236126]: 2025-10-02 12:51:21.812 2 DEBUG oslo_concurrency.lockutils [req-bff4fbcd-aa18-4573-9edd-d9fcf6d4220d req-b21371e2-b056-44d6-a544-01898ed41b65 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:51:21 np0005465988 nova_compute[236126]: 2025-10-02 12:51:21.812 2 DEBUG nova.network.neutron [req-bff4fbcd-aa18-4573-9edd-d9fcf6d4220d req-b21371e2-b056-44d6-a544-01898ed41b65 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Refreshing network info cache for port e248b8bd-d715-4187-bc4a-f4451f9237fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:51:21 np0005465988 nova_compute[236126]: 2025-10-02 12:51:21.905 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:51:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:22.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:22 np0005465988 nova_compute[236126]: 2025-10-02 12:51:22.274 2 DEBUG nova.network.neutron [req-bff4fbcd-aa18-4573-9edd-d9fcf6d4220d req-b21371e2-b056-44d6-a544-01898ed41b65 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:51:22 np0005465988 nova_compute[236126]: 2025-10-02 12:51:22.736 2 DEBUG nova.network.neutron [req-bff4fbcd-aa18-4573-9edd-d9fcf6d4220d req-b21371e2-b056-44d6-a544-01898ed41b65 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:51:22 np0005465988 nova_compute[236126]: 2025-10-02 12:51:22.851 2 DEBUG oslo_concurrency.lockutils [req-bff4fbcd-aa18-4573-9edd-d9fcf6d4220d req-b21371e2-b056-44d6-a544-01898ed41b65 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:51:22 np0005465988 nova_compute[236126]: 2025-10-02 12:51:22.852 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquired lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:51:22 np0005465988 nova_compute[236126]: 2025-10-02 12:51:22.853 2 DEBUG nova.network.neutron [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.226 2 DEBUG nova.network.neutron [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:51:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:23.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.842 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.842 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.842 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.849 2 DEBUG nova.objects.instance [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lazy-loading 'migration_context' on Instance uuid c1ab1282-e440-44ee-9a39-d59322467ce9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.896 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.897 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Ensure instance console log exists: /var/lib/nova/instances/c1ab1282-e440-44ee-9a39-d59322467ce9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.898 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.898 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:23 np0005465988 nova_compute[236126]: 2025-10-02 12:51:23.898 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:24.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.395 2 DEBUG nova.network.neutron [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Updating instance_info_cache with network_info: [{"id": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "address": "fa:16:3e:2b:05:b5", "network": {"id": "9da548d8-bbbf-425f-9d2c-5b231ba235e1", "bridge": "br-int", "label": "tempest-network-smoke--587721749", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape248b8bd-d7", "ovs_interfaceid": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.732 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Releasing lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.733 2 DEBUG nova.compute.manager [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Instance network_info: |[{"id": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "address": "fa:16:3e:2b:05:b5", "network": {"id": "9da548d8-bbbf-425f-9d2c-5b231ba235e1", "bridge": "br-int", "label": "tempest-network-smoke--587721749", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape248b8bd-d7", "ovs_interfaceid": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.736 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Start _get_guest_xml network_info=[{"id": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "address": "fa:16:3e:2b:05:b5", "network": {"id": "9da548d8-bbbf-425f-9d2c-5b231ba235e1", "bridge": "br-int", "label": "tempest-network-smoke--587721749", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape248b8bd-d7", "ovs_interfaceid": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.743 2 WARNING nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.749 2 DEBUG nova.virt.libvirt.host [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.750 2 DEBUG nova.virt.libvirt.host [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.754 2 DEBUG nova.virt.libvirt.host [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.754 2 DEBUG nova.virt.libvirt.host [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.756 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.756 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.756 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.757 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.757 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.757 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.757 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.757 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.758 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.758 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.758 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.758 2 DEBUG nova.virt.hardware [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:51:24 np0005465988 nova_compute[236126]: 2025-10-02 12:51:24.761 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:51:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:25 np0005465988 nova_compute[236126]: 2025-10-02 12:51:25.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:51:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/453716225' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:51:25 np0005465988 nova_compute[236126]: 2025-10-02 12:51:25.365 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.605s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:51:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:25.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:25 np0005465988 nova_compute[236126]: 2025-10-02 12:51:25.421 2 DEBUG nova.storage.rbd_utils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image c1ab1282-e440-44ee-9a39-d59322467ce9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:51:25 np0005465988 nova_compute[236126]: 2025-10-02 12:51:25.425 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:51:25 np0005465988 nova_compute[236126]: 2025-10-02 12:51:25.567 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Updating instance_info_cache with network_info: [{"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:51:25 np0005465988 nova_compute[236126]: 2025-10-02 12:51:25.600 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:51:25 np0005465988 nova_compute[236126]: 2025-10-02 12:51:25.601 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:51:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:26.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:51:26 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4176606406' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.264 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.839s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.266 2 DEBUG nova.virt.libvirt.vif [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:51:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-gen-1-1731550910',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-gen-1-1731550910',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1031871880-ge',id=180,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNswVb0qzE60u4BIwTdoxN1wCq1mjTg0LI0f0dRfL9Dy5OWE5514P4onN+QOd1Rw6rOzpCQmFmdmXCTuqTp4M4QoUJGeM1vbbJ7M/8I3p45f/7+fxTSWHkhjk1j/aCR05g==',key_name='tempest-TestSecurityGroupsBasicOps-605389685',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5ade962c517a483dbfe4bb13386f0006',ramdisk_id='',reservation_id='r-v3dk7951',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1031871880',owner_user_name='tempest-TestSecurityGroupsBasicOps-1031871880-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:51:17Z,user_data=None,user_id='16730f38111542e58a05fb4deb2b3914',uuid=c1ab1282-e440-44ee-9a39-d59322467ce9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "address": "fa:16:3e:2b:05:b5", "network": {"id": "9da548d8-bbbf-425f-9d2c-5b231ba235e1", "bridge": "br-int", "label": "tempest-network-smoke--587721749", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape248b8bd-d7", "ovs_interfaceid": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.266 2 DEBUG nova.network.os_vif_util [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converting VIF {"id": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "address": "fa:16:3e:2b:05:b5", "network": {"id": "9da548d8-bbbf-425f-9d2c-5b231ba235e1", "bridge": "br-int", "label": "tempest-network-smoke--587721749", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape248b8bd-d7", "ovs_interfaceid": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.267 2 DEBUG nova.network.os_vif_util [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:05:b5,bridge_name='br-int',has_traffic_filtering=True,id=e248b8bd-d715-4187-bc4a-f4451f9237fd,network=Network(9da548d8-bbbf-425f-9d2c-5b231ba235e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape248b8bd-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.269 2 DEBUG nova.objects.instance [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lazy-loading 'pci_devices' on Instance uuid c1ab1282-e440-44ee-9a39-d59322467ce9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.323 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  <uuid>c1ab1282-e440-44ee-9a39-d59322467ce9</uuid>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  <name>instance-000000b4</name>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-gen-1-1731550910</nova:name>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:51:24</nova:creationTime>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <nova:user uuid="16730f38111542e58a05fb4deb2b3914">tempest-TestSecurityGroupsBasicOps-1031871880-project-member</nova:user>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <nova:project uuid="5ade962c517a483dbfe4bb13386f0006">tempest-TestSecurityGroupsBasicOps-1031871880</nova:project>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <nova:port uuid="e248b8bd-d715-4187-bc4a-f4451f9237fd">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <entry name="serial">c1ab1282-e440-44ee-9a39-d59322467ce9</entry>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <entry name="uuid">c1ab1282-e440-44ee-9a39-d59322467ce9</entry>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c1ab1282-e440-44ee-9a39-d59322467ce9_disk">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/c1ab1282-e440-44ee-9a39-d59322467ce9_disk.config">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:2b:05:b5"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <target dev="tape248b8bd-d7"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/c1ab1282-e440-44ee-9a39-d59322467ce9/console.log" append="off"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:51:26 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:51:26 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:51:26 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:51:26 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.325 2 DEBUG nova.compute.manager [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Preparing to wait for external event network-vif-plugged-e248b8bd-d715-4187-bc4a-f4451f9237fd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.325 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.325 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.326 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.326 2 DEBUG nova.virt.libvirt.vif [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:51:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-gen-1-1731550910',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-gen-1-1731550910',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1031871880-ge',id=180,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNswVb0qzE60u4BIwTdoxN1wCq1mjTg0LI0f0dRfL9Dy5OWE5514P4onN+QOd1Rw6rOzpCQmFmdmXCTuqTp4M4QoUJGeM1vbbJ7M/8I3p45f/7+fxTSWHkhjk1j/aCR05g==',key_name='tempest-TestSecurityGroupsBasicOps-605389685',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5ade962c517a483dbfe4bb13386f0006',ramdisk_id='',reservation_id='r-v3dk7951',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1031871880',owner_user_name='tempest-TestSecurityGroupsBasicOps-1031871880-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:51:17Z,user_data=None,user_id='16730f38111542e58a05fb4deb2b3914',uuid=c1ab1282-e440-44ee-9a39-d59322467ce9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "address": "fa:16:3e:2b:05:b5", "network": {"id": "9da548d8-bbbf-425f-9d2c-5b231ba235e1", "bridge": "br-int", "label": "tempest-network-smoke--587721749", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape248b8bd-d7", "ovs_interfaceid": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.327 2 DEBUG nova.network.os_vif_util [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converting VIF {"id": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "address": "fa:16:3e:2b:05:b5", "network": {"id": "9da548d8-bbbf-425f-9d2c-5b231ba235e1", "bridge": "br-int", "label": "tempest-network-smoke--587721749", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape248b8bd-d7", "ovs_interfaceid": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.327 2 DEBUG nova.network.os_vif_util [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:05:b5,bridge_name='br-int',has_traffic_filtering=True,id=e248b8bd-d715-4187-bc4a-f4451f9237fd,network=Network(9da548d8-bbbf-425f-9d2c-5b231ba235e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape248b8bd-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.328 2 DEBUG os_vif [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:05:b5,bridge_name='br-int',has_traffic_filtering=True,id=e248b8bd-d715-4187-bc4a-f4451f9237fd,network=Network(9da548d8-bbbf-425f-9d2c-5b231ba235e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape248b8bd-d7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.329 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.329 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.333 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape248b8bd-d7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.334 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape248b8bd-d7, col_values=(('external_ids', {'iface-id': 'e248b8bd-d715-4187-bc4a-f4451f9237fd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:05:b5', 'vm-uuid': 'c1ab1282-e440-44ee-9a39-d59322467ce9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:26 np0005465988 NetworkManager[45041]: <info>  [1759409486.3369] manager: (tape248b8bd-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/356)
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.347 2 INFO os_vif [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:05:b5,bridge_name='br-int',has_traffic_filtering=True,id=e248b8bd-d715-4187-bc4a-f4451f9237fd,network=Network(9da548d8-bbbf-425f-9d2c-5b231ba235e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape248b8bd-d7')#033[00m
Oct  2 08:51:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:51:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.729 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.730 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.730 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] No VIF found with MAC fa:16:3e:2b:05:b5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.731 2 INFO nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Using config drive#033[00m
Oct  2 08:51:26 np0005465988 nova_compute[236126]: 2025-10-02 12:51:26.927 2 DEBUG nova.storage.rbd_utils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image c1ab1282-e440-44ee-9a39-d59322467ce9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:51:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:27.397 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:27.398 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:27.398 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:27.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:51:27 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:51:28 np0005465988 nova_compute[236126]: 2025-10-02 12:51:28.050 2 INFO nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Creating config drive at /var/lib/nova/instances/c1ab1282-e440-44ee-9a39-d59322467ce9/disk.config#033[00m
Oct  2 08:51:28 np0005465988 nova_compute[236126]: 2025-10-02 12:51:28.056 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1ab1282-e440-44ee-9a39-d59322467ce9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw030l29h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:51:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:28.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:28 np0005465988 nova_compute[236126]: 2025-10-02 12:51:28.207 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1ab1282-e440-44ee-9a39-d59322467ce9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw030l29h" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:51:28 np0005465988 nova_compute[236126]: 2025-10-02 12:51:28.251 2 DEBUG nova.storage.rbd_utils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image c1ab1282-e440-44ee-9a39-d59322467ce9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:51:28 np0005465988 nova_compute[236126]: 2025-10-02 12:51:28.257 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c1ab1282-e440-44ee-9a39-d59322467ce9/disk.config c1ab1282-e440-44ee-9a39-d59322467ce9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:51:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:51:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:51:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:51:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:29.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:30 np0005465988 nova_compute[236126]: 2025-10-02 12:51:30.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:51:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:30.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:51:31 np0005465988 nova_compute[236126]: 2025-10-02 12:51:31.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:31.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:32.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.193 2 DEBUG oslo_concurrency.processutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c1ab1282-e440-44ee-9a39-d59322467ce9/disk.config c1ab1282-e440-44ee-9a39-d59322467ce9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.936s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.194 2 INFO nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Deleting local config drive /var/lib/nova/instances/c1ab1282-e440-44ee-9a39-d59322467ce9/disk.config because it was imported into RBD.#033[00m
Oct  2 08:51:32 np0005465988 kernel: tape248b8bd-d7: entered promiscuous mode
Oct  2 08:51:32 np0005465988 NetworkManager[45041]: <info>  [1759409492.2658] manager: (tape248b8bd-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/357)
Oct  2 08:51:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:51:32Z|00815|binding|INFO|Claiming lport e248b8bd-d715-4187-bc4a-f4451f9237fd for this chassis.
Oct  2 08:51:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:51:32Z|00816|binding|INFO|e248b8bd-d715-4187-bc4a-f4451f9237fd: Claiming fa:16:3e:2b:05:b5 10.100.0.7
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005465988 NetworkManager[45041]: <info>  [1759409492.2808] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/358)
Oct  2 08:51:32 np0005465988 NetworkManager[45041]: <info>  [1759409492.2814] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/359)
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.292 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:05:b5 10.100.0.7'], port_security=['fa:16:3e:2b:05:b5 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c1ab1282-e440-44ee-9a39-d59322467ce9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9da548d8-bbbf-425f-9d2c-5b231ba235e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ade962c517a483dbfe4bb13386f0006', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'eab0f3a3-d674-4d86-8dcb-4153be335100', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb6054ec-09a9-4745-8f7e-27b822d71fe5, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=e248b8bd-d715-4187-bc4a-f4451f9237fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.293 142124 INFO neutron.agent.ovn.metadata.agent [-] Port e248b8bd-d715-4187-bc4a-f4451f9237fd in datapath 9da548d8-bbbf-425f-9d2c-5b231ba235e1 bound to our chassis#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.296 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9da548d8-bbbf-425f-9d2c-5b231ba235e1#033[00m
Oct  2 08:51:32 np0005465988 systemd-udevd[318713]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:51:32 np0005465988 systemd-machined[192594]: New machine qemu-86-instance-000000b4.
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.310 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fc33a66a-ccdb-443a-a5c0-e3a4e224e523]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.311 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9da548d8-b1 in ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.313 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9da548d8-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.313 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a500b426-d435-4859-9239-9d8842def35e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.314 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b98f83-c285-4c8a-8e51-67432a1607e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 NetworkManager[45041]: <info>  [1759409492.3219] device (tape248b8bd-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:51:32 np0005465988 NetworkManager[45041]: <info>  [1759409492.3228] device (tape248b8bd-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.327 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[de9090a1-fe4c-4f2d-aabb-5af1d129b15d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.357 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd4515e-dce1-436d-bd00-82c1db90c863]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.391 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4b33982f-80f0-4563-8fc2-ad612b908c1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 systemd[1]: Started Virtual Machine qemu-86-instance-000000b4.
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.401 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e761da0b-5aac-49f5-a3ef-47cd29d91f81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 NetworkManager[45041]: <info>  [1759409492.4037] manager: (tap9da548d8-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/360)
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:51:32Z|00817|binding|INFO|Releasing lport 1fc80788-89b8-413a-b0b0-d36f1a11a2b1 from this chassis (sb_readonly=0)
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.436 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[789fb8a0-33cd-45ee-aa01-a1a53cb1ef32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:51:32Z|00818|binding|INFO|Setting lport e248b8bd-d715-4187-bc4a-f4451f9237fd ovn-installed in OVS
Oct  2 08:51:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:51:32Z|00819|binding|INFO|Setting lport e248b8bd-d715-4187-bc4a-f4451f9237fd up in Southbound
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.440 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1a5f9489-96fc-4cd1-bf5e-6c9742a55cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005465988 NetworkManager[45041]: <info>  [1759409492.4658] device (tap9da548d8-b0): carrier: link connected
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.470 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0cd608db-c46e-4c66-830a-c6ab696c90ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.492 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e0731cdf-3343-4630-b9c8-9733b8b51f8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9da548d8-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b4:aa:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 239], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 755279, 'reachable_time': 27334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 318745, 'error': None, 'target': 'ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.510 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[262e2257-67b0-413a-bedd-5b6c34e51069]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb4:aa7c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 755279, 'tstamp': 755279}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 318747, 'error': None, 'target': 'ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.529 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[db404edc-b58f-47d7-8e0b-2fb1bc6fec44]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9da548d8-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b4:aa:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 239], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 755279, 'reachable_time': 27334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 318749, 'error': None, 'target': 'ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.564 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[07d033fa-4712-45c6-860c-69dd69352495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.625 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7ccde81f-87f2-4118-98b8-4841d54bb5f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.627 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9da548d8-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.627 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.628 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9da548d8-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:32 np0005465988 NetworkManager[45041]: <info>  [1759409492.6307] manager: (tap9da548d8-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005465988 kernel: tap9da548d8-b0: entered promiscuous mode
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.633 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9da548d8-b0, col_values=(('external_ids', {'iface-id': '82fa0cfe-92b9-4165-b82b-8f8d8f846f87'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005465988 ovn_controller[132601]: 2025-10-02T12:51:32Z|00820|binding|INFO|Releasing lport 82fa0cfe-92b9-4165-b82b-8f8d8f846f87 from this chassis (sb_readonly=0)
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.637 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9da548d8-bbbf-425f-9d2c-5b231ba235e1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9da548d8-bbbf-425f-9d2c-5b231ba235e1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.653 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[94a814a0-d4d0-4884-ae8d-bfa1783d8822]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.654 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-9da548d8-bbbf-425f-9d2c-5b231ba235e1
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/9da548d8-bbbf-425f-9d2c-5b231ba235e1.pid.haproxy
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 9da548d8-bbbf-425f-9d2c-5b231ba235e1
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:51:32 np0005465988 nova_compute[236126]: 2025-10-02 12:51:32.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:51:32.655 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1', 'env', 'PROCESS_TAG=haproxy-9da548d8-bbbf-425f-9d2c-5b231ba235e1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9da548d8-bbbf-425f-9d2c-5b231ba235e1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:51:33 np0005465988 podman[318815]: 2025-10-02 12:51:33.039283835 +0000 UTC m=+0.020025549 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:51:33 np0005465988 podman[318815]: 2025-10-02 12:51:33.162649923 +0000 UTC m=+0.143391617 container create 264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:51:33 np0005465988 systemd[1]: Started libpod-conmon-264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6.scope.
Oct  2 08:51:33 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:51:33 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1589b9c3a7d7669038d52257e882fe11c00d099fae243c34765482940e289885/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:51:33 np0005465988 podman[318815]: 2025-10-02 12:51:33.331778552 +0000 UTC m=+0.312520276 container init 264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:51:33 np0005465988 podman[318815]: 2025-10-02 12:51:33.339761942 +0000 UTC m=+0.320503646 container start 264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:51:33 np0005465988 neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1[318839]: [NOTICE]   (318843) : New worker (318845) forked
Oct  2 08:51:33 np0005465988 neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1[318839]: [NOTICE]   (318843) : Loading success.
Oct  2 08:51:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:33.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:33 np0005465988 nova_compute[236126]: 2025-10-02 12:51:33.631 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409493.631131, c1ab1282-e440-44ee-9a39-d59322467ce9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:51:33 np0005465988 nova_compute[236126]: 2025-10-02 12:51:33.632 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] VM Started (Lifecycle Event)#033[00m
Oct  2 08:51:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:51:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:34.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.437 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.444 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409493.6319697, c1ab1282-e440-44ee-9a39-d59322467ce9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.445 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:51:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.796 2 DEBUG nova.compute.manager [req-6a0ef22e-2486-4fcb-816a-8cf74c7240a3 req-3e987771-85c7-42d3-a2bf-006e05456c79 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Received event network-vif-plugged-e248b8bd-d715-4187-bc4a-f4451f9237fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.796 2 DEBUG oslo_concurrency.lockutils [req-6a0ef22e-2486-4fcb-816a-8cf74c7240a3 req-3e987771-85c7-42d3-a2bf-006e05456c79 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.796 2 DEBUG oslo_concurrency.lockutils [req-6a0ef22e-2486-4fcb-816a-8cf74c7240a3 req-3e987771-85c7-42d3-a2bf-006e05456c79 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.796 2 DEBUG oslo_concurrency.lockutils [req-6a0ef22e-2486-4fcb-816a-8cf74c7240a3 req-3e987771-85c7-42d3-a2bf-006e05456c79 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.797 2 DEBUG nova.compute.manager [req-6a0ef22e-2486-4fcb-816a-8cf74c7240a3 req-3e987771-85c7-42d3-a2bf-006e05456c79 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Processing event network-vif-plugged-e248b8bd-d715-4187-bc4a-f4451f9237fd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.797 2 DEBUG nova.compute.manager [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.801 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.807 2 INFO nova.virt.libvirt.driver [-] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Instance spawned successfully.#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.808 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.852 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.862 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409494.8009984, c1ab1282-e440-44ee-9a39-d59322467ce9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.863 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.867 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.867 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.868 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.868 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.869 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:34 np0005465988 nova_compute[236126]: 2025-10-02 12:51:34.869 2 DEBUG nova.virt.libvirt.driver [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:35 np0005465988 nova_compute[236126]: 2025-10-02 12:51:35.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:35 np0005465988 nova_compute[236126]: 2025-10-02 12:51:35.041 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:51:35 np0005465988 nova_compute[236126]: 2025-10-02 12:51:35.044 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:51:35 np0005465988 nova_compute[236126]: 2025-10-02 12:51:35.134 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:51:35 np0005465988 nova_compute[236126]: 2025-10-02 12:51:35.177 2 INFO nova.compute.manager [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Took 17.34 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:51:35 np0005465988 nova_compute[236126]: 2025-10-02 12:51:35.177 2 DEBUG nova.compute.manager [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:51:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:51:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:51:35 np0005465988 nova_compute[236126]: 2025-10-02 12:51:35.350 2 INFO nova.compute.manager [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Took 22.67 seconds to build instance.#033[00m
Oct  2 08:51:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:35.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:35 np0005465988 nova_compute[236126]: 2025-10-02 12:51:35.461 2 DEBUG oslo_concurrency.lockutils [None req-779fdfc6-0155-4cc5-9e99-d93d0bd5ba03 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:36.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:36 np0005465988 nova_compute[236126]: 2025-10-02 12:51:36.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:37 np0005465988 nova_compute[236126]: 2025-10-02 12:51:37.111 2 DEBUG nova.compute.manager [req-465fc611-5d9a-40b1-8633-ee57376450b9 req-068d6f14-a547-4bda-a5b8-5fbcd42fc6af d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Received event network-vif-plugged-e248b8bd-d715-4187-bc4a-f4451f9237fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:37 np0005465988 nova_compute[236126]: 2025-10-02 12:51:37.112 2 DEBUG oslo_concurrency.lockutils [req-465fc611-5d9a-40b1-8633-ee57376450b9 req-068d6f14-a547-4bda-a5b8-5fbcd42fc6af d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:37 np0005465988 nova_compute[236126]: 2025-10-02 12:51:37.112 2 DEBUG oslo_concurrency.lockutils [req-465fc611-5d9a-40b1-8633-ee57376450b9 req-068d6f14-a547-4bda-a5b8-5fbcd42fc6af d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:37 np0005465988 nova_compute[236126]: 2025-10-02 12:51:37.112 2 DEBUG oslo_concurrency.lockutils [req-465fc611-5d9a-40b1-8633-ee57376450b9 req-068d6f14-a547-4bda-a5b8-5fbcd42fc6af d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:37 np0005465988 nova_compute[236126]: 2025-10-02 12:51:37.113 2 DEBUG nova.compute.manager [req-465fc611-5d9a-40b1-8633-ee57376450b9 req-068d6f14-a547-4bda-a5b8-5fbcd42fc6af d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] No waiting events found dispatching network-vif-plugged-e248b8bd-d715-4187-bc4a-f4451f9237fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:51:37 np0005465988 nova_compute[236126]: 2025-10-02 12:51:37.113 2 WARNING nova.compute.manager [req-465fc611-5d9a-40b1-8633-ee57376450b9 req-068d6f14-a547-4bda-a5b8-5fbcd42fc6af d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Received unexpected event network-vif-plugged-e248b8bd-d715-4187-bc4a-f4451f9237fd for instance with vm_state active and task_state None.#033[00m
Oct  2 08:51:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:37.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:37 np0005465988 podman[318956]: 2025-10-02 12:51:37.544352217 +0000 UTC m=+0.075909911 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:51:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:38.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:39.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:40 np0005465988 nova_compute[236126]: 2025-10-02 12:51:40.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:40.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:41 np0005465988 nova_compute[236126]: 2025-10-02 12:51:41.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:41.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:41 np0005465988 nova_compute[236126]: 2025-10-02 12:51:41.459 2 DEBUG nova.compute.manager [req-fef40b79-c4eb-46e6-992f-4a048ce195b6 req-dbbb8eb9-6c5f-4bfc-a718-30339a0deb7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Received event network-changed-e248b8bd-d715-4187-bc4a-f4451f9237fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:41 np0005465988 nova_compute[236126]: 2025-10-02 12:51:41.459 2 DEBUG nova.compute.manager [req-fef40b79-c4eb-46e6-992f-4a048ce195b6 req-dbbb8eb9-6c5f-4bfc-a718-30339a0deb7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Refreshing instance network info cache due to event network-changed-e248b8bd-d715-4187-bc4a-f4451f9237fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:51:41 np0005465988 nova_compute[236126]: 2025-10-02 12:51:41.460 2 DEBUG oslo_concurrency.lockutils [req-fef40b79-c4eb-46e6-992f-4a048ce195b6 req-dbbb8eb9-6c5f-4bfc-a718-30339a0deb7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:51:41 np0005465988 nova_compute[236126]: 2025-10-02 12:51:41.460 2 DEBUG oslo_concurrency.lockutils [req-fef40b79-c4eb-46e6-992f-4a048ce195b6 req-dbbb8eb9-6c5f-4bfc-a718-30339a0deb7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:51:41 np0005465988 nova_compute[236126]: 2025-10-02 12:51:41.460 2 DEBUG nova.network.neutron [req-fef40b79-c4eb-46e6-992f-4a048ce195b6 req-dbbb8eb9-6c5f-4bfc-a718-30339a0deb7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Refreshing network info cache for port e248b8bd-d715-4187-bc4a-f4451f9237fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:51:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:42.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:43.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:43 np0005465988 nova_compute[236126]: 2025-10-02 12:51:43.779 2 DEBUG nova.compute.manager [req-e9065eb2-e480-41d4-b280-0194eb376ff3 req-2070001f-d704-45a6-84bb-9790ac9c3e0d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Received event network-changed-e248b8bd-d715-4187-bc4a-f4451f9237fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:43 np0005465988 nova_compute[236126]: 2025-10-02 12:51:43.779 2 DEBUG nova.compute.manager [req-e9065eb2-e480-41d4-b280-0194eb376ff3 req-2070001f-d704-45a6-84bb-9790ac9c3e0d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Refreshing instance network info cache due to event network-changed-e248b8bd-d715-4187-bc4a-f4451f9237fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:51:43 np0005465988 nova_compute[236126]: 2025-10-02 12:51:43.780 2 DEBUG oslo_concurrency.lockutils [req-e9065eb2-e480-41d4-b280-0194eb376ff3 req-2070001f-d704-45a6-84bb-9790ac9c3e0d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:51:44 np0005465988 nova_compute[236126]: 2025-10-02 12:51:44.060 2 DEBUG nova.network.neutron [req-fef40b79-c4eb-46e6-992f-4a048ce195b6 req-dbbb8eb9-6c5f-4bfc-a718-30339a0deb7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Updated VIF entry in instance network info cache for port e248b8bd-d715-4187-bc4a-f4451f9237fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:51:44 np0005465988 nova_compute[236126]: 2025-10-02 12:51:44.060 2 DEBUG nova.network.neutron [req-fef40b79-c4eb-46e6-992f-4a048ce195b6 req-dbbb8eb9-6c5f-4bfc-a718-30339a0deb7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Updating instance_info_cache with network_info: [{"id": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "address": "fa:16:3e:2b:05:b5", "network": {"id": "9da548d8-bbbf-425f-9d2c-5b231ba235e1", "bridge": "br-int", "label": "tempest-network-smoke--587721749", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape248b8bd-d7", "ovs_interfaceid": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:51:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:44.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:44 np0005465988 nova_compute[236126]: 2025-10-02 12:51:44.215 2 DEBUG oslo_concurrency.lockutils [req-fef40b79-c4eb-46e6-992f-4a048ce195b6 req-dbbb8eb9-6c5f-4bfc-a718-30339a0deb7d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:51:44 np0005465988 nova_compute[236126]: 2025-10-02 12:51:44.216 2 DEBUG oslo_concurrency.lockutils [req-e9065eb2-e480-41d4-b280-0194eb376ff3 req-2070001f-d704-45a6-84bb-9790ac9c3e0d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:51:44 np0005465988 nova_compute[236126]: 2025-10-02 12:51:44.216 2 DEBUG nova.network.neutron [req-e9065eb2-e480-41d4-b280-0194eb376ff3 req-2070001f-d704-45a6-84bb-9790ac9c3e0d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Refreshing network info cache for port e248b8bd-d715-4187-bc4a-f4451f9237fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:51:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:45 np0005465988 nova_compute[236126]: 2025-10-02 12:51:45.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:45.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:46.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:46 np0005465988 nova_compute[236126]: 2025-10-02 12:51:46.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:47 np0005465988 nova_compute[236126]: 2025-10-02 12:51:47.022 2 DEBUG nova.network.neutron [req-e9065eb2-e480-41d4-b280-0194eb376ff3 req-2070001f-d704-45a6-84bb-9790ac9c3e0d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Updated VIF entry in instance network info cache for port e248b8bd-d715-4187-bc4a-f4451f9237fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:51:47 np0005465988 nova_compute[236126]: 2025-10-02 12:51:47.023 2 DEBUG nova.network.neutron [req-e9065eb2-e480-41d4-b280-0194eb376ff3 req-2070001f-d704-45a6-84bb-9790ac9c3e0d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Updating instance_info_cache with network_info: [{"id": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "address": "fa:16:3e:2b:05:b5", "network": {"id": "9da548d8-bbbf-425f-9d2c-5b231ba235e1", "bridge": "br-int", "label": "tempest-network-smoke--587721749", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape248b8bd-d7", "ovs_interfaceid": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:51:47 np0005465988 nova_compute[236126]: 2025-10-02 12:51:47.115 2 DEBUG oslo_concurrency.lockutils [req-e9065eb2-e480-41d4-b280-0194eb376ff3 req-2070001f-d704-45a6-84bb-9790ac9c3e0d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-c1ab1282-e440-44ee-9a39-d59322467ce9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:51:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:47.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:48.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:49 np0005465988 ovn_controller[132601]: 2025-10-02T12:51:49Z|00106|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2b:05:b5 10.100.0.7
Oct  2 08:51:49 np0005465988 ovn_controller[132601]: 2025-10-02T12:51:49Z|00107|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2b:05:b5 10.100.0.7
Oct  2 08:51:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:49.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:49 np0005465988 podman[318985]: 2025-10-02 12:51:49.549498073 +0000 UTC m=+0.073629465 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:51:49 np0005465988 podman[318986]: 2025-10-02 12:51:49.550865772 +0000 UTC m=+0.073221003 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:51:49 np0005465988 podman[318984]: 2025-10-02 12:51:49.574234456 +0000 UTC m=+0.104008391 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:51:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:50 np0005465988 nova_compute[236126]: 2025-10-02 12:51:50.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:50.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:51 np0005465988 nova_compute[236126]: 2025-10-02 12:51:51.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:51.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:52.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:53.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:54.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:51:55 np0005465988 nova_compute[236126]: 2025-10-02 12:51:55.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:55.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:51:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:56.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:51:56 np0005465988 nova_compute[236126]: 2025-10-02 12:51:56.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:57.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:51:58.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:51:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:51:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:51:59.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:51:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:00 np0005465988 nova_compute[236126]: 2025-10-02 12:52:00.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:00.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:00 np0005465988 nova_compute[236126]: 2025-10-02 12:52:00.424 2 DEBUG oslo_concurrency.lockutils [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "c1ab1282-e440-44ee-9a39-d59322467ce9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:00 np0005465988 nova_compute[236126]: 2025-10-02 12:52:00.425 2 DEBUG oslo_concurrency.lockutils [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:00 np0005465988 nova_compute[236126]: 2025-10-02 12:52:00.425 2 DEBUG oslo_concurrency.lockutils [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:00 np0005465988 nova_compute[236126]: 2025-10-02 12:52:00.425 2 DEBUG oslo_concurrency.lockutils [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:00 np0005465988 nova_compute[236126]: 2025-10-02 12:52:00.426 2 DEBUG oslo_concurrency.lockutils [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:00 np0005465988 nova_compute[236126]: 2025-10-02 12:52:00.427 2 INFO nova.compute.manager [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Terminating instance#033[00m
Oct  2 08:52:00 np0005465988 nova_compute[236126]: 2025-10-02 12:52:00.428 2 DEBUG nova.compute.manager [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:52:00 np0005465988 kernel: tape248b8bd-d7 (unregistering): left promiscuous mode
Oct  2 08:52:00 np0005465988 NetworkManager[45041]: <info>  [1759409520.8806] device (tape248b8bd-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:52:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:52:00Z|00821|binding|INFO|Releasing lport e248b8bd-d715-4187-bc4a-f4451f9237fd from this chassis (sb_readonly=0)
Oct  2 08:52:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:52:00Z|00822|binding|INFO|Setting lport e248b8bd-d715-4187-bc4a-f4451f9237fd down in Southbound
Oct  2 08:52:00 np0005465988 ovn_controller[132601]: 2025-10-02T12:52:00Z|00823|binding|INFO|Removing iface tape248b8bd-d7 ovn-installed in OVS
Oct  2 08:52:00 np0005465988 nova_compute[236126]: 2025-10-02 12:52:00.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:00 np0005465988 nova_compute[236126]: 2025-10-02 12:52:00.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:01 np0005465988 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b4.scope: Deactivated successfully.
Oct  2 08:52:01 np0005465988 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b4.scope: Consumed 14.239s CPU time.
Oct  2 08:52:01 np0005465988 systemd-machined[192594]: Machine qemu-86-instance-000000b4 terminated.
Oct  2 08:52:01 np0005465988 nova_compute[236126]: 2025-10-02 12:52:01.072 2 INFO nova.virt.libvirt.driver [-] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Instance destroyed successfully.#033[00m
Oct  2 08:52:01 np0005465988 nova_compute[236126]: 2025-10-02 12:52:01.073 2 DEBUG nova.objects.instance [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lazy-loading 'resources' on Instance uuid c1ab1282-e440-44ee-9a39-d59322467ce9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:52:01 np0005465988 nova_compute[236126]: 2025-10-02 12:52:01.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:01.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:02.159 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:05:b5 10.100.0.7', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c1ab1282-e440-44ee-9a39-d59322467ce9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9da548d8-bbbf-425f-9d2c-5b231ba235e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ade962c517a483dbfe4bb13386f0006', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb6054ec-09a9-4745-8f7e-27b822d71fe5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=e248b8bd-d715-4187-bc4a-f4451f9237fd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:52:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:02.161 142124 INFO neutron.agent.ovn.metadata.agent [-] Port e248b8bd-d715-4187-bc4a-f4451f9237fd in datapath 9da548d8-bbbf-425f-9d2c-5b231ba235e1 unbound from our chassis#033[00m
Oct  2 08:52:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:02.163 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9da548d8-bbbf-425f-9d2c-5b231ba235e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:52:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:02.164 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4520caa9-4375-4341-b9b1-4e9b873a2837]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:52:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:02.165 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1 namespace which is not needed anymore#033[00m
Oct  2 08:52:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:02.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:02 np0005465988 nova_compute[236126]: 2025-10-02 12:52:02.328 2 DEBUG nova.virt.libvirt.vif [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:51:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-gen-1-1731550910',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-gen-1-1731550910',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1031871880-ge',id=180,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNswVb0qzE60u4BIwTdoxN1wCq1mjTg0LI0f0dRfL9Dy5OWE5514P4onN+QOd1Rw6rOzpCQmFmdmXCTuqTp4M4QoUJGeM1vbbJ7M/8I3p45f/7+fxTSWHkhjk1j/aCR05g==',key_name='tempest-TestSecurityGroupsBasicOps-605389685',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:51:35Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5ade962c517a483dbfe4bb13386f0006',ramdisk_id='',reservation_id='r-v3dk7951',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1031871880',owner_user_name='tempest-TestSecurityGroupsBasicOps-1031871880-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:51:35Z,user_data=None,user_id='16730f38111542e58a05fb4deb2b3914',uuid=c1ab1282-e440-44ee-9a39-d59322467ce9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "address": "fa:16:3e:2b:05:b5", "network": {"id": "9da548d8-bbbf-425f-9d2c-5b231ba235e1", "bridge": "br-int", "label": "tempest-network-smoke--587721749", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape248b8bd-d7", "ovs_interfaceid": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:52:02 np0005465988 nova_compute[236126]: 2025-10-02 12:52:02.329 2 DEBUG nova.network.os_vif_util [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converting VIF {"id": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "address": "fa:16:3e:2b:05:b5", "network": {"id": "9da548d8-bbbf-425f-9d2c-5b231ba235e1", "bridge": "br-int", "label": "tempest-network-smoke--587721749", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape248b8bd-d7", "ovs_interfaceid": "e248b8bd-d715-4187-bc4a-f4451f9237fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:52:02 np0005465988 nova_compute[236126]: 2025-10-02 12:52:02.330 2 DEBUG nova.network.os_vif_util [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:05:b5,bridge_name='br-int',has_traffic_filtering=True,id=e248b8bd-d715-4187-bc4a-f4451f9237fd,network=Network(9da548d8-bbbf-425f-9d2c-5b231ba235e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape248b8bd-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:52:02 np0005465988 nova_compute[236126]: 2025-10-02 12:52:02.331 2 DEBUG os_vif [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:05:b5,bridge_name='br-int',has_traffic_filtering=True,id=e248b8bd-d715-4187-bc4a-f4451f9237fd,network=Network(9da548d8-bbbf-425f-9d2c-5b231ba235e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape248b8bd-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:52:02 np0005465988 nova_compute[236126]: 2025-10-02 12:52:02.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:02 np0005465988 nova_compute[236126]: 2025-10-02 12:52:02.335 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape248b8bd-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:52:02 np0005465988 nova_compute[236126]: 2025-10-02 12:52:02.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:02 np0005465988 nova_compute[236126]: 2025-10-02 12:52:02.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:02 np0005465988 nova_compute[236126]: 2025-10-02 12:52:02.342 2 INFO os_vif [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:05:b5,bridge_name='br-int',has_traffic_filtering=True,id=e248b8bd-d715-4187-bc4a-f4451f9237fd,network=Network(9da548d8-bbbf-425f-9d2c-5b231ba235e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape248b8bd-d7')#033[00m
Oct  2 08:52:02 np0005465988 nova_compute[236126]: 2025-10-02 12:52:02.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:02 np0005465988 neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1[318839]: [NOTICE]   (318843) : haproxy version is 2.8.14-c23fe91
Oct  2 08:52:02 np0005465988 neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1[318839]: [NOTICE]   (318843) : path to executable is /usr/sbin/haproxy
Oct  2 08:52:02 np0005465988 neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1[318839]: [WARNING]  (318843) : Exiting Master process...
Oct  2 08:52:02 np0005465988 neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1[318839]: [ALERT]    (318843) : Current worker (318845) exited with code 143 (Terminated)
Oct  2 08:52:02 np0005465988 neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1[318839]: [WARNING]  (318843) : All workers exited. Exiting... (0)
Oct  2 08:52:02 np0005465988 systemd[1]: libpod-264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6.scope: Deactivated successfully.
Oct  2 08:52:02 np0005465988 podman[319135]: 2025-10-02 12:52:02.564901123 +0000 UTC m=+0.287598008 container died 264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:52:03 np0005465988 nova_compute[236126]: 2025-10-02 12:52:03.366 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:03 np0005465988 nova_compute[236126]: 2025-10-02 12:52:03.367 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:03 np0005465988 nova_compute[236126]: 2025-10-02 12:52:03.367 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:03 np0005465988 nova_compute[236126]: 2025-10-02 12:52:03.367 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:52:03 np0005465988 nova_compute[236126]: 2025-10-02 12:52:03.367 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:52:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:03.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:04.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:04 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6-userdata-shm.mount: Deactivated successfully.
Oct  2 08:52:04 np0005465988 systemd[1]: var-lib-containers-storage-overlay-1589b9c3a7d7669038d52257e882fe11c00d099fae243c34765482940e289885-merged.mount: Deactivated successfully.
Oct  2 08:52:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:52:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1114481088' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:52:04 np0005465988 nova_compute[236126]: 2025-10-02 12:52:04.686 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.319s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:52:05 np0005465988 nova_compute[236126]: 2025-10-02 12:52:05.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:05.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:05 np0005465988 podman[319135]: 2025-10-02 12:52:05.565549715 +0000 UTC m=+3.288246570 container cleanup 264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:52:05 np0005465988 systemd[1]: libpod-conmon-264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6.scope: Deactivated successfully.
Oct  2 08:52:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:06.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:06 np0005465988 podman[319207]: 2025-10-02 12:52:06.541889581 +0000 UTC m=+0.947076042 container remove 264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:52:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:06.553 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[227ab9d2-286c-4551-96f9-3a71c4dd9f86]: (4, ('Thu Oct  2 12:52:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1 (264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6)\n264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6\nThu Oct  2 12:52:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1 (264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6)\n264dc9f85800adb8bf8aa59f83794324d577ad4997a3e34e3b63f544d5a4ceb6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:52:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:06.555 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[188678f7-4fb3-4e98-bcf4-deb1939f1382]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:52:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:06.557 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9da548d8-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:52:06 np0005465988 nova_compute[236126]: 2025-10-02 12:52:06.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:06 np0005465988 kernel: tap9da548d8-b0: left promiscuous mode
Oct  2 08:52:06 np0005465988 nova_compute[236126]: 2025-10-02 12:52:06.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:06.589 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c81cf2-e457-4387-bc1f-49e5fac57440]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:52:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:06.626 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1321b627-e590-4ff4-9bed-87333057dfaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:52:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:06.628 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[65e4a897-e490-4dd4-8400-5202cabbae1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:52:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:06.651 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9c247a16-9728-42d1-a586-8d06a1709147]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 755271, 'reachable_time': 39380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 319224, 'error': None, 'target': 'ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:52:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:06.655 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9da548d8-bbbf-425f-9d2c-5b231ba235e1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:52:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:06.655 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[4c50fc41-f54b-4547-939c-338505b05f6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:52:06 np0005465988 systemd[1]: run-netns-ovnmeta\x2d9da548d8\x2dbbbf\x2d425f\x2d9d2c\x2d5b231ba235e1.mount: Deactivated successfully.
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.042 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.043 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.048 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.048 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.053 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000b4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.054 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000b4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.290 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.292 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3612MB free_disk=20.694046020507812GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.292 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.292 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:07.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.507 2 DEBUG nova.compute.manager [req-738a2bd3-b24b-467d-901f-66c28a42205f req-7a6da6fd-50b3-4d95-b372-d63b325fafba d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Received event network-vif-unplugged-e248b8bd-d715-4187-bc4a-f4451f9237fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.508 2 DEBUG oslo_concurrency.lockutils [req-738a2bd3-b24b-467d-901f-66c28a42205f req-7a6da6fd-50b3-4d95-b372-d63b325fafba d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.508 2 DEBUG oslo_concurrency.lockutils [req-738a2bd3-b24b-467d-901f-66c28a42205f req-7a6da6fd-50b3-4d95-b372-d63b325fafba d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.509 2 DEBUG oslo_concurrency.lockutils [req-738a2bd3-b24b-467d-901f-66c28a42205f req-7a6da6fd-50b3-4d95-b372-d63b325fafba d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.509 2 DEBUG nova.compute.manager [req-738a2bd3-b24b-467d-901f-66c28a42205f req-7a6da6fd-50b3-4d95-b372-d63b325fafba d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] No waiting events found dispatching network-vif-unplugged-e248b8bd-d715-4187-bc4a-f4451f9237fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:52:07 np0005465988 nova_compute[236126]: 2025-10-02 12:52:07.509 2 DEBUG nova.compute.manager [req-738a2bd3-b24b-467d-901f-66c28a42205f req-7a6da6fd-50b3-4d95-b372-d63b325fafba d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Received event network-vif-unplugged-e248b8bd-d715-4187-bc4a-f4451f9237fd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:52:08 np0005465988 nova_compute[236126]: 2025-10-02 12:52:08.045 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 439392e5-66ae-4162-a7e5-077f87ca558b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:52:08 np0005465988 nova_compute[236126]: 2025-10-02 12:52:08.046 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:52:08 np0005465988 nova_compute[236126]: 2025-10-02 12:52:08.046 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance c1ab1282-e440-44ee-9a39-d59322467ce9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:52:08 np0005465988 nova_compute[236126]: 2025-10-02 12:52:08.046 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:52:08 np0005465988 nova_compute[236126]: 2025-10-02 12:52:08.046 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:52:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:08.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:08 np0005465988 nova_compute[236126]: 2025-10-02 12:52:08.246 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:52:08 np0005465988 podman[319248]: 2025-10-02 12:52:08.530204321 +0000 UTC m=+0.074161431 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:52:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:52:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2235344967' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:52:08 np0005465988 nova_compute[236126]: 2025-10-02 12:52:08.718 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:52:08 np0005465988 nova_compute[236126]: 2025-10-02 12:52:08.725 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:52:09 np0005465988 nova_compute[236126]: 2025-10-02 12:52:09.164 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:52:09 np0005465988 nova_compute[236126]: 2025-10-02 12:52:09.295 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:52:09 np0005465988 nova_compute[236126]: 2025-10-02 12:52:09.296 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:09.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:09 np0005465988 nova_compute[236126]: 2025-10-02 12:52:09.839 2 DEBUG nova.compute.manager [req-2646d91f-2151-470f-b15d-4570ecaea33d req-5f038397-5cb7-4790-9224-daabdd88ee74 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Received event network-vif-plugged-e248b8bd-d715-4187-bc4a-f4451f9237fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:52:09 np0005465988 nova_compute[236126]: 2025-10-02 12:52:09.840 2 DEBUG oslo_concurrency.lockutils [req-2646d91f-2151-470f-b15d-4570ecaea33d req-5f038397-5cb7-4790-9224-daabdd88ee74 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:09 np0005465988 nova_compute[236126]: 2025-10-02 12:52:09.841 2 DEBUG oslo_concurrency.lockutils [req-2646d91f-2151-470f-b15d-4570ecaea33d req-5f038397-5cb7-4790-9224-daabdd88ee74 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:09 np0005465988 nova_compute[236126]: 2025-10-02 12:52:09.842 2 DEBUG oslo_concurrency.lockutils [req-2646d91f-2151-470f-b15d-4570ecaea33d req-5f038397-5cb7-4790-9224-daabdd88ee74 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:09 np0005465988 nova_compute[236126]: 2025-10-02 12:52:09.842 2 DEBUG nova.compute.manager [req-2646d91f-2151-470f-b15d-4570ecaea33d req-5f038397-5cb7-4790-9224-daabdd88ee74 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] No waiting events found dispatching network-vif-plugged-e248b8bd-d715-4187-bc4a-f4451f9237fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:52:09 np0005465988 nova_compute[236126]: 2025-10-02 12:52:09.843 2 WARNING nova.compute.manager [req-2646d91f-2151-470f-b15d-4570ecaea33d req-5f038397-5cb7-4790-9224-daabdd88ee74 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Received unexpected event network-vif-plugged-e248b8bd-d715-4187-bc4a-f4451f9237fd for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:52:10 np0005465988 nova_compute[236126]: 2025-10-02 12:52:10.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:10.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:11.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:12.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:12 np0005465988 nova_compute[236126]: 2025-10-02 12:52:12.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:13.145 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:52:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:13.146 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:52:13 np0005465988 nova_compute[236126]: 2025-10-02 12:52:13.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:13 np0005465988 nova_compute[236126]: 2025-10-02 12:52:13.297 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:13.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:14.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:15 np0005465988 nova_compute[236126]: 2025-10-02 12:52:15.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:15 np0005465988 nova_compute[236126]: 2025-10-02 12:52:15.154 2 INFO nova.virt.libvirt.driver [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Deleting instance files /var/lib/nova/instances/c1ab1282-e440-44ee-9a39-d59322467ce9_del#033[00m
Oct  2 08:52:15 np0005465988 nova_compute[236126]: 2025-10-02 12:52:15.155 2 INFO nova.virt.libvirt.driver [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Deletion of /var/lib/nova/instances/c1ab1282-e440-44ee-9a39-d59322467ce9_del complete#033[00m
Oct  2 08:52:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:15.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:16 np0005465988 nova_compute[236126]: 2025-10-02 12:52:16.070 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409521.0690436, c1ab1282-e440-44ee-9a39-d59322467ce9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:52:16 np0005465988 nova_compute[236126]: 2025-10-02 12:52:16.070 2 INFO nova.compute.manager [-] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:52:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:16.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:16 np0005465988 nova_compute[236126]: 2025-10-02 12:52:16.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:17 np0005465988 nova_compute[236126]: 2025-10-02 12:52:17.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:17.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e379 e379: 3 total, 3 up, 3 in
Oct  2 08:52:18 np0005465988 nova_compute[236126]: 2025-10-02 12:52:18.168 2 INFO nova.compute.manager [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Took 17.74 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:52:18 np0005465988 nova_compute[236126]: 2025-10-02 12:52:18.169 2 DEBUG oslo.service.loopingcall [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:52:18 np0005465988 nova_compute[236126]: 2025-10-02 12:52:18.170 2 DEBUG nova.compute.manager [-] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:52:18 np0005465988 nova_compute[236126]: 2025-10-02 12:52:18.170 2 DEBUG nova.network.neutron [-] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:52:18 np0005465988 nova_compute[236126]: 2025-10-02 12:52:18.193 2 DEBUG nova.compute.manager [None req-2664fa1c-6225-4ada-b30d-61fb51061251 - - - - - -] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:52:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:18.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:18 np0005465988 nova_compute[236126]: 2025-10-02 12:52:18.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:18 np0005465988 nova_compute[236126]: 2025-10-02 12:52:18.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:52:19 np0005465988 nova_compute[236126]: 2025-10-02 12:52:19.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:52:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:19.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:52:20 np0005465988 nova_compute[236126]: 2025-10-02 12:52:20.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:20.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:20 np0005465988 nova_compute[236126]: 2025-10-02 12:52:20.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:20 np0005465988 podman[319331]: 2025-10-02 12:52:20.553493139 +0000 UTC m=+0.076582020 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:52:20 np0005465988 podman[319332]: 2025-10-02 12:52:20.564014583 +0000 UTC m=+0.074762038 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:52:20 np0005465988 podman[319330]: 2025-10-02 12:52:20.586293075 +0000 UTC m=+0.113739592 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:52:21 np0005465988 nova_compute[236126]: 2025-10-02 12:52:21.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:21.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:22.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:22 np0005465988 nova_compute[236126]: 2025-10-02 12:52:22.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:22 np0005465988 nova_compute[236126]: 2025-10-02 12:52:22.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:23.148 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:52:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e380 e380: 3 total, 3 up, 3 in
Oct  2 08:52:23 np0005465988 nova_compute[236126]: 2025-10-02 12:52:23.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:23 np0005465988 nova_compute[236126]: 2025-10-02 12:52:23.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:52:23 np0005465988 nova_compute[236126]: 2025-10-02 12:52:23.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:52:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:52:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:23.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:52:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:24.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:24 np0005465988 nova_compute[236126]: 2025-10-02 12:52:24.717 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  2 08:52:25 np0005465988 nova_compute[236126]: 2025-10-02 12:52:25.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:25.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e381 e381: 3 total, 3 up, 3 in
Oct  2 08:52:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:26.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:27.399 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:27.399 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:52:27.400 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:27 np0005465988 nova_compute[236126]: 2025-10-02 12:52:27.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:27.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:28.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:29.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 e382: 3 total, 3 up, 3 in
Oct  2 08:52:30 np0005465988 nova_compute[236126]: 2025-10-02 12:52:30.037 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:52:30 np0005465988 nova_compute[236126]: 2025-10-02 12:52:30.037 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:52:30 np0005465988 nova_compute[236126]: 2025-10-02 12:52:30.038 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:52:30 np0005465988 nova_compute[236126]: 2025-10-02 12:52:30.038 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:52:30 np0005465988 nova_compute[236126]: 2025-10-02 12:52:30.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:30.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:31.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:31 np0005465988 nova_compute[236126]: 2025-10-02 12:52:31.645 2 DEBUG nova.network.neutron [-] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:52:31 np0005465988 nova_compute[236126]: 2025-10-02 12:52:31.744 2 DEBUG nova.compute.manager [req-91193925-1f1a-4180-8c8d-e09e768bf558 req-a6e75fe1-5ee8-475d-93ce-48e41471af09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Received event network-vif-deleted-e248b8bd-d715-4187-bc4a-f4451f9237fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:52:31 np0005465988 nova_compute[236126]: 2025-10-02 12:52:31.744 2 INFO nova.compute.manager [req-91193925-1f1a-4180-8c8d-e09e768bf558 req-a6e75fe1-5ee8-475d-93ce-48e41471af09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Neutron deleted interface e248b8bd-d715-4187-bc4a-f4451f9237fd; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:52:31 np0005465988 nova_compute[236126]: 2025-10-02 12:52:31.745 2 DEBUG nova.network.neutron [req-91193925-1f1a-4180-8c8d-e09e768bf558 req-a6e75fe1-5ee8-475d-93ce-48e41471af09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:52:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:32.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:32 np0005465988 nova_compute[236126]: 2025-10-02 12:52:32.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:33 np0005465988 nova_compute[236126]: 2025-10-02 12:52:33.073 2 INFO nova.compute.manager [-] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Took 14.90 seconds to deallocate network for instance.#033[00m
Oct  2 08:52:33 np0005465988 nova_compute[236126]: 2025-10-02 12:52:33.082 2 DEBUG nova.compute.manager [req-91193925-1f1a-4180-8c8d-e09e768bf558 req-a6e75fe1-5ee8-475d-93ce-48e41471af09 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: c1ab1282-e440-44ee-9a39-d59322467ce9] Detach interface failed, port_id=e248b8bd-d715-4187-bc4a-f4451f9237fd, reason: Instance c1ab1282-e440-44ee-9a39-d59322467ce9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:52:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:33.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:33 np0005465988 nova_compute[236126]: 2025-10-02 12:52:33.962 2 DEBUG oslo_concurrency.lockutils [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:33 np0005465988 nova_compute[236126]: 2025-10-02 12:52:33.963 2 DEBUG oslo_concurrency.lockutils [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:34 np0005465988 nova_compute[236126]: 2025-10-02 12:52:34.116 2 DEBUG oslo_concurrency.processutils [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:52:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:34.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:52:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/653364688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:52:34 np0005465988 nova_compute[236126]: 2025-10-02 12:52:34.598 2 DEBUG oslo_concurrency.processutils [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:52:34 np0005465988 nova_compute[236126]: 2025-10-02 12:52:34.605 2 DEBUG nova.compute.provider_tree [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:52:34 np0005465988 nova_compute[236126]: 2025-10-02 12:52:34.912 2 DEBUG nova.scheduler.client.report [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:52:35 np0005465988 nova_compute[236126]: 2025-10-02 12:52:35.041 2 DEBUG oslo_concurrency.lockutils [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:35 np0005465988 nova_compute[236126]: 2025-10-02 12:52:35.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:35 np0005465988 nova_compute[236126]: 2025-10-02 12:52:35.073 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updating instance_info_cache with network_info: [{"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:52:35 np0005465988 nova_compute[236126]: 2025-10-02 12:52:35.204 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:52:35 np0005465988 nova_compute[236126]: 2025-10-02 12:52:35.205 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:52:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:35 np0005465988 nova_compute[236126]: 2025-10-02 12:52:35.419 2 INFO nova.scheduler.client.report [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Deleted allocations for instance c1ab1282-e440-44ee-9a39-d59322467ce9#033[00m
Oct  2 08:52:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:35.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:36 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:52:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:36.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:37 np0005465988 nova_compute[236126]: 2025-10-02 12:52:37.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:37.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:52:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:52:37 np0005465988 nova_compute[236126]: 2025-10-02 12:52:37.747 2 DEBUG oslo_concurrency.lockutils [None req-d1622e3d-4bb5-490c-a2f7-ffbb015ee983 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "c1ab1282-e440-44ee-9a39-d59322467ce9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 37.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:38.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:39.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:39 np0005465988 podman[319606]: 2025-10-02 12:52:39.55302177 +0000 UTC m=+0.093806647 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  2 08:52:40 np0005465988 nova_compute[236126]: 2025-10-02 12:52:40.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:40.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:41.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:42.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:42 np0005465988 nova_compute[236126]: 2025-10-02 12:52:42.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:52:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:52:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:43.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:44.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:45 np0005465988 nova_compute[236126]: 2025-10-02 12:52:45.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:52:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:45.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:52:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:46.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:47 np0005465988 nova_compute[236126]: 2025-10-02 12:52:47.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:47.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:52:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:48.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:52:49 np0005465988 nova_compute[236126]: 2025-10-02 12:52:49.201 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:49.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:50 np0005465988 nova_compute[236126]: 2025-10-02 12:52:50.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:50.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:51.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:51 np0005465988 podman[319683]: 2025-10-02 12:52:51.537075647 +0000 UTC m=+0.065748427 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:52:51 np0005465988 podman[319682]: 2025-10-02 12:52:51.572448868 +0000 UTC m=+0.107642416 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:52:51 np0005465988 podman[319684]: 2025-10-02 12:52:51.576276778 +0000 UTC m=+0.102700603 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.build-date=20251001)
Oct  2 08:52:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:52.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:52 np0005465988 nova_compute[236126]: 2025-10-02 12:52:52.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:53.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:54.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:55 np0005465988 nova_compute[236126]: 2025-10-02 12:52:55.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:52:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:55.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:52:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:56.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:52:57 np0005465988 nova_compute[236126]: 2025-10-02 12:52:57.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:57.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:52:58.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:52:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:52:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:52:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:52:59.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:00 np0005465988 nova_compute[236126]: 2025-10-02 12:53:00.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:00.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:01.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:02.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:02 np0005465988 nova_compute[236126]: 2025-10-02 12:53:02.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:53:03.348 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:53:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:53:03.349 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:53:03 np0005465988 nova_compute[236126]: 2025-10-02 12:53:03.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:03 np0005465988 nova_compute[236126]: 2025-10-02 12:53:03.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:03.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:04.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:05 np0005465988 nova_compute[236126]: 2025-10-02 12:53:05.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:05.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:06.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:07 np0005465988 nova_compute[236126]: 2025-10-02 12:53:07.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:07.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:08 np0005465988 nova_compute[236126]: 2025-10-02 12:53:08.072 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:08 np0005465988 nova_compute[236126]: 2025-10-02 12:53:08.072 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:08 np0005465988 nova_compute[236126]: 2025-10-02 12:53:08.073 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:08 np0005465988 nova_compute[236126]: 2025-10-02 12:53:08.073 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:53:08 np0005465988 nova_compute[236126]: 2025-10-02 12:53:08.073 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:53:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:08.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:53:08.352 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:53:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:53:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1299686118' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:53:08 np0005465988 nova_compute[236126]: 2025-10-02 12:53:08.549 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:53:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:09.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:10 np0005465988 nova_compute[236126]: 2025-10-02 12:53:10.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:10.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:10 np0005465988 podman[319827]: 2025-10-02 12:53:10.542099936 +0000 UTC m=+0.086059604 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:53:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:53:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:11.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:53:11 np0005465988 nova_compute[236126]: 2025-10-02 12:53:11.783 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:53:11 np0005465988 nova_compute[236126]: 2025-10-02 12:53:11.784 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:53:11 np0005465988 nova_compute[236126]: 2025-10-02 12:53:11.787 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:53:11 np0005465988 nova_compute[236126]: 2025-10-02 12:53:11.788 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:53:11 np0005465988 nova_compute[236126]: 2025-10-02 12:53:11.964 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:53:11 np0005465988 nova_compute[236126]: 2025-10-02 12:53:11.965 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3657MB free_disk=20.760520935058594GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:53:11 np0005465988 nova_compute[236126]: 2025-10-02 12:53:11.966 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:11 np0005465988 nova_compute[236126]: 2025-10-02 12:53:11.966 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:12.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:12 np0005465988 nova_compute[236126]: 2025-10-02 12:53:12.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:13.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:14.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:15 np0005465988 nova_compute[236126]: 2025-10-02 12:53:15.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:15.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:15 np0005465988 nova_compute[236126]: 2025-10-02 12:53:15.849 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 439392e5-66ae-4162-a7e5-077f87ca558b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:53:15 np0005465988 nova_compute[236126]: 2025-10-02 12:53:15.850 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:53:15 np0005465988 nova_compute[236126]: 2025-10-02 12:53:15.850 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:53:15 np0005465988 nova_compute[236126]: 2025-10-02 12:53:15.850 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:53:15 np0005465988 nova_compute[236126]: 2025-10-02 12:53:15.923 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:53:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:16.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:53:16 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1538834424' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:53:16 np0005465988 nova_compute[236126]: 2025-10-02 12:53:16.370 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:53:16 np0005465988 nova_compute[236126]: 2025-10-02 12:53:16.378 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:53:17 np0005465988 nova_compute[236126]: 2025-10-02 12:53:17.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:17.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:18.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:18 np0005465988 nova_compute[236126]: 2025-10-02 12:53:18.746 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:53:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:19.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:20 np0005465988 nova_compute[236126]: 2025-10-02 12:53:20.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:20.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:21.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:22.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:22 np0005465988 nova_compute[236126]: 2025-10-02 12:53:22.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:22 np0005465988 podman[319927]: 2025-10-02 12:53:22.552896195 +0000 UTC m=+0.069240019 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 08:53:22 np0005465988 podman[319926]: 2025-10-02 12:53:22.559266179 +0000 UTC m=+0.086227379 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller)
Oct  2 08:53:22 np0005465988 podman[319928]: 2025-10-02 12:53:22.565106907 +0000 UTC m=+0.075852139 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible)
Oct  2 08:53:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:53:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:23.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:53:23 np0005465988 nova_compute[236126]: 2025-10-02 12:53:23.639 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:53:23 np0005465988 nova_compute[236126]: 2025-10-02 12:53:23.639 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 11.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:24.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:25 np0005465988 nova_compute[236126]: 2025-10-02 12:53:25.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:25.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:26.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:53:27.400 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:53:27.401 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:53:27.401 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:27 np0005465988 nova_compute[236126]: 2025-10-02 12:53:27.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:27.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:27 np0005465988 nova_compute[236126]: 2025-10-02 12:53:27.639 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:27 np0005465988 nova_compute[236126]: 2025-10-02 12:53:27.640 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:27 np0005465988 nova_compute[236126]: 2025-10-02 12:53:27.640 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:53:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:28.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:53:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.0 total, 600.0 interval#012Cumulative writes: 13K writes, 67K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.13 GB, 0.03 MB/s#012Cumulative WAL: 13K writes, 13K syncs, 1.00 writes per sync, written: 0.13 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1644 writes, 8007 keys, 1644 commit groups, 1.0 writes per commit group, ingest: 16.07 MB, 0.03 MB/s#012Interval WAL: 1644 writes, 1644 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     54.7      1.47              0.28        42    0.035       0      0       0.0       0.0#012  L6      1/0   10.02 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   5.0    110.8     94.3      4.28              1.47        41    0.104    279K    22K       0.0       0.0#012 Sum      1/0   10.02 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   6.0     82.5     84.2      5.76              1.75        83    0.069    279K    22K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   9.4     56.8     55.2      1.37              0.24        12    0.114     55K   3123       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0    110.8     94.3      4.28              1.47        41    0.104    279K    22K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     54.8      1.47              0.28        41    0.036       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4800.0 total, 600.0 interval#012Flush(GB): cumulative 0.079, interval 0.008#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.47 GB write, 0.10 MB/s write, 0.46 GB read, 0.10 MB/s read, 5.8 seconds#012Interval compaction: 0.07 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 1.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 304.00 MB usage: 51.81 MB table_size: 0 occupancy: 18446744073709551615 collections: 9 last_copies: 0 last_secs: 0.000422 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2987,49.73 MB,16.3595%) FilterBlock(83,787.42 KB,0.25295%) IndexBlock(83,1.31 MB,0.42954%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 08:53:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:29.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:30 np0005465988 nova_compute[236126]: 2025-10-02 12:53:30.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:30.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:31 np0005465988 ovn_controller[132601]: 2025-10-02T12:53:31Z|00824|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Oct  2 08:53:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:31.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:32.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:32 np0005465988 nova_compute[236126]: 2025-10-02 12:53:32.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:33 np0005465988 nova_compute[236126]: 2025-10-02 12:53:33.546 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:53:33 np0005465988 nova_compute[236126]: 2025-10-02 12:53:33.547 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:53:33 np0005465988 nova_compute[236126]: 2025-10-02 12:53:33.547 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:53:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:33.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:34.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:35 np0005465988 nova_compute[236126]: 2025-10-02 12:53:35.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:53:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:35.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:53:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:53:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:36.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:53:37 np0005465988 nova_compute[236126]: 2025-10-02 12:53:37.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:37.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:38.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:39.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:40 np0005465988 nova_compute[236126]: 2025-10-02 12:53:40.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:40.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:53:40.865 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:53:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:53:40.866 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:53:40 np0005465988 nova_compute[236126]: 2025-10-02 12:53:40.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:41 np0005465988 nova_compute[236126]: 2025-10-02 12:53:41.256 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Updating instance_info_cache with network_info: [{"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:53:41 np0005465988 nova_compute[236126]: 2025-10-02 12:53:41.285 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:53:41 np0005465988 nova_compute[236126]: 2025-10-02 12:53:41.286 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:53:41 np0005465988 nova_compute[236126]: 2025-10-02 12:53:41.286 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:41 np0005465988 nova_compute[236126]: 2025-10-02 12:53:41.286 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:41 np0005465988 nova_compute[236126]: 2025-10-02 12:53:41.287 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:41 np0005465988 nova_compute[236126]: 2025-10-02 12:53:41.287 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:41 np0005465988 nova_compute[236126]: 2025-10-02 12:53:41.287 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:41 np0005465988 nova_compute[236126]: 2025-10-02 12:53:41.287 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:41 np0005465988 nova_compute[236126]: 2025-10-02 12:53:41.287 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:41 np0005465988 nova_compute[236126]: 2025-10-02 12:53:41.288 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:53:41 np0005465988 podman[320049]: 2025-10-02 12:53:41.526964892 +0000 UTC m=+0.071022760 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:53:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:41.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:53:41.869 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:53:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:42.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:42 np0005465988 nova_compute[236126]: 2025-10-02 12:53:42.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:43.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:53:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:53:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:53:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:44.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:45 np0005465988 nova_compute[236126]: 2025-10-02 12:53:45.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:45.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:46.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:47 np0005465988 nova_compute[236126]: 2025-10-02 12:53:47.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:47.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:53:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:53:48Z|00825|binding|INFO|Releasing lport 1fc80788-89b8-413a-b0b0-d36f1a11a2b1 from this chassis (sb_readonly=0)
Oct  2 08:53:48 np0005465988 nova_compute[236126]: 2025-10-02 12:53:48.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:48.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:49.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:50 np0005465988 nova_compute[236126]: 2025-10-02 12:53:50.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:50.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:53:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:53:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:51.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:52.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:52 np0005465988 nova_compute[236126]: 2025-10-02 12:53:52.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:53 np0005465988 podman[320257]: 2025-10-02 12:53:53.519179154 +0000 UTC m=+0.051706843 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:53:53 np0005465988 podman[320258]: 2025-10-02 12:53:53.524385824 +0000 UTC m=+0.052615089 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:53:53 np0005465988 podman[320256]: 2025-10-02 12:53:53.543062403 +0000 UTC m=+0.079278568 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:53:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:53.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:54.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:55 np0005465988 nova_compute[236126]: 2025-10-02 12:53:55.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:53:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3629671726' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:53:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:53:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3629671726' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:53:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:53:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:55.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:56.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:57 np0005465988 nova_compute[236126]: 2025-10-02 12:53:57.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:53:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:57.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:53:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:53:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:53:58.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:53:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:53:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:53:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:53:59.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:00 np0005465988 nova_compute[236126]: 2025-10-02 12:54:00.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:00.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:01.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:02.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:02 np0005465988 nova_compute[236126]: 2025-10-02 12:54:02.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:03 np0005465988 nova_compute[236126]: 2025-10-02 12:54:03.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:03 np0005465988 nova_compute[236126]: 2025-10-02 12:54:03.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:03.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:03 np0005465988 nova_compute[236126]: 2025-10-02 12:54:03.741 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:03 np0005465988 nova_compute[236126]: 2025-10-02 12:54:03.744 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:03 np0005465988 nova_compute[236126]: 2025-10-02 12:54:03.744 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:03 np0005465988 nova_compute[236126]: 2025-10-02 12:54:03.745 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:54:03 np0005465988 nova_compute[236126]: 2025-10-02 12:54:03.746 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:54:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:54:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/816787724' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:54:04 np0005465988 nova_compute[236126]: 2025-10-02 12:54:04.183 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:54:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:04.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:04 np0005465988 nova_compute[236126]: 2025-10-02 12:54:04.931 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:54:04 np0005465988 nova_compute[236126]: 2025-10-02 12:54:04.932 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:54:04 np0005465988 nova_compute[236126]: 2025-10-02 12:54:04.939 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:54:04 np0005465988 nova_compute[236126]: 2025-10-02 12:54:04.939 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:54:05 np0005465988 nova_compute[236126]: 2025-10-02 12:54:05.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:05 np0005465988 nova_compute[236126]: 2025-10-02 12:54:05.160 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:54:05 np0005465988 nova_compute[236126]: 2025-10-02 12:54:05.162 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3666MB free_disk=20.739540100097656GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:54:05 np0005465988 nova_compute[236126]: 2025-10-02 12:54:05.163 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:05 np0005465988 nova_compute[236126]: 2025-10-02 12:54:05.164 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:05.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:06.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 08:54:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4801.0 total, 600.0 interval#012Cumulative writes: 66K writes, 268K keys, 66K commit groups, 1.0 writes per commit group, ingest: 0.27 GB, 0.06 MB/s#012Cumulative WAL: 66K writes, 24K syncs, 2.72 writes per sync, written: 0.27 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8463 writes, 34K keys, 8463 commit groups, 1.0 writes per commit group, ingest: 40.77 MB, 0.07 MB/s#012Interval WAL: 8463 writes, 3047 syncs, 2.78 writes per sync, written: 0.04 GB, 0.07 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 08:54:07 np0005465988 nova_compute[236126]: 2025-10-02 12:54:07.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:07.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:08.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:09.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:09 np0005465988 nova_compute[236126]: 2025-10-02 12:54:09.638 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 439392e5-66ae-4162-a7e5-077f87ca558b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:54:09 np0005465988 nova_compute[236126]: 2025-10-02 12:54:09.639 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:54:09 np0005465988 nova_compute[236126]: 2025-10-02 12:54:09.639 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:54:09 np0005465988 nova_compute[236126]: 2025-10-02 12:54:09.639 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:54:09 np0005465988 nova_compute[236126]: 2025-10-02 12:54:09.744 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:54:10 np0005465988 nova_compute[236126]: 2025-10-02 12:54:10.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:54:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3838244575' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:54:10 np0005465988 nova_compute[236126]: 2025-10-02 12:54:10.212 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:54:10 np0005465988 nova_compute[236126]: 2025-10-02 12:54:10.217 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:54:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:10.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:10 np0005465988 nova_compute[236126]: 2025-10-02 12:54:10.480 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:54:10 np0005465988 nova_compute[236126]: 2025-10-02 12:54:10.482 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:54:10 np0005465988 nova_compute[236126]: 2025-10-02 12:54:10.482 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:11.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:12.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:12 np0005465988 nova_compute[236126]: 2025-10-02 12:54:12.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:12 np0005465988 podman[320422]: 2025-10-02 12:54:12.52305316 +0000 UTC m=+0.062881295 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:54:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:13.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:14.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:14 np0005465988 nova_compute[236126]: 2025-10-02 12:54:14.482 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:15 np0005465988 nova_compute[236126]: 2025-10-02 12:54:15.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:15 np0005465988 nova_compute[236126]: 2025-10-02 12:54:15.436 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:15.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:16.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:17 np0005465988 nova_compute[236126]: 2025-10-02 12:54:17.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:17.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:18.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:18 np0005465988 nova_compute[236126]: 2025-10-02 12:54:18.746 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:19.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:19 np0005465988 nova_compute[236126]: 2025-10-02 12:54:19.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:20 np0005465988 nova_compute[236126]: 2025-10-02 12:54:20.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:20.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:20 np0005465988 nova_compute[236126]: 2025-10-02 12:54:20.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:20 np0005465988 nova_compute[236126]: 2025-10-02 12:54:20.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:54:21 np0005465988 nova_compute[236126]: 2025-10-02 12:54:21.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:21.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:22.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:22 np0005465988 nova_compute[236126]: 2025-10-02 12:54:22.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:22 np0005465988 nova_compute[236126]: 2025-10-02 12:54:22.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:23 np0005465988 nova_compute[236126]: 2025-10-02 12:54:23.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:23.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:24.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:24 np0005465988 nova_compute[236126]: 2025-10-02 12:54:24.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:24 np0005465988 nova_compute[236126]: 2025-10-02 12:54:24.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:24 np0005465988 nova_compute[236126]: 2025-10-02 12:54:24.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:54:24 np0005465988 nova_compute[236126]: 2025-10-02 12:54:24.500 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:54:24 np0005465988 podman[320500]: 2025-10-02 12:54:24.575938083 +0000 UTC m=+0.101291583 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:54:24 np0005465988 podman[320501]: 2025-10-02 12:54:24.579611429 +0000 UTC m=+0.098317897 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  2 08:54:24 np0005465988 podman[320499]: 2025-10-02 12:54:24.581935546 +0000 UTC m=+0.106611146 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:54:25 np0005465988 nova_compute[236126]: 2025-10-02 12:54:25.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:25.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:26.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:26 np0005465988 nova_compute[236126]: 2025-10-02 12:54:26.501 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:26 np0005465988 nova_compute[236126]: 2025-10-02 12:54:26.502 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:54:26 np0005465988 nova_compute[236126]: 2025-10-02 12:54:26.502 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:54:26 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #57. Immutable memtables: 13.
Oct  2 08:54:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:27.401 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:27.402 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:27.402 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:27 np0005465988 nova_compute[236126]: 2025-10-02 12:54:27.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:27.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:27 np0005465988 nova_compute[236126]: 2025-10-02 12:54:27.657 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:54:27 np0005465988 nova_compute[236126]: 2025-10-02 12:54:27.657 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:54:27 np0005465988 nova_compute[236126]: 2025-10-02 12:54:27.657 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:54:27 np0005465988 nova_compute[236126]: 2025-10-02 12:54:27.658 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:54:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:28.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:28 np0005465988 nova_compute[236126]: 2025-10-02 12:54:28.882 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "bb29dcd8-7156-4124-be08-2a85be9287f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:28 np0005465988 nova_compute[236126]: 2025-10-02 12:54:28.882 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:28 np0005465988 nova_compute[236126]: 2025-10-02 12:54:28.909 2 DEBUG nova.compute.manager [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.161 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.161 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.178 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.179 2 INFO nova.compute.claims [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.327 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:54:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:29.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:54:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2011315154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.779 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.786 2 DEBUG nova.compute.provider_tree [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.847 2 DEBUG nova.scheduler.client.report [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.884 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.885 2 DEBUG nova.compute.manager [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.933 2 DEBUG nova.compute.manager [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.934 2 DEBUG nova.network.neutron [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:54:29 np0005465988 nova_compute[236126]: 2025-10-02 12:54:29.962 2 INFO nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.007 2 DEBUG nova.compute.manager [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.112 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updating instance_info_cache with network_info: [{"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.116 2 DEBUG nova.compute.manager [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.117 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.117 2 INFO nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Creating image(s)#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.143 2 DEBUG nova.storage.rbd_utils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image bb29dcd8-7156-4124-be08-2a85be9287f7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.175 2 DEBUG nova.storage.rbd_utils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image bb29dcd8-7156-4124-be08-2a85be9287f7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.204 2 DEBUG nova.storage.rbd_utils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image bb29dcd8-7156-4124-be08-2a85be9287f7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.208 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.242 2 DEBUG nova.policy [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16730f38111542e58a05fb4deb2b3914', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5ade962c517a483dbfe4bb13386f0006', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.247 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-439392e5-66ae-4162-a7e5-077f87ca558b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.247 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.278 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.278 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.279 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.279 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.303 2 DEBUG nova.storage.rbd_utils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image bb29dcd8-7156-4124-be08-2a85be9287f7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.307 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 bb29dcd8-7156-4124-be08-2a85be9287f7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:54:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:30.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.712 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 bb29dcd8-7156-4124-be08-2a85be9287f7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.802 2 DEBUG nova.storage.rbd_utils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] resizing rbd image bb29dcd8-7156-4124-be08-2a85be9287f7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:54:30 np0005465988 nova_compute[236126]: 2025-10-02 12:54:30.911 2 DEBUG nova.objects.instance [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lazy-loading 'migration_context' on Instance uuid bb29dcd8-7156-4124-be08-2a85be9287f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:54:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:31.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:31 np0005465988 nova_compute[236126]: 2025-10-02 12:54:31.840 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:54:31 np0005465988 nova_compute[236126]: 2025-10-02 12:54:31.841 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Ensure instance console log exists: /var/lib/nova/instances/bb29dcd8-7156-4124-be08-2a85be9287f7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:54:31 np0005465988 nova_compute[236126]: 2025-10-02 12:54:31.842 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:31 np0005465988 nova_compute[236126]: 2025-10-02 12:54:31.842 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:31 np0005465988 nova_compute[236126]: 2025-10-02 12:54:31.842 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:32 np0005465988 nova_compute[236126]: 2025-10-02 12:54:32.089 2 DEBUG nova.network.neutron [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Successfully created port: f25f9de7-55b4-47c3-8367-c6e83c489ca1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:54:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:32.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:32 np0005465988 nova_compute[236126]: 2025-10-02 12:54:32.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:32 np0005465988 nova_compute[236126]: 2025-10-02 12:54:32.845 2 DEBUG nova.network.neutron [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Successfully updated port: f25f9de7-55b4-47c3-8367-c6e83c489ca1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:54:32 np0005465988 nova_compute[236126]: 2025-10-02 12:54:32.895 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:54:32 np0005465988 nova_compute[236126]: 2025-10-02 12:54:32.896 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquired lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:54:32 np0005465988 nova_compute[236126]: 2025-10-02 12:54:32.896 2 DEBUG nova.network.neutron [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.046 2 DEBUG nova.compute.manager [req-49470870-f66b-45db-9dfa-a0664af390b2 req-fd969347-e779-4493-a8bb-dc72b65721a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Received event network-changed-f25f9de7-55b4-47c3-8367-c6e83c489ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.047 2 DEBUG nova.compute.manager [req-49470870-f66b-45db-9dfa-a0664af390b2 req-fd969347-e779-4493-a8bb-dc72b65721a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Refreshing instance network info cache due to event network-changed-f25f9de7-55b4-47c3-8367-c6e83c489ca1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.047 2 DEBUG oslo_concurrency.lockutils [req-49470870-f66b-45db-9dfa-a0664af390b2 req-fd969347-e779-4493-a8bb-dc72b65721a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.099 2 DEBUG nova.network.neutron [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:54:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:33.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.924 2 DEBUG nova.network.neutron [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Updating instance_info_cache with network_info: [{"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.962 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Releasing lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.963 2 DEBUG nova.compute.manager [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Instance network_info: |[{"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.964 2 DEBUG oslo_concurrency.lockutils [req-49470870-f66b-45db-9dfa-a0664af390b2 req-fd969347-e779-4493-a8bb-dc72b65721a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.964 2 DEBUG nova.network.neutron [req-49470870-f66b-45db-9dfa-a0664af390b2 req-fd969347-e779-4493-a8bb-dc72b65721a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Refreshing network info cache for port f25f9de7-55b4-47c3-8367-c6e83c489ca1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.967 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Start _get_guest_xml network_info=[{"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.974 2 WARNING nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.980 2 DEBUG nova.virt.libvirt.host [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.981 2 DEBUG nova.virt.libvirt.host [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.987 2 DEBUG nova.virt.libvirt.host [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.987 2 DEBUG nova.virt.libvirt.host [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.989 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.989 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.989 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.990 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.990 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.990 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.990 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.990 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.991 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.991 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.991 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.991 2 DEBUG nova.virt.hardware [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:54:33 np0005465988 nova_compute[236126]: 2025-10-02 12:54:33.994 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:54:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:54:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3383468913' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:54:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:34.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.443 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.477 2 DEBUG nova.storage.rbd_utils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image bb29dcd8-7156-4124-be08-2a85be9287f7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.483 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:54:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:54:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2340281099' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.924 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.926 2 DEBUG nova.virt.libvirt.vif [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:54:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-access_point-778253998',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-access_point-778253998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1031871880-ac',id=184,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCNXoUpB5vnXEqEgNwYIzijuoYmVb3l9JNigmHJaX5u9xC3Mh8lXczeWl2u7dkH6lLuwxSlvCC37ZyW8ZGUIpj45HvvaKOGWejz8IKI5Q3A25a49idjq6IkqaQpM0SMq/w==',key_name='tempest-TestSecurityGroupsBasicOps-487542131',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5ade962c517a483dbfe4bb13386f0006',ramdisk_id='',reservation_id='r-bkn3fqep',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1031871880',owner_user_name='tempest-TestSecurityGroupsBasicOps-1031871880-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:54:30Z,user_data=None,user_id='16730f38111542e58a05fb4deb2b3914',uuid=bb29dcd8-7156-4124-be08-2a85be9287f7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.926 2 DEBUG nova.network.os_vif_util [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converting VIF {"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.927 2 DEBUG nova.network.os_vif_util [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:9d:a7,bridge_name='br-int',has_traffic_filtering=True,id=f25f9de7-55b4-47c3-8367-c6e83c489ca1,network=Network(5d708a73-9d9d-419e-a932-76b92db27fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25f9de7-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.928 2 DEBUG nova.objects.instance [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lazy-loading 'pci_devices' on Instance uuid bb29dcd8-7156-4124-be08-2a85be9287f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.958 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  <uuid>bb29dcd8-7156-4124-be08-2a85be9287f7</uuid>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  <name>instance-000000b8</name>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-access_point-778253998</nova:name>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:54:33</nova:creationTime>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <nova:user uuid="16730f38111542e58a05fb4deb2b3914">tempest-TestSecurityGroupsBasicOps-1031871880-project-member</nova:user>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <nova:project uuid="5ade962c517a483dbfe4bb13386f0006">tempest-TestSecurityGroupsBasicOps-1031871880</nova:project>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <nova:port uuid="f25f9de7-55b4-47c3-8367-c6e83c489ca1">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <entry name="serial">bb29dcd8-7156-4124-be08-2a85be9287f7</entry>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <entry name="uuid">bb29dcd8-7156-4124-be08-2a85be9287f7</entry>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/bb29dcd8-7156-4124-be08-2a85be9287f7_disk">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/bb29dcd8-7156-4124-be08-2a85be9287f7_disk.config">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:1e:9d:a7"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <target dev="tapf25f9de7-55"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/bb29dcd8-7156-4124-be08-2a85be9287f7/console.log" append="off"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:54:34 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:54:34 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:54:34 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:54:34 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.960 2 DEBUG nova.compute.manager [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Preparing to wait for external event network-vif-plugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.960 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.961 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.961 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.962 2 DEBUG nova.virt.libvirt.vif [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:54:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-access_point-778253998',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-access_point-778253998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1031871880-ac',id=184,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCNXoUpB5vnXEqEgNwYIzijuoYmVb3l9JNigmHJaX5u9xC3Mh8lXczeWl2u7dkH6lLuwxSlvCC37ZyW8ZGUIpj45HvvaKOGWejz8IKI5Q3A25a49idjq6IkqaQpM0SMq/w==',key_name='tempest-TestSecurityGroupsBasicOps-487542131',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5ade962c517a483dbfe4bb13386f0006',ramdisk_id='',reservation_id='r-bkn3fqep',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1031871880',owner_user_name='tempest-TestSecurityGroupsBasicOps-1031871880-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:54:30Z,user_data=None,user_id='16730f38111542e58a05fb4deb2b3914',uuid=bb29dcd8-7156-4124-be08-2a85be9287f7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.962 2 DEBUG nova.network.os_vif_util [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converting VIF {"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.963 2 DEBUG nova.network.os_vif_util [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:9d:a7,bridge_name='br-int',has_traffic_filtering=True,id=f25f9de7-55b4-47c3-8367-c6e83c489ca1,network=Network(5d708a73-9d9d-419e-a932-76b92db27fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25f9de7-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.963 2 DEBUG os_vif [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:9d:a7,bridge_name='br-int',has_traffic_filtering=True,id=f25f9de7-55b4-47c3-8367-c6e83c489ca1,network=Network(5d708a73-9d9d-419e-a932-76b92db27fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25f9de7-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.965 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.965 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf25f9de7-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf25f9de7-55, col_values=(('external_ids', {'iface-id': 'f25f9de7-55b4-47c3-8367-c6e83c489ca1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:9d:a7', 'vm-uuid': 'bb29dcd8-7156-4124-be08-2a85be9287f7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:34 np0005465988 NetworkManager[45041]: <info>  [1759409674.9744] manager: (tapf25f9de7-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/362)
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:34 np0005465988 nova_compute[236126]: 2025-10-02 12:54:34.981 2 INFO os_vif [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:9d:a7,bridge_name='br-int',has_traffic_filtering=True,id=f25f9de7-55b4-47c3-8367-c6e83c489ca1,network=Network(5d708a73-9d9d-419e-a932-76b92db27fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25f9de7-55')#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.101 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.102 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.102 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] No VIF found with MAC fa:16:3e:1e:9d:a7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.103 2 INFO nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Using config drive#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.133 2 DEBUG nova.storage.rbd_utils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image bb29dcd8-7156-4124-be08-2a85be9287f7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.153 2 DEBUG nova.network.neutron [req-49470870-f66b-45db-9dfa-a0664af390b2 req-fd969347-e779-4493-a8bb-dc72b65721a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Updated VIF entry in instance network info cache for port f25f9de7-55b4-47c3-8367-c6e83c489ca1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.154 2 DEBUG nova.network.neutron [req-49470870-f66b-45db-9dfa-a0664af390b2 req-fd969347-e779-4493-a8bb-dc72b65721a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Updating instance_info_cache with network_info: [{"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.235 2 DEBUG oslo_concurrency.lockutils [req-49470870-f66b-45db-9dfa-a0664af390b2 req-fd969347-e779-4493-a8bb-dc72b65721a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:54:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.529 2 INFO nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Creating config drive at /var/lib/nova/instances/bb29dcd8-7156-4124-be08-2a85be9287f7/disk.config#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.535 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bb29dcd8-7156-4124-be08-2a85be9287f7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2q1h9hf_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:54:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:35.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.671 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bb29dcd8-7156-4124-be08-2a85be9287f7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2q1h9hf_" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.709 2 DEBUG nova.storage.rbd_utils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] rbd image bb29dcd8-7156-4124-be08-2a85be9287f7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.714 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bb29dcd8-7156-4124-be08-2a85be9287f7/disk.config bb29dcd8-7156-4124-be08-2a85be9287f7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.990 2 DEBUG oslo_concurrency.processutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bb29dcd8-7156-4124-be08-2a85be9287f7/disk.config bb29dcd8-7156-4124-be08-2a85be9287f7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.276s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:54:35 np0005465988 nova_compute[236126]: 2025-10-02 12:54:35.991 2 INFO nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Deleting local config drive /var/lib/nova/instances/bb29dcd8-7156-4124-be08-2a85be9287f7/disk.config because it was imported into RBD.#033[00m
Oct  2 08:54:36 np0005465988 kernel: tapf25f9de7-55: entered promiscuous mode
Oct  2 08:54:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:54:36Z|00826|binding|INFO|Claiming lport f25f9de7-55b4-47c3-8367-c6e83c489ca1 for this chassis.
Oct  2 08:54:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:54:36Z|00827|binding|INFO|f25f9de7-55b4-47c3-8367-c6e83c489ca1: Claiming fa:16:3e:1e:9d:a7 10.100.0.13
Oct  2 08:54:36 np0005465988 NetworkManager[45041]: <info>  [1759409676.0643] manager: (tapf25f9de7-55): new Tun device (/org/freedesktop/NetworkManager/Devices/363)
Oct  2 08:54:36 np0005465988 nova_compute[236126]: 2025-10-02 12:54:36.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:36 np0005465988 ovn_controller[132601]: 2025-10-02T12:54:36Z|00828|binding|INFO|Setting lport f25f9de7-55b4-47c3-8367-c6e83c489ca1 ovn-installed in OVS
Oct  2 08:54:36 np0005465988 nova_compute[236126]: 2025-10-02 12:54:36.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:36 np0005465988 systemd-machined[192594]: New machine qemu-87-instance-000000b8.
Oct  2 08:54:36 np0005465988 systemd[1]: Started Virtual Machine qemu-87-instance-000000b8.
Oct  2 08:54:36 np0005465988 systemd-udevd[320893]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:54:36 np0005465988 NetworkManager[45041]: <info>  [1759409676.1312] device (tapf25f9de7-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:54:36 np0005465988 NetworkManager[45041]: <info>  [1759409676.1320] device (tapf25f9de7-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:54:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:36.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:36 np0005465988 nova_compute[236126]: 2025-10-02 12:54:36.972 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409676.970686, bb29dcd8-7156-4124-be08-2a85be9287f7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:54:36 np0005465988 nova_compute[236126]: 2025-10-02 12:54:36.973 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] VM Started (Lifecycle Event)#033[00m
Oct  2 08:54:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:37.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:38 np0005465988 nova_compute[236126]: 2025-10-02 12:54:38.295 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:54:38 np0005465988 nova_compute[236126]: 2025-10-02 12:54:38.302 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409676.9727726, bb29dcd8-7156-4124-be08-2a85be9287f7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:54:38 np0005465988 nova_compute[236126]: 2025-10-02 12:54:38.302 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.439 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:9d:a7 10.100.0.13'], port_security=['fa:16:3e:1e:9d:a7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'bb29dcd8-7156-4124-be08-2a85be9287f7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d708a73-9d9d-419e-a932-76b92db27fe0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ade962c517a483dbfe4bb13386f0006', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e8da7b2-0043-4cd4-a44e-7049b2d7d14e 8feb20e9-648a-477c-af80-8cb6c84d6497', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d0c1625-3ae7-4b72-a555-f31f6d4351fb, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f25f9de7-55b4-47c3-8367-c6e83c489ca1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:54:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:54:38Z|00829|binding|INFO|Setting lport f25f9de7-55b4-47c3-8367-c6e83c489ca1 up in Southbound
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.440 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f25f9de7-55b4-47c3-8367-c6e83c489ca1 in datapath 5d708a73-9d9d-419e-a932-76b92db27fe0 bound to our chassis#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.442 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5d708a73-9d9d-419e-a932-76b92db27fe0#033[00m
Oct  2 08:54:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:38.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.454 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8e5d546a-a243-4b11-9ad8-da9cc2382c90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.456 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5d708a73-91 in ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.458 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5d708a73-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.458 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d606c2-13de-49d1-ab4d-1bdc45107b18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.459 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[64b8a67a-633a-4c0a-9c6d-089e39882083]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.473 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[0736a51b-8254-4efa-af92-91be1f2ad637]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.498 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a8fed56d-b744-42ec-a18d-7bfb6da3e018]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.528 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ba0ebe-898a-41e7-bbec-1c23f0d791fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.534 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7d39b132-07c2-4d56-957e-6ac4b15a7bb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 NetworkManager[45041]: <info>  [1759409678.5355] manager: (tap5d708a73-90): new Veth device (/org/freedesktop/NetworkManager/Devices/364)
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.567 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[75212ea1-c95a-4e57-9f1b-6651a7977944]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.570 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[173a3732-556f-4f10-9b45-17542e7d5edd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 NetworkManager[45041]: <info>  [1759409678.5950] device (tap5d708a73-90): carrier: link connected
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.600 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c71c0004-2cda-46c4-9cef-b6681242af25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.620 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c593e2d6-4ce1-4943-af6a-0687495a6aa2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d708a73-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:db:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 242], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773892, 'reachable_time': 33302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 321019, 'error': None, 'target': 'ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.639 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9b1dabd3-3167-4694-bb48-170c21933a1b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:dbdb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773892, 'tstamp': 773892}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 321020, 'error': None, 'target': 'ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.657 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[32d121d6-fb44-49b9-a985-4f8fc1550692]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d708a73-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:db:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 242], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773892, 'reachable_time': 33302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 321021, 'error': None, 'target': 'ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.693 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e49668dd-df11-4014-8117-a5f0383f8d06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.765 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fecf30aa-b763-4289-b9fe-e44bb2863644]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.767 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d708a73-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.767 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.768 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d708a73-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:54:38 np0005465988 NetworkManager[45041]: <info>  [1759409678.7709] manager: (tap5d708a73-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/365)
Oct  2 08:54:38 np0005465988 kernel: tap5d708a73-90: entered promiscuous mode
Oct  2 08:54:38 np0005465988 nova_compute[236126]: 2025-10-02 12:54:38.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.774 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5d708a73-90, col_values=(('external_ids', {'iface-id': 'fe52387d-636e-4544-9cfa-1db45f861a05'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:54:38 np0005465988 nova_compute[236126]: 2025-10-02 12:54:38.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:38 np0005465988 ovn_controller[132601]: 2025-10-02T12:54:38Z|00830|binding|INFO|Releasing lport fe52387d-636e-4544-9cfa-1db45f861a05 from this chassis (sb_readonly=1)
Oct  2 08:54:38 np0005465988 nova_compute[236126]: 2025-10-02 12:54:38.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.794 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5d708a73-9d9d-419e-a932-76b92db27fe0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5d708a73-9d9d-419e-a932-76b92db27fe0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.795 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3c951d9f-c633-4272-8d10-ff22c0f07cb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.796 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-5d708a73-9d9d-419e-a932-76b92db27fe0
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/5d708a73-9d9d-419e-a932-76b92db27fe0.pid.haproxy
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 5d708a73-9d9d-419e-a932-76b92db27fe0
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:54:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:38.798 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0', 'env', 'PROCESS_TAG=haproxy-5d708a73-9d9d-419e-a932-76b92db27fe0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5d708a73-9d9d-419e-a932-76b92db27fe0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:54:39 np0005465988 podman[321055]: 2025-10-02 12:54:39.23604287 +0000 UTC m=+0.069708802 container create 5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:54:39 np0005465988 podman[321055]: 2025-10-02 12:54:39.191951668 +0000 UTC m=+0.025617620 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:54:39 np0005465988 systemd[1]: Started libpod-conmon-5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a.scope.
Oct  2 08:54:39 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:54:39 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/471817c2b0b08d34116d2724cc88b4597c3964ff185d0d0d44967042bf0fdb93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:54:39 np0005465988 podman[321055]: 2025-10-02 12:54:39.370736175 +0000 UTC m=+0.204402107 container init 5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:54:39 np0005465988 podman[321055]: 2025-10-02 12:54:39.378101507 +0000 UTC m=+0.211767419 container start 5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct  2 08:54:39 np0005465988 neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0[321070]: [NOTICE]   (321074) : New worker (321076) forked
Oct  2 08:54:39 np0005465988 neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0[321070]: [NOTICE]   (321074) : Loading success.
Oct  2 08:54:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:39.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:39 np0005465988 nova_compute[236126]: 2025-10-02 12:54:39.795 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:54:39 np0005465988 nova_compute[236126]: 2025-10-02 12:54:39.801 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:54:39 np0005465988 nova_compute[236126]: 2025-10-02 12:54:39.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:40 np0005465988 nova_compute[236126]: 2025-10-02 12:54:40.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:40 np0005465988 nova_compute[236126]: 2025-10-02 12:54:40.215 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:54:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:54:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:40.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:54:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:40.974 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:54:40 np0005465988 nova_compute[236126]: 2025-10-02 12:54:40.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:40 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:40.975 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:54:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:41.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:54:41.977 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.216 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:42.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.841 2 DEBUG nova.compute.manager [req-9e1e88a6-da68-45ed-8120-ecb7e7a20436 req-d6f8df5a-cbb6-4199-bc05-6b39f97e7264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Received event network-vif-plugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.841 2 DEBUG oslo_concurrency.lockutils [req-9e1e88a6-da68-45ed-8120-ecb7e7a20436 req-d6f8df5a-cbb6-4199-bc05-6b39f97e7264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.842 2 DEBUG oslo_concurrency.lockutils [req-9e1e88a6-da68-45ed-8120-ecb7e7a20436 req-d6f8df5a-cbb6-4199-bc05-6b39f97e7264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.842 2 DEBUG oslo_concurrency.lockutils [req-9e1e88a6-da68-45ed-8120-ecb7e7a20436 req-d6f8df5a-cbb6-4199-bc05-6b39f97e7264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.842 2 DEBUG nova.compute.manager [req-9e1e88a6-da68-45ed-8120-ecb7e7a20436 req-d6f8df5a-cbb6-4199-bc05-6b39f97e7264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Processing event network-vif-plugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.842 2 DEBUG nova.compute.manager [req-9e1e88a6-da68-45ed-8120-ecb7e7a20436 req-d6f8df5a-cbb6-4199-bc05-6b39f97e7264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Received event network-vif-plugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.842 2 DEBUG oslo_concurrency.lockutils [req-9e1e88a6-da68-45ed-8120-ecb7e7a20436 req-d6f8df5a-cbb6-4199-bc05-6b39f97e7264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.842 2 DEBUG oslo_concurrency.lockutils [req-9e1e88a6-da68-45ed-8120-ecb7e7a20436 req-d6f8df5a-cbb6-4199-bc05-6b39f97e7264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.843 2 DEBUG oslo_concurrency.lockutils [req-9e1e88a6-da68-45ed-8120-ecb7e7a20436 req-d6f8df5a-cbb6-4199-bc05-6b39f97e7264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.843 2 DEBUG nova.compute.manager [req-9e1e88a6-da68-45ed-8120-ecb7e7a20436 req-d6f8df5a-cbb6-4199-bc05-6b39f97e7264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] No waiting events found dispatching network-vif-plugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.843 2 WARNING nova.compute.manager [req-9e1e88a6-da68-45ed-8120-ecb7e7a20436 req-d6f8df5a-cbb6-4199-bc05-6b39f97e7264 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Received unexpected event network-vif-plugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.844 2 DEBUG nova.compute.manager [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.850 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409682.8501318, bb29dcd8-7156-4124-be08-2a85be9287f7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.850 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.852 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.856 2 INFO nova.virt.libvirt.driver [-] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Instance spawned successfully.#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.856 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.887 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.894 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.900 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.901 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.901 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.902 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.902 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.903 2 DEBUG nova.virt.libvirt.driver [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:54:42 np0005465988 nova_compute[236126]: 2025-10-02 12:54:42.916 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:54:43 np0005465988 nova_compute[236126]: 2025-10-02 12:54:43.028 2 INFO nova.compute.manager [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Took 12.91 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:54:43 np0005465988 nova_compute[236126]: 2025-10-02 12:54:43.029 2 DEBUG nova.compute.manager [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:54:43 np0005465988 nova_compute[236126]: 2025-10-02 12:54:43.142 2 INFO nova.compute.manager [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Took 14.10 seconds to build instance.#033[00m
Oct  2 08:54:43 np0005465988 nova_compute[236126]: 2025-10-02 12:54:43.167 2 DEBUG oslo_concurrency.lockutils [None req-192c84e2-dc05-408b-ad4e-d5df2f730783 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:43 np0005465988 podman[321087]: 2025-10-02 12:54:43.528058136 +0000 UTC m=+0.060886048 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #136. Immutable memtables: 0.
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.635130) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 85] Flushing memtable with next log file: 136
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409683635167, "job": 85, "event": "flush_started", "num_memtables": 1, "num_entries": 2389, "num_deletes": 252, "total_data_size": 5807881, "memory_usage": 5902208, "flush_reason": "Manual Compaction"}
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 85] Level-0 flush table #137: started
Oct  2 08:54:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:43.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409683654109, "cf_name": "default", "job": 85, "event": "table_file_creation", "file_number": 137, "file_size": 3797588, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 66522, "largest_seqno": 68905, "table_properties": {"data_size": 3787963, "index_size": 6118, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19873, "raw_average_key_size": 20, "raw_value_size": 3768675, "raw_average_value_size": 3877, "num_data_blocks": 267, "num_entries": 972, "num_filter_entries": 972, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759409468, "oldest_key_time": 1759409468, "file_creation_time": 1759409683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 137, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 85] Flush lasted 19056 microseconds, and 7507 cpu microseconds.
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.654180) [db/flush_job.cc:967] [default] [JOB 85] Level-0 flush table #137: 3797588 bytes OK
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.654202) [db/memtable_list.cc:519] [default] Level-0 commit table #137 started
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.655715) [db/memtable_list.cc:722] [default] Level-0 commit table #137: memtable #1 done
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.655731) EVENT_LOG_v1 {"time_micros": 1759409683655726, "job": 85, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.655747) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 85] Try to delete WAL files size 5797494, prev total WAL file size 5797494, number of live WAL files 2.
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000133.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.656920) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035373733' seq:72057594037927935, type:22 .. '7061786F730036303235' seq:0, type:0; will stop at (end)
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 86] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 85 Base level 0, inputs: [137(3708KB)], [135(10MB)]
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409683656987, "job": 86, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [137], "files_L6": [135], "score": -1, "input_data_size": 14300763, "oldest_snapshot_seqno": -1}
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 86] Generated table #138: 9118 keys, 12395472 bytes, temperature: kUnknown
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409683730084, "cf_name": "default", "job": 86, "event": "table_file_creation", "file_number": 138, "file_size": 12395472, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12335414, "index_size": 36137, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22853, "raw_key_size": 239438, "raw_average_key_size": 26, "raw_value_size": 12174124, "raw_average_value_size": 1335, "num_data_blocks": 1384, "num_entries": 9118, "num_filter_entries": 9118, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759409683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 138, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.730626) [db/compaction/compaction_job.cc:1663] [default] [JOB 86] Compacted 1@0 + 1@6 files to L6 => 12395472 bytes
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.731852) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.4 rd, 169.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 10.0 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(7.0) write-amplify(3.3) OK, records in: 9641, records dropped: 523 output_compression: NoCompression
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.731875) EVENT_LOG_v1 {"time_micros": 1759409683731864, "job": 86, "event": "compaction_finished", "compaction_time_micros": 73183, "compaction_time_cpu_micros": 31705, "output_level": 6, "num_output_files": 1, "total_output_size": 12395472, "num_input_records": 9641, "num_output_records": 9118, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000137.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409683732975, "job": 86, "event": "table_file_deletion", "file_number": 137}
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000135.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409683735609, "job": 86, "event": "table_file_deletion", "file_number": 135}
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.656824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.735665) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.735670) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.735672) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.735674) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:54:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:54:43.735676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:54:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:54:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:44.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:54:44 np0005465988 nova_compute[236126]: 2025-10-02 12:54:44.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:45 np0005465988 nova_compute[236126]: 2025-10-02 12:54:45.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:45.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:46.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e383 e383: 3 total, 3 up, 3 in
Oct  2 08:54:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:47.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:48.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:54:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:49.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:54:49 np0005465988 nova_compute[236126]: 2025-10-02 12:54:49.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:50 np0005465988 nova_compute[236126]: 2025-10-02 12:54:50.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e383 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:50.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:50 np0005465988 nova_compute[236126]: 2025-10-02 12:54:50.808 2 DEBUG nova.compute.manager [req-f14aba3d-530a-4fed-b874-b89942128cec req-0b88c10a-5e7b-4904-8ca6-f586b8bd0cf8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Received event network-changed-f25f9de7-55b4-47c3-8367-c6e83c489ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:54:50 np0005465988 nova_compute[236126]: 2025-10-02 12:54:50.808 2 DEBUG nova.compute.manager [req-f14aba3d-530a-4fed-b874-b89942128cec req-0b88c10a-5e7b-4904-8ca6-f586b8bd0cf8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Refreshing instance network info cache due to event network-changed-f25f9de7-55b4-47c3-8367-c6e83c489ca1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:54:50 np0005465988 nova_compute[236126]: 2025-10-02 12:54:50.809 2 DEBUG oslo_concurrency.lockutils [req-f14aba3d-530a-4fed-b874-b89942128cec req-0b88c10a-5e7b-4904-8ca6-f586b8bd0cf8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:54:50 np0005465988 nova_compute[236126]: 2025-10-02 12:54:50.809 2 DEBUG oslo_concurrency.lockutils [req-f14aba3d-530a-4fed-b874-b89942128cec req-0b88c10a-5e7b-4904-8ca6-f586b8bd0cf8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:54:50 np0005465988 nova_compute[236126]: 2025-10-02 12:54:50.810 2 DEBUG nova.network.neutron [req-f14aba3d-530a-4fed-b874-b89942128cec req-0b88c10a-5e7b-4904-8ca6-f586b8bd0cf8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Refreshing network info cache for port f25f9de7-55b4-47c3-8367-c6e83c489ca1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:54:51 np0005465988 nova_compute[236126]: 2025-10-02 12:54:51.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:51.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:52 np0005465988 nova_compute[236126]: 2025-10-02 12:54:52.331 2 DEBUG nova.network.neutron [req-f14aba3d-530a-4fed-b874-b89942128cec req-0b88c10a-5e7b-4904-8ca6-f586b8bd0cf8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Updated VIF entry in instance network info cache for port f25f9de7-55b4-47c3-8367-c6e83c489ca1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:54:52 np0005465988 nova_compute[236126]: 2025-10-02 12:54:52.333 2 DEBUG nova.network.neutron [req-f14aba3d-530a-4fed-b874-b89942128cec req-0b88c10a-5e7b-4904-8ca6-f586b8bd0cf8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Updating instance_info_cache with network_info: [{"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:54:52 np0005465988 nova_compute[236126]: 2025-10-02 12:54:52.369 2 DEBUG oslo_concurrency.lockutils [req-f14aba3d-530a-4fed-b874-b89942128cec req-0b88c10a-5e7b-4904-8ca6-f586b8bd0cf8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:54:52 np0005465988 podman[321284]: 2025-10-02 12:54:52.393534948 +0000 UTC m=+0.759513751 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 08:54:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:52.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:52 np0005465988 podman[321284]: 2025-10-02 12:54:52.570069121 +0000 UTC m=+0.936047934 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct  2 08:54:53 np0005465988 nova_compute[236126]: 2025-10-02 12:54:53.493 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:53 np0005465988 nova_compute[236126]: 2025-10-02 12:54:53.493 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:54:53 np0005465988 podman[321420]: 2025-10-02 12:54:53.538026605 +0000 UTC m=+0.059557919 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:54:53 np0005465988 podman[321420]: 2025-10-02 12:54:53.551646318 +0000 UTC m=+0.073177602 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 08:54:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:53.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:53 np0005465988 podman[321485]: 2025-10-02 12:54:53.752020477 +0000 UTC m=+0.049078816 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, version=2.2.4, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, vcs-type=git, release=1793, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, io.openshift.expose-services=, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:54:53 np0005465988 podman[321485]: 2025-10-02 12:54:53.764976761 +0000 UTC m=+0.062035080 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, name=keepalived, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Oct  2 08:54:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e384 e384: 3 total, 3 up, 3 in
Oct  2 08:54:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:54.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:54:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:54:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:54:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:54:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:54:54 np0005465988 nova_compute[236126]: 2025-10-02 12:54:54.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:55 np0005465988 nova_compute[236126]: 2025-10-02 12:54:55.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:54:55 np0005465988 podman[321671]: 2025-10-02 12:54:55.537593868 +0000 UTC m=+0.065985494 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:54:55 np0005465988 podman[321670]: 2025-10-02 12:54:55.54526286 +0000 UTC m=+0.072218805 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:54:55 np0005465988 podman[321669]: 2025-10-02 12:54:55.571413344 +0000 UTC m=+0.101493459 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:54:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:55.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:54:56Z|00108|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:9d:a7 10.100.0.13
Oct  2 08:54:56 np0005465988 ovn_controller[132601]: 2025-10-02T12:54:56Z|00109|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:9d:a7 10.100.0.13
Oct  2 08:54:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:56.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:57 np0005465988 ovn_controller[132601]: 2025-10-02T12:54:57Z|00831|binding|INFO|Releasing lport 1fc80788-89b8-413a-b0b0-d36f1a11a2b1 from this chassis (sb_readonly=0)
Oct  2 08:54:57 np0005465988 ovn_controller[132601]: 2025-10-02T12:54:57Z|00832|binding|INFO|Releasing lport fe52387d-636e-4544-9cfa-1db45f861a05 from this chassis (sb_readonly=0)
Oct  2 08:54:57 np0005465988 nova_compute[236126]: 2025-10-02 12:54:57.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:54:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:57.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:54:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:54:58.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:54:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:54:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:54:59.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:54:59 np0005465988 nova_compute[236126]: 2025-10-02 12:54:59.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:59 np0005465988 nova_compute[236126]: 2025-10-02 12:54:59.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:00 np0005465988 nova_compute[236126]: 2025-10-02 12:55:00.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:00.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e385 e385: 3 total, 3 up, 3 in
Oct  2 08:55:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:55:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:55:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:01.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:02.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:03.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:04.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:04 np0005465988 nova_compute[236126]: 2025-10-02 12:55:04.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:05 np0005465988 nova_compute[236126]: 2025-10-02 12:55:05.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:05 np0005465988 nova_compute[236126]: 2025-10-02 12:55:05.517 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:05 np0005465988 nova_compute[236126]: 2025-10-02 12:55:05.551 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:05 np0005465988 nova_compute[236126]: 2025-10-02 12:55:05.552 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:05 np0005465988 nova_compute[236126]: 2025-10-02 12:55:05.552 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:05 np0005465988 nova_compute[236126]: 2025-10-02 12:55:05.553 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:55:05 np0005465988 nova_compute[236126]: 2025-10-02 12:55:05.553 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:55:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:05.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:55:06 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/860078894' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.049 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.140 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000b8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.140 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000b8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.144 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.144 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.148 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.148 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.337 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.338 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3453MB free_disk=20.80630111694336GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.338 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.339 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.435 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 439392e5-66ae-4162-a7e5-077f87ca558b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.436 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.436 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance bb29dcd8-7156-4124-be08-2a85be9287f7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.436 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.437 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:55:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:06.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.540 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:55:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e386 e386: 3 total, 3 up, 3 in
Oct  2 08:55:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:55:06 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/889387731' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:55:06 np0005465988 nova_compute[236126]: 2025-10-02 12:55:06.998 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.005 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.023 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.051 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.052 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.713s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.231 2 DEBUG oslo_concurrency.lockutils [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.232 2 DEBUG oslo_concurrency.lockutils [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.233 2 DEBUG oslo_concurrency.lockutils [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.233 2 DEBUG oslo_concurrency.lockutils [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.234 2 DEBUG oslo_concurrency.lockutils [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.236 2 INFO nova.compute.manager [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Terminating instance#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.237 2 DEBUG nova.compute.manager [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:55:07 np0005465988 kernel: tapeccb499c-96 (unregistering): left promiscuous mode
Oct  2 08:55:07 np0005465988 NetworkManager[45041]: <info>  [1759409707.3074] device (tapeccb499c-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00833|binding|INFO|Releasing lport eccb499c-961f-4ee4-9995-578966625db6 from this chassis (sb_readonly=0)
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00834|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 down in Southbound
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00835|binding|INFO|Removing iface tapeccb499c-96 ovn-installed in OVS
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.329 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:45:eb 10.100.0.6'], port_security=['fa:16:3e:39:45:eb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ac6724c1-4d98-45f7-8e2b-dfac55d9cb13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=eccb499c-961f-4ee4-9995-578966625db6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.331 142124 INFO neutron.agent.ovn.metadata.agent [-] Port eccb499c-961f-4ee4-9995-578966625db6 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 unbound from our chassis#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.332 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:07 np0005465988 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000af.scope: Deactivated successfully.
Oct  2 08:55:07 np0005465988 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000af.scope: Consumed 27.910s CPU time.
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.351 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[adbf74a9-a4ac-46d2-a6d1-4a0f6e9302ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 systemd-machined[192594]: Machine qemu-85-instance-000000af terminated.
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.384 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[91286c5e-a462-4dd1-944d-58e0b6073379]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.387 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2e915334-0fd6-478b-8ce7-229f7e49953c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.418 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[74c9d146-1625-464c-8776-e2582b43ee15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.436 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[31302c18-13b5-49cc-a6ca-a4b7c4aef4ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 20, 'rx_bytes': 1000, 'tx_bytes': 1032, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 20, 'rx_bytes': 1000, 'tx_bytes': 1032, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 23866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 321897, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.452 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[af1a1f14-3a8e-4fd2-8f36-f4aee4da41d9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740222, 'tstamp': 740222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 321898, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740225, 'tstamp': 740225}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 321898, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.454 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:07 np0005465988 kernel: tapeccb499c-96: entered promiscuous mode
Oct  2 08:55:07 np0005465988 NetworkManager[45041]: <info>  [1759409707.4576] manager: (tapeccb499c-96): new Tun device (/org/freedesktop/NetworkManager/Devices/366)
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:07 np0005465988 kernel: tapeccb499c-96 (unregistering): left promiscuous mode
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00836|binding|INFO|Claiming lport eccb499c-961f-4ee4-9995-578966625db6 for this chassis.
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00837|binding|INFO|eccb499c-961f-4ee4-9995-578966625db6: Claiming fa:16:3e:39:45:eb 10.100.0.6
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.467 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:45:eb 10.100.0.6'], port_security=['fa:16:3e:39:45:eb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ac6724c1-4d98-45f7-8e2b-dfac55d9cb13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=eccb499c-961f-4ee4-9995-578966625db6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00838|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 ovn-installed in OVS
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00839|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 up in Southbound
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00840|binding|INFO|Releasing lport eccb499c-961f-4ee4-9995-578966625db6 from this chassis (sb_readonly=1)
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00841|if_status|INFO|Dropped 2 log messages in last 366 seconds (most recently, 366 seconds ago) due to excessive rate
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00842|if_status|INFO|Not setting lport eccb499c-961f-4ee4-9995-578966625db6 down as sb is readonly
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00843|binding|INFO|Removing iface tapeccb499c-96 ovn-installed in OVS
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.483 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.483 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.483 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.484 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.485 142124 INFO neutron.agent.ovn.metadata.agent [-] Port eccb499c-961f-4ee4-9995-578966625db6 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 bound to our chassis#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.486 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00844|binding|INFO|Releasing lport eccb499c-961f-4ee4-9995-578966625db6 from this chassis (sb_readonly=0)
Oct  2 08:55:07 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:07Z|00845|binding|INFO|Setting lport eccb499c-961f-4ee4-9995-578966625db6 down in Southbound
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.506 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:45:eb 10.100.0.6'], port_security=['fa:16:3e:39:45:eb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ac6724c1-4d98-45f7-8e2b-dfac55d9cb13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=eccb499c-961f-4ee4-9995-578966625db6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.507 2 INFO nova.virt.libvirt.driver [-] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Instance destroyed successfully.#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.507 2 DEBUG nova.objects.instance [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'resources' on Instance uuid ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.509 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b662e2fb-7fea-48e7-8055-2be2f55d2755]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.527 2 DEBUG nova.virt.libvirt.vif [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:49:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-926303208',display_name='tempest-ServerStableDeviceRescueTest-server-926303208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-926303208',id=175,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:49:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6a442bc513e14406b73e96e70396e6c3',ramdisk_id='',reservation_id='r-s2ebpg4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-454391960',owner_user_name='tempest-ServerStableDeviceRescueTest-454391960-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:49:53Z,user_data=None,user_id='6785ffe5d6554514b4ed9fd47665eca0',uuid=ac6724c1-4d98-45f7-8e2b-dfac55d9cb13,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.528 2 DEBUG nova.network.os_vif_util [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converting VIF {"id": "eccb499c-961f-4ee4-9995-578966625db6", "address": "fa:16:3e:39:45:eb", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccb499c-96", "ovs_interfaceid": "eccb499c-961f-4ee4-9995-578966625db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.528 2 DEBUG nova.network.os_vif_util [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:45:eb,bridge_name='br-int',has_traffic_filtering=True,id=eccb499c-961f-4ee4-9995-578966625db6,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccb499c-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.529 2 DEBUG os_vif [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:45:eb,bridge_name='br-int',has_traffic_filtering=True,id=eccb499c-961f-4ee4-9995-578966625db6,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccb499c-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.530 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeccb499c-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.535 2 INFO os_vif [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:45:eb,bridge_name='br-int',has_traffic_filtering=True,id=eccb499c-961f-4ee4-9995-578966625db6,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccb499c-96')#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.535 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[7b37b517-680c-4d97-85c3-9fe07f3c784f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.537 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[6aa9fc69-4c4e-4209-8855-aa57f240c5e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.567 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[35d42ca6-f40d-4322-9b03-542b4cc92e28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.584 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4ff28c66-66a4-477d-a21c-5f11757c6c4d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 22, 'rx_bytes': 1000, 'tx_bytes': 1116, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 22, 'rx_bytes': 1000, 'tx_bytes': 1116, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 23866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 321931, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.599 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[38ff0ff6-f820-4b87-9539-80ec4222281a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740222, 'tstamp': 740222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 321934, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740225, 'tstamp': 740225}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 321934, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.600 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.603 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.603 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.603 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.604 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.604 142124 INFO neutron.agent.ovn.metadata.agent [-] Port eccb499c-961f-4ee4-9995-578966625db6 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 unbound from our chassis#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.606 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e4ff16-1388-40c7-a27a-83a3b4869808#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.619 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[151b13b9-6905-444a-a474-36b0901dcfef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.645 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[db6db59e-ade7-4f1f-8265-ee2c46a8be94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.651 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[97f31967-b770-4713-b183-89845138d353]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:07.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.682 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[37fa3893-13e0-461b-8e01-7d2c33765ea1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.704 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b4eeaa3f-1114-466c-aab7-9c9016ed5129]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e4ff16-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:53:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 24, 'rx_bytes': 1000, 'tx_bytes': 1200, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 24, 'rx_bytes': 1000, 'tx_bytes': 1200, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740208, 'reachable_time': 23866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 321940, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.720 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0402309a-9718-484e-96a5-1fadecd8e9e0]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740222, 'tstamp': 740222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 321941, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap48e4ff16-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 740225, 'tstamp': 740225}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 321941, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.723 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:07 np0005465988 nova_compute[236126]: 2025-10-02 12:55:07.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.726 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e4ff16-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.726 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.727 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e4ff16-10, col_values=(('external_ids', {'iface-id': '1fc80788-89b8-413a-b0b0-d36f1a11a2b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:07.727 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:55:08 np0005465988 nova_compute[236126]: 2025-10-02 12:55:08.265 2 INFO nova.virt.libvirt.driver [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Deleting instance files /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_del#033[00m
Oct  2 08:55:08 np0005465988 nova_compute[236126]: 2025-10-02 12:55:08.266 2 INFO nova.virt.libvirt.driver [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Deletion of /var/lib/nova/instances/ac6724c1-4d98-45f7-8e2b-dfac55d9cb13_del complete#033[00m
Oct  2 08:55:08 np0005465988 nova_compute[236126]: 2025-10-02 12:55:08.362 2 INFO nova.compute.manager [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Took 1.12 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:55:08 np0005465988 nova_compute[236126]: 2025-10-02 12:55:08.364 2 DEBUG oslo.service.loopingcall [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:55:08 np0005465988 nova_compute[236126]: 2025-10-02 12:55:08.365 2 DEBUG nova.compute.manager [-] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:55:08 np0005465988 nova_compute[236126]: 2025-10-02 12:55:08.365 2 DEBUG nova.network.neutron [-] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:55:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:08.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e387 e387: 3 total, 3 up, 3 in
Oct  2 08:55:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:09.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:09 np0005465988 nova_compute[236126]: 2025-10-02 12:55:09.884 2 DEBUG nova.network.neutron [-] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:55:09 np0005465988 nova_compute[236126]: 2025-10-02 12:55:09.928 2 INFO nova.compute.manager [-] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Took 1.56 seconds to deallocate network for instance.#033[00m
Oct  2 08:55:10 np0005465988 nova_compute[236126]: 2025-10-02 12:55:10.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:10 np0005465988 nova_compute[236126]: 2025-10-02 12:55:10.211 2 DEBUG nova.compute.manager [req-622c49b5-c315-4f18-9ed0-e1bc78588366 req-9191735f-d692-478f-972a-043842704ef2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Received event network-vif-deleted-eccb499c-961f-4ee4-9995-578966625db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:55:10 np0005465988 nova_compute[236126]: 2025-10-02 12:55:10.255 2 DEBUG oslo_concurrency.lockutils [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:10 np0005465988 nova_compute[236126]: 2025-10-02 12:55:10.256 2 DEBUG oslo_concurrency.lockutils [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:10 np0005465988 nova_compute[236126]: 2025-10-02 12:55:10.350 2 DEBUG oslo_concurrency.processutils [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:55:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e387 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:10.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:55:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1890250216' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:55:10 np0005465988 nova_compute[236126]: 2025-10-02 12:55:10.827 2 DEBUG oslo_concurrency.processutils [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:55:10 np0005465988 nova_compute[236126]: 2025-10-02 12:55:10.834 2 DEBUG nova.compute.provider_tree [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:55:10 np0005465988 nova_compute[236126]: 2025-10-02 12:55:10.895 2 DEBUG nova.scheduler.client.report [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:55:11 np0005465988 nova_compute[236126]: 2025-10-02 12:55:11.008 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:11 np0005465988 nova_compute[236126]: 2025-10-02 12:55:11.156 2 DEBUG oslo_concurrency.lockutils [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.900s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:11 np0005465988 nova_compute[236126]: 2025-10-02 12:55:11.225 2 INFO nova.scheduler.client.report [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Deleted allocations for instance ac6724c1-4d98-45f7-8e2b-dfac55d9cb13#033[00m
Oct  2 08:55:11 np0005465988 nova_compute[236126]: 2025-10-02 12:55:11.408 2 DEBUG oslo_concurrency.lockutils [None req-e531574f-d177-450c-9417-a4507d2d2f58 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "ac6724c1-4d98-45f7-8e2b-dfac55d9cb13" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:11.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:12.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:12 np0005465988 nova_compute[236126]: 2025-10-02 12:55:12.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e388 e388: 3 total, 3 up, 3 in
Oct  2 08:55:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:55:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:13.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.381 2 DEBUG oslo_concurrency.lockutils [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.382 2 DEBUG oslo_concurrency.lockutils [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.382 2 DEBUG oslo_concurrency.lockutils [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.382 2 DEBUG oslo_concurrency.lockutils [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.383 2 DEBUG oslo_concurrency.lockutils [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.384 2 INFO nova.compute.manager [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Terminating instance#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.386 2 DEBUG nova.compute.manager [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:55:14 np0005465988 kernel: tap7020ab2d-94 (unregistering): left promiscuous mode
Oct  2 08:55:14 np0005465988 NetworkManager[45041]: <info>  [1759409714.4504] device (tap7020ab2d-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:14Z|00846|binding|INFO|Releasing lport 7020ab2d-943e-4985-b442-c6584c56c0d2 from this chassis (sb_readonly=0)
Oct  2 08:55:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:14Z|00847|binding|INFO|Setting lport 7020ab2d-943e-4985-b442-c6584c56c0d2 down in Southbound
Oct  2 08:55:14 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:14Z|00848|binding|INFO|Removing iface tap7020ab2d-94 ovn-installed in OVS
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.471 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:a9:95 10.100.0.9'], port_security=['fa:16:3e:5b:a9:95 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '439392e5-66ae-4162-a7e5-077f87ca558b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e4ff16-1388-40c7-a27a-83a3b4869808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a442bc513e14406b73e96e70396e6c3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '55b83463-a692-41fe-aa59-8c6f6a3385f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd2acef0-eb35-44b4-ad52-c0266ea4784a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7020ab2d-943e-4985-b442-c6584c56c0d2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.472 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7020ab2d-943e-4985-b442-c6584c56c0d2 in datapath 48e4ff16-1388-40c7-a27a-83a3b4869808 unbound from our chassis#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.474 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48e4ff16-1388-40c7-a27a-83a3b4869808, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.475 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[18faa477-296d-482e-a2d0-8c3acbea734a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.475 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 namespace which is not needed anymore#033[00m
Oct  2 08:55:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e389 e389: 3 total, 3 up, 3 in
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:14.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:14 np0005465988 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000ad.scope: Deactivated successfully.
Oct  2 08:55:14 np0005465988 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000ad.scope: Consumed 29.279s CPU time.
Oct  2 08:55:14 np0005465988 systemd-machined[192594]: Machine qemu-82-instance-000000ad terminated.
Oct  2 08:55:14 np0005465988 podman[321969]: 2025-10-02 12:55:14.536234344 +0000 UTC m=+0.071525524 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:55:14 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315718]: [NOTICE]   (315769) : haproxy version is 2.8.14-c23fe91
Oct  2 08:55:14 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315718]: [NOTICE]   (315769) : path to executable is /usr/sbin/haproxy
Oct  2 08:55:14 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315718]: [WARNING]  (315769) : Exiting Master process...
Oct  2 08:55:14 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315718]: [ALERT]    (315769) : Current worker (315775) exited with code 143 (Terminated)
Oct  2 08:55:14 np0005465988 neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808[315718]: [WARNING]  (315769) : All workers exited. Exiting... (0)
Oct  2 08:55:14 np0005465988 systemd[1]: libpod-bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb.scope: Deactivated successfully.
Oct  2 08:55:14 np0005465988 conmon[315718]: conmon bf4656b4122ddd404065 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb.scope/container/memory.events
Oct  2 08:55:14 np0005465988 podman[322009]: 2025-10-02 12:55:14.634622972 +0000 UTC m=+0.061801794 container died bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.634 2 INFO nova.virt.libvirt.driver [-] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Instance destroyed successfully.#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.636 2 DEBUG nova.objects.instance [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lazy-loading 'resources' on Instance uuid 439392e5-66ae-4162-a7e5-077f87ca558b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.658 2 DEBUG nova.virt.libvirt.vif [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:48:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-91664647',display_name='tempest-ServerStableDeviceRescueTest-server-91664647',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-91664647',id=173,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:48:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6a442bc513e14406b73e96e70396e6c3',ramdisk_id='',reservation_id='r-x1ok7izy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-454391960',owner_user_name='tempest-ServerStableDeviceRescueTest-454391960-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:49:03Z,user_data=None,user_id='6785ffe5d6554514b4ed9fd47665eca0',uuid=439392e5-66ae-4162-a7e5-077f87ca558b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.659 2 DEBUG nova.network.os_vif_util [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converting VIF {"id": "7020ab2d-943e-4985-b442-c6584c56c0d2", "address": "fa:16:3e:5b:a9:95", "network": {"id": "48e4ff16-1388-40c7-a27a-83a3b4869808", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-271672558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a442bc513e14406b73e96e70396e6c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7020ab2d-94", "ovs_interfaceid": "7020ab2d-943e-4985-b442-c6584c56c0d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.660 2 DEBUG nova.network.os_vif_util [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:a9:95,bridge_name='br-int',has_traffic_filtering=True,id=7020ab2d-943e-4985-b442-c6584c56c0d2,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7020ab2d-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.660 2 DEBUG os_vif [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:a9:95,bridge_name='br-int',has_traffic_filtering=True,id=7020ab2d-943e-4985-b442-c6584c56c0d2,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7020ab2d-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.662 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7020ab2d-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:14 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb-userdata-shm.mount: Deactivated successfully.
Oct  2 08:55:14 np0005465988 systemd[1]: var-lib-containers-storage-overlay-0755f3ea1419a1dbb47ea63f096e47b6843023923230b44feb353f6e5fd40b4e-merged.mount: Deactivated successfully.
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.668 2 INFO os_vif [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:a9:95,bridge_name='br-int',has_traffic_filtering=True,id=7020ab2d-943e-4985-b442-c6584c56c0d2,network=Network(48e4ff16-1388-40c7-a27a-83a3b4869808),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7020ab2d-94')#033[00m
Oct  2 08:55:14 np0005465988 podman[322009]: 2025-10-02 12:55:14.693848491 +0000 UTC m=+0.121027313 container cleanup bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:55:14 np0005465988 systemd[1]: libpod-conmon-bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb.scope: Deactivated successfully.
Oct  2 08:55:14 np0005465988 podman[322069]: 2025-10-02 12:55:14.78844332 +0000 UTC m=+0.067738185 container remove bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.796 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4a11141c-4651-4dec-808d-f85a11c2c940]: (4, ('Thu Oct  2 12:55:14 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 (bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb)\nbf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb\nThu Oct  2 12:55:14 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 (bf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb)\nbf4656b4122ddd4040654dd6d4e435f9549fcf463df7f5f73cf0be47fa80d9fb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.799 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b3060e7b-22ad-4282-958a-1dd4b3970c36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.800 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e4ff16-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:14 np0005465988 kernel: tap48e4ff16-10: left promiscuous mode
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.818 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8ebe2bf9-5e5f-424d-bb18-4de0babe111e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.855 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[63b40486-e88f-4b5d-984a-f375ec45f2a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.856 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[63c0f4f7-25e1-4fb3-901b-9e6cc4cda74c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.875 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[756900b7-585d-4fe2-823d-315baa8c0cd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 740199, 'reachable_time': 23328, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 322087, 'error': None, 'target': 'ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.877 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48e4ff16-1388-40c7-a27a-83a3b4869808 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:55:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:14.878 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[c8513e59-7c83-47b6-89a2-04912a062ed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:14 np0005465988 systemd[1]: run-netns-ovnmeta\x2d48e4ff16\x2d1388\x2d40c7\x2da27a\x2d83a3b4869808.mount: Deactivated successfully.
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.978 2 DEBUG nova.compute.manager [req-a51a7eca-8915-4cf2-8dd8-ff7b4e9c14de req-6e4336b4-8c91-4b01-b28f-5e43a8256523 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-unplugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.979 2 DEBUG oslo_concurrency.lockutils [req-a51a7eca-8915-4cf2-8dd8-ff7b4e9c14de req-6e4336b4-8c91-4b01-b28f-5e43a8256523 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.979 2 DEBUG oslo_concurrency.lockutils [req-a51a7eca-8915-4cf2-8dd8-ff7b4e9c14de req-6e4336b4-8c91-4b01-b28f-5e43a8256523 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.979 2 DEBUG oslo_concurrency.lockutils [req-a51a7eca-8915-4cf2-8dd8-ff7b4e9c14de req-6e4336b4-8c91-4b01-b28f-5e43a8256523 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.979 2 DEBUG nova.compute.manager [req-a51a7eca-8915-4cf2-8dd8-ff7b4e9c14de req-6e4336b4-8c91-4b01-b28f-5e43a8256523 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] No waiting events found dispatching network-vif-unplugged-7020ab2d-943e-4985-b442-c6584c56c0d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:55:14 np0005465988 nova_compute[236126]: 2025-10-02 12:55:14.980 2 DEBUG nova.compute.manager [req-a51a7eca-8915-4cf2-8dd8-ff7b4e9c14de req-6e4336b4-8c91-4b01-b28f-5e43a8256523 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-unplugged-7020ab2d-943e-4985-b442-c6584c56c0d2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:55:15 np0005465988 nova_compute[236126]: 2025-10-02 12:55:15.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:15 np0005465988 nova_compute[236126]: 2025-10-02 12:55:15.158 2 INFO nova.virt.libvirt.driver [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Deleting instance files /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b_del#033[00m
Oct  2 08:55:15 np0005465988 nova_compute[236126]: 2025-10-02 12:55:15.158 2 INFO nova.virt.libvirt.driver [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Deletion of /var/lib/nova/instances/439392e5-66ae-4162-a7e5-077f87ca558b_del complete#033[00m
Oct  2 08:55:15 np0005465988 nova_compute[236126]: 2025-10-02 12:55:15.384 2 INFO nova.compute.manager [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Took 1.00 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:55:15 np0005465988 nova_compute[236126]: 2025-10-02 12:55:15.384 2 DEBUG oslo.service.loopingcall [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:55:15 np0005465988 nova_compute[236126]: 2025-10-02 12:55:15.385 2 DEBUG nova.compute.manager [-] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:55:15 np0005465988 nova_compute[236126]: 2025-10-02 12:55:15.385 2 DEBUG nova.network.neutron [-] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:55:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e389 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:15.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:16.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.144 2 DEBUG nova.compute.manager [req-1d2166f6-5e3b-4e2d-9aad-3115a07be48a req-045c5ab6-254a-4015-b562-574218056b77 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.145 2 DEBUG oslo_concurrency.lockutils [req-1d2166f6-5e3b-4e2d-9aad-3115a07be48a req-045c5ab6-254a-4015-b562-574218056b77 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.145 2 DEBUG oslo_concurrency.lockutils [req-1d2166f6-5e3b-4e2d-9aad-3115a07be48a req-045c5ab6-254a-4015-b562-574218056b77 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.145 2 DEBUG oslo_concurrency.lockutils [req-1d2166f6-5e3b-4e2d-9aad-3115a07be48a req-045c5ab6-254a-4015-b562-574218056b77 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.145 2 DEBUG nova.compute.manager [req-1d2166f6-5e3b-4e2d-9aad-3115a07be48a req-045c5ab6-254a-4015-b562-574218056b77 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] No waiting events found dispatching network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.145 2 WARNING nova.compute.manager [req-1d2166f6-5e3b-4e2d-9aad-3115a07be48a req-045c5ab6-254a-4015-b562-574218056b77 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received unexpected event network-vif-plugged-7020ab2d-943e-4985-b442-c6584c56c0d2 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.193 2 DEBUG nova.network.neutron [-] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.213 2 INFO nova.compute.manager [-] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Took 1.83 seconds to deallocate network for instance.#033[00m
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.262 2 DEBUG oslo_concurrency.lockutils [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.263 2 DEBUG oslo_concurrency.lockutils [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.398 2 DEBUG oslo_concurrency.processutils [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:55:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:17.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:55:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/627310823' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.900 2 DEBUG oslo_concurrency.processutils [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:55:17 np0005465988 nova_compute[236126]: 2025-10-02 12:55:17.908 2 DEBUG nova.compute.provider_tree [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:55:18 np0005465988 nova_compute[236126]: 2025-10-02 12:55:18.348 2 DEBUG nova.scheduler.client.report [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:55:18 np0005465988 nova_compute[236126]: 2025-10-02 12:55:18.457 2 DEBUG oslo_concurrency.lockutils [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:18.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:18 np0005465988 nova_compute[236126]: 2025-10-02 12:55:18.604 2 INFO nova.scheduler.client.report [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Deleted allocations for instance 439392e5-66ae-4162-a7e5-077f87ca558b#033[00m
Oct  2 08:55:19 np0005465988 nova_compute[236126]: 2025-10-02 12:55:19.403 2 DEBUG oslo_concurrency.lockutils [None req-38e441fe-ab1c-4153-ae66-07e4c49bae33 6785ffe5d6554514b4ed9fd47665eca0 6a442bc513e14406b73e96e70396e6c3 - - default default] Lock "439392e5-66ae-4162-a7e5-077f87ca558b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:19 np0005465988 nova_compute[236126]: 2025-10-02 12:55:19.411 2 DEBUG nova.compute.manager [req-a44f1525-f4d7-4f50-bfdb-7a08779330db req-8057fe8f-f133-4e8e-9fec-355390e453c4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Received event network-vif-deleted-7020ab2d-943e-4985-b442-c6584c56c0d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:55:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 e390: 3 total, 3 up, 3 in
Oct  2 08:55:19 np0005465988 nova_compute[236126]: 2025-10-02 12:55:19.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:19 np0005465988 nova_compute[236126]: 2025-10-02 12:55:19.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:19.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:20 np0005465988 nova_compute[236126]: 2025-10-02 12:55:20.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:20.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:21 np0005465988 nova_compute[236126]: 2025-10-02 12:55:21.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:21.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:22 np0005465988 nova_compute[236126]: 2025-10-02 12:55:22.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:22 np0005465988 nova_compute[236126]: 2025-10-02 12:55:22.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:55:22 np0005465988 nova_compute[236126]: 2025-10-02 12:55:22.505 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409707.5049062, ac6724c1-4d98-45f7-8e2b-dfac55d9cb13 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:55:22 np0005465988 nova_compute[236126]: 2025-10-02 12:55:22.505 2 INFO nova.compute.manager [-] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:55:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:22.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:22 np0005465988 nova_compute[236126]: 2025-10-02 12:55:22.951 2 DEBUG nova.compute.manager [None req-3f510e48-7f73-4bd9-82a5-e20afdd5840b - - - - - -] [instance: ac6724c1-4d98-45f7-8e2b-dfac55d9cb13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:55:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:23.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:24 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:24Z|00849|binding|INFO|Releasing lport fe52387d-636e-4544-9cfa-1db45f861a05 from this chassis (sb_readonly=0)
Oct  2 08:55:24 np0005465988 nova_compute[236126]: 2025-10-02 12:55:24.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:24 np0005465988 nova_compute[236126]: 2025-10-02 12:55:24.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:24.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:24 np0005465988 nova_compute[236126]: 2025-10-02 12:55:24.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:25 np0005465988 nova_compute[236126]: 2025-10-02 12:55:25.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:25 np0005465988 nova_compute[236126]: 2025-10-02 12:55:25.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:25 np0005465988 nova_compute[236126]: 2025-10-02 12:55:25.472 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:25.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:26.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:26 np0005465988 podman[322168]: 2025-10-02 12:55:26.529817746 +0000 UTC m=+0.067045866 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 08:55:26 np0005465988 podman[322169]: 2025-10-02 12:55:26.547718442 +0000 UTC m=+0.076299032 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:55:26 np0005465988 podman[322167]: 2025-10-02 12:55:26.558918605 +0000 UTC m=+0.096166315 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:55:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:27.403 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:27.403 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:27.404 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:27 np0005465988 nova_compute[236126]: 2025-10-02 12:55:27.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:27 np0005465988 nova_compute[236126]: 2025-10-02 12:55:27.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:55:27 np0005465988 nova_compute[236126]: 2025-10-02 12:55:27.578 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:55:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:27.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:28.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:29 np0005465988 nova_compute[236126]: 2025-10-02 12:55:29.632 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409714.6308882, 439392e5-66ae-4162-a7e5-077f87ca558b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:55:29 np0005465988 nova_compute[236126]: 2025-10-02 12:55:29.633 2 INFO nova.compute.manager [-] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:55:29 np0005465988 nova_compute[236126]: 2025-10-02 12:55:29.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:29.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:29 np0005465988 nova_compute[236126]: 2025-10-02 12:55:29.726 2 DEBUG nova.compute.manager [None req-0e1ca85a-2eda-41d5-b2de-18a375960fe6 - - - - - -] [instance: 439392e5-66ae-4162-a7e5-077f87ca558b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:55:30 np0005465988 nova_compute[236126]: 2025-10-02 12:55:30.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:30.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:31.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:32.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:33.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:34.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:34 np0005465988 nova_compute[236126]: 2025-10-02 12:55:34.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:35 np0005465988 nova_compute[236126]: 2025-10-02 12:55:35.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:35.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:36.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:37.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:38.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:39 np0005465988 nova_compute[236126]: 2025-10-02 12:55:39.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:39.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:40 np0005465988 nova_compute[236126]: 2025-10-02 12:55:40.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:40.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:41.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:42 np0005465988 nova_compute[236126]: 2025-10-02 12:55:42.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:42.335 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:55:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:42.337 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:55:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:42.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:43 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:43.340 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:55:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:43.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:55:44 np0005465988 nova_compute[236126]: 2025-10-02 12:55:44.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:44.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:44 np0005465988 nova_compute[236126]: 2025-10-02 12:55:44.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:45 np0005465988 nova_compute[236126]: 2025-10-02 12:55:45.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:45 np0005465988 podman[322292]: 2025-10-02 12:55:45.532292781 +0000 UTC m=+0.061996239 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:55:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:45.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:46.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:47.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 08:55:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:48.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 08:55:49 np0005465988 nova_compute[236126]: 2025-10-02 12:55:49.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:49.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:50 np0005465988 nova_compute[236126]: 2025-10-02 12:55:50.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:50.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:51.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:52.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:53 np0005465988 nova_compute[236126]: 2025-10-02 12:55:53.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:53.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:54.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:54 np0005465988 nova_compute[236126]: 2025-10-02 12:55:54.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:55 np0005465988 nova_compute[236126]: 2025-10-02 12:55:55.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:55:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:55.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:56.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:56 np0005465988 nova_compute[236126]: 2025-10-02 12:55:56.573 2 DEBUG nova.compute.manager [req-de0860a7-ecf6-49d9-b106-23516f7fca4e req-25196b31-1522-4913-8110-615611d5aebb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Received event network-changed-f25f9de7-55b4-47c3-8367-c6e83c489ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:55:56 np0005465988 nova_compute[236126]: 2025-10-02 12:55:56.574 2 DEBUG nova.compute.manager [req-de0860a7-ecf6-49d9-b106-23516f7fca4e req-25196b31-1522-4913-8110-615611d5aebb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Refreshing instance network info cache due to event network-changed-f25f9de7-55b4-47c3-8367-c6e83c489ca1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:55:56 np0005465988 nova_compute[236126]: 2025-10-02 12:55:56.574 2 DEBUG oslo_concurrency.lockutils [req-de0860a7-ecf6-49d9-b106-23516f7fca4e req-25196b31-1522-4913-8110-615611d5aebb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:55:56 np0005465988 nova_compute[236126]: 2025-10-02 12:55:56.574 2 DEBUG oslo_concurrency.lockutils [req-de0860a7-ecf6-49d9-b106-23516f7fca4e req-25196b31-1522-4913-8110-615611d5aebb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:55:56 np0005465988 nova_compute[236126]: 2025-10-02 12:55:56.575 2 DEBUG nova.network.neutron [req-de0860a7-ecf6-49d9-b106-23516f7fca4e req-25196b31-1522-4913-8110-615611d5aebb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Refreshing network info cache for port f25f9de7-55b4-47c3-8367-c6e83c489ca1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.119 2 DEBUG oslo_concurrency.lockutils [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "bb29dcd8-7156-4124-be08-2a85be9287f7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.120 2 DEBUG oslo_concurrency.lockutils [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.120 2 DEBUG oslo_concurrency.lockutils [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.120 2 DEBUG oslo_concurrency.lockutils [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.120 2 DEBUG oslo_concurrency.lockutils [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.121 2 INFO nova.compute.manager [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Terminating instance#033[00m
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.122 2 DEBUG nova.compute.manager [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:55:57 np0005465988 kernel: tapf25f9de7-55 (unregistering): left promiscuous mode
Oct  2 08:55:57 np0005465988 NetworkManager[45041]: <info>  [1759409757.5257] device (tapf25f9de7-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:55:57 np0005465988 podman[322320]: 2025-10-02 12:55:57.538396415 +0000 UTC m=+0.069289580 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:55:57 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:57Z|00850|binding|INFO|Releasing lport f25f9de7-55b4-47c3-8367-c6e83c489ca1 from this chassis (sb_readonly=0)
Oct  2 08:55:57 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:57Z|00851|binding|INFO|Setting lport f25f9de7-55b4-47c3-8367-c6e83c489ca1 down in Southbound
Oct  2 08:55:57 np0005465988 ovn_controller[132601]: 2025-10-02T12:55:57Z|00852|binding|INFO|Removing iface tapf25f9de7-55 ovn-installed in OVS
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:57 np0005465988 podman[322321]: 2025-10-02 12:55:57.564470377 +0000 UTC m=+0.089264436 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:57 np0005465988 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b8.scope: Deactivated successfully.
Oct  2 08:55:57 np0005465988 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b8.scope: Consumed 16.330s CPU time.
Oct  2 08:55:57 np0005465988 systemd-machined[192594]: Machine qemu-87-instance-000000b8 terminated.
Oct  2 08:55:57 np0005465988 podman[322319]: 2025-10-02 12:55:57.606084068 +0000 UTC m=+0.138734204 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:55:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:57.660 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:9d:a7 10.100.0.13'], port_security=['fa:16:3e:1e:9d:a7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'bb29dcd8-7156-4124-be08-2a85be9287f7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d708a73-9d9d-419e-a932-76b92db27fe0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ade962c517a483dbfe4bb13386f0006', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e8da7b2-0043-4cd4-a44e-7049b2d7d14e 8feb20e9-648a-477c-af80-8cb6c84d6497', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d0c1625-3ae7-4b72-a555-f31f6d4351fb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f25f9de7-55b4-47c3-8367-c6e83c489ca1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:55:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:57.661 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f25f9de7-55b4-47c3-8367-c6e83c489ca1 in datapath 5d708a73-9d9d-419e-a932-76b92db27fe0 unbound from our chassis#033[00m
Oct  2 08:55:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:57.663 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d708a73-9d9d-419e-a932-76b92db27fe0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:55:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:57.665 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b1dd4319-7513-4831-b58b-66a464f61a85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:57.666 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0 namespace which is not needed anymore#033[00m
Oct  2 08:55:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:57.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.767 2 INFO nova.virt.libvirt.driver [-] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Instance destroyed successfully.#033[00m
Oct  2 08:55:57 np0005465988 nova_compute[236126]: 2025-10-02 12:55:57.768 2 DEBUG nova.objects.instance [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lazy-loading 'resources' on Instance uuid bb29dcd8-7156-4124-be08-2a85be9287f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:55:57 np0005465988 neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0[321070]: [NOTICE]   (321074) : haproxy version is 2.8.14-c23fe91
Oct  2 08:55:57 np0005465988 neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0[321070]: [NOTICE]   (321074) : path to executable is /usr/sbin/haproxy
Oct  2 08:55:57 np0005465988 neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0[321070]: [WARNING]  (321074) : Exiting Master process...
Oct  2 08:55:57 np0005465988 neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0[321070]: [ALERT]    (321074) : Current worker (321076) exited with code 143 (Terminated)
Oct  2 08:55:57 np0005465988 neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0[321070]: [WARNING]  (321074) : All workers exited. Exiting... (0)
Oct  2 08:55:57 np0005465988 systemd[1]: libpod-5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a.scope: Deactivated successfully.
Oct  2 08:55:57 np0005465988 podman[322407]: 2025-10-02 12:55:57.848914893 +0000 UTC m=+0.089003189 container died 5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:55:57 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a-userdata-shm.mount: Deactivated successfully.
Oct  2 08:55:57 np0005465988 systemd[1]: var-lib-containers-storage-overlay-471817c2b0b08d34116d2724cc88b4597c3964ff185d0d0d44967042bf0fdb93-merged.mount: Deactivated successfully.
Oct  2 08:55:57 np0005465988 podman[322407]: 2025-10-02 12:55:57.928441197 +0000 UTC m=+0.168529493 container cleanup 5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:55:57 np0005465988 systemd[1]: libpod-conmon-5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a.scope: Deactivated successfully.
Oct  2 08:55:58 np0005465988 podman[322443]: 2025-10-02 12:55:58.011848592 +0000 UTC m=+0.058994442 container remove 5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:55:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:58.022 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[02603dfb-c266-4faa-b128-822ba0e78db8]: (4, ('Thu Oct  2 12:55:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0 (5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a)\n5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a\nThu Oct  2 12:55:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0 (5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a)\n5f597a47608980ca07bb49df6d9171bedb215a7e44c2a1fddd82c5bba9283d7a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:58.025 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[74c23fc2-274a-45e6-85cc-1658047e7482]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:58.026 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d708a73-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:58 np0005465988 kernel: tap5d708a73-90: left promiscuous mode
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:58.064 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[81baa780-97b7-4234-b45f-f04478295aef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:58.093 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8a9887fa-401a-446d-84a7-a607b842d72a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:58.095 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[571d6404-dcd6-4f02-a56f-58a428dc53ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:58.112 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9e0f665a-5034-4567-928c-337952666d53]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773885, 'reachable_time': 39993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 322462, 'error': None, 'target': 'ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:58 np0005465988 systemd[1]: run-netns-ovnmeta\x2d5d708a73\x2d9d9d\x2d419e\x2da932\x2d76b92db27fe0.mount: Deactivated successfully.
Oct  2 08:55:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:58.117 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5d708a73-9d9d-419e-a932-76b92db27fe0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:55:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:55:58.117 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[157c2cf5-093c-48a6-982b-8a2f9cbd2640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.141 2 DEBUG nova.virt.libvirt.vif [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:54:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-access_point-778253998',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1031871880-access_point-778253998',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1031871880-ac',id=184,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCNXoUpB5vnXEqEgNwYIzijuoYmVb3l9JNigmHJaX5u9xC3Mh8lXczeWl2u7dkH6lLuwxSlvCC37ZyW8ZGUIpj45HvvaKOGWejz8IKI5Q3A25a49idjq6IkqaQpM0SMq/w==',key_name='tempest-TestSecurityGroupsBasicOps-487542131',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:54:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5ade962c517a483dbfe4bb13386f0006',ramdisk_id='',reservation_id='r-bkn3fqep',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1031871880',owner_user_name='tempest-TestSecurityGroupsBasicOps-1031871880-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:54:43Z,user_data=None,user_id='16730f38111542e58a05fb4deb2b3914',uuid=bb29dcd8-7156-4124-be08-2a85be9287f7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.141 2 DEBUG nova.network.os_vif_util [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converting VIF {"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.142 2 DEBUG nova.network.os_vif_util [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:9d:a7,bridge_name='br-int',has_traffic_filtering=True,id=f25f9de7-55b4-47c3-8367-c6e83c489ca1,network=Network(5d708a73-9d9d-419e-a932-76b92db27fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25f9de7-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.143 2 DEBUG os_vif [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:9d:a7,bridge_name='br-int',has_traffic_filtering=True,id=f25f9de7-55b4-47c3-8367-c6e83c489ca1,network=Network(5d708a73-9d9d-419e-a932-76b92db27fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25f9de7-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.145 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf25f9de7-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.151 2 INFO os_vif [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:9d:a7,bridge_name='br-int',has_traffic_filtering=True,id=f25f9de7-55b4-47c3-8367-c6e83c489ca1,network=Network(5d708a73-9d9d-419e-a932-76b92db27fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25f9de7-55')#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.303 2 DEBUG nova.network.neutron [req-de0860a7-ecf6-49d9-b106-23516f7fca4e req-25196b31-1522-4913-8110-615611d5aebb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Updated VIF entry in instance network info cache for port f25f9de7-55b4-47c3-8367-c6e83c489ca1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.304 2 DEBUG nova.network.neutron [req-de0860a7-ecf6-49d9-b106-23516f7fca4e req-25196b31-1522-4913-8110-615611d5aebb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Updating instance_info_cache with network_info: [{"id": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "address": "fa:16:3e:1e:9d:a7", "network": {"id": "5d708a73-9d9d-419e-a932-76b92db27fe0", "bridge": "br-int", "label": "tempest-network-smoke--582150062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ade962c517a483dbfe4bb13386f0006", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25f9de7-55", "ovs_interfaceid": "f25f9de7-55b4-47c3-8367-c6e83c489ca1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:55:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:55:58.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.690 2 DEBUG oslo_concurrency.lockutils [req-de0860a7-ecf6-49d9-b106-23516f7fca4e req-25196b31-1522-4913-8110-615611d5aebb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-bb29dcd8-7156-4124-be08-2a85be9287f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.695 2 DEBUG nova.compute.manager [req-f8609fec-e42a-483d-94ea-67376851400c req-98da8684-8986-4889-84e6-f7d8130f72fa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Received event network-vif-unplugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.696 2 DEBUG oslo_concurrency.lockutils [req-f8609fec-e42a-483d-94ea-67376851400c req-98da8684-8986-4889-84e6-f7d8130f72fa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.696 2 DEBUG oslo_concurrency.lockutils [req-f8609fec-e42a-483d-94ea-67376851400c req-98da8684-8986-4889-84e6-f7d8130f72fa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.697 2 DEBUG oslo_concurrency.lockutils [req-f8609fec-e42a-483d-94ea-67376851400c req-98da8684-8986-4889-84e6-f7d8130f72fa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.697 2 DEBUG nova.compute.manager [req-f8609fec-e42a-483d-94ea-67376851400c req-98da8684-8986-4889-84e6-f7d8130f72fa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] No waiting events found dispatching network-vif-unplugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:55:58 np0005465988 nova_compute[236126]: 2025-10-02 12:55:58.698 2 DEBUG nova.compute.manager [req-f8609fec-e42a-483d-94ea-67376851400c req-98da8684-8986-4889-84e6-f7d8130f72fa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Received event network-vif-unplugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:55:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:55:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:55:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:55:59.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:00 np0005465988 nova_compute[236126]: 2025-10-02 12:56:00.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:00.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:00 np0005465988 nova_compute[236126]: 2025-10-02 12:56:00.691 2 INFO nova.virt.libvirt.driver [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Deleting instance files /var/lib/nova/instances/bb29dcd8-7156-4124-be08-2a85be9287f7_del#033[00m
Oct  2 08:56:00 np0005465988 nova_compute[236126]: 2025-10-02 12:56:00.693 2 INFO nova.virt.libvirt.driver [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Deletion of /var/lib/nova/instances/bb29dcd8-7156-4124-be08-2a85be9287f7_del complete#033[00m
Oct  2 08:56:01 np0005465988 nova_compute[236126]: 2025-10-02 12:56:01.046 2 DEBUG nova.compute.manager [req-5dbf0f6c-4fd9-4d90-8249-00d40bf6e932 req-5ad7bd96-6ea0-4bc4-8a62-594e3ea5a5d9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Received event network-vif-plugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:56:01 np0005465988 nova_compute[236126]: 2025-10-02 12:56:01.047 2 DEBUG oslo_concurrency.lockutils [req-5dbf0f6c-4fd9-4d90-8249-00d40bf6e932 req-5ad7bd96-6ea0-4bc4-8a62-594e3ea5a5d9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:01 np0005465988 nova_compute[236126]: 2025-10-02 12:56:01.047 2 DEBUG oslo_concurrency.lockutils [req-5dbf0f6c-4fd9-4d90-8249-00d40bf6e932 req-5ad7bd96-6ea0-4bc4-8a62-594e3ea5a5d9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:01 np0005465988 nova_compute[236126]: 2025-10-02 12:56:01.048 2 DEBUG oslo_concurrency.lockutils [req-5dbf0f6c-4fd9-4d90-8249-00d40bf6e932 req-5ad7bd96-6ea0-4bc4-8a62-594e3ea5a5d9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:01 np0005465988 nova_compute[236126]: 2025-10-02 12:56:01.048 2 DEBUG nova.compute.manager [req-5dbf0f6c-4fd9-4d90-8249-00d40bf6e932 req-5ad7bd96-6ea0-4bc4-8a62-594e3ea5a5d9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] No waiting events found dispatching network-vif-plugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:56:01 np0005465988 nova_compute[236126]: 2025-10-02 12:56:01.048 2 WARNING nova.compute.manager [req-5dbf0f6c-4fd9-4d90-8249-00d40bf6e932 req-5ad7bd96-6ea0-4bc4-8a62-594e3ea5a5d9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Received unexpected event network-vif-plugged-f25f9de7-55b4-47c3-8367-c6e83c489ca1 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:56:01 np0005465988 nova_compute[236126]: 2025-10-02 12:56:01.078 2 INFO nova.compute.manager [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Took 3.96 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:56:01 np0005465988 nova_compute[236126]: 2025-10-02 12:56:01.079 2 DEBUG oslo.service.loopingcall [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:56:01 np0005465988 nova_compute[236126]: 2025-10-02 12:56:01.080 2 DEBUG nova.compute.manager [-] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:56:01 np0005465988 nova_compute[236126]: 2025-10-02 12:56:01.080 2 DEBUG nova.network.neutron [-] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:56:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:01.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:56:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:56:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:56:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:02.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:02 np0005465988 nova_compute[236126]: 2025-10-02 12:56:02.826 2 DEBUG nova.network.neutron [-] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:56:02 np0005465988 nova_compute[236126]: 2025-10-02 12:56:02.858 2 DEBUG nova.compute.manager [req-f95ef628-b9bb-4b14-905b-6cae291f9f4a req-c2e87328-e4b4-4ecd-a350-3089b0f1b746 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Received event network-vif-deleted-f25f9de7-55b4-47c3-8367-c6e83c489ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:56:02 np0005465988 nova_compute[236126]: 2025-10-02 12:56:02.858 2 INFO nova.compute.manager [req-f95ef628-b9bb-4b14-905b-6cae291f9f4a req-c2e87328-e4b4-4ecd-a350-3089b0f1b746 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Neutron deleted interface f25f9de7-55b4-47c3-8367-c6e83c489ca1; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:56:02 np0005465988 nova_compute[236126]: 2025-10-02 12:56:02.859 2 DEBUG nova.network.neutron [req-f95ef628-b9bb-4b14-905b-6cae291f9f4a req-c2e87328-e4b4-4ecd-a350-3089b0f1b746 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:56:03 np0005465988 nova_compute[236126]: 2025-10-02 12:56:03.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:03 np0005465988 nova_compute[236126]: 2025-10-02 12:56:03.161 2 DEBUG nova.compute.manager [req-f95ef628-b9bb-4b14-905b-6cae291f9f4a req-c2e87328-e4b4-4ecd-a350-3089b0f1b746 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Detach interface failed, port_id=f25f9de7-55b4-47c3-8367-c6e83c489ca1, reason: Instance bb29dcd8-7156-4124-be08-2a85be9287f7 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:56:03 np0005465988 nova_compute[236126]: 2025-10-02 12:56:03.162 2 INFO nova.compute.manager [-] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Took 2.08 seconds to deallocate network for instance.#033[00m
Oct  2 08:56:03 np0005465988 nova_compute[236126]: 2025-10-02 12:56:03.565 2 DEBUG oslo_concurrency.lockutils [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:03 np0005465988 nova_compute[236126]: 2025-10-02 12:56:03.566 2 DEBUG oslo_concurrency.lockutils [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:03 np0005465988 nova_compute[236126]: 2025-10-02 12:56:03.709 2 DEBUG oslo_concurrency.processutils [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:56:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:56:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:03.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:56:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:56:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/495128654' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:56:04 np0005465988 nova_compute[236126]: 2025-10-02 12:56:04.197 2 DEBUG oslo_concurrency.processutils [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:56:04 np0005465988 nova_compute[236126]: 2025-10-02 12:56:04.204 2 DEBUG nova.compute.provider_tree [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:56:04 np0005465988 nova_compute[236126]: 2025-10-02 12:56:04.336 2 DEBUG nova.scheduler.client.report [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:56:04 np0005465988 nova_compute[236126]: 2025-10-02 12:56:04.493 2 DEBUG oslo_concurrency.lockutils [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:04.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:04 np0005465988 nova_compute[236126]: 2025-10-02 12:56:04.990 2 INFO nova.scheduler.client.report [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Deleted allocations for instance bb29dcd8-7156-4124-be08-2a85be9287f7#033[00m
Oct  2 08:56:05 np0005465988 nova_compute[236126]: 2025-10-02 12:56:05.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:05 np0005465988 nova_compute[236126]: 2025-10-02 12:56:05.725 2 DEBUG oslo_concurrency.lockutils [None req-8a04fd5e-1751-41b3-86f6-32ea9bcf425b 16730f38111542e58a05fb4deb2b3914 5ade962c517a483dbfe4bb13386f0006 - - default default] Lock "bb29dcd8-7156-4124-be08-2a85be9287f7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:05.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:06.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:07 np0005465988 nova_compute[236126]: 2025-10-02 12:56:07.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:07.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:07 np0005465988 nova_compute[236126]: 2025-10-02 12:56:07.865 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:07 np0005465988 nova_compute[236126]: 2025-10-02 12:56:07.866 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:07 np0005465988 nova_compute[236126]: 2025-10-02 12:56:07.866 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:07 np0005465988 nova_compute[236126]: 2025-10-02 12:56:07.867 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:56:07 np0005465988 nova_compute[236126]: 2025-10-02 12:56:07.867 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:56:08 np0005465988 nova_compute[236126]: 2025-10-02 12:56:08.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:56:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1707496210' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:56:08 np0005465988 nova_compute[236126]: 2025-10-02 12:56:08.336 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:56:08 np0005465988 nova_compute[236126]: 2025-10-02 12:56:08.532 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:56:08 np0005465988 nova_compute[236126]: 2025-10-02 12:56:08.534 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4057MB free_disk=20.941638946533203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:56:08 np0005465988 nova_compute[236126]: 2025-10-02 12:56:08.534 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:08 np0005465988 nova_compute[236126]: 2025-10-02 12:56:08.534 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:08.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:09 np0005465988 nova_compute[236126]: 2025-10-02 12:56:09.174 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:56:09 np0005465988 nova_compute[236126]: 2025-10-02 12:56:09.175 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:56:09 np0005465988 nova_compute[236126]: 2025-10-02 12:56:09.218 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:56:09 np0005465988 nova_compute[236126]: 2025-10-02 12:56:09.273 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:56:09 np0005465988 nova_compute[236126]: 2025-10-02 12:56:09.274 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:56:09 np0005465988 nova_compute[236126]: 2025-10-02 12:56:09.303 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:56:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:56:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:56:09 np0005465988 nova_compute[236126]: 2025-10-02 12:56:09.348 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:56:09 np0005465988 nova_compute[236126]: 2025-10-02 12:56:09.368 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:56:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:09.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:56:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/596234105' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:56:09 np0005465988 nova_compute[236126]: 2025-10-02 12:56:09.840 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:56:09 np0005465988 nova_compute[236126]: 2025-10-02 12:56:09.847 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:56:10 np0005465988 nova_compute[236126]: 2025-10-02 12:56:10.017 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:56:10 np0005465988 nova_compute[236126]: 2025-10-02 12:56:10.065 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:56:10 np0005465988 nova_compute[236126]: 2025-10-02 12:56:10.066 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:10 np0005465988 nova_compute[236126]: 2025-10-02 12:56:10.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:56:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:10.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:56:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:11.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:12 np0005465988 nova_compute[236126]: 2025-10-02 12:56:12.067 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:12.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:12 np0005465988 nova_compute[236126]: 2025-10-02 12:56:12.766 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409757.7648578, bb29dcd8-7156-4124-be08-2a85be9287f7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:56:12 np0005465988 nova_compute[236126]: 2025-10-02 12:56:12.767 2 INFO nova.compute.manager [-] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:56:12 np0005465988 nova_compute[236126]: 2025-10-02 12:56:12.870 2 DEBUG nova.compute.manager [None req-24a9d589-85f5-40ab-9d2d-f5389bf552c1 - - - - - -] [instance: bb29dcd8-7156-4124-be08-2a85be9287f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:56:13 np0005465988 nova_compute[236126]: 2025-10-02 12:56:13.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:13.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:14.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:15 np0005465988 nova_compute[236126]: 2025-10-02 12:56:15.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:56:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:15.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:56:16 np0005465988 podman[322789]: 2025-10-02 12:56:16.554801042 +0000 UTC m=+0.079988849 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:56:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:16.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:17.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:18 np0005465988 nova_compute[236126]: 2025-10-02 12:56:18.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:18.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:19.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:20 np0005465988 nova_compute[236126]: 2025-10-02 12:56:20.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:20 np0005465988 nova_compute[236126]: 2025-10-02 12:56:20.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:20.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:21 np0005465988 nova_compute[236126]: 2025-10-02 12:56:21.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:21 np0005465988 nova_compute[236126]: 2025-10-02 12:56:21.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:21.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:22 np0005465988 nova_compute[236126]: 2025-10-02 12:56:22.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:22 np0005465988 nova_compute[236126]: 2025-10-02 12:56:22.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:56:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:22.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:23 np0005465988 nova_compute[236126]: 2025-10-02 12:56:23.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:23 np0005465988 nova_compute[236126]: 2025-10-02 12:56:23.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:23.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:24 np0005465988 nova_compute[236126]: 2025-10-02 12:56:24.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:56:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:24.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:56:25 np0005465988 nova_compute[236126]: 2025-10-02 12:56:25.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:25.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:26 np0005465988 nova_compute[236126]: 2025-10-02 12:56:26.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:26.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:27.403 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:27.404 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:27.404 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:27 np0005465988 nova_compute[236126]: 2025-10-02 12:56:27.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:27.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:28 np0005465988 nova_compute[236126]: 2025-10-02 12:56:28.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:28 np0005465988 nova_compute[236126]: 2025-10-02 12:56:28.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:28 np0005465988 nova_compute[236126]: 2025-10-02 12:56:28.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:56:28 np0005465988 nova_compute[236126]: 2025-10-02 12:56:28.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:56:28 np0005465988 nova_compute[236126]: 2025-10-02 12:56:28.499 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:56:28 np0005465988 podman[322867]: 2025-10-02 12:56:28.5544658 +0000 UTC m=+0.071839914 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible)
Oct  2 08:56:28 np0005465988 podman[322868]: 2025-10-02 12:56:28.565731895 +0000 UTC m=+0.079444383 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  2 08:56:28 np0005465988 podman[322866]: 2025-10-02 12:56:28.598816199 +0000 UTC m=+0.116172532 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:56:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:28.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:29.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:29 np0005465988 nova_compute[236126]: 2025-10-02 12:56:29.856 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:30 np0005465988 nova_compute[236126]: 2025-10-02 12:56:30.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:30.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:31.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:32.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:33 np0005465988 nova_compute[236126]: 2025-10-02 12:56:33.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:33.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:34.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:35 np0005465988 nova_compute[236126]: 2025-10-02 12:56:35.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:35 np0005465988 nova_compute[236126]: 2025-10-02 12:56:35.253 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "bcc081e3-b47c-4963-b0e1-1aff13e929de" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:35 np0005465988 nova_compute[236126]: 2025-10-02 12:56:35.253 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:35 np0005465988 nova_compute[236126]: 2025-10-02 12:56:35.296 2 DEBUG nova.compute.manager [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:56:35 np0005465988 nova_compute[236126]: 2025-10-02 12:56:35.399 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:35 np0005465988 nova_compute[236126]: 2025-10-02 12:56:35.400 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:35 np0005465988 nova_compute[236126]: 2025-10-02 12:56:35.409 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:56:35 np0005465988 nova_compute[236126]: 2025-10-02 12:56:35.409 2 INFO nova.compute.claims [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:56:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:35 np0005465988 nova_compute[236126]: 2025-10-02 12:56:35.612 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:56:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:35.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:56:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1642685684' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.050 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.057 2 DEBUG nova.compute.provider_tree [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.081 2 DEBUG nova.scheduler.client.report [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.135 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.137 2 DEBUG nova.compute.manager [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.210 2 DEBUG nova.compute.manager [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.211 2 DEBUG nova.network.neutron [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.238 2 INFO nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.271 2 DEBUG nova.compute.manager [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.512 2 DEBUG nova.compute.manager [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.514 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.514 2 INFO nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Creating image(s)#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.562 2 DEBUG nova.storage.rbd_utils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image bcc081e3-b47c-4963-b0e1-1aff13e929de_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.596 2 DEBUG nova.storage.rbd_utils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image bcc081e3-b47c-4963-b0e1-1aff13e929de_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.629 2 DEBUG nova.storage.rbd_utils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image bcc081e3-b47c-4963-b0e1-1aff13e929de_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.633 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:56:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:36.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.734 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.735 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.735 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.736 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.767 2 DEBUG nova.storage.rbd_utils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image bcc081e3-b47c-4963-b0e1-1aff13e929de_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:56:36 np0005465988 nova_compute[236126]: 2025-10-02 12:56:36.772 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 bcc081e3-b47c-4963-b0e1-1aff13e929de_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:56:37 np0005465988 nova_compute[236126]: 2025-10-02 12:56:37.153 2 DEBUG nova.policy [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb366465e6154871b8a53c9f500105ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:56:37 np0005465988 nova_compute[236126]: 2025-10-02 12:56:37.414 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 bcc081e3-b47c-4963-b0e1-1aff13e929de_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.642s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:56:37 np0005465988 nova_compute[236126]: 2025-10-02 12:56:37.500 2 DEBUG nova.storage.rbd_utils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] resizing rbd image bcc081e3-b47c-4963-b0e1-1aff13e929de_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:56:37 np0005465988 nova_compute[236126]: 2025-10-02 12:56:37.624 2 DEBUG nova.objects.instance [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'migration_context' on Instance uuid bcc081e3-b47c-4963-b0e1-1aff13e929de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:56:37 np0005465988 nova_compute[236126]: 2025-10-02 12:56:37.645 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:56:37 np0005465988 nova_compute[236126]: 2025-10-02 12:56:37.646 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Ensure instance console log exists: /var/lib/nova/instances/bcc081e3-b47c-4963-b0e1-1aff13e929de/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:56:37 np0005465988 nova_compute[236126]: 2025-10-02 12:56:37.646 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:37 np0005465988 nova_compute[236126]: 2025-10-02 12:56:37.647 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:37 np0005465988 nova_compute[236126]: 2025-10-02 12:56:37.647 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:37.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:38 np0005465988 nova_compute[236126]: 2025-10-02 12:56:38.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:38.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #139. Immutable memtables: 0.
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.562722) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 87] Flushing memtable with next log file: 139
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409799562755, "job": 87, "event": "flush_started", "num_memtables": 1, "num_entries": 1535, "num_deletes": 263, "total_data_size": 3321189, "memory_usage": 3358368, "flush_reason": "Manual Compaction"}
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 87] Level-0 flush table #140: started
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409799592714, "cf_name": "default", "job": 87, "event": "table_file_creation", "file_number": 140, "file_size": 2178036, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 68910, "largest_seqno": 70440, "table_properties": {"data_size": 2171518, "index_size": 3652, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14285, "raw_average_key_size": 20, "raw_value_size": 2158198, "raw_average_value_size": 3056, "num_data_blocks": 160, "num_entries": 706, "num_filter_entries": 706, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759409684, "oldest_key_time": 1759409684, "file_creation_time": 1759409799, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 140, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 87] Flush lasted 30048 microseconds, and 5177 cpu microseconds.
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.592764) [db/flush_job.cc:967] [default] [JOB 87] Level-0 flush table #140: 2178036 bytes OK
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.592786) [db/memtable_list.cc:519] [default] Level-0 commit table #140 started
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.606207) [db/memtable_list.cc:722] [default] Level-0 commit table #140: memtable #1 done
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.606262) EVENT_LOG_v1 {"time_micros": 1759409799606250, "job": 87, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.606288) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 87] Try to delete WAL files size 3314033, prev total WAL file size 3314033, number of live WAL files 2.
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000136.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.607433) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032353036' seq:72057594037927935, type:22 .. '6C6F676D0032373630' seq:0, type:0; will stop at (end)
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 88] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 87 Base level 0, inputs: [140(2126KB)], [138(11MB)]
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409799607464, "job": 88, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [140], "files_L6": [138], "score": -1, "input_data_size": 14573508, "oldest_snapshot_seqno": -1}
Oct  2 08:56:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:39.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 88] Generated table #141: 9282 keys, 14414018 bytes, temperature: kUnknown
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409799804226, "cf_name": "default", "job": 88, "event": "table_file_creation", "file_number": 141, "file_size": 14414018, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14350570, "index_size": 39143, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23237, "raw_key_size": 243931, "raw_average_key_size": 26, "raw_value_size": 14184230, "raw_average_value_size": 1528, "num_data_blocks": 1509, "num_entries": 9282, "num_filter_entries": 9282, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759409799, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 141, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.804612) [db/compaction/compaction_job.cc:1663] [default] [JOB 88] Compacted 1@0 + 1@6 files to L6 => 14414018 bytes
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.811931) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 74.0 rd, 73.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 11.8 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(13.3) write-amplify(6.6) OK, records in: 9824, records dropped: 542 output_compression: NoCompression
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.811960) EVENT_LOG_v1 {"time_micros": 1759409799811946, "job": 88, "event": "compaction_finished", "compaction_time_micros": 196880, "compaction_time_cpu_micros": 35424, "output_level": 6, "num_output_files": 1, "total_output_size": 14414018, "num_input_records": 9824, "num_output_records": 9282, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000140.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409799812819, "job": 88, "event": "table_file_deletion", "file_number": 140}
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000138.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409799816037, "job": 88, "event": "table_file_deletion", "file_number": 138}
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.607340) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.816117) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.816122) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.816124) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.816126) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:56:39 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:56:39.816128) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:56:40 np0005465988 nova_compute[236126]: 2025-10-02 12:56:40.153 2 DEBUG nova.network.neutron [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Successfully created port: dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:56:40 np0005465988 nova_compute[236126]: 2025-10-02 12:56:40.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:40.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:41 np0005465988 nova_compute[236126]: 2025-10-02 12:56:41.462 2 DEBUG nova.network.neutron [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Successfully updated port: dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:56:41 np0005465988 nova_compute[236126]: 2025-10-02 12:56:41.495 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:41 np0005465988 nova_compute[236126]: 2025-10-02 12:56:41.499 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "refresh_cache-bcc081e3-b47c-4963-b0e1-1aff13e929de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:56:41 np0005465988 nova_compute[236126]: 2025-10-02 12:56:41.500 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquired lock "refresh_cache-bcc081e3-b47c-4963-b0e1-1aff13e929de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:56:41 np0005465988 nova_compute[236126]: 2025-10-02 12:56:41.500 2 DEBUG nova.network.neutron [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:56:41 np0005465988 nova_compute[236126]: 2025-10-02 12:56:41.583 2 DEBUG nova.compute.manager [req-295c5d2b-08ab-441b-b2a8-c60aabd33fc9 req-1e5e833e-7115-474d-adff-eb20c86b696e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Received event network-changed-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:56:41 np0005465988 nova_compute[236126]: 2025-10-02 12:56:41.584 2 DEBUG nova.compute.manager [req-295c5d2b-08ab-441b-b2a8-c60aabd33fc9 req-1e5e833e-7115-474d-adff-eb20c86b696e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Refreshing instance network info cache due to event network-changed-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:56:41 np0005465988 nova_compute[236126]: 2025-10-02 12:56:41.584 2 DEBUG oslo_concurrency.lockutils [req-295c5d2b-08ab-441b-b2a8-c60aabd33fc9 req-1e5e833e-7115-474d-adff-eb20c86b696e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-bcc081e3-b47c-4963-b0e1-1aff13e929de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:56:41 np0005465988 nova_compute[236126]: 2025-10-02 12:56:41.702 2 DEBUG nova.network.neutron [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:56:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:41.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:42.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:43 np0005465988 nova_compute[236126]: 2025-10-02 12:56:43.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:43.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:44.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:45 np0005465988 nova_compute[236126]: 2025-10-02 12:56:45.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:45.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.191 2 DEBUG nova.network.neutron [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Updating instance_info_cache with network_info: [{"id": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "address": "fa:16:3e:f1:70:1b", "network": {"id": "48b23f60-a626-4a95-b154-a764454c451b", "bridge": "br-int", "label": "tempest-network-smoke--1549158146", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd846c20-9d", "ovs_interfaceid": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.628 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Releasing lock "refresh_cache-bcc081e3-b47c-4963-b0e1-1aff13e929de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.629 2 DEBUG nova.compute.manager [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Instance network_info: |[{"id": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "address": "fa:16:3e:f1:70:1b", "network": {"id": "48b23f60-a626-4a95-b154-a764454c451b", "bridge": "br-int", "label": "tempest-network-smoke--1549158146", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd846c20-9d", "ovs_interfaceid": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.630 2 DEBUG oslo_concurrency.lockutils [req-295c5d2b-08ab-441b-b2a8-c60aabd33fc9 req-1e5e833e-7115-474d-adff-eb20c86b696e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-bcc081e3-b47c-4963-b0e1-1aff13e929de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.630 2 DEBUG nova.network.neutron [req-295c5d2b-08ab-441b-b2a8-c60aabd33fc9 req-1e5e833e-7115-474d-adff-eb20c86b696e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Refreshing network info cache for port dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.635 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Start _get_guest_xml network_info=[{"id": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "address": "fa:16:3e:f1:70:1b", "network": {"id": "48b23f60-a626-4a95-b154-a764454c451b", "bridge": "br-int", "label": "tempest-network-smoke--1549158146", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd846c20-9d", "ovs_interfaceid": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.643 2 WARNING nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:56:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:46.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.652 2 DEBUG nova.virt.libvirt.host [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.653 2 DEBUG nova.virt.libvirt.host [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.657 2 DEBUG nova.virt.libvirt.host [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.658 2 DEBUG nova.virt.libvirt.host [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.660 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.660 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.661 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.662 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.662 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.663 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.663 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.663 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.664 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.664 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.665 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.665 2 DEBUG nova.virt.hardware [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:56:46 np0005465988 nova_compute[236126]: 2025-10-02 12:56:46.670 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:56:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:56:47 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4243037233' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.150 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.181 2 DEBUG nova.storage.rbd_utils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image bcc081e3-b47c-4963-b0e1-1aff13e929de_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.185 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:56:47 np0005465988 podman[323237]: 2025-10-02 12:56:47.561839937 +0000 UTC m=+0.097666179 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:56:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:56:47 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2108423254' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.643 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.646 2 DEBUG nova.virt.libvirt.vif [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:56:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-887371889',display_name='tempest-TestNetworkBasicOps-server-887371889',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-887371889',id=188,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGS+YU6IfWR2i2vh5DtnBd4STMtZO0onP6GxQ1zL0ghp75MvMRBnuNZKmJn1MIJQkRA89BhjwEmKYW0uPAQZ1TvPU+xFcNfvjFSUcigDldMVaKBEnBnxeHvqd6H4SH+Ywg==',key_name='tempest-TestNetworkBasicOps-528737445',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-2bmykuc2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:56:36Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=bcc081e3-b47c-4963-b0e1-1aff13e929de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "address": "fa:16:3e:f1:70:1b", "network": {"id": "48b23f60-a626-4a95-b154-a764454c451b", "bridge": "br-int", "label": "tempest-network-smoke--1549158146", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd846c20-9d", "ovs_interfaceid": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.647 2 DEBUG nova.network.os_vif_util [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "address": "fa:16:3e:f1:70:1b", "network": {"id": "48b23f60-a626-4a95-b154-a764454c451b", "bridge": "br-int", "label": "tempest-network-smoke--1549158146", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd846c20-9d", "ovs_interfaceid": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.648 2 DEBUG nova.network.os_vif_util [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:70:1b,bridge_name='br-int',has_traffic_filtering=True,id=dd846c20-9d18-4e15-a8a8-4cccbf14b8a4,network=Network(48b23f60-a626-4a95-b154-a764454c451b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd846c20-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.651 2 DEBUG nova.objects.instance [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'pci_devices' on Instance uuid bcc081e3-b47c-4963-b0e1-1aff13e929de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:56:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:56:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:47.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.832 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  <uuid>bcc081e3-b47c-4963-b0e1-1aff13e929de</uuid>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  <name>instance-000000bc</name>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestNetworkBasicOps-server-887371889</nova:name>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:56:46</nova:creationTime>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <nova:user uuid="fb366465e6154871b8a53c9f500105ce">tempest-TestNetworkBasicOps-1692262680-project-member</nova:user>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <nova:project uuid="ce2ca82c03554560b55ed747ae63f1fb">tempest-TestNetworkBasicOps-1692262680</nova:project>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <nova:port uuid="dd846c20-9d18-4e15-a8a8-4cccbf14b8a4">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <entry name="serial">bcc081e3-b47c-4963-b0e1-1aff13e929de</entry>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <entry name="uuid">bcc081e3-b47c-4963-b0e1-1aff13e929de</entry>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/bcc081e3-b47c-4963-b0e1-1aff13e929de_disk">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/bcc081e3-b47c-4963-b0e1-1aff13e929de_disk.config">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:f1:70:1b"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <target dev="tapdd846c20-9d"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/bcc081e3-b47c-4963-b0e1-1aff13e929de/console.log" append="off"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:56:47 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:56:47 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:56:47 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:56:47 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.833 2 DEBUG nova.compute.manager [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Preparing to wait for external event network-vif-plugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.834 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.834 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.834 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.836 2 DEBUG nova.virt.libvirt.vif [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:56:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-887371889',display_name='tempest-TestNetworkBasicOps-server-887371889',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-887371889',id=188,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGS+YU6IfWR2i2vh5DtnBd4STMtZO0onP6GxQ1zL0ghp75MvMRBnuNZKmJn1MIJQkRA89BhjwEmKYW0uPAQZ1TvPU+xFcNfvjFSUcigDldMVaKBEnBnxeHvqd6H4SH+Ywg==',key_name='tempest-TestNetworkBasicOps-528737445',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-2bmykuc2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:56:36Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=bcc081e3-b47c-4963-b0e1-1aff13e929de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "address": "fa:16:3e:f1:70:1b", "network": {"id": "48b23f60-a626-4a95-b154-a764454c451b", "bridge": "br-int", "label": "tempest-network-smoke--1549158146", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd846c20-9d", "ovs_interfaceid": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.836 2 DEBUG nova.network.os_vif_util [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "address": "fa:16:3e:f1:70:1b", "network": {"id": "48b23f60-a626-4a95-b154-a764454c451b", "bridge": "br-int", "label": "tempest-network-smoke--1549158146", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd846c20-9d", "ovs_interfaceid": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.838 2 DEBUG nova.network.os_vif_util [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:70:1b,bridge_name='br-int',has_traffic_filtering=True,id=dd846c20-9d18-4e15-a8a8-4cccbf14b8a4,network=Network(48b23f60-a626-4a95-b154-a764454c451b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd846c20-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.838 2 DEBUG os_vif [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:70:1b,bridge_name='br-int',has_traffic_filtering=True,id=dd846c20-9d18-4e15-a8a8-4cccbf14b8a4,network=Network(48b23f60-a626-4a95-b154-a764454c451b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd846c20-9d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.840 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.842 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.845 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd846c20-9d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.846 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdd846c20-9d, col_values=(('external_ids', {'iface-id': 'dd846c20-9d18-4e15-a8a8-4cccbf14b8a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f1:70:1b', 'vm-uuid': 'bcc081e3-b47c-4963-b0e1-1aff13e929de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:47 np0005465988 NetworkManager[45041]: <info>  [1759409807.8820] manager: (tapdd846c20-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:47 np0005465988 nova_compute[236126]: 2025-10-02 12:56:47.889 2 INFO os_vif [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:70:1b,bridge_name='br-int',has_traffic_filtering=True,id=dd846c20-9d18-4e15-a8a8-4cccbf14b8a4,network=Network(48b23f60-a626-4a95-b154-a764454c451b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd846c20-9d')#033[00m
Oct  2 08:56:48 np0005465988 nova_compute[236126]: 2025-10-02 12:56:48.130 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:56:48 np0005465988 nova_compute[236126]: 2025-10-02 12:56:48.130 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:56:48 np0005465988 nova_compute[236126]: 2025-10-02 12:56:48.133 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No VIF found with MAC fa:16:3e:f1:70:1b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:56:48 np0005465988 nova_compute[236126]: 2025-10-02 12:56:48.134 2 INFO nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Using config drive#033[00m
Oct  2 08:56:48 np0005465988 nova_compute[236126]: 2025-10-02 12:56:48.171 2 DEBUG nova.storage.rbd_utils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image bcc081e3-b47c-4963-b0e1-1aff13e929de_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:56:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:48.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:49.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:50 np0005465988 nova_compute[236126]: 2025-10-02 12:56:50.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:50 np0005465988 nova_compute[236126]: 2025-10-02 12:56:50.564 2 DEBUG nova.network.neutron [req-295c5d2b-08ab-441b-b2a8-c60aabd33fc9 req-1e5e833e-7115-474d-adff-eb20c86b696e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Updated VIF entry in instance network info cache for port dd846c20-9d18-4e15-a8a8-4cccbf14b8a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:56:50 np0005465988 nova_compute[236126]: 2025-10-02 12:56:50.565 2 DEBUG nova.network.neutron [req-295c5d2b-08ab-441b-b2a8-c60aabd33fc9 req-1e5e833e-7115-474d-adff-eb20c86b696e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Updating instance_info_cache with network_info: [{"id": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "address": "fa:16:3e:f1:70:1b", "network": {"id": "48b23f60-a626-4a95-b154-a764454c451b", "bridge": "br-int", "label": "tempest-network-smoke--1549158146", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd846c20-9d", "ovs_interfaceid": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:56:50 np0005465988 nova_compute[236126]: 2025-10-02 12:56:50.626 2 DEBUG oslo_concurrency.lockutils [req-295c5d2b-08ab-441b-b2a8-c60aabd33fc9 req-1e5e833e-7115-474d-adff-eb20c86b696e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-bcc081e3-b47c-4963-b0e1-1aff13e929de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:56:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:50.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:50 np0005465988 nova_compute[236126]: 2025-10-02 12:56:50.671 2 INFO nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Creating config drive at /var/lib/nova/instances/bcc081e3-b47c-4963-b0e1-1aff13e929de/disk.config#033[00m
Oct  2 08:56:50 np0005465988 nova_compute[236126]: 2025-10-02 12:56:50.676 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bcc081e3-b47c-4963-b0e1-1aff13e929de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvcr1qfj5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:56:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:50.690 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:56:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:50.691 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:56:50 np0005465988 nova_compute[236126]: 2025-10-02 12:56:50.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:50 np0005465988 nova_compute[236126]: 2025-10-02 12:56:50.825 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bcc081e3-b47c-4963-b0e1-1aff13e929de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvcr1qfj5" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:56:50 np0005465988 nova_compute[236126]: 2025-10-02 12:56:50.855 2 DEBUG nova.storage.rbd_utils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image bcc081e3-b47c-4963-b0e1-1aff13e929de_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:56:50 np0005465988 nova_compute[236126]: 2025-10-02 12:56:50.859 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bcc081e3-b47c-4963-b0e1-1aff13e929de/disk.config bcc081e3-b47c-4963-b0e1-1aff13e929de_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:56:51 np0005465988 nova_compute[236126]: 2025-10-02 12:56:51.095 2 DEBUG oslo_concurrency.processutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bcc081e3-b47c-4963-b0e1-1aff13e929de/disk.config bcc081e3-b47c-4963-b0e1-1aff13e929de_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.236s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:56:51 np0005465988 nova_compute[236126]: 2025-10-02 12:56:51.097 2 INFO nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Deleting local config drive /var/lib/nova/instances/bcc081e3-b47c-4963-b0e1-1aff13e929de/disk.config because it was imported into RBD.#033[00m
Oct  2 08:56:51 np0005465988 kernel: tapdd846c20-9d: entered promiscuous mode
Oct  2 08:56:51 np0005465988 NetworkManager[45041]: <info>  [1759409811.1613] manager: (tapdd846c20-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/368)
Oct  2 08:56:51 np0005465988 systemd-udevd[323331]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:56:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:56:51Z|00853|binding|INFO|Claiming lport dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 for this chassis.
Oct  2 08:56:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:56:51Z|00854|binding|INFO|dd846c20-9d18-4e15-a8a8-4cccbf14b8a4: Claiming fa:16:3e:f1:70:1b 10.100.0.18
Oct  2 08:56:51 np0005465988 nova_compute[236126]: 2025-10-02 12:56:51.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:51 np0005465988 NetworkManager[45041]: <info>  [1759409811.2215] device (tapdd846c20-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:56:51 np0005465988 NetworkManager[45041]: <info>  [1759409811.2225] device (tapdd846c20-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.230 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:70:1b 10.100.0.18'], port_security=['fa:16:3e:f1:70:1b 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'bcc081e3-b47c-4963-b0e1-1aff13e929de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48b23f60-a626-4a95-b154-a764454c451b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '12f3d19b-01fe-42e0-ac19-e732ef6c9e33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3dbfe445-7f25-42ca-8688-0a8d6c43ed3f, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=dd846c20-9d18-4e15-a8a8-4cccbf14b8a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.231 142124 INFO neutron.agent.ovn.metadata.agent [-] Port dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 in datapath 48b23f60-a626-4a95-b154-a764454c451b bound to our chassis#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.233 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48b23f60-a626-4a95-b154-a764454c451b#033[00m
Oct  2 08:56:51 np0005465988 systemd-machined[192594]: New machine qemu-88-instance-000000bc.
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.244 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[652ddf17-cd5d-4350-8918-d82e212429d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.244 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48b23f60-a1 in ovnmeta-48b23f60-a626-4a95-b154-a764454c451b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.246 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48b23f60-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.246 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[07c91d45-6037-40f3-8994-7fc9948fe3a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.247 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4a6cc9-505d-4f83-87e5-124db5b39f15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 nova_compute[236126]: 2025-10-02 12:56:51.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:56:51Z|00855|binding|INFO|Setting lport dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 ovn-installed in OVS
Oct  2 08:56:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:56:51Z|00856|binding|INFO|Setting lport dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 up in Southbound
Oct  2 08:56:51 np0005465988 nova_compute[236126]: 2025-10-02 12:56:51.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:51 np0005465988 systemd[1]: Started Virtual Machine qemu-88-instance-000000bc.
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.259 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[e38f0036-246e-4530-89da-bdcacb93721f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.275 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[af086c65-1804-4bf6-bffe-687e4e3aa372]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.311 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ed00520c-e30f-44ba-a742-92dca3784a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 NetworkManager[45041]: <info>  [1759409811.3207] manager: (tap48b23f60-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/369)
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.319 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2ebbd34c-92e3-44a0-83cb-3467b2df358a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.360 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[05d3bc24-d08c-4a49-832a-c62b36a7b958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.366 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9ce079c5-ed20-44c8-9595-971fa44045a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 NetworkManager[45041]: <info>  [1759409811.4015] device (tap48b23f60-a0): carrier: link connected
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.412 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4516217f-fe48-4a08-a0c7-177141c50c7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.440 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a46f6861-bdd3-4cbe-8d13-e850cfb30f92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48b23f60-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:da:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 787173, 'reachable_time': 17138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 323367, 'error': None, 'target': 'ovnmeta-48b23f60-a626-4a95-b154-a764454c451b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.464 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[787adbae-130c-44a1-9f5d-682f30665e53]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe39:da6c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 787173, 'tstamp': 787173}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 323368, 'error': None, 'target': 'ovnmeta-48b23f60-a626-4a95-b154-a764454c451b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.488 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f26b7c91-05ce-4460-bd71-481860b4ae8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48b23f60-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:da:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 787173, 'reachable_time': 17138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 323369, 'error': None, 'target': 'ovnmeta-48b23f60-a626-4a95-b154-a764454c451b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.535 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[621f5d44-93c3-43df-b4c1-7914051fe05c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.632 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e661f1-f57d-4df5-894a-8ed3a6b7b7c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.634 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48b23f60-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.634 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.635 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48b23f60-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:56:51 np0005465988 NetworkManager[45041]: <info>  [1759409811.6378] manager: (tap48b23f60-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/370)
Oct  2 08:56:51 np0005465988 kernel: tap48b23f60-a0: entered promiscuous mode
Oct  2 08:56:51 np0005465988 nova_compute[236126]: 2025-10-02 12:56:51.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.657 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48b23f60-a0, col_values=(('external_ids', {'iface-id': '72e5faac-9d73-42b2-89ed-1f386e556cc7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:56:51 np0005465988 ovn_controller[132601]: 2025-10-02T12:56:51Z|00857|binding|INFO|Releasing lport 72e5faac-9d73-42b2-89ed-1f386e556cc7 from this chassis (sb_readonly=0)
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.662 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48b23f60-a626-4a95-b154-a764454c451b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48b23f60-a626-4a95-b154-a764454c451b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.663 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd27aa7-ea8a-4966-8e47-f59b56517f81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.664 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-48b23f60-a626-4a95-b154-a764454c451b
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/48b23f60-a626-4a95-b154-a764454c451b.pid.haproxy
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 48b23f60-a626-4a95-b154-a764454c451b
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:56:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:51.665 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48b23f60-a626-4a95-b154-a764454c451b', 'env', 'PROCESS_TAG=haproxy-48b23f60-a626-4a95-b154-a764454c451b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48b23f60-a626-4a95-b154-a764454c451b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:56:51 np0005465988 nova_compute[236126]: 2025-10-02 12:56:51.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:51 np0005465988 nova_compute[236126]: 2025-10-02 12:56:51.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:51.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:52 np0005465988 podman[323443]: 2025-10-02 12:56:52.047681924 +0000 UTC m=+0.064455850 container create 2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:56:52 np0005465988 systemd[1]: Started libpod-conmon-2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500.scope.
Oct  2 08:56:52 np0005465988 podman[323443]: 2025-10-02 12:56:52.010255115 +0000 UTC m=+0.027029121 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:56:52 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:56:52 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5bf6f0f26bc4a6548834d57a70c3aed01462db796ddb27a07850978c901c6fa1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:56:52 np0005465988 podman[323443]: 2025-10-02 12:56:52.158069449 +0000 UTC m=+0.174843405 container init 2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:56:52 np0005465988 podman[323443]: 2025-10-02 12:56:52.164740141 +0000 UTC m=+0.181514067 container start 2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:56:52 np0005465988 neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b[323459]: [NOTICE]   (323463) : New worker (323465) forked
Oct  2 08:56:52 np0005465988 neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b[323459]: [NOTICE]   (323463) : Loading success.
Oct  2 08:56:52 np0005465988 nova_compute[236126]: 2025-10-02 12:56:52.345 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409812.3449848, bcc081e3-b47c-4963-b0e1-1aff13e929de => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:56:52 np0005465988 nova_compute[236126]: 2025-10-02 12:56:52.345 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] VM Started (Lifecycle Event)#033[00m
Oct  2 08:56:52 np0005465988 nova_compute[236126]: 2025-10-02 12:56:52.379 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:56:52 np0005465988 nova_compute[236126]: 2025-10-02 12:56:52.384 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409812.3451676, bcc081e3-b47c-4963-b0e1-1aff13e929de => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:56:52 np0005465988 nova_compute[236126]: 2025-10-02 12:56:52.384 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:56:52 np0005465988 nova_compute[236126]: 2025-10-02 12:56:52.410 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:56:52 np0005465988 nova_compute[236126]: 2025-10-02 12:56:52.413 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:56:52 np0005465988 nova_compute[236126]: 2025-10-02 12:56:52.441 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:56:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:52.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:52 np0005465988 nova_compute[236126]: 2025-10-02 12:56:52.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:53.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:54.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:56:54.694 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:56:55 np0005465988 nova_compute[236126]: 2025-10-02 12:56:55.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:56:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:55.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:56:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:56:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:56.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:56:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:57.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.863 2 DEBUG nova.compute.manager [req-3d98f80a-31bc-4728-bf46-8323fcb9060e req-771d967e-67dd-4d4b-b5aa-16ea58be56a9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Received event network-vif-plugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.863 2 DEBUG oslo_concurrency.lockutils [req-3d98f80a-31bc-4728-bf46-8323fcb9060e req-771d967e-67dd-4d4b-b5aa-16ea58be56a9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.864 2 DEBUG oslo_concurrency.lockutils [req-3d98f80a-31bc-4728-bf46-8323fcb9060e req-771d967e-67dd-4d4b-b5aa-16ea58be56a9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.864 2 DEBUG oslo_concurrency.lockutils [req-3d98f80a-31bc-4728-bf46-8323fcb9060e req-771d967e-67dd-4d4b-b5aa-16ea58be56a9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.864 2 DEBUG nova.compute.manager [req-3d98f80a-31bc-4728-bf46-8323fcb9060e req-771d967e-67dd-4d4b-b5aa-16ea58be56a9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Processing event network-vif-plugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.865 2 DEBUG nova.compute.manager [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.869 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409817.8696277, bcc081e3-b47c-4963-b0e1-1aff13e929de => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.870 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.873 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.876 2 INFO nova.virt.libvirt.driver [-] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Instance spawned successfully.#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.877 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.896 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.902 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.923 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.947 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.947 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.948 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.949 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.949 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:56:57 np0005465988 nova_compute[236126]: 2025-10-02 12:56:57.950 2 DEBUG nova.virt.libvirt.driver [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:56:58 np0005465988 nova_compute[236126]: 2025-10-02 12:56:58.035 2 INFO nova.compute.manager [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Took 21.52 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:56:58 np0005465988 nova_compute[236126]: 2025-10-02 12:56:58.036 2 DEBUG nova.compute.manager [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:56:58 np0005465988 nova_compute[236126]: 2025-10-02 12:56:58.201 2 INFO nova.compute.manager [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Took 22.84 seconds to build instance.#033[00m
Oct  2 08:56:58 np0005465988 nova_compute[236126]: 2025-10-02 12:56:58.242 2 DEBUG oslo_concurrency.lockutils [None req-b21572b5-e63e-4870-8c22-219834b5a70c fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.989s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:56:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:56:58.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:56:58 np0005465988 podman[323502]: 2025-10-02 12:56:58.788647049 +0000 UTC m=+0.089161783 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid)
Oct  2 08:56:58 np0005465988 podman[323503]: 2025-10-02 12:56:58.792510741 +0000 UTC m=+0.088919946 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:56:58 np0005465988 podman[323501]: 2025-10-02 12:56:58.79281646 +0000 UTC m=+0.093819228 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:56:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:56:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:56:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:56:59.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:00 np0005465988 nova_compute[236126]: 2025-10-02 12:57:00.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:00 np0005465988 nova_compute[236126]: 2025-10-02 12:57:00.348 2 DEBUG nova.compute.manager [req-6efb44e5-927b-4550-8bf7-a5f6ad3956af req-bb55d648-948c-4977-96d4-807f4af302a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Received event network-vif-plugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:57:00 np0005465988 nova_compute[236126]: 2025-10-02 12:57:00.349 2 DEBUG oslo_concurrency.lockutils [req-6efb44e5-927b-4550-8bf7-a5f6ad3956af req-bb55d648-948c-4977-96d4-807f4af302a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:00 np0005465988 nova_compute[236126]: 2025-10-02 12:57:00.349 2 DEBUG oslo_concurrency.lockutils [req-6efb44e5-927b-4550-8bf7-a5f6ad3956af req-bb55d648-948c-4977-96d4-807f4af302a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:00 np0005465988 nova_compute[236126]: 2025-10-02 12:57:00.349 2 DEBUG oslo_concurrency.lockutils [req-6efb44e5-927b-4550-8bf7-a5f6ad3956af req-bb55d648-948c-4977-96d4-807f4af302a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:00 np0005465988 nova_compute[236126]: 2025-10-02 12:57:00.349 2 DEBUG nova.compute.manager [req-6efb44e5-927b-4550-8bf7-a5f6ad3956af req-bb55d648-948c-4977-96d4-807f4af302a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] No waiting events found dispatching network-vif-plugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:57:00 np0005465988 nova_compute[236126]: 2025-10-02 12:57:00.349 2 WARNING nova.compute.manager [req-6efb44e5-927b-4550-8bf7-a5f6ad3956af req-bb55d648-948c-4977-96d4-807f4af302a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Received unexpected event network-vif-plugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:57:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:00.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:01.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:57:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:02.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:57:02 np0005465988 nova_compute[236126]: 2025-10-02 12:57:02.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:03.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:04.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #142. Immutable memtables: 0.
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:04.749488) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 89] Flushing memtable with next log file: 142
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409824749593, "job": 89, "event": "flush_started", "num_memtables": 1, "num_entries": 474, "num_deletes": 250, "total_data_size": 616470, "memory_usage": 625632, "flush_reason": "Manual Compaction"}
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 89] Level-0 flush table #143: started
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409824813623, "cf_name": "default", "job": 89, "event": "table_file_creation", "file_number": 143, "file_size": 313927, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 70445, "largest_seqno": 70914, "table_properties": {"data_size": 311514, "index_size": 512, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6595, "raw_average_key_size": 20, "raw_value_size": 306608, "raw_average_value_size": 946, "num_data_blocks": 23, "num_entries": 324, "num_filter_entries": 324, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759409799, "oldest_key_time": 1759409799, "file_creation_time": 1759409824, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 143, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 89] Flush lasted 64188 microseconds, and 2647 cpu microseconds.
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:04.813689) [db/flush_job.cc:967] [default] [JOB 89] Level-0 flush table #143: 313927 bytes OK
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:04.813721) [db/memtable_list.cc:519] [default] Level-0 commit table #143 started
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:04.817714) [db/memtable_list.cc:722] [default] Level-0 commit table #143: memtable #1 done
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:04.817735) EVENT_LOG_v1 {"time_micros": 1759409824817727, "job": 89, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:04.817763) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 89] Try to delete WAL files size 613610, prev total WAL file size 629053, number of live WAL files 2.
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000139.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:04.818542) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032323630' seq:72057594037927935, type:22 .. '6D6772737461740032353131' seq:0, type:0; will stop at (end)
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 90] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 89 Base level 0, inputs: [143(306KB)], [141(13MB)]
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409824818612, "job": 90, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [143], "files_L6": [141], "score": -1, "input_data_size": 14727945, "oldest_snapshot_seqno": -1}
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 90] Generated table #144: 9103 keys, 10984950 bytes, temperature: kUnknown
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409824970846, "cf_name": "default", "job": 90, "event": "table_file_creation", "file_number": 144, "file_size": 10984950, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10927318, "index_size": 33775, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22789, "raw_key_size": 240402, "raw_average_key_size": 26, "raw_value_size": 10768672, "raw_average_value_size": 1182, "num_data_blocks": 1286, "num_entries": 9103, "num_filter_entries": 9103, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759409824, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 144, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:57:04 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:04.971118) [db/compaction/compaction_job.cc:1663] [default] [JOB 90] Compacted 1@0 + 1@6 files to L6 => 10984950 bytes
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:05.175413) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 96.7 rd, 72.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 13.7 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(81.9) write-amplify(35.0) OK, records in: 9606, records dropped: 503 output_compression: NoCompression
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:05.175456) EVENT_LOG_v1 {"time_micros": 1759409825175437, "job": 90, "event": "compaction_finished", "compaction_time_micros": 152306, "compaction_time_cpu_micros": 28315, "output_level": 6, "num_output_files": 1, "total_output_size": 10984950, "num_input_records": 9606, "num_output_records": 9103, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000143.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409825175727, "job": 90, "event": "table_file_deletion", "file_number": 143}
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000141.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409825178073, "job": 90, "event": "table_file_deletion", "file_number": 141}
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:04.818410) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:05.178167) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:05.178173) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:05.178175) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:05.178177) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:57:05.178179) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:57:05 np0005465988 nova_compute[236126]: 2025-10-02 12:57:05.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:05.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:06.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:07.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:07 np0005465988 nova_compute[236126]: 2025-10-02 12:57:07.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:08.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:09 np0005465988 nova_compute[236126]: 2025-10-02 12:57:09.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:09 np0005465988 nova_compute[236126]: 2025-10-02 12:57:09.511 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:09 np0005465988 nova_compute[236126]: 2025-10-02 12:57:09.512 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:09 np0005465988 nova_compute[236126]: 2025-10-02 12:57:09.513 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:09 np0005465988 nova_compute[236126]: 2025-10-02 12:57:09.513 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:57:09 np0005465988 nova_compute[236126]: 2025-10-02 12:57:09.513 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:57:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:09.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:57:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1205470845' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.090 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.576s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.185 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000bc as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.186 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000bc as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:57:10Z|00110|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f1:70:1b 10.100.0.18
Oct  2 08:57:10 np0005465988 ovn_controller[132601]: 2025-10-02T12:57:10Z|00111|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f1:70:1b 10.100.0.18
Oct  2 08:57:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.453 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.454 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3863MB free_disk=20.92172622680664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.455 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.456 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:57:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:57:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:57:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.664 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance bcc081e3-b47c-4963-b0e1-1aff13e929de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.665 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.665 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:57:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:10.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:10 np0005465988 nova_compute[236126]: 2025-10-02 12:57:10.895 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:57:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:57:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/751274490' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:57:11 np0005465988 nova_compute[236126]: 2025-10-02 12:57:11.415 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:57:11 np0005465988 nova_compute[236126]: 2025-10-02 12:57:11.421 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:57:11 np0005465988 nova_compute[236126]: 2025-10-02 12:57:11.442 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:57:11 np0005465988 nova_compute[236126]: 2025-10-02 12:57:11.469 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:57:11 np0005465988 nova_compute[236126]: 2025-10-02 12:57:11.470 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:57:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:57:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:57:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:11.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:12.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:12 np0005465988 nova_compute[236126]: 2025-10-02 12:57:12.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:13 np0005465988 nova_compute[236126]: 2025-10-02 12:57:13.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:13.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:14.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:15 np0005465988 nova_compute[236126]: 2025-10-02 12:57:15.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:15.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:16.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:57:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:57:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:17.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:17 np0005465988 nova_compute[236126]: 2025-10-02 12:57:17.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:18 np0005465988 podman[323825]: 2025-10-02 12:57:18.53384235 +0000 UTC m=+0.061428383 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 08:57:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:18.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:19.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:20 np0005465988 nova_compute[236126]: 2025-10-02 12:57:20.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:20.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:21 np0005465988 nova_compute[236126]: 2025-10-02 12:57:21.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:57:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:21.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:57:22 np0005465988 nova_compute[236126]: 2025-10-02 12:57:22.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:22 np0005465988 nova_compute[236126]: 2025-10-02 12:57:22.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:57:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:57:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:22.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:57:22 np0005465988 nova_compute[236126]: 2025-10-02 12:57:22.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:23.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:24.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:25 np0005465988 nova_compute[236126]: 2025-10-02 12:57:25.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:25 np0005465988 nova_compute[236126]: 2025-10-02 12:57:25.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:25 np0005465988 nova_compute[236126]: 2025-10-02 12:57:25.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:25.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:26 np0005465988 nova_compute[236126]: 2025-10-02 12:57:26.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:26.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:27.404 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:27.405 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:27.405 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:27.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:27 np0005465988 nova_compute[236126]: 2025-10-02 12:57:27.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:28 np0005465988 nova_compute[236126]: 2025-10-02 12:57:28.106 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Acquiring lock "2a8ebb32-a776-4437-8643-49acdc43be2f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:28 np0005465988 nova_compute[236126]: 2025-10-02 12:57:28.107 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:28 np0005465988 nova_compute[236126]: 2025-10-02 12:57:28.135 2 DEBUG nova.compute.manager [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:57:28 np0005465988 nova_compute[236126]: 2025-10-02 12:57:28.258 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:28 np0005465988 nova_compute[236126]: 2025-10-02 12:57:28.259 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:28 np0005465988 nova_compute[236126]: 2025-10-02 12:57:28.269 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:57:28 np0005465988 nova_compute[236126]: 2025-10-02 12:57:28.269 2 INFO nova.compute.claims [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:57:28 np0005465988 nova_compute[236126]: 2025-10-02 12:57:28.496 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:57:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:28.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:57:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/754908909' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:57:28 np0005465988 nova_compute[236126]: 2025-10-02 12:57:28.974 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:57:28 np0005465988 nova_compute[236126]: 2025-10-02 12:57:28.983 2 DEBUG nova.compute.provider_tree [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.025 2 DEBUG nova.scheduler.client.report [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.172 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.913s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.172 2 DEBUG nova.compute.manager [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.472 2 DEBUG nova.compute.manager [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.472 2 DEBUG nova.network.neutron [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.482 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.482 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.483 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.528 2 INFO nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.532 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:57:29 np0005465988 podman[323924]: 2025-10-02 12:57:29.553089095 +0000 UTC m=+0.072583975 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:57:29 np0005465988 podman[323923]: 2025-10-02 12:57:29.565011529 +0000 UTC m=+0.098396209 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:57:29 np0005465988 podman[323925]: 2025-10-02 12:57:29.584167182 +0000 UTC m=+0.108454590 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.617 2 DEBUG nova.compute.manager [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.724 2 DEBUG nova.compute.manager [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.725 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.725 2 INFO nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Creating image(s)#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.751 2 DEBUG nova.storage.rbd_utils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] rbd image 2a8ebb32-a776-4437-8643-49acdc43be2f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.778 2 DEBUG nova.storage.rbd_utils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] rbd image 2a8ebb32-a776-4437-8643-49acdc43be2f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.805 2 DEBUG nova.storage.rbd_utils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] rbd image 2a8ebb32-a776-4437-8643-49acdc43be2f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.809 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:57:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:29.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.880 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.881 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.881 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.881 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.906 2 DEBUG nova.storage.rbd_utils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] rbd image 2a8ebb32-a776-4437-8643-49acdc43be2f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:57:29 np0005465988 nova_compute[236126]: 2025-10-02 12:57:29.910 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 2a8ebb32-a776-4437-8643-49acdc43be2f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.368 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-bcc081e3-b47c-4963-b0e1-1aff13e929de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.368 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-bcc081e3-b47c-4963-b0e1-1aff13e929de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.369 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.369 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bcc081e3-b47c-4963-b0e1-1aff13e929de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.399 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 2a8ebb32-a776-4437-8643-49acdc43be2f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:57:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.483 2 DEBUG nova.storage.rbd_utils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] resizing rbd image 2a8ebb32-a776-4437-8643-49acdc43be2f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:57:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:30.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.732 2 DEBUG nova.objects.instance [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lazy-loading 'migration_context' on Instance uuid 2a8ebb32-a776-4437-8643-49acdc43be2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.749 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.749 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Ensure instance console log exists: /var/lib/nova/instances/2a8ebb32-a776-4437-8643-49acdc43be2f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.749 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.750 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:30 np0005465988 nova_compute[236126]: 2025-10-02 12:57:30.750 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:31 np0005465988 nova_compute[236126]: 2025-10-02 12:57:31.809 2 DEBUG nova.network.neutron [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Successfully created port: 63749ef2-bb12-4521-93f2-5314940b99f4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:57:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:31.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:32.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:32 np0005465988 nova_compute[236126]: 2025-10-02 12:57:32.731 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Updating instance_info_cache with network_info: [{"id": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "address": "fa:16:3e:f1:70:1b", "network": {"id": "48b23f60-a626-4a95-b154-a764454c451b", "bridge": "br-int", "label": "tempest-network-smoke--1549158146", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd846c20-9d", "ovs_interfaceid": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:57:32 np0005465988 nova_compute[236126]: 2025-10-02 12:57:32.769 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-bcc081e3-b47c-4963-b0e1-1aff13e929de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:57:32 np0005465988 nova_compute[236126]: 2025-10-02 12:57:32.770 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:57:32 np0005465988 nova_compute[236126]: 2025-10-02 12:57:32.771 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:32 np0005465988 nova_compute[236126]: 2025-10-02 12:57:32.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:33.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:34 np0005465988 nova_compute[236126]: 2025-10-02 12:57:34.382 2 DEBUG nova.network.neutron [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Successfully updated port: 63749ef2-bb12-4521-93f2-5314940b99f4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:57:34 np0005465988 nova_compute[236126]: 2025-10-02 12:57:34.438 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Acquiring lock "refresh_cache-2a8ebb32-a776-4437-8643-49acdc43be2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:57:34 np0005465988 nova_compute[236126]: 2025-10-02 12:57:34.438 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Acquired lock "refresh_cache-2a8ebb32-a776-4437-8643-49acdc43be2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:57:34 np0005465988 nova_compute[236126]: 2025-10-02 12:57:34.439 2 DEBUG nova.network.neutron [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:57:34 np0005465988 nova_compute[236126]: 2025-10-02 12:57:34.515 2 DEBUG nova.compute.manager [req-a34f0363-aa6e-4805-bfbd-9b94516e2cdf req-733a046e-b550-4840-b416-537ba54403c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Received event network-changed-63749ef2-bb12-4521-93f2-5314940b99f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:57:34 np0005465988 nova_compute[236126]: 2025-10-02 12:57:34.516 2 DEBUG nova.compute.manager [req-a34f0363-aa6e-4805-bfbd-9b94516e2cdf req-733a046e-b550-4840-b416-537ba54403c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Refreshing instance network info cache due to event network-changed-63749ef2-bb12-4521-93f2-5314940b99f4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:57:34 np0005465988 nova_compute[236126]: 2025-10-02 12:57:34.516 2 DEBUG oslo_concurrency.lockutils [req-a34f0363-aa6e-4805-bfbd-9b94516e2cdf req-733a046e-b550-4840-b416-537ba54403c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-2a8ebb32-a776-4437-8643-49acdc43be2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:57:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:34.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:34 np0005465988 nova_compute[236126]: 2025-10-02 12:57:34.771 2 DEBUG nova.network.neutron [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:57:35 np0005465988 nova_compute[236126]: 2025-10-02 12:57:35.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:35.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.461 2 DEBUG nova.network.neutron [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Updating instance_info_cache with network_info: [{"id": "63749ef2-bb12-4521-93f2-5314940b99f4", "address": "fa:16:3e:80:a6:b6", "network": {"id": "9b3e5364-0567-4be5-b771-728ed7dd0ab7", "bridge": "br-int", "label": "tempest-TestServerMultinode-1015877619-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e08031c685814456aa6c43bcc8f98574", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63749ef2-bb", "ovs_interfaceid": "63749ef2-bb12-4521-93f2-5314940b99f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:57:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:36.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.789 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Releasing lock "refresh_cache-2a8ebb32-a776-4437-8643-49acdc43be2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.789 2 DEBUG nova.compute.manager [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Instance network_info: |[{"id": "63749ef2-bb12-4521-93f2-5314940b99f4", "address": "fa:16:3e:80:a6:b6", "network": {"id": "9b3e5364-0567-4be5-b771-728ed7dd0ab7", "bridge": "br-int", "label": "tempest-TestServerMultinode-1015877619-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e08031c685814456aa6c43bcc8f98574", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63749ef2-bb", "ovs_interfaceid": "63749ef2-bb12-4521-93f2-5314940b99f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.790 2 DEBUG oslo_concurrency.lockutils [req-a34f0363-aa6e-4805-bfbd-9b94516e2cdf req-733a046e-b550-4840-b416-537ba54403c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-2a8ebb32-a776-4437-8643-49acdc43be2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.790 2 DEBUG nova.network.neutron [req-a34f0363-aa6e-4805-bfbd-9b94516e2cdf req-733a046e-b550-4840-b416-537ba54403c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Refreshing network info cache for port 63749ef2-bb12-4521-93f2-5314940b99f4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.793 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Start _get_guest_xml network_info=[{"id": "63749ef2-bb12-4521-93f2-5314940b99f4", "address": "fa:16:3e:80:a6:b6", "network": {"id": "9b3e5364-0567-4be5-b771-728ed7dd0ab7", "bridge": "br-int", "label": "tempest-TestServerMultinode-1015877619-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e08031c685814456aa6c43bcc8f98574", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63749ef2-bb", "ovs_interfaceid": "63749ef2-bb12-4521-93f2-5314940b99f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.797 2 WARNING nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.802 2 DEBUG nova.virt.libvirt.host [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.803 2 DEBUG nova.virt.libvirt.host [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.806 2 DEBUG nova.virt.libvirt.host [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.807 2 DEBUG nova.virt.libvirt.host [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.808 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.808 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.808 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.809 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.809 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.809 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.809 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.810 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.810 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.810 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.810 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.811 2 DEBUG nova.virt.hardware [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:57:36 np0005465988 nova_compute[236126]: 2025-10-02 12:57:36.813 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:57:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:57:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/886648033' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:57:37 np0005465988 nova_compute[236126]: 2025-10-02 12:57:37.277 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:57:37 np0005465988 nova_compute[236126]: 2025-10-02 12:57:37.307 2 DEBUG nova.storage.rbd_utils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] rbd image 2a8ebb32-a776-4437-8643-49acdc43be2f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:57:37 np0005465988 nova_compute[236126]: 2025-10-02 12:57:37.312 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:57:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:37.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:37 np0005465988 nova_compute[236126]: 2025-10-02 12:57:37.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:57:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2248987543' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:57:37 np0005465988 nova_compute[236126]: 2025-10-02 12:57:37.953 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.641s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:57:37 np0005465988 nova_compute[236126]: 2025-10-02 12:57:37.955 2 DEBUG nova.virt.libvirt.vif [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:57:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-2106045067',display_name='tempest-TestServerMultinode-server-2106045067',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-2106045067',id=192,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8668725b86704fdcacbb467738b51154',ramdisk_id='',reservation_id='r-f0fjq1jc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1785572191',owner_user_name='tempest-TestServerMultinode-1785572191-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:57:29Z,user_data=None,user_id='de066041e985417da95924c04915bd11',uuid=2a8ebb32-a776-4437-8643-49acdc43be2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63749ef2-bb12-4521-93f2-5314940b99f4", "address": "fa:16:3e:80:a6:b6", "network": {"id": "9b3e5364-0567-4be5-b771-728ed7dd0ab7", "bridge": "br-int", "label": "tempest-TestServerMultinode-1015877619-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e08031c685814456aa6c43bcc8f98574", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63749ef2-bb", "ovs_interfaceid": "63749ef2-bb12-4521-93f2-5314940b99f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:57:37 np0005465988 nova_compute[236126]: 2025-10-02 12:57:37.955 2 DEBUG nova.network.os_vif_util [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Converting VIF {"id": "63749ef2-bb12-4521-93f2-5314940b99f4", "address": "fa:16:3e:80:a6:b6", "network": {"id": "9b3e5364-0567-4be5-b771-728ed7dd0ab7", "bridge": "br-int", "label": "tempest-TestServerMultinode-1015877619-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e08031c685814456aa6c43bcc8f98574", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63749ef2-bb", "ovs_interfaceid": "63749ef2-bb12-4521-93f2-5314940b99f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:57:37 np0005465988 nova_compute[236126]: 2025-10-02 12:57:37.956 2 DEBUG nova.network.os_vif_util [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:a6:b6,bridge_name='br-int',has_traffic_filtering=True,id=63749ef2-bb12-4521-93f2-5314940b99f4,network=Network(9b3e5364-0567-4be5-b771-728ed7dd0ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63749ef2-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:57:37 np0005465988 nova_compute[236126]: 2025-10-02 12:57:37.957 2 DEBUG nova.objects.instance [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2a8ebb32-a776-4437-8643-49acdc43be2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.021 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  <uuid>2a8ebb32-a776-4437-8643-49acdc43be2f</uuid>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  <name>instance-000000c0</name>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestServerMultinode-server-2106045067</nova:name>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:57:36</nova:creationTime>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <nova:user uuid="de066041e985417da95924c04915bd11">tempest-TestServerMultinode-1785572191-project-admin</nova:user>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <nova:project uuid="8668725b86704fdcacbb467738b51154">tempest-TestServerMultinode-1785572191</nova:project>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <nova:port uuid="63749ef2-bb12-4521-93f2-5314940b99f4">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <entry name="serial">2a8ebb32-a776-4437-8643-49acdc43be2f</entry>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <entry name="uuid">2a8ebb32-a776-4437-8643-49acdc43be2f</entry>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/2a8ebb32-a776-4437-8643-49acdc43be2f_disk">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/2a8ebb32-a776-4437-8643-49acdc43be2f_disk.config">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:80:a6:b6"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <target dev="tap63749ef2-bb"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/2a8ebb32-a776-4437-8643-49acdc43be2f/console.log" append="off"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:57:38 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:57:38 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:57:38 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:57:38 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.023 2 DEBUG nova.compute.manager [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Preparing to wait for external event network-vif-plugged-63749ef2-bb12-4521-93f2-5314940b99f4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.023 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Acquiring lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.023 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.024 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.024 2 DEBUG nova.virt.libvirt.vif [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:57:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-2106045067',display_name='tempest-TestServerMultinode-server-2106045067',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-2106045067',id=192,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8668725b86704fdcacbb467738b51154',ramdisk_id='',reservation_id='r-f0fjq1jc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1785572191',owner_user_name='tempest-TestServerMultinode-1785572191-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:57:29Z,user_data=None,user_id='de066041e985417da95924c04915bd11',uuid=2a8ebb32-a776-4437-8643-49acdc43be2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63749ef2-bb12-4521-93f2-5314940b99f4", "address": "fa:16:3e:80:a6:b6", "network": {"id": "9b3e5364-0567-4be5-b771-728ed7dd0ab7", "bridge": "br-int", "label": "tempest-TestServerMultinode-1015877619-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e08031c685814456aa6c43bcc8f98574", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63749ef2-bb", "ovs_interfaceid": "63749ef2-bb12-4521-93f2-5314940b99f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.025 2 DEBUG nova.network.os_vif_util [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Converting VIF {"id": "63749ef2-bb12-4521-93f2-5314940b99f4", "address": "fa:16:3e:80:a6:b6", "network": {"id": "9b3e5364-0567-4be5-b771-728ed7dd0ab7", "bridge": "br-int", "label": "tempest-TestServerMultinode-1015877619-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e08031c685814456aa6c43bcc8f98574", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63749ef2-bb", "ovs_interfaceid": "63749ef2-bb12-4521-93f2-5314940b99f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.025 2 DEBUG nova.network.os_vif_util [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:a6:b6,bridge_name='br-int',has_traffic_filtering=True,id=63749ef2-bb12-4521-93f2-5314940b99f4,network=Network(9b3e5364-0567-4be5-b771-728ed7dd0ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63749ef2-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.026 2 DEBUG os_vif [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:a6:b6,bridge_name='br-int',has_traffic_filtering=True,id=63749ef2-bb12-4521-93f2-5314940b99f4,network=Network(9b3e5364-0567-4be5-b771-728ed7dd0ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63749ef2-bb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.027 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.027 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.032 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63749ef2-bb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.032 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap63749ef2-bb, col_values=(('external_ids', {'iface-id': '63749ef2-bb12-4521-93f2-5314940b99f4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:a6:b6', 'vm-uuid': '2a8ebb32-a776-4437-8643-49acdc43be2f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:38 np0005465988 NetworkManager[45041]: <info>  [1759409858.0360] manager: (tap63749ef2-bb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/371)
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.043 2 INFO os_vif [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:a6:b6,bridge_name='br-int',has_traffic_filtering=True,id=63749ef2-bb12-4521-93f2-5314940b99f4,network=Network(9b3e5364-0567-4be5-b771-728ed7dd0ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63749ef2-bb')#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.168 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.170 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.172 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] No VIF found with MAC fa:16:3e:80:a6:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.172 2 INFO nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Using config drive#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.198 2 DEBUG nova.storage.rbd_utils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] rbd image 2a8ebb32-a776-4437-8643-49acdc43be2f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:57:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:38.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.786 2 INFO nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Creating config drive at /var/lib/nova/instances/2a8ebb32-a776-4437-8643-49acdc43be2f/disk.config#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.791 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2a8ebb32-a776-4437-8643-49acdc43be2f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnsq4yww5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.951 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2a8ebb32-a776-4437-8643-49acdc43be2f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnsq4yww5" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.986 2 DEBUG nova.storage.rbd_utils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] rbd image 2a8ebb32-a776-4437-8643-49acdc43be2f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:57:38 np0005465988 nova_compute[236126]: 2025-10-02 12:57:38.990 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2a8ebb32-a776-4437-8643-49acdc43be2f/disk.config 2a8ebb32-a776-4437-8643-49acdc43be2f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:57:39 np0005465988 nova_compute[236126]: 2025-10-02 12:57:39.314 2 DEBUG oslo_concurrency.processutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2a8ebb32-a776-4437-8643-49acdc43be2f/disk.config 2a8ebb32-a776-4437-8643-49acdc43be2f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.324s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:57:39 np0005465988 nova_compute[236126]: 2025-10-02 12:57:39.316 2 INFO nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Deleting local config drive /var/lib/nova/instances/2a8ebb32-a776-4437-8643-49acdc43be2f/disk.config because it was imported into RBD.#033[00m
Oct  2 08:57:39 np0005465988 NetworkManager[45041]: <info>  [1759409859.3635] manager: (tap63749ef2-bb): new Tun device (/org/freedesktop/NetworkManager/Devices/372)
Oct  2 08:57:39 np0005465988 kernel: tap63749ef2-bb: entered promiscuous mode
Oct  2 08:57:39 np0005465988 nova_compute[236126]: 2025-10-02 12:57:39.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:57:39Z|00858|binding|INFO|Claiming lport 63749ef2-bb12-4521-93f2-5314940b99f4 for this chassis.
Oct  2 08:57:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:57:39Z|00859|binding|INFO|63749ef2-bb12-4521-93f2-5314940b99f4: Claiming fa:16:3e:80:a6:b6 10.100.0.5
Oct  2 08:57:39 np0005465988 systemd-udevd[324343]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:57:39 np0005465988 NetworkManager[45041]: <info>  [1759409859.4070] device (tap63749ef2-bb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:57:39 np0005465988 NetworkManager[45041]: <info>  [1759409859.4085] device (tap63749ef2-bb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:57:39 np0005465988 systemd-machined[192594]: New machine qemu-89-instance-000000c0.
Oct  2 08:57:39 np0005465988 systemd[1]: Started Virtual Machine qemu-89-instance-000000c0.
Oct  2 08:57:39 np0005465988 nova_compute[236126]: 2025-10-02 12:57:39.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:57:39Z|00860|binding|INFO|Setting lport 63749ef2-bb12-4521-93f2-5314940b99f4 ovn-installed in OVS
Oct  2 08:57:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:57:39Z|00861|binding|INFO|Setting lport 63749ef2-bb12-4521-93f2-5314940b99f4 up in Southbound
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.462 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:a6:b6 10.100.0.5'], port_security=['fa:16:3e:80:a6:b6 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2a8ebb32-a776-4437-8643-49acdc43be2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b3e5364-0567-4be5-b771-728ed7dd0ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8668725b86704fdcacbb467738b51154', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9426c89d-e30e-4342-a8bd-1975c70a0c71', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e55754b-f304-4904-b3bf-7f80f94cdc02, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=63749ef2-bb12-4521-93f2-5314940b99f4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.465 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 63749ef2-bb12-4521-93f2-5314940b99f4 in datapath 9b3e5364-0567-4be5-b771-728ed7dd0ab7 bound to our chassis#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.467 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b3e5364-0567-4be5-b771-728ed7dd0ab7#033[00m
Oct  2 08:57:39 np0005465988 nova_compute[236126]: 2025-10-02 12:57:39.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.484 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[61bb23c9-c71a-47f7-b9bd-b694c72b253a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.485 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b3e5364-01 in ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.487 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b3e5364-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.487 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2e422d9a-ff28-45fa-86c5-3ce7ede52baf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.488 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[460a3ef1-5f60-45e2-a427-210d489c856e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.500 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[4efba8ae-84b0-4aab-bcc2-4fca798f9c6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.518 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[18470478-f839-4bb9-99ce-253e52fc853e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.550 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5f9f955b-9dbe-43ed-bf35-a573814be414]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.557 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e44a7b-b586-4394-ba89-d9d3c586f5e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 NetworkManager[45041]: <info>  [1759409859.5585] manager: (tap9b3e5364-00): new Veth device (/org/freedesktop/NetworkManager/Devices/373)
Oct  2 08:57:39 np0005465988 systemd-udevd[324347]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.592 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1da7df93-da39-4e96-a29b-24279a3dab70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.596 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[bc178ca7-38a0-45c6-93c4-76abbfe45c6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 NetworkManager[45041]: <info>  [1759409859.6180] device (tap9b3e5364-00): carrier: link connected
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.623 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[6f1ca70a-7fa5-48fc-b4fd-1702dbd0b22b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.644 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c7f070a8-d4fb-4c8d-8d36-6551f86ed9c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b3e5364-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:74:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 249], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 791994, 'reachable_time': 43758, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 324378, 'error': None, 'target': 'ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.666 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[769a65c8-0006-4b33-9c90-f116d48e82ca]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef6:7417'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 791994, 'tstamp': 791994}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 324379, 'error': None, 'target': 'ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.682 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d3498a82-2c35-4a81-b986-a7d1355fb0dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b3e5364-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:74:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 249], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 791994, 'reachable_time': 43758, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 324380, 'error': None, 'target': 'ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.716 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[182d7acf-1995-48f2-91a1-95335a08ce71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 nova_compute[236126]: 2025-10-02 12:57:39.768 2 DEBUG nova.network.neutron [req-a34f0363-aa6e-4805-bfbd-9b94516e2cdf req-733a046e-b550-4840-b416-537ba54403c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Updated VIF entry in instance network info cache for port 63749ef2-bb12-4521-93f2-5314940b99f4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:57:39 np0005465988 nova_compute[236126]: 2025-10-02 12:57:39.769 2 DEBUG nova.network.neutron [req-a34f0363-aa6e-4805-bfbd-9b94516e2cdf req-733a046e-b550-4840-b416-537ba54403c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Updating instance_info_cache with network_info: [{"id": "63749ef2-bb12-4521-93f2-5314940b99f4", "address": "fa:16:3e:80:a6:b6", "network": {"id": "9b3e5364-0567-4be5-b771-728ed7dd0ab7", "bridge": "br-int", "label": "tempest-TestServerMultinode-1015877619-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e08031c685814456aa6c43bcc8f98574", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63749ef2-bb", "ovs_interfaceid": "63749ef2-bb12-4521-93f2-5314940b99f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.794 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8c06a02a-31c3-46bf-b3bd-dfcf7a20f32a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.797 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b3e5364-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.797 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.798 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b3e5364-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:57:39 np0005465988 NetworkManager[45041]: <info>  [1759409859.8018] manager: (tap9b3e5364-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/374)
Oct  2 08:57:39 np0005465988 kernel: tap9b3e5364-00: entered promiscuous mode
Oct  2 08:57:39 np0005465988 nova_compute[236126]: 2025-10-02 12:57:39.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.804 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b3e5364-00, col_values=(('external_ids', {'iface-id': '6e4127f6-98a7-4fa7-9ba7-1b632af1bcf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:57:39 np0005465988 nova_compute[236126]: 2025-10-02 12:57:39.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:39 np0005465988 ovn_controller[132601]: 2025-10-02T12:57:39Z|00862|binding|INFO|Releasing lport 6e4127f6-98a7-4fa7-9ba7-1b632af1bcf6 from this chassis (sb_readonly=0)
Oct  2 08:57:39 np0005465988 nova_compute[236126]: 2025-10-02 12:57:39.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.821 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b3e5364-0567-4be5-b771-728ed7dd0ab7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b3e5364-0567-4be5-b771-728ed7dd0ab7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:57:39 np0005465988 nova_compute[236126]: 2025-10-02 12:57:39.822 2 DEBUG oslo_concurrency.lockutils [req-a34f0363-aa6e-4805-bfbd-9b94516e2cdf req-733a046e-b550-4840-b416-537ba54403c7 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-2a8ebb32-a776-4437-8643-49acdc43be2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.823 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7045c832-63c3-4c94-8ffa-5abb9fded79f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.825 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-9b3e5364-0567-4be5-b771-728ed7dd0ab7
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/9b3e5364-0567-4be5-b771-728ed7dd0ab7.pid.haproxy
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 9b3e5364-0567-4be5-b771-728ed7dd0ab7
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:57:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:39.828 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7', 'env', 'PROCESS_TAG=haproxy-9b3e5364-0567-4be5-b771-728ed7dd0ab7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b3e5364-0567-4be5-b771-728ed7dd0ab7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:57:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:57:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:39.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.042 2 DEBUG nova.compute.manager [req-0b089136-bb69-4c61-ba52-50818267f9a5 req-107905bb-22f7-41b2-a3d1-d4c92797900e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Received event network-vif-plugged-63749ef2-bb12-4521-93f2-5314940b99f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.043 2 DEBUG oslo_concurrency.lockutils [req-0b089136-bb69-4c61-ba52-50818267f9a5 req-107905bb-22f7-41b2-a3d1-d4c92797900e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.043 2 DEBUG oslo_concurrency.lockutils [req-0b089136-bb69-4c61-ba52-50818267f9a5 req-107905bb-22f7-41b2-a3d1-d4c92797900e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.044 2 DEBUG oslo_concurrency.lockutils [req-0b089136-bb69-4c61-ba52-50818267f9a5 req-107905bb-22f7-41b2-a3d1-d4c92797900e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.044 2 DEBUG nova.compute.manager [req-0b089136-bb69-4c61-ba52-50818267f9a5 req-107905bb-22f7-41b2-a3d1-d4c92797900e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Processing event network-vif-plugged-63749ef2-bb12-4521-93f2-5314940b99f4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:40 np0005465988 podman[324445]: 2025-10-02 12:57:40.195220611 +0000 UTC m=+0.024406324 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:57:40 np0005465988 podman[324445]: 2025-10-02 12:57:40.29727203 +0000 UTC m=+0.126457713 container create 97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:57:40 np0005465988 systemd[1]: Started libpod-conmon-97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed.scope.
Oct  2 08:57:40 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:57:40 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f452be26c069ba1aedb321a24f96d68dfb922b9ced1fde552e7c260f658cd8df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:57:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:40 np0005465988 podman[324445]: 2025-10-02 12:57:40.546964252 +0000 UTC m=+0.376150025 container init 97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:57:40 np0005465988 podman[324445]: 2025-10-02 12:57:40.5580044 +0000 UTC m=+0.387190113 container start 97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 08:57:40 np0005465988 neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7[324466]: [NOTICE]   (324470) : New worker (324472) forked
Oct  2 08:57:40 np0005465988 neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7[324466]: [NOTICE]   (324470) : Loading success.
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.725 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409860.7246702, 2a8ebb32-a776-4437-8643-49acdc43be2f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.725 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] VM Started (Lifecycle Event)#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.728 2 DEBUG nova.compute.manager [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.733 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:57:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:40.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.738 2 INFO nova.virt.libvirt.driver [-] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Instance spawned successfully.#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.739 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.747 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.767 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.772 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.772 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.773 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.773 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.774 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.774 2 DEBUG nova.virt.libvirt.driver [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.808 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.809 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409860.725735, 2a8ebb32-a776-4437-8643-49acdc43be2f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.809 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.836 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.840 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409860.7306693, 2a8ebb32-a776-4437-8643-49acdc43be2f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.841 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.852 2 INFO nova.compute.manager [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Took 11.13 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.852 2 DEBUG nova.compute.manager [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.860 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.863 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.885 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.938 2 INFO nova.compute.manager [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Took 12.72 seconds to build instance.#033[00m
Oct  2 08:57:40 np0005465988 nova_compute[236126]: 2025-10-02 12:57:40.966 2 DEBUG oslo_concurrency.lockutils [None req-064146b3-645b-4147-acd1-73b7a0044efb de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.859s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:41.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:42 np0005465988 nova_compute[236126]: 2025-10-02 12:57:42.174 2 DEBUG nova.compute.manager [req-3525c8c9-3951-41d5-a33b-53176bf8fbe5 req-8611cc9a-db0f-4268-9fa8-322d43c11465 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Received event network-vif-plugged-63749ef2-bb12-4521-93f2-5314940b99f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:57:42 np0005465988 nova_compute[236126]: 2025-10-02 12:57:42.175 2 DEBUG oslo_concurrency.lockutils [req-3525c8c9-3951-41d5-a33b-53176bf8fbe5 req-8611cc9a-db0f-4268-9fa8-322d43c11465 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:42 np0005465988 nova_compute[236126]: 2025-10-02 12:57:42.176 2 DEBUG oslo_concurrency.lockutils [req-3525c8c9-3951-41d5-a33b-53176bf8fbe5 req-8611cc9a-db0f-4268-9fa8-322d43c11465 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:42 np0005465988 nova_compute[236126]: 2025-10-02 12:57:42.176 2 DEBUG oslo_concurrency.lockutils [req-3525c8c9-3951-41d5-a33b-53176bf8fbe5 req-8611cc9a-db0f-4268-9fa8-322d43c11465 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:42 np0005465988 nova_compute[236126]: 2025-10-02 12:57:42.176 2 DEBUG nova.compute.manager [req-3525c8c9-3951-41d5-a33b-53176bf8fbe5 req-8611cc9a-db0f-4268-9fa8-322d43c11465 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] No waiting events found dispatching network-vif-plugged-63749ef2-bb12-4521-93f2-5314940b99f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:57:42 np0005465988 nova_compute[236126]: 2025-10-02 12:57:42.177 2 WARNING nova.compute.manager [req-3525c8c9-3951-41d5-a33b-53176bf8fbe5 req-8611cc9a-db0f-4268-9fa8-322d43c11465 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Received unexpected event network-vif-plugged-63749ef2-bb12-4521-93f2-5314940b99f4 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:57:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:42.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:43 np0005465988 nova_compute[236126]: 2025-10-02 12:57:43.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:43.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:44.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:44 np0005465988 nova_compute[236126]: 2025-10-02 12:57:44.897 2 DEBUG oslo_concurrency.lockutils [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Acquiring lock "2a8ebb32-a776-4437-8643-49acdc43be2f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:44 np0005465988 nova_compute[236126]: 2025-10-02 12:57:44.898 2 DEBUG oslo_concurrency.lockutils [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:44 np0005465988 nova_compute[236126]: 2025-10-02 12:57:44.899 2 DEBUG oslo_concurrency.lockutils [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Acquiring lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:44 np0005465988 nova_compute[236126]: 2025-10-02 12:57:44.899 2 DEBUG oslo_concurrency.lockutils [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:44 np0005465988 nova_compute[236126]: 2025-10-02 12:57:44.899 2 DEBUG oslo_concurrency.lockutils [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:44 np0005465988 nova_compute[236126]: 2025-10-02 12:57:44.901 2 INFO nova.compute.manager [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Terminating instance#033[00m
Oct  2 08:57:44 np0005465988 nova_compute[236126]: 2025-10-02 12:57:44.902 2 DEBUG nova.compute.manager [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:57:45 np0005465988 kernel: tap63749ef2-bb (unregistering): left promiscuous mode
Oct  2 08:57:45 np0005465988 NetworkManager[45041]: <info>  [1759409865.0140] device (tap63749ef2-bb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:57:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:57:45Z|00863|binding|INFO|Releasing lport 63749ef2-bb12-4521-93f2-5314940b99f4 from this chassis (sb_readonly=0)
Oct  2 08:57:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:57:45Z|00864|binding|INFO|Setting lport 63749ef2-bb12-4521-93f2-5314940b99f4 down in Southbound
Oct  2 08:57:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:57:45Z|00865|binding|INFO|Removing iface tap63749ef2-bb ovn-installed in OVS
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.033 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:a6:b6 10.100.0.5'], port_security=['fa:16:3e:80:a6:b6 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2a8ebb32-a776-4437-8643-49acdc43be2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b3e5364-0567-4be5-b771-728ed7dd0ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8668725b86704fdcacbb467738b51154', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9426c89d-e30e-4342-a8bd-1975c70a0c71', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e55754b-f304-4904-b3bf-7f80f94cdc02, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=63749ef2-bb12-4521-93f2-5314940b99f4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.036 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 63749ef2-bb12-4521-93f2-5314940b99f4 in datapath 9b3e5364-0567-4be5-b771-728ed7dd0ab7 unbound from our chassis#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.040 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b3e5364-0567-4be5-b771-728ed7dd0ab7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.044 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[38556b81-d75e-43c8-bde9-ff55bf2a078c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.046 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7 namespace which is not needed anymore#033[00m
Oct  2 08:57:45 np0005465988 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000c0.scope: Deactivated successfully.
Oct  2 08:57:45 np0005465988 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000c0.scope: Consumed 5.486s CPU time.
Oct  2 08:57:45 np0005465988 systemd-machined[192594]: Machine qemu-89-instance-000000c0 terminated.
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.141 2 INFO nova.virt.libvirt.driver [-] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Instance destroyed successfully.#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.142 2 DEBUG nova.objects.instance [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lazy-loading 'resources' on Instance uuid 2a8ebb32-a776-4437-8643-49acdc43be2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.163 2 DEBUG nova.virt.libvirt.vif [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:57:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-2106045067',display_name='tempest-TestServerMultinode-server-2106045067',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-2106045067',id=192,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:57:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8668725b86704fdcacbb467738b51154',ramdisk_id='',reservation_id='r-f0fjq1jc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1785572191',owner_user_name='tempest-TestServerMultinode-1785572191-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:57:40Z,user_data=None,user_id='de066041e985417da95924c04915bd11',uuid=2a8ebb32-a776-4437-8643-49acdc43be2f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63749ef2-bb12-4521-93f2-5314940b99f4", "address": "fa:16:3e:80:a6:b6", "network": {"id": "9b3e5364-0567-4be5-b771-728ed7dd0ab7", "bridge": "br-int", "label": "tempest-TestServerMultinode-1015877619-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e08031c685814456aa6c43bcc8f98574", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63749ef2-bb", "ovs_interfaceid": "63749ef2-bb12-4521-93f2-5314940b99f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.164 2 DEBUG nova.network.os_vif_util [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Converting VIF {"id": "63749ef2-bb12-4521-93f2-5314940b99f4", "address": "fa:16:3e:80:a6:b6", "network": {"id": "9b3e5364-0567-4be5-b771-728ed7dd0ab7", "bridge": "br-int", "label": "tempest-TestServerMultinode-1015877619-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e08031c685814456aa6c43bcc8f98574", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63749ef2-bb", "ovs_interfaceid": "63749ef2-bb12-4521-93f2-5314940b99f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.165 2 DEBUG nova.network.os_vif_util [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:a6:b6,bridge_name='br-int',has_traffic_filtering=True,id=63749ef2-bb12-4521-93f2-5314940b99f4,network=Network(9b3e5364-0567-4be5-b771-728ed7dd0ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63749ef2-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.166 2 DEBUG os_vif [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:a6:b6,bridge_name='br-int',has_traffic_filtering=True,id=63749ef2-bb12-4521-93f2-5314940b99f4,network=Network(9b3e5364-0567-4be5-b771-728ed7dd0ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63749ef2-bb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.168 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63749ef2-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.175 2 INFO os_vif [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:a6:b6,bridge_name='br-int',has_traffic_filtering=True,id=63749ef2-bb12-4521-93f2-5314940b99f4,network=Network(9b3e5364-0567-4be5-b771-728ed7dd0ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63749ef2-bb')#033[00m
Oct  2 08:57:45 np0005465988 neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7[324466]: [NOTICE]   (324470) : haproxy version is 2.8.14-c23fe91
Oct  2 08:57:45 np0005465988 neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7[324466]: [NOTICE]   (324470) : path to executable is /usr/sbin/haproxy
Oct  2 08:57:45 np0005465988 neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7[324466]: [WARNING]  (324470) : Exiting Master process...
Oct  2 08:57:45 np0005465988 neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7[324466]: [WARNING]  (324470) : Exiting Master process...
Oct  2 08:57:45 np0005465988 neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7[324466]: [ALERT]    (324470) : Current worker (324472) exited with code 143 (Terminated)
Oct  2 08:57:45 np0005465988 neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7[324466]: [WARNING]  (324470) : All workers exited. Exiting... (0)
Oct  2 08:57:45 np0005465988 systemd[1]: libpod-97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed.scope: Deactivated successfully.
Oct  2 08:57:45 np0005465988 conmon[324466]: conmon 97f08659a0143766064f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed.scope/container/memory.events
Oct  2 08:57:45 np0005465988 podman[324518]: 2025-10-02 12:57:45.220261468 +0000 UTC m=+0.062247503 container died 97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:45 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed-userdata-shm.mount: Deactivated successfully.
Oct  2 08:57:45 np0005465988 systemd[1]: var-lib-containers-storage-overlay-f452be26c069ba1aedb321a24f96d68dfb922b9ced1fde552e7c260f658cd8df-merged.mount: Deactivated successfully.
Oct  2 08:57:45 np0005465988 podman[324518]: 2025-10-02 12:57:45.281351678 +0000 UTC m=+0.123337723 container cleanup 97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:57:45 np0005465988 systemd[1]: libpod-conmon-97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed.scope: Deactivated successfully.
Oct  2 08:57:45 np0005465988 podman[324566]: 2025-10-02 12:57:45.382206652 +0000 UTC m=+0.065940540 container remove 97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.389 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a84184f6-0727-48ab-add2-3f986bfc2334]: (4, ('Thu Oct  2 12:57:45 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7 (97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed)\n97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed\nThu Oct  2 12:57:45 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7 (97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed)\n97f08659a0143766064f3c9b38449607ea8a23ff61399a7bfb551436933df9ed\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.391 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1044bb1e-9322-41d8-9cd0-00fe2e91dd77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.392 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b3e5364-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:45 np0005465988 kernel: tap9b3e5364-00: left promiscuous mode
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.401 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bb4ef764-5ea8-4733-9a76-0b191b2bf606]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.423 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[592209b8-1ba2-4d06-91a5-fa68b4697f64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.425 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c74dbaba-a22d-4538-9435-ea9f9807a0e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.441 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fc932906-26bb-49b3-9bde-caf660c0210c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 791987, 'reachable_time': 41585, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 324581, 'error': None, 'target': 'ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:45 np0005465988 systemd[1]: run-netns-ovnmeta\x2d9b3e5364\x2d0567\x2d4be5\x2db771\x2d728ed7dd0ab7.mount: Deactivated successfully.
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.446 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b3e5364-0567-4be5-b771-728ed7dd0ab7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:57:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:45.446 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[db2cb46c-ce81-4534-a4b1-d7a760307b22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:57:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.579 2 DEBUG nova.compute.manager [req-2edd49f6-a6ce-4a6e-a7fc-e9af2063c20b req-d0d39d88-8b64-4b40-b181-1d6e1b329f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Received event network-vif-unplugged-63749ef2-bb12-4521-93f2-5314940b99f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.580 2 DEBUG oslo_concurrency.lockutils [req-2edd49f6-a6ce-4a6e-a7fc-e9af2063c20b req-d0d39d88-8b64-4b40-b181-1d6e1b329f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.581 2 DEBUG oslo_concurrency.lockutils [req-2edd49f6-a6ce-4a6e-a7fc-e9af2063c20b req-d0d39d88-8b64-4b40-b181-1d6e1b329f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.582 2 DEBUG oslo_concurrency.lockutils [req-2edd49f6-a6ce-4a6e-a7fc-e9af2063c20b req-d0d39d88-8b64-4b40-b181-1d6e1b329f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.582 2 DEBUG nova.compute.manager [req-2edd49f6-a6ce-4a6e-a7fc-e9af2063c20b req-d0d39d88-8b64-4b40-b181-1d6e1b329f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] No waiting events found dispatching network-vif-unplugged-63749ef2-bb12-4521-93f2-5314940b99f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.583 2 DEBUG nova.compute.manager [req-2edd49f6-a6ce-4a6e-a7fc-e9af2063c20b req-d0d39d88-8b64-4b40-b181-1d6e1b329f24 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Received event network-vif-unplugged-63749ef2-bb12-4521-93f2-5314940b99f4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.626 2 INFO nova.virt.libvirt.driver [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Deleting instance files /var/lib/nova/instances/2a8ebb32-a776-4437-8643-49acdc43be2f_del#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.627 2 INFO nova.virt.libvirt.driver [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Deletion of /var/lib/nova/instances/2a8ebb32-a776-4437-8643-49acdc43be2f_del complete#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.701 2 INFO nova.compute.manager [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Took 0.80 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.702 2 DEBUG oslo.service.loopingcall [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.703 2 DEBUG nova.compute.manager [-] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:57:45 np0005465988 nova_compute[236126]: 2025-10-02 12:57:45.703 2 DEBUG nova.network.neutron [-] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:57:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:45.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:46 np0005465988 nova_compute[236126]: 2025-10-02 12:57:46.610 2 DEBUG nova.network.neutron [-] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:57:46 np0005465988 nova_compute[236126]: 2025-10-02 12:57:46.629 2 INFO nova.compute.manager [-] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Took 0.93 seconds to deallocate network for instance.#033[00m
Oct  2 08:57:46 np0005465988 nova_compute[236126]: 2025-10-02 12:57:46.699 2 DEBUG oslo_concurrency.lockutils [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:46 np0005465988 nova_compute[236126]: 2025-10-02 12:57:46.700 2 DEBUG oslo_concurrency.lockutils [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:46.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:46 np0005465988 nova_compute[236126]: 2025-10-02 12:57:46.770 2 DEBUG oslo_concurrency.processutils [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:57:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:57:47 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1917175509' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.270 2 DEBUG oslo_concurrency.processutils [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.279 2 DEBUG nova.compute.provider_tree [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.305 2 DEBUG nova.scheduler.client.report [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.340 2 DEBUG oslo_concurrency.lockutils [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.384 2 INFO nova.scheduler.client.report [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Deleted allocations for instance 2a8ebb32-a776-4437-8643-49acdc43be2f#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.453 2 DEBUG oslo_concurrency.lockutils [None req-8e8881a9-9f0d-4cc1-9bba-3d1dc914ec72 de066041e985417da95924c04915bd11 8668725b86704fdcacbb467738b51154 - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.687 2 DEBUG nova.compute.manager [req-1f27b394-657e-45bd-aa18-cc12ede2052b req-7cec72da-0919-4d3f-940d-85bb7b525344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Received event network-vif-plugged-63749ef2-bb12-4521-93f2-5314940b99f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.688 2 DEBUG oslo_concurrency.lockutils [req-1f27b394-657e-45bd-aa18-cc12ede2052b req-7cec72da-0919-4d3f-940d-85bb7b525344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.689 2 DEBUG oslo_concurrency.lockutils [req-1f27b394-657e-45bd-aa18-cc12ede2052b req-7cec72da-0919-4d3f-940d-85bb7b525344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.689 2 DEBUG oslo_concurrency.lockutils [req-1f27b394-657e-45bd-aa18-cc12ede2052b req-7cec72da-0919-4d3f-940d-85bb7b525344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "2a8ebb32-a776-4437-8643-49acdc43be2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.689 2 DEBUG nova.compute.manager [req-1f27b394-657e-45bd-aa18-cc12ede2052b req-7cec72da-0919-4d3f-940d-85bb7b525344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] No waiting events found dispatching network-vif-plugged-63749ef2-bb12-4521-93f2-5314940b99f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.690 2 WARNING nova.compute.manager [req-1f27b394-657e-45bd-aa18-cc12ede2052b req-7cec72da-0919-4d3f-940d-85bb7b525344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Received unexpected event network-vif-plugged-63749ef2-bb12-4521-93f2-5314940b99f4 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:57:47 np0005465988 nova_compute[236126]: 2025-10-02 12:57:47.690 2 DEBUG nova.compute.manager [req-1f27b394-657e-45bd-aa18-cc12ede2052b req-7cec72da-0919-4d3f-940d-85bb7b525344 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Received event network-vif-deleted-63749ef2-bb12-4521-93f2-5314940b99f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:57:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:47.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:48.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:49 np0005465988 podman[324607]: 2025-10-02 12:57:49.525828912 +0000 UTC m=+0.066812605 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 08:57:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:57:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:49.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:57:50 np0005465988 nova_compute[236126]: 2025-10-02 12:57:50.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:50 np0005465988 nova_compute[236126]: 2025-10-02 12:57:50.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:50.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:51.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:52.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:57:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:53.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:57:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:54.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:55 np0005465988 nova_compute[236126]: 2025-10-02 12:57:55.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:55 np0005465988 nova_compute[236126]: 2025-10-02 12:57:55.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:57:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:57:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:55.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:57:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:56.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:57.673 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:57:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:57:57.675 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:57:57 np0005465988 nova_compute[236126]: 2025-10-02 12:57:57.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:57:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:57.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:57:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:57:58.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:57:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:57:59 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/266658596' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:57:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:57:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:57:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:57:59.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:00 np0005465988 nova_compute[236126]: 2025-10-02 12:58:00.141 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409865.1383517, 2a8ebb32-a776-4437-8643-49acdc43be2f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:58:00 np0005465988 nova_compute[236126]: 2025-10-02 12:58:00.141 2 INFO nova.compute.manager [-] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:58:00 np0005465988 nova_compute[236126]: 2025-10-02 12:58:00.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:00 np0005465988 nova_compute[236126]: 2025-10-02 12:58:00.205 2 DEBUG nova.compute.manager [None req-163033ee-fdd5-4785-949a-13094847d5d5 - - - - - -] [instance: 2a8ebb32-a776-4437-8643-49acdc43be2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:58:00 np0005465988 nova_compute[236126]: 2025-10-02 12:58:00.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:00 np0005465988 podman[324684]: 2025-10-02 12:58:00.538603532 +0000 UTC m=+0.068750771 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:58:00 np0005465988 podman[324685]: 2025-10-02 12:58:00.564477048 +0000 UTC m=+0.092005791 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:58:00 np0005465988 podman[324683]: 2025-10-02 12:58:00.591601809 +0000 UTC m=+0.127610806 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:58:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:00.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:01.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:02.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:03.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:04.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:05 np0005465988 nova_compute[236126]: 2025-10-02 12:58:05.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:05 np0005465988 nova_compute[236126]: 2025-10-02 12:58:05.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #145. Immutable memtables: 0.
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.625823) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 91] Flushing memtable with next log file: 145
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409885625919, "job": 91, "event": "flush_started", "num_memtables": 1, "num_entries": 920, "num_deletes": 251, "total_data_size": 1773220, "memory_usage": 1792944, "flush_reason": "Manual Compaction"}
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 91] Level-0 flush table #146: started
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409885641148, "cf_name": "default", "job": 91, "event": "table_file_creation", "file_number": 146, "file_size": 1158332, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 70919, "largest_seqno": 71834, "table_properties": {"data_size": 1154084, "index_size": 1899, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 9719, "raw_average_key_size": 19, "raw_value_size": 1145551, "raw_average_value_size": 2342, "num_data_blocks": 84, "num_entries": 489, "num_filter_entries": 489, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759409824, "oldest_key_time": 1759409824, "file_creation_time": 1759409885, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 146, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 91] Flush lasted 15372 microseconds, and 6843 cpu microseconds.
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.641206) [db/flush_job.cc:967] [default] [JOB 91] Level-0 flush table #146: 1158332 bytes OK
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.641233) [db/memtable_list.cc:519] [default] Level-0 commit table #146 started
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.648179) [db/memtable_list.cc:722] [default] Level-0 commit table #146: memtable #1 done
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.648207) EVENT_LOG_v1 {"time_micros": 1759409885648199, "job": 91, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.648231) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 91] Try to delete WAL files size 1768569, prev total WAL file size 1768569, number of live WAL files 2.
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000142.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.649106) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036303234' seq:72057594037927935, type:22 .. '7061786F730036323736' seq:0, type:0; will stop at (end)
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 92] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 91 Base level 0, inputs: [146(1131KB)], [144(10MB)]
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409885649181, "job": 92, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [146], "files_L6": [144], "score": -1, "input_data_size": 12143282, "oldest_snapshot_seqno": -1}
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 92] Generated table #147: 9076 keys, 10251387 bytes, temperature: kUnknown
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409885751727, "cf_name": "default", "job": 92, "event": "table_file_creation", "file_number": 147, "file_size": 10251387, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10194666, "index_size": 32928, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22725, "raw_key_size": 240575, "raw_average_key_size": 26, "raw_value_size": 10037151, "raw_average_value_size": 1105, "num_data_blocks": 1244, "num_entries": 9076, "num_filter_entries": 9076, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759409885, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 147, "seqno_to_time_mapping": "N/A"}}
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.751981) [db/compaction/compaction_job.cc:1663] [default] [JOB 92] Compacted 1@0 + 1@6 files to L6 => 10251387 bytes
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.754268) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 118.3 rd, 99.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 10.5 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(19.3) write-amplify(8.9) OK, records in: 9592, records dropped: 516 output_compression: NoCompression
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.754307) EVENT_LOG_v1 {"time_micros": 1759409885754289, "job": 92, "event": "compaction_finished", "compaction_time_micros": 102614, "compaction_time_cpu_micros": 33690, "output_level": 6, "num_output_files": 1, "total_output_size": 10251387, "num_input_records": 9592, "num_output_records": 9076, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000146.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409885754924, "job": 92, "event": "table_file_deletion", "file_number": 146}
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000144.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759409885759253, "job": 92, "event": "table_file_deletion", "file_number": 144}
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.648964) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.759516) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.759525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.759527) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.759530) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:58:05 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-12:58:05.759531) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 08:58:05 np0005465988 ovn_controller[132601]: 2025-10-02T12:58:05Z|00866|binding|INFO|Releasing lport 72e5faac-9d73-42b2-89ed-1f386e556cc7 from this chassis (sb_readonly=0)
Oct  2 08:58:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:05.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:05 np0005465988 nova_compute[236126]: 2025-10-02 12:58:05.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:06.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:07.677 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:58:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:07.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:08.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:09 np0005465988 nova_compute[236126]: 2025-10-02 12:58:09.817 2 DEBUG oslo_concurrency.lockutils [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "bcc081e3-b47c-4963-b0e1-1aff13e929de" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:09 np0005465988 nova_compute[236126]: 2025-10-02 12:58:09.818 2 DEBUG oslo_concurrency.lockutils [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:09 np0005465988 nova_compute[236126]: 2025-10-02 12:58:09.818 2 DEBUG oslo_concurrency.lockutils [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:09 np0005465988 nova_compute[236126]: 2025-10-02 12:58:09.819 2 DEBUG oslo_concurrency.lockutils [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:09 np0005465988 nova_compute[236126]: 2025-10-02 12:58:09.819 2 DEBUG oslo_concurrency.lockutils [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:09 np0005465988 nova_compute[236126]: 2025-10-02 12:58:09.820 2 INFO nova.compute.manager [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Terminating instance#033[00m
Oct  2 08:58:09 np0005465988 nova_compute[236126]: 2025-10-02 12:58:09.821 2 DEBUG nova.compute.manager [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:58:09 np0005465988 kernel: tapdd846c20-9d (unregistering): left promiscuous mode
Oct  2 08:58:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:09.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:09 np0005465988 NetworkManager[45041]: <info>  [1759409889.9053] device (tapdd846c20-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:58:09 np0005465988 nova_compute[236126]: 2025-10-02 12:58:09.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:58:09Z|00867|binding|INFO|Releasing lport dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 from this chassis (sb_readonly=0)
Oct  2 08:58:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:58:09Z|00868|binding|INFO|Setting lport dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 down in Southbound
Oct  2 08:58:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:58:09Z|00869|binding|INFO|Removing iface tapdd846c20-9d ovn-installed in OVS
Oct  2 08:58:09 np0005465988 nova_compute[236126]: 2025-10-02 12:58:09.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:09.925 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:70:1b 10.100.0.18'], port_security=['fa:16:3e:f1:70:1b 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'bcc081e3-b47c-4963-b0e1-1aff13e929de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48b23f60-a626-4a95-b154-a764454c451b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '12f3d19b-01fe-42e0-ac19-e732ef6c9e33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3dbfe445-7f25-42ca-8688-0a8d6c43ed3f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=dd846c20-9d18-4e15-a8a8-4cccbf14b8a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:58:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:09.926 142124 INFO neutron.agent.ovn.metadata.agent [-] Port dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 in datapath 48b23f60-a626-4a95-b154-a764454c451b unbound from our chassis#033[00m
Oct  2 08:58:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:09.929 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48b23f60-a626-4a95-b154-a764454c451b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:58:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:09.930 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[554e15d2-0eb5-461a-b848-7d7690d518ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:58:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:09.931 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48b23f60-a626-4a95-b154-a764454c451b namespace which is not needed anymore#033[00m
Oct  2 08:58:09 np0005465988 nova_compute[236126]: 2025-10-02 12:58:09.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:09 np0005465988 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Oct  2 08:58:09 np0005465988 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000bc.scope: Consumed 16.346s CPU time.
Oct  2 08:58:09 np0005465988 systemd-machined[192594]: Machine qemu-88-instance-000000bc terminated.
Oct  2 08:58:10 np0005465988 NetworkManager[45041]: <info>  [1759409890.0466] manager: (tapdd846c20-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/375)
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.063 2 INFO nova.virt.libvirt.driver [-] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Instance destroyed successfully.#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.065 2 DEBUG nova.objects.instance [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'resources' on Instance uuid bcc081e3-b47c-4963-b0e1-1aff13e929de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.078 2 DEBUG nova.virt.libvirt.vif [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:56:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-887371889',display_name='tempest-TestNetworkBasicOps-server-887371889',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-887371889',id=188,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGS+YU6IfWR2i2vh5DtnBd4STMtZO0onP6GxQ1zL0ghp75MvMRBnuNZKmJn1MIJQkRA89BhjwEmKYW0uPAQZ1TvPU+xFcNfvjFSUcigDldMVaKBEnBnxeHvqd6H4SH+Ywg==',key_name='tempest-TestNetworkBasicOps-528737445',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:56:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-2bmykuc2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:56:58Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=bcc081e3-b47c-4963-b0e1-1aff13e929de,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "address": "fa:16:3e:f1:70:1b", "network": {"id": "48b23f60-a626-4a95-b154-a764454c451b", "bridge": "br-int", "label": "tempest-network-smoke--1549158146", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd846c20-9d", "ovs_interfaceid": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.079 2 DEBUG nova.network.os_vif_util [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "address": "fa:16:3e:f1:70:1b", "network": {"id": "48b23f60-a626-4a95-b154-a764454c451b", "bridge": "br-int", "label": "tempest-network-smoke--1549158146", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd846c20-9d", "ovs_interfaceid": "dd846c20-9d18-4e15-a8a8-4cccbf14b8a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.080 2 DEBUG nova.network.os_vif_util [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f1:70:1b,bridge_name='br-int',has_traffic_filtering=True,id=dd846c20-9d18-4e15-a8a8-4cccbf14b8a4,network=Network(48b23f60-a626-4a95-b154-a764454c451b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd846c20-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.081 2 DEBUG os_vif [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f1:70:1b,bridge_name='br-int',has_traffic_filtering=True,id=dd846c20-9d18-4e15-a8a8-4cccbf14b8a4,network=Network(48b23f60-a626-4a95-b154-a764454c451b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd846c20-9d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.084 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd846c20-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:10 np0005465988 neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b[323459]: [NOTICE]   (323463) : haproxy version is 2.8.14-c23fe91
Oct  2 08:58:10 np0005465988 neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b[323459]: [NOTICE]   (323463) : path to executable is /usr/sbin/haproxy
Oct  2 08:58:10 np0005465988 neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b[323459]: [WARNING]  (323463) : Exiting Master process...
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:10 np0005465988 neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b[323459]: [ALERT]    (323463) : Current worker (323465) exited with code 143 (Terminated)
Oct  2 08:58:10 np0005465988 neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b[323459]: [WARNING]  (323463) : All workers exited. Exiting... (0)
Oct  2 08:58:10 np0005465988 systemd[1]: libpod-2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500.scope: Deactivated successfully.
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.134 2 INFO os_vif [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f1:70:1b,bridge_name='br-int',has_traffic_filtering=True,id=dd846c20-9d18-4e15-a8a8-4cccbf14b8a4,network=Network(48b23f60-a626-4a95-b154-a764454c451b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd846c20-9d')#033[00m
Oct  2 08:58:10 np0005465988 podman[324775]: 2025-10-02 12:58:10.140528379 +0000 UTC m=+0.110641768 container died 2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:58:10 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500-userdata-shm.mount: Deactivated successfully.
Oct  2 08:58:10 np0005465988 systemd[1]: var-lib-containers-storage-overlay-5bf6f0f26bc4a6548834d57a70c3aed01462db796ddb27a07850978c901c6fa1-merged.mount: Deactivated successfully.
Oct  2 08:58:10 np0005465988 podman[324775]: 2025-10-02 12:58:10.222691865 +0000 UTC m=+0.192805224 container cleanup 2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:58:10 np0005465988 systemd[1]: libpod-conmon-2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500.scope: Deactivated successfully.
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:10 np0005465988 podman[324835]: 2025-10-02 12:58:10.357091036 +0000 UTC m=+0.099999241 container remove 2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:58:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:10.363 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe2e6e3-bc65-4c6e-b22a-a453d6fac996]: (4, ('Thu Oct  2 12:58:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b (2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500)\n2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500\nThu Oct  2 12:58:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-48b23f60-a626-4a95-b154-a764454c451b (2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500)\n2c9ccfe520a85fbdc49dd4d7ff7b40559598d4b9278c7715f788586678b58500\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:58:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:10.365 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0cd414be-9a68-40c8-9297-a5cfe62ac4ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:58:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:10.366 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48b23f60-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:10 np0005465988 kernel: tap48b23f60-a0: left promiscuous mode
Oct  2 08:58:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:10.373 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7c8509f1-3dcb-4dfc-8022-e64638a273e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:10.402 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[afc0f7c1-070c-4e39-b604-eb6127a93594]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:58:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:10.404 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6f060f02-03df-47f6-9dfe-3566aaf99f9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:58:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:10.423 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb48313-6c5d-4a75-8eef-7647584156b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 787163, 'reachable_time': 43063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 324850, 'error': None, 'target': 'ovnmeta-48b23f60-a626-4a95-b154-a764454c451b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:58:10 np0005465988 systemd[1]: run-netns-ovnmeta\x2d48b23f60\x2da626\x2d4a95\x2db154\x2da764454c451b.mount: Deactivated successfully.
Oct  2 08:58:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:10.428 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48b23f60-a626-4a95-b154-a764454c451b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:58:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:10.428 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[8da5916f-d9ac-4edc-b487-64f19e0f1c4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:58:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.529 2 DEBUG nova.compute.manager [req-3532b233-63b7-42f4-ba3a-324147cf98eb req-c489db40-6eda-45d5-b5a3-2aa805ed3591 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Received event network-vif-unplugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.529 2 DEBUG oslo_concurrency.lockutils [req-3532b233-63b7-42f4-ba3a-324147cf98eb req-c489db40-6eda-45d5-b5a3-2aa805ed3591 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.529 2 DEBUG oslo_concurrency.lockutils [req-3532b233-63b7-42f4-ba3a-324147cf98eb req-c489db40-6eda-45d5-b5a3-2aa805ed3591 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.530 2 DEBUG oslo_concurrency.lockutils [req-3532b233-63b7-42f4-ba3a-324147cf98eb req-c489db40-6eda-45d5-b5a3-2aa805ed3591 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.530 2 DEBUG nova.compute.manager [req-3532b233-63b7-42f4-ba3a-324147cf98eb req-c489db40-6eda-45d5-b5a3-2aa805ed3591 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] No waiting events found dispatching network-vif-unplugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.530 2 DEBUG nova.compute.manager [req-3532b233-63b7-42f4-ba3a-324147cf98eb req-c489db40-6eda-45d5-b5a3-2aa805ed3591 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Received event network-vif-unplugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.751 2 INFO nova.virt.libvirt.driver [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Deleting instance files /var/lib/nova/instances/bcc081e3-b47c-4963-b0e1-1aff13e929de_del#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.754 2 INFO nova.virt.libvirt.driver [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Deletion of /var/lib/nova/instances/bcc081e3-b47c-4963-b0e1-1aff13e929de_del complete#033[00m
Oct  2 08:58:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:10.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.824 2 INFO nova.compute.manager [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Took 1.00 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.825 2 DEBUG oslo.service.loopingcall [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.826 2 DEBUG nova.compute.manager [-] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:58:10 np0005465988 nova_compute[236126]: 2025-10-02 12:58:10.826 2 DEBUG nova.network.neutron [-] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.510 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.511 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.511 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.511 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.512 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.708 2 DEBUG nova.network.neutron [-] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.745 2 INFO nova.compute.manager [-] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Took 0.92 seconds to deallocate network for instance.#033[00m
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.821 2 DEBUG nova.compute.manager [req-5e249da4-bbdd-4375-9b30-e42982e5392d req-abf5dd1c-16fb-4432-b654-6604f0ba401d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Received event network-vif-deleted-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.844 2 DEBUG oslo_concurrency.lockutils [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.845 2 DEBUG oslo_concurrency.lockutils [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:11.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.909 2 DEBUG oslo_concurrency.processutils [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:58:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:58:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2614947171' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:58:11 np0005465988 nova_compute[236126]: 2025-10-02 12:58:11.979 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.179 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.182 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4067MB free_disk=20.851272583007812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.182 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:58:12 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2101575353' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.414 2 DEBUG oslo_concurrency.processutils [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.420 2 DEBUG nova.compute.provider_tree [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.440 2 DEBUG nova.scheduler.client.report [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.466 2 DEBUG oslo_concurrency.lockutils [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.470 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.502 2 INFO nova.scheduler.client.report [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Deleted allocations for instance bcc081e3-b47c-4963-b0e1-1aff13e929de#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.554 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.554 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.574 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.614 2 DEBUG oslo_concurrency.lockutils [None req-b1f54195-3e6b-48b5-8de3-4bb315209491 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.628 2 DEBUG nova.compute.manager [req-6fb5eb03-ab3e-40be-9a0b-e9f45d1baf16 req-4fb52d13-73ff-4c84-8429-4ec548a7a4f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Received event network-vif-plugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.628 2 DEBUG oslo_concurrency.lockutils [req-6fb5eb03-ab3e-40be-9a0b-e9f45d1baf16 req-4fb52d13-73ff-4c84-8429-4ec548a7a4f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.629 2 DEBUG oslo_concurrency.lockutils [req-6fb5eb03-ab3e-40be-9a0b-e9f45d1baf16 req-4fb52d13-73ff-4c84-8429-4ec548a7a4f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.629 2 DEBUG oslo_concurrency.lockutils [req-6fb5eb03-ab3e-40be-9a0b-e9f45d1baf16 req-4fb52d13-73ff-4c84-8429-4ec548a7a4f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "bcc081e3-b47c-4963-b0e1-1aff13e929de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.629 2 DEBUG nova.compute.manager [req-6fb5eb03-ab3e-40be-9a0b-e9f45d1baf16 req-4fb52d13-73ff-4c84-8429-4ec548a7a4f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] No waiting events found dispatching network-vif-plugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:58:12 np0005465988 nova_compute[236126]: 2025-10-02 12:58:12.629 2 WARNING nova.compute.manager [req-6fb5eb03-ab3e-40be-9a0b-e9f45d1baf16 req-4fb52d13-73ff-4c84-8429-4ec548a7a4f0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Received unexpected event network-vif-plugged-dd846c20-9d18-4e15-a8a8-4cccbf14b8a4 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:58:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:12.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:58:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3201776990' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:58:13 np0005465988 nova_compute[236126]: 2025-10-02 12:58:13.033 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:58:13 np0005465988 nova_compute[236126]: 2025-10-02 12:58:13.040 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:58:13 np0005465988 nova_compute[236126]: 2025-10-02 12:58:13.055 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:58:13 np0005465988 nova_compute[236126]: 2025-10-02 12:58:13.086 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:58:13 np0005465988 nova_compute[236126]: 2025-10-02 12:58:13.086 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:13.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:14.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:15 np0005465988 nova_compute[236126]: 2025-10-02 12:58:15.087 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:15 np0005465988 nova_compute[236126]: 2025-10-02 12:58:15.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:15 np0005465988 nova_compute[236126]: 2025-10-02 12:58:15.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:15.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:16.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:17.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:18.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:58:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:58:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 08:58:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct  2 08:58:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 08:58:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:58:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:58:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:58:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:19.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:20 np0005465988 nova_compute[236126]: 2025-10-02 12:58:20.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:20 np0005465988 nova_compute[236126]: 2025-10-02 12:58:20.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:20 np0005465988 podman[325225]: 2025-10-02 12:58:20.531329708 +0000 UTC m=+0.066558248 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:58:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:20.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:21 np0005465988 nova_compute[236126]: 2025-10-02 12:58:21.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:21.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:22 np0005465988 nova_compute[236126]: 2025-10-02 12:58:22.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:22 np0005465988 nova_compute[236126]: 2025-10-02 12:58:22.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:58:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:22.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:23 np0005465988 nova_compute[236126]: 2025-10-02 12:58:23.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:23.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:58:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:24.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:58:25 np0005465988 nova_compute[236126]: 2025-10-02 12:58:25.062 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409890.0605352, bcc081e3-b47c-4963-b0e1-1aff13e929de => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:58:25 np0005465988 nova_compute[236126]: 2025-10-02 12:58:25.062 2 INFO nova.compute.manager [-] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:58:25 np0005465988 nova_compute[236126]: 2025-10-02 12:58:25.081 2 DEBUG nova.compute.manager [None req-12946f28-6710-4cbb-a155-814efa7ee125 - - - - - -] [instance: bcc081e3-b47c-4963-b0e1-1aff13e929de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:58:25 np0005465988 nova_compute[236126]: 2025-10-02 12:58:25.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:25 np0005465988 nova_compute[236126]: 2025-10-02 12:58:25.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:25.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:26 np0005465988 nova_compute[236126]: 2025-10-02 12:58:26.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:26.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:27.405 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:27.405 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:58:27.406 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:27 np0005465988 nova_compute[236126]: 2025-10-02 12:58:27.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:27.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:28 np0005465988 nova_compute[236126]: 2025-10-02 12:58:28.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:28.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:58:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:58:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:29.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:30 np0005465988 nova_compute[236126]: 2025-10-02 12:58:30.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:30 np0005465988 nova_compute[236126]: 2025-10-02 12:58:30.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:30 np0005465988 nova_compute[236126]: 2025-10-02 12:58:30.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:30 np0005465988 nova_compute[236126]: 2025-10-02 12:58:30.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:58:30 np0005465988 nova_compute[236126]: 2025-10-02 12:58:30.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:58:30 np0005465988 nova_compute[236126]: 2025-10-02 12:58:30.494 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:58:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:30.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:31 np0005465988 nova_compute[236126]: 2025-10-02 12:58:31.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:31 np0005465988 podman[325302]: 2025-10-02 12:58:31.55633007 +0000 UTC m=+0.067490265 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:58:31 np0005465988 podman[325301]: 2025-10-02 12:58:31.564455514 +0000 UTC m=+0.089821608 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid)
Oct  2 08:58:31 np0005465988 podman[325300]: 2025-10-02 12:58:31.58028439 +0000 UTC m=+0.109047012 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:58:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:31.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:32.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:58:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:33.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:58:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:34.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:35 np0005465988 nova_compute[236126]: 2025-10-02 12:58:35.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:35 np0005465988 nova_compute[236126]: 2025-10-02 12:58:35.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:35.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:36.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:37.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:38.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:39.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:40 np0005465988 nova_compute[236126]: 2025-10-02 12:58:40.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:40 np0005465988 nova_compute[236126]: 2025-10-02 12:58:40.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:40.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:41 np0005465988 nova_compute[236126]: 2025-10-02 12:58:41.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:41.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:58:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:42.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:58:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:43.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:44.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:45 np0005465988 nova_compute[236126]: 2025-10-02 12:58:45.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:45 np0005465988 nova_compute[236126]: 2025-10-02 12:58:45.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:45.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:46.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:47.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:48.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:58:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:49.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:58:50 np0005465988 nova_compute[236126]: 2025-10-02 12:58:50.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:50 np0005465988 nova_compute[236126]: 2025-10-02 12:58:50.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:50.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:51 np0005465988 podman[325425]: 2025-10-02 12:58:51.522247171 +0000 UTC m=+0.064306353 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS)
Oct  2 08:58:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:58:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:51.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:58:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:52.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:53.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:54.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:55 np0005465988 nova_compute[236126]: 2025-10-02 12:58:55.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:55 np0005465988 nova_compute[236126]: 2025-10-02 12:58:55.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:58:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:55.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:56.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:57.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:58:58.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:58:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:58:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:58:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:58:59.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:00 np0005465988 nova_compute[236126]: 2025-10-02 12:59:00.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:00 np0005465988 nova_compute[236126]: 2025-10-02 12:59:00.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:00 np0005465988 nova_compute[236126]: 2025-10-02 12:59:00.342 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "e9f655a6-48fa-4199-8652-5c5b47440055" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:00 np0005465988 nova_compute[236126]: 2025-10-02 12:59:00.342 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:00 np0005465988 nova_compute[236126]: 2025-10-02 12:59:00.430 2 DEBUG nova.compute.manager [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:59:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:00 np0005465988 nova_compute[236126]: 2025-10-02 12:59:00.604 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:00 np0005465988 nova_compute[236126]: 2025-10-02 12:59:00.605 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:00 np0005465988 nova_compute[236126]: 2025-10-02 12:59:00.614 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:59:00 np0005465988 nova_compute[236126]: 2025-10-02 12:59:00.614 2 INFO nova.compute.claims [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:59:00 np0005465988 nova_compute[236126]: 2025-10-02 12:59:00.817 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:00.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:59:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2434650181' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.310 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.317 2 DEBUG nova.compute.provider_tree [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.400 2 DEBUG nova.scheduler.client.report [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.484 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.879s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.485 2 DEBUG nova.compute.manager [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.604 2 DEBUG nova.compute.manager [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.605 2 DEBUG nova.network.neutron [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.649 2 INFO nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.679 2 DEBUG nova.compute.manager [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.795 2 DEBUG nova.compute.manager [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.797 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.797 2 INFO nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Creating image(s)#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.829 2 DEBUG nova.storage.rbd_utils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image e9f655a6-48fa-4199-8652-5c5b47440055_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e391 e391: 3 total, 3 up, 3 in
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.864 2 DEBUG nova.storage.rbd_utils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image e9f655a6-48fa-4199-8652-5c5b47440055_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.901 2 DEBUG nova.storage.rbd_utils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image e9f655a6-48fa-4199-8652-5c5b47440055_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.906 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:01.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.985 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.986 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.986 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:01 np0005465988 nova_compute[236126]: 2025-10-02 12:59:01.987 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:02 np0005465988 nova_compute[236126]: 2025-10-02 12:59:02.013 2 DEBUG nova.storage.rbd_utils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image e9f655a6-48fa-4199-8652-5c5b47440055_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:02 np0005465988 nova_compute[236126]: 2025-10-02 12:59:02.018 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e9f655a6-48fa-4199-8652-5c5b47440055_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:02 np0005465988 nova_compute[236126]: 2025-10-02 12:59:02.443 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 e9f655a6-48fa-4199-8652-5c5b47440055_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:02 np0005465988 podman[325624]: 2025-10-02 12:59:02.536355263 +0000 UTC m=+0.068969477 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:59:02 np0005465988 nova_compute[236126]: 2025-10-02 12:59:02.555 2 DEBUG nova.storage.rbd_utils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] resizing rbd image e9f655a6-48fa-4199-8652-5c5b47440055_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:59:02 np0005465988 podman[325632]: 2025-10-02 12:59:02.570902028 +0000 UTC m=+0.084300479 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:59:02 np0005465988 podman[325616]: 2025-10-02 12:59:02.585131998 +0000 UTC m=+0.113513130 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:59:02 np0005465988 nova_compute[236126]: 2025-10-02 12:59:02.684 2 DEBUG nova.objects.instance [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'migration_context' on Instance uuid e9f655a6-48fa-4199-8652-5c5b47440055 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:59:02 np0005465988 nova_compute[236126]: 2025-10-02 12:59:02.746 2 DEBUG nova.policy [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb366465e6154871b8a53c9f500105ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:59:02 np0005465988 nova_compute[236126]: 2025-10-02 12:59:02.749 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:59:02 np0005465988 nova_compute[236126]: 2025-10-02 12:59:02.749 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Ensure instance console log exists: /var/lib/nova/instances/e9f655a6-48fa-4199-8652-5c5b47440055/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:59:02 np0005465988 nova_compute[236126]: 2025-10-02 12:59:02.749 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:02 np0005465988 nova_compute[236126]: 2025-10-02 12:59:02.750 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:02 np0005465988 nova_compute[236126]: 2025-10-02 12:59:02.750 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:02.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e392 e392: 3 total, 3 up, 3 in
Oct  2 08:59:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:03.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e393 e393: 3 total, 3 up, 3 in
Oct  2 08:59:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:04.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e394 e394: 3 total, 3 up, 3 in
Oct  2 08:59:05 np0005465988 nova_compute[236126]: 2025-10-02 12:59:05.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:05 np0005465988 nova_compute[236126]: 2025-10-02 12:59:05.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:05 np0005465988 nova_compute[236126]: 2025-10-02 12:59:05.492 2 DEBUG nova.network.neutron [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Successfully updated port: 74de555c-711e-4e21-a6f4-89c8289aae93 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:59:05 np0005465988 nova_compute[236126]: 2025-10-02 12:59:05.526 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "refresh_cache-e9f655a6-48fa-4199-8652-5c5b47440055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:59:05 np0005465988 nova_compute[236126]: 2025-10-02 12:59:05.527 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquired lock "refresh_cache-e9f655a6-48fa-4199-8652-5c5b47440055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:59:05 np0005465988 nova_compute[236126]: 2025-10-02 12:59:05.527 2 DEBUG nova.network.neutron [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:59:05 np0005465988 nova_compute[236126]: 2025-10-02 12:59:05.597 2 DEBUG nova.compute.manager [req-f55e008b-51bd-46f1-a0ca-83755cf82160 req-74fc9679-a0e1-4020-914c-083a7942637b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Received event network-changed-74de555c-711e-4e21-a6f4-89c8289aae93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:59:05 np0005465988 nova_compute[236126]: 2025-10-02 12:59:05.598 2 DEBUG nova.compute.manager [req-f55e008b-51bd-46f1-a0ca-83755cf82160 req-74fc9679-a0e1-4020-914c-083a7942637b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Refreshing instance network info cache due to event network-changed-74de555c-711e-4e21-a6f4-89c8289aae93. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:59:05 np0005465988 nova_compute[236126]: 2025-10-02 12:59:05.599 2 DEBUG oslo_concurrency.lockutils [req-f55e008b-51bd-46f1-a0ca-83755cf82160 req-74fc9679-a0e1-4020-914c-083a7942637b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-e9f655a6-48fa-4199-8652-5c5b47440055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:59:05 np0005465988 nova_compute[236126]: 2025-10-02 12:59:05.713 2 DEBUG nova.network.neutron [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:59:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:59:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:05.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.332 2 DEBUG nova.network.neutron [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Updating instance_info_cache with network_info: [{"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.362 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Releasing lock "refresh_cache-e9f655a6-48fa-4199-8652-5c5b47440055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.363 2 DEBUG nova.compute.manager [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Instance network_info: |[{"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.363 2 DEBUG oslo_concurrency.lockutils [req-f55e008b-51bd-46f1-a0ca-83755cf82160 req-74fc9679-a0e1-4020-914c-083a7942637b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-e9f655a6-48fa-4199-8652-5c5b47440055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.364 2 DEBUG nova.network.neutron [req-f55e008b-51bd-46f1-a0ca-83755cf82160 req-74fc9679-a0e1-4020-914c-083a7942637b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Refreshing network info cache for port 74de555c-711e-4e21-a6f4-89c8289aae93 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.367 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Start _get_guest_xml network_info=[{"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.371 2 WARNING nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.376 2 DEBUG nova.virt.libvirt.host [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.377 2 DEBUG nova.virt.libvirt.host [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.379 2 DEBUG nova.virt.libvirt.host [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.380 2 DEBUG nova.virt.libvirt.host [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.381 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.382 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.382 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.383 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.383 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.383 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.384 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.384 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.385 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.385 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.385 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.386 2 DEBUG nova.virt.hardware [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.390 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:59:06 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2653659783' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.832 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:06.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.873 2 DEBUG nova.storage.rbd_utils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image e9f655a6-48fa-4199-8652-5c5b47440055_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:06 np0005465988 nova_compute[236126]: 2025-10-02 12:59:06.878 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:59:07 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3003324144' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.369 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.371 2 DEBUG nova.virt.libvirt.vif [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:58:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1546173084',display_name='tempest-TestNetworkBasicOps-server-1546173084',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1546173084',id=193,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOPOsj0pxh52taYqW5zA4gKjjHmyewWOEGLCwLWJ6CevctDo6M7sV0+kJbHre7Znry9axU8SnqeKBbCrrcdDwloF7+x3JIq882zktzE6ifusqsHta6salUdSLjgl9p6bKw==',key_name='tempest-TestNetworkBasicOps-1245745945',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-yn0o1vv6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:59:01Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=e9f655a6-48fa-4199-8652-5c5b47440055,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.371 2 DEBUG nova.network.os_vif_util [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.372 2 DEBUG nova.network.os_vif_util [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.373 2 DEBUG nova.objects.instance [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'pci_devices' on Instance uuid e9f655a6-48fa-4199-8652-5c5b47440055 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:59:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:59:07 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/757562566' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.488 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  <uuid>e9f655a6-48fa-4199-8652-5c5b47440055</uuid>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  <name>instance-000000c1</name>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestNetworkBasicOps-server-1546173084</nova:name>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:59:06</nova:creationTime>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <nova:user uuid="fb366465e6154871b8a53c9f500105ce">tempest-TestNetworkBasicOps-1692262680-project-member</nova:user>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <nova:project uuid="ce2ca82c03554560b55ed747ae63f1fb">tempest-TestNetworkBasicOps-1692262680</nova:project>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <nova:port uuid="74de555c-711e-4e21-a6f4-89c8289aae93">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <entry name="serial">e9f655a6-48fa-4199-8652-5c5b47440055</entry>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <entry name="uuid">e9f655a6-48fa-4199-8652-5c5b47440055</entry>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e9f655a6-48fa-4199-8652-5c5b47440055_disk">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/e9f655a6-48fa-4199-8652-5c5b47440055_disk.config">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:fa:91:d3"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <target dev="tap74de555c-71"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/e9f655a6-48fa-4199-8652-5c5b47440055/console.log" append="off"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:59:07 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:59:07 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:59:07 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:59:07 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.489 2 DEBUG nova.compute.manager [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Preparing to wait for external event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.490 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.490 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.491 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.492 2 DEBUG nova.virt.libvirt.vif [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:58:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1546173084',display_name='tempest-TestNetworkBasicOps-server-1546173084',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1546173084',id=193,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOPOsj0pxh52taYqW5zA4gKjjHmyewWOEGLCwLWJ6CevctDo6M7sV0+kJbHre7Znry9axU8SnqeKBbCrrcdDwloF7+x3JIq882zktzE6ifusqsHta6salUdSLjgl9p6bKw==',key_name='tempest-TestNetworkBasicOps-1245745945',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-yn0o1vv6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:59:01Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=e9f655a6-48fa-4199-8652-5c5b47440055,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.492 2 DEBUG nova.network.os_vif_util [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.493 2 DEBUG nova.network.os_vif_util [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.494 2 DEBUG os_vif [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.495 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.495 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.498 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74de555c-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.499 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap74de555c-71, col_values=(('external_ids', {'iface-id': '74de555c-711e-4e21-a6f4-89c8289aae93', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:91:d3', 'vm-uuid': 'e9f655a6-48fa-4199-8652-5c5b47440055'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:07 np0005465988 NetworkManager[45041]: <info>  [1759409947.5024] manager: (tap74de555c-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/376)
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.510 2 INFO os_vif [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71')#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.577 2 DEBUG nova.network.neutron [req-f55e008b-51bd-46f1-a0ca-83755cf82160 req-74fc9679-a0e1-4020-914c-083a7942637b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Updated VIF entry in instance network info cache for port 74de555c-711e-4e21-a6f4-89c8289aae93. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.578 2 DEBUG nova.network.neutron [req-f55e008b-51bd-46f1-a0ca-83755cf82160 req-74fc9679-a0e1-4020-914c-083a7942637b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Updating instance_info_cache with network_info: [{"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.608 2 DEBUG oslo_concurrency.lockutils [req-f55e008b-51bd-46f1-a0ca-83755cf82160 req-74fc9679-a0e1-4020-914c-083a7942637b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-e9f655a6-48fa-4199-8652-5c5b47440055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.802 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.803 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.803 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No VIF found with MAC fa:16:3e:fa:91:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.804 2 INFO nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Using config drive#033[00m
Oct  2 08:59:07 np0005465988 nova_compute[236126]: 2025-10-02 12:59:07.848 2 DEBUG nova.storage.rbd_utils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image e9f655a6-48fa-4199-8652-5c5b47440055_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:07.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:08 np0005465988 nova_compute[236126]: 2025-10-02 12:59:08.658 2 INFO nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Creating config drive at /var/lib/nova/instances/e9f655a6-48fa-4199-8652-5c5b47440055/disk.config#033[00m
Oct  2 08:59:08 np0005465988 nova_compute[236126]: 2025-10-02 12:59:08.668 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e9f655a6-48fa-4199-8652-5c5b47440055/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptp7dhbcw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:08 np0005465988 nova_compute[236126]: 2025-10-02 12:59:08.821 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e9f655a6-48fa-4199-8652-5c5b47440055/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptp7dhbcw" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:08 np0005465988 nova_compute[236126]: 2025-10-02 12:59:08.848 2 DEBUG nova.storage.rbd_utils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image e9f655a6-48fa-4199-8652-5c5b47440055_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:08 np0005465988 nova_compute[236126]: 2025-10-02 12:59:08.852 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e9f655a6-48fa-4199-8652-5c5b47440055/disk.config e9f655a6-48fa-4199-8652-5c5b47440055_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:59:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:08.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:59:09 np0005465988 nova_compute[236126]: 2025-10-02 12:59:09.139 2 DEBUG oslo_concurrency.processutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e9f655a6-48fa-4199-8652-5c5b47440055/disk.config e9f655a6-48fa-4199-8652-5c5b47440055_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.286s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:09 np0005465988 nova_compute[236126]: 2025-10-02 12:59:09.139 2 INFO nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Deleting local config drive /var/lib/nova/instances/e9f655a6-48fa-4199-8652-5c5b47440055/disk.config because it was imported into RBD.#033[00m
Oct  2 08:59:09 np0005465988 kernel: tap74de555c-71: entered promiscuous mode
Oct  2 08:59:09 np0005465988 NetworkManager[45041]: <info>  [1759409949.1987] manager: (tap74de555c-71): new Tun device (/org/freedesktop/NetworkManager/Devices/377)
Oct  2 08:59:09 np0005465988 nova_compute[236126]: 2025-10-02 12:59:09.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:09Z|00870|binding|INFO|Claiming lport 74de555c-711e-4e21-a6f4-89c8289aae93 for this chassis.
Oct  2 08:59:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:09Z|00871|binding|INFO|74de555c-711e-4e21-a6f4-89c8289aae93: Claiming fa:16:3e:fa:91:d3 10.100.0.6
Oct  2 08:59:09 np0005465988 nova_compute[236126]: 2025-10-02 12:59:09.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:09 np0005465988 nova_compute[236126]: 2025-10-02 12:59:09.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:09 np0005465988 nova_compute[236126]: 2025-10-02 12:59:09.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:09 np0005465988 systemd-machined[192594]: New machine qemu-90-instance-000000c1.
Oct  2 08:59:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:09Z|00872|binding|INFO|Setting lport 74de555c-711e-4e21-a6f4-89c8289aae93 ovn-installed in OVS
Oct  2 08:59:09 np0005465988 nova_compute[236126]: 2025-10-02 12:59:09.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:09 np0005465988 systemd[1]: Started Virtual Machine qemu-90-instance-000000c1.
Oct  2 08:59:09 np0005465988 systemd-udevd[325891]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:59:09 np0005465988 NetworkManager[45041]: <info>  [1759409949.3034] device (tap74de555c-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:59:09 np0005465988 NetworkManager[45041]: <info>  [1759409949.3045] device (tap74de555c-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.343 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:91:d3 10.100.0.6'], port_security=['fa:16:3e:fa:91:d3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1996543852', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e9f655a6-48fa-4199-8652-5c5b47440055', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1996543852', 'neutron:project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c114e1a6-21d7-49a2-a13f-595584b99547', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09ec6ed0-28d8-4666-8087-300b86d2afe4, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=74de555c-711e-4e21-a6f4-89c8289aae93) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.344 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 74de555c-711e-4e21-a6f4-89c8289aae93 in datapath cf9dc276-03fd-47d7-92fb-6f6d94b7d169 bound to our chassis#033[00m
Oct  2 08:59:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:09Z|00873|binding|INFO|Setting lport 74de555c-711e-4e21-a6f4-89c8289aae93 up in Southbound
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.346 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cf9dc276-03fd-47d7-92fb-6f6d94b7d169#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.356 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[16dc2062-f5d6-4910-87d0-1d9b075c3c6d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.357 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcf9dc276-01 in ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.358 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcf9dc276-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.358 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[94a239ae-0a73-498c-bc2f-255ca87fecbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.359 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1057e521-166d-4d9a-9dbd-8605b09f14e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.370 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[e89878b2-973a-43d4-b0b6-469d4c509090]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.392 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c6094036-7bc2-4928-abca-3f13caa6a9e9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.435 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f95400c9-1a65-4dce-ae9e-8d86e76368c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.441 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f52f1804-415e-4eda-ab20-e8fdcce051e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 NetworkManager[45041]: <info>  [1759409949.4439] manager: (tapcf9dc276-00): new Veth device (/org/freedesktop/NetworkManager/Devices/378)
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.489 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[553cb9f2-ec4c-4787-a23a-a193a191db70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.493 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4cf090c1-b508-4439-84c4-9cb25afaad3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e395 e395: 3 total, 3 up, 3 in
Oct  2 08:59:09 np0005465988 NetworkManager[45041]: <info>  [1759409949.5219] device (tapcf9dc276-00): carrier: link connected
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.530 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[178cf3f8-9051-47ca-8bda-bfb12125d9ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.553 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[69e0a846-709e-43ca-973a-c652de1cd283]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcf9dc276-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:71:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 800985, 'reachable_time': 18524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 325949, 'error': None, 'target': 'ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.572 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb5c0b9-a500-4033-af34-7bd6daafaba7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:713d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 800985, 'tstamp': 800985}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 325950, 'error': None, 'target': 'ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.595 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dae040b8-4a41-47f4-9890-5ca20b930ada]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcf9dc276-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:71:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 800985, 'reachable_time': 18524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 325951, 'error': None, 'target': 'ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e396 e396: 3 total, 3 up, 3 in
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.634 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf2c8ae-ce11-46af-8fae-edb9212a9dad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.721 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6d64547e-621d-4604-a0e0-c9c45c704a24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.723 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf9dc276-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.724 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.725 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcf9dc276-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:09 np0005465988 NetworkManager[45041]: <info>  [1759409949.7286] manager: (tapcf9dc276-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/379)
Oct  2 08:59:09 np0005465988 kernel: tapcf9dc276-00: entered promiscuous mode
Oct  2 08:59:09 np0005465988 nova_compute[236126]: 2025-10-02 12:59:09.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.734 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcf9dc276-00, col_values=(('external_ids', {'iface-id': 'd5135195-c335-444d-8d55-41c04f12d49b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:09 np0005465988 nova_compute[236126]: 2025-10-02 12:59:09.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:09 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:09Z|00874|binding|INFO|Releasing lport d5135195-c335-444d-8d55-41c04f12d49b from this chassis (sb_readonly=0)
Oct  2 08:59:09 np0005465988 nova_compute[236126]: 2025-10-02 12:59:09.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.738 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cf9dc276-03fd-47d7-92fb-6f6d94b7d169.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cf9dc276-03fd-47d7-92fb-6f6d94b7d169.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.739 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ee33ab82-3351-403d-ab20-0dddea85d8bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.740 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-cf9dc276-03fd-47d7-92fb-6f6d94b7d169
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/cf9dc276-03fd-47d7-92fb-6f6d94b7d169.pid.haproxy
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID cf9dc276-03fd-47d7-92fb-6f6d94b7d169
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:59:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:09.741 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'env', 'PROCESS_TAG=haproxy-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cf9dc276-03fd-47d7-92fb-6f6d94b7d169.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:59:09 np0005465988 nova_compute[236126]: 2025-10-02 12:59:09.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:09.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:10 np0005465988 podman[325999]: 2025-10-02 12:59:10.153696243 +0000 UTC m=+0.096546772 container create 86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.176 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409950.1754794, e9f655a6-48fa-4199-8652-5c5b47440055 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.176 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] VM Started (Lifecycle Event)#033[00m
Oct  2 08:59:10 np0005465988 podman[325999]: 2025-10-02 12:59:10.085570091 +0000 UTC m=+0.028420710 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:59:10 np0005465988 systemd[1]: Started libpod-conmon-86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d.scope.
Oct  2 08:59:10 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:59:10 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b09e083b847f1f7c59faa93fb523f50c45965c4119512e73607386d22a9eaf8b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:59:10 np0005465988 podman[325999]: 2025-10-02 12:59:10.231849844 +0000 UTC m=+0.174700403 container init 86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:59:10 np0005465988 podman[325999]: 2025-10-02 12:59:10.242253264 +0000 UTC m=+0.185103793 container start 86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:59:10 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[326015]: [NOTICE]   (326019) : New worker (326021) forked
Oct  2 08:59:10 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[326015]: [NOTICE]   (326019) : Loading success.
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.431 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.438 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409950.1757162, e9f655a6-48fa-4199-8652-5c5b47440055 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.439 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:59:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.720 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.724 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:59:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:10.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.957 2 DEBUG nova.compute.manager [req-7e1db057-f6be-460e-bca5-53149b743d6d req-8c603c92-795b-47be-9101-4de58e9bcdc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Received event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.957 2 DEBUG oslo_concurrency.lockutils [req-7e1db057-f6be-460e-bca5-53149b743d6d req-8c603c92-795b-47be-9101-4de58e9bcdc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.957 2 DEBUG oslo_concurrency.lockutils [req-7e1db057-f6be-460e-bca5-53149b743d6d req-8c603c92-795b-47be-9101-4de58e9bcdc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.958 2 DEBUG oslo_concurrency.lockutils [req-7e1db057-f6be-460e-bca5-53149b743d6d req-8c603c92-795b-47be-9101-4de58e9bcdc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.958 2 DEBUG nova.compute.manager [req-7e1db057-f6be-460e-bca5-53149b743d6d req-8c603c92-795b-47be-9101-4de58e9bcdc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Processing event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.959 2 DEBUG nova.compute.manager [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.963 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.966 2 INFO nova.virt.libvirt.driver [-] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Instance spawned successfully.#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.966 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.988 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.989 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409950.9621015, e9f655a6-48fa-4199-8652-5c5b47440055 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:59:10 np0005465988 nova_compute[236126]: 2025-10-02 12:59:10.989 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.002 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.002 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.003 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.003 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.004 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.004 2 DEBUG nova.virt.libvirt.driver [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.092 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.095 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.258 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.385 2 INFO nova.compute.manager [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Took 9.59 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.385 2 DEBUG nova.compute.manager [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:11.729 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:59:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:11.731 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.741 2 INFO nova.compute.manager [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Took 11.16 seconds to build instance.#033[00m
Oct  2 08:59:11 np0005465988 nova_compute[236126]: 2025-10-02 12:59:11.976 2 DEBUG oslo_concurrency.lockutils [None req-f95ce8cc-5d57-40f2-88ef-ee143d9a4d33 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:11.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:12 np0005465988 nova_compute[236126]: 2025-10-02 12:59:12.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:12.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.108 2 DEBUG nova.compute.manager [req-624913a2-d4ba-49ce-bb93-cca830c6c49e req-5066b838-2bc3-4fc5-9482-8af8d449c060 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Received event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.109 2 DEBUG oslo_concurrency.lockutils [req-624913a2-d4ba-49ce-bb93-cca830c6c49e req-5066b838-2bc3-4fc5-9482-8af8d449c060 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.109 2 DEBUG oslo_concurrency.lockutils [req-624913a2-d4ba-49ce-bb93-cca830c6c49e req-5066b838-2bc3-4fc5-9482-8af8d449c060 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.109 2 DEBUG oslo_concurrency.lockutils [req-624913a2-d4ba-49ce-bb93-cca830c6c49e req-5066b838-2bc3-4fc5-9482-8af8d449c060 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.109 2 DEBUG nova.compute.manager [req-624913a2-d4ba-49ce-bb93-cca830c6c49e req-5066b838-2bc3-4fc5-9482-8af8d449c060 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] No waiting events found dispatching network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.110 2 WARNING nova.compute.manager [req-624913a2-d4ba-49ce-bb93-cca830c6c49e req-5066b838-2bc3-4fc5-9482-8af8d449c060 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Received unexpected event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.629 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.629 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.630 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.630 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:59:13 np0005465988 nova_compute[236126]: 2025-10-02 12:59:13.631 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:13.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:59:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3794624728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:59:14 np0005465988 nova_compute[236126]: 2025-10-02 12:59:14.094 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:14 np0005465988 nova_compute[236126]: 2025-10-02 12:59:14.440 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:59:14 np0005465988 nova_compute[236126]: 2025-10-02 12:59:14.441 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 08:59:14 np0005465988 nova_compute[236126]: 2025-10-02 12:59:14.614 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:59:14 np0005465988 nova_compute[236126]: 2025-10-02 12:59:14.616 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3859MB free_disk=20.921710968017578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:59:14 np0005465988 nova_compute[236126]: 2025-10-02 12:59:14.616 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:14 np0005465988 nova_compute[236126]: 2025-10-02 12:59:14.617 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:14 np0005465988 nova_compute[236126]: 2025-10-02 12:59:14.691 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance e9f655a6-48fa-4199-8652-5c5b47440055 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:59:14 np0005465988 nova_compute[236126]: 2025-10-02 12:59:14.691 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:59:14 np0005465988 nova_compute[236126]: 2025-10-02 12:59:14.692 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:59:14 np0005465988 nova_compute[236126]: 2025-10-02 12:59:14.725 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:14.734 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:14.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:59:15 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2952988850' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:59:15 np0005465988 nova_compute[236126]: 2025-10-02 12:59:15.169 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:15 np0005465988 nova_compute[236126]: 2025-10-02 12:59:15.175 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:59:15 np0005465988 nova_compute[236126]: 2025-10-02 12:59:15.199 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:59:15 np0005465988 nova_compute[236126]: 2025-10-02 12:59:15.267 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:59:15 np0005465988 nova_compute[236126]: 2025-10-02 12:59:15.268 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:15 np0005465988 nova_compute[236126]: 2025-10-02 12:59:15.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:15.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:16 np0005465988 nova_compute[236126]: 2025-10-02 12:59:16.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:16 np0005465988 NetworkManager[45041]: <info>  [1759409956.6932] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/380)
Oct  2 08:59:16 np0005465988 NetworkManager[45041]: <info>  [1759409956.6943] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/381)
Oct  2 08:59:16 np0005465988 nova_compute[236126]: 2025-10-02 12:59:16.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:16 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:16Z|00875|binding|INFO|Releasing lport d5135195-c335-444d-8d55-41c04f12d49b from this chassis (sb_readonly=0)
Oct  2 08:59:16 np0005465988 nova_compute[236126]: 2025-10-02 12:59:16.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:16.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.075 2 DEBUG nova.compute.manager [req-bc50dccf-5a96-4934-b95a-2d7ff5d3bf47 req-0434b1ef-c070-4ae1-856c-05e396336126 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Received event network-changed-74de555c-711e-4e21-a6f4-89c8289aae93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.076 2 DEBUG nova.compute.manager [req-bc50dccf-5a96-4934-b95a-2d7ff5d3bf47 req-0434b1ef-c070-4ae1-856c-05e396336126 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Refreshing instance network info cache due to event network-changed-74de555c-711e-4e21-a6f4-89c8289aae93. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.076 2 DEBUG oslo_concurrency.lockutils [req-bc50dccf-5a96-4934-b95a-2d7ff5d3bf47 req-0434b1ef-c070-4ae1-856c-05e396336126 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-e9f655a6-48fa-4199-8652-5c5b47440055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.077 2 DEBUG oslo_concurrency.lockutils [req-bc50dccf-5a96-4934-b95a-2d7ff5d3bf47 req-0434b1ef-c070-4ae1-856c-05e396336126 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-e9f655a6-48fa-4199-8652-5c5b47440055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.077 2 DEBUG nova.network.neutron [req-bc50dccf-5a96-4934-b95a-2d7ff5d3bf47 req-0434b1ef-c070-4ae1-856c-05e396336126 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Refreshing network info cache for port 74de555c-711e-4e21-a6f4-89c8289aae93 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.708 2 DEBUG oslo_concurrency.lockutils [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "e9f655a6-48fa-4199-8652-5c5b47440055" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.708 2 DEBUG oslo_concurrency.lockutils [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.709 2 DEBUG oslo_concurrency.lockutils [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.709 2 DEBUG oslo_concurrency.lockutils [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.709 2 DEBUG oslo_concurrency.lockutils [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.711 2 INFO nova.compute.manager [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Terminating instance#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.712 2 DEBUG nova.compute.manager [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:59:17 np0005465988 kernel: tap74de555c-71 (unregistering): left promiscuous mode
Oct  2 08:59:17 np0005465988 NetworkManager[45041]: <info>  [1759409957.8425] device (tap74de555c-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:59:17 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:17Z|00876|binding|INFO|Releasing lport 74de555c-711e-4e21-a6f4-89c8289aae93 from this chassis (sb_readonly=0)
Oct  2 08:59:17 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:17Z|00877|binding|INFO|Setting lport 74de555c-711e-4e21-a6f4-89c8289aae93 down in Southbound
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:17 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:17Z|00878|binding|INFO|Removing iface tap74de555c-71 ovn-installed in OVS
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:17 np0005465988 nova_compute[236126]: 2025-10-02 12:59:17.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:17.921 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:91:d3 10.100.0.6'], port_security=['fa:16:3e:fa:91:d3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1996543852', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e9f655a6-48fa-4199-8652-5c5b47440055', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1996543852', 'neutron:project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c114e1a6-21d7-49a2-a13f-595584b99547', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09ec6ed0-28d8-4666-8087-300b86d2afe4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=74de555c-711e-4e21-a6f4-89c8289aae93) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:59:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:17.923 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 74de555c-711e-4e21-a6f4-89c8289aae93 in datapath cf9dc276-03fd-47d7-92fb-6f6d94b7d169 unbound from our chassis#033[00m
Oct  2 08:59:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:17.925 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cf9dc276-03fd-47d7-92fb-6f6d94b7d169, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:59:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:17.927 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a0f14507-161d-41f2-8ab5-f7b429e2b0d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:17.927 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169 namespace which is not needed anymore#033[00m
Oct  2 08:59:17 np0005465988 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000c1.scope: Deactivated successfully.
Oct  2 08:59:17 np0005465988 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000c1.scope: Consumed 7.734s CPU time.
Oct  2 08:59:17 np0005465988 systemd-machined[192594]: Machine qemu-90-instance-000000c1 terminated.
Oct  2 08:59:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:17.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:18 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[326015]: [NOTICE]   (326019) : haproxy version is 2.8.14-c23fe91
Oct  2 08:59:18 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[326015]: [NOTICE]   (326019) : path to executable is /usr/sbin/haproxy
Oct  2 08:59:18 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[326015]: [WARNING]  (326019) : Exiting Master process...
Oct  2 08:59:18 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[326015]: [ALERT]    (326019) : Current worker (326021) exited with code 143 (Terminated)
Oct  2 08:59:18 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[326015]: [WARNING]  (326019) : All workers exited. Exiting... (0)
Oct  2 08:59:18 np0005465988 systemd[1]: libpod-86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d.scope: Deactivated successfully.
Oct  2 08:59:18 np0005465988 podman[326105]: 2025-10-02 12:59:18.064713651 +0000 UTC m=+0.049187347 container died 86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:59:18 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d-userdata-shm.mount: Deactivated successfully.
Oct  2 08:59:18 np0005465988 systemd[1]: var-lib-containers-storage-overlay-b09e083b847f1f7c59faa93fb523f50c45965c4119512e73607386d22a9eaf8b-merged.mount: Deactivated successfully.
Oct  2 08:59:18 np0005465988 podman[326105]: 2025-10-02 12:59:18.107054491 +0000 UTC m=+0.091528177 container cleanup 86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:59:18 np0005465988 systemd[1]: libpod-conmon-86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d.scope: Deactivated successfully.
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.155 2 INFO nova.virt.libvirt.driver [-] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Instance destroyed successfully.#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.156 2 DEBUG nova.objects.instance [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'resources' on Instance uuid e9f655a6-48fa-4199-8652-5c5b47440055 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.171 2 DEBUG nova.virt.libvirt.vif [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:58:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1546173084',display_name='tempest-TestNetworkBasicOps-server-1546173084',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1546173084',id=193,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOPOsj0pxh52taYqW5zA4gKjjHmyewWOEGLCwLWJ6CevctDo6M7sV0+kJbHre7Znry9axU8SnqeKBbCrrcdDwloF7+x3JIq882zktzE6ifusqsHta6salUdSLjgl9p6bKw==',key_name='tempest-TestNetworkBasicOps-1245745945',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:59:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-yn0o1vv6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:59:11Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=e9f655a6-48fa-4199-8652-5c5b47440055,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.173 2 DEBUG nova.network.os_vif_util [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.173 2 DEBUG nova.network.os_vif_util [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.174 2 DEBUG os_vif [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.176 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74de555c-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:18 np0005465988 podman[326137]: 2025-10-02 12:59:18.191002449 +0000 UTC m=+0.054484741 container remove 86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.231 2 INFO os_vif [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71')#033[00m
Oct  2 08:59:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:18.233 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[574e323d-ede7-43ac-ab81-795458f73700]: (4, ('Thu Oct  2 12:59:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169 (86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d)\n86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d\nThu Oct  2 12:59:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169 (86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d)\n86202e51860a723013f91b494fb8f184fc588097af5f2bd3f6425fbbb274b33d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:18.235 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[24e7a6a4-bd99-4973-9226-f5053e80715e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:18.236 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf9dc276-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:18 np0005465988 kernel: tapcf9dc276-00: left promiscuous mode
Oct  2 08:59:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:18.256 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1a0b59bd-b6f3-4aaa-bbd1-8b8a454fb176]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.262 2 DEBUG nova.compute.manager [req-05ce0d4a-2dba-4799-b93d-b22db3759331 req-d5cecddf-2ee6-42f3-a67d-ebb7b7b80620 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Received event network-vif-unplugged-74de555c-711e-4e21-a6f4-89c8289aae93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.263 2 DEBUG oslo_concurrency.lockutils [req-05ce0d4a-2dba-4799-b93d-b22db3759331 req-d5cecddf-2ee6-42f3-a67d-ebb7b7b80620 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.264 2 DEBUG oslo_concurrency.lockutils [req-05ce0d4a-2dba-4799-b93d-b22db3759331 req-d5cecddf-2ee6-42f3-a67d-ebb7b7b80620 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.264 2 DEBUG oslo_concurrency.lockutils [req-05ce0d4a-2dba-4799-b93d-b22db3759331 req-d5cecddf-2ee6-42f3-a67d-ebb7b7b80620 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.265 2 DEBUG nova.compute.manager [req-05ce0d4a-2dba-4799-b93d-b22db3759331 req-d5cecddf-2ee6-42f3-a67d-ebb7b7b80620 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] No waiting events found dispatching network-vif-unplugged-74de555c-711e-4e21-a6f4-89c8289aae93 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.265 2 DEBUG nova.compute.manager [req-05ce0d4a-2dba-4799-b93d-b22db3759331 req-d5cecddf-2ee6-42f3-a67d-ebb7b7b80620 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Received event network-vif-unplugged-74de555c-711e-4e21-a6f4-89c8289aae93 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:59:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:18.290 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[686cb6cc-bd5e-44a6-9634-fa7516d9d1a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:18.293 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[86cb957c-e0d2-42a7-9319-fff5091bc274]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:18.308 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f787c460-aa67-4c37-a8e6-6991fdc2fac4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 800976, 'reachable_time': 29642, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 326176, 'error': None, 'target': 'ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:18.311 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:59:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:18.311 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[0896f102-fa2d-4858-9095-c691a9d60fe7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:18 np0005465988 systemd[1]: run-netns-ovnmeta\x2dcf9dc276\x2d03fd\x2d47d7\x2d92fb\x2d6f6d94b7d169.mount: Deactivated successfully.
Oct  2 08:59:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:18.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.946 2 DEBUG nova.network.neutron [req-bc50dccf-5a96-4934-b95a-2d7ff5d3bf47 req-0434b1ef-c070-4ae1-856c-05e396336126 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Updated VIF entry in instance network info cache for port 74de555c-711e-4e21-a6f4-89c8289aae93. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.947 2 DEBUG nova.network.neutron [req-bc50dccf-5a96-4934-b95a-2d7ff5d3bf47 req-0434b1ef-c070-4ae1-856c-05e396336126 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Updating instance_info_cache with network_info: [{"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:59:18 np0005465988 nova_compute[236126]: 2025-10-02 12:59:18.966 2 DEBUG oslo_concurrency.lockutils [req-bc50dccf-5a96-4934-b95a-2d7ff5d3bf47 req-0434b1ef-c070-4ae1-856c-05e396336126 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-e9f655a6-48fa-4199-8652-5c5b47440055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:59:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:19.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.100 2 INFO nova.virt.libvirt.driver [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Deleting instance files /var/lib/nova/instances/e9f655a6-48fa-4199-8652-5c5b47440055_del#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.102 2 INFO nova.virt.libvirt.driver [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Deletion of /var/lib/nova/instances/e9f655a6-48fa-4199-8652-5c5b47440055_del complete#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.169 2 INFO nova.compute.manager [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Took 2.46 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.170 2 DEBUG oslo.service.loopingcall [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.171 2 DEBUG nova.compute.manager [-] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.171 2 DEBUG nova.network.neutron [-] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.346 2 DEBUG nova.compute.manager [req-1df0795b-bc11-4a9c-8198-d6b4dcd1ca4a req-62b4d9a5-ed41-4fd3-b5dc-3afb6e3923d3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Received event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.347 2 DEBUG oslo_concurrency.lockutils [req-1df0795b-bc11-4a9c-8198-d6b4dcd1ca4a req-62b4d9a5-ed41-4fd3-b5dc-3afb6e3923d3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.347 2 DEBUG oslo_concurrency.lockutils [req-1df0795b-bc11-4a9c-8198-d6b4dcd1ca4a req-62b4d9a5-ed41-4fd3-b5dc-3afb6e3923d3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.347 2 DEBUG oslo_concurrency.lockutils [req-1df0795b-bc11-4a9c-8198-d6b4dcd1ca4a req-62b4d9a5-ed41-4fd3-b5dc-3afb6e3923d3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.348 2 DEBUG nova.compute.manager [req-1df0795b-bc11-4a9c-8198-d6b4dcd1ca4a req-62b4d9a5-ed41-4fd3-b5dc-3afb6e3923d3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] No waiting events found dispatching network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.348 2 WARNING nova.compute.manager [req-1df0795b-bc11-4a9c-8198-d6b4dcd1ca4a req-62b4d9a5-ed41-4fd3-b5dc-3afb6e3923d3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Received unexpected event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:59:20 np0005465988 nova_compute[236126]: 2025-10-02 12:59:20.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:20.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:21.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:22 np0005465988 nova_compute[236126]: 2025-10-02 12:59:22.208 2 DEBUG nova.network.neutron [-] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:59:22 np0005465988 nova_compute[236126]: 2025-10-02 12:59:22.232 2 INFO nova.compute.manager [-] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Took 2.06 seconds to deallocate network for instance.#033[00m
Oct  2 08:59:22 np0005465988 nova_compute[236126]: 2025-10-02 12:59:22.287 2 DEBUG oslo_concurrency.lockutils [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:22 np0005465988 nova_compute[236126]: 2025-10-02 12:59:22.288 2 DEBUG oslo_concurrency.lockutils [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:22 np0005465988 nova_compute[236126]: 2025-10-02 12:59:22.367 2 DEBUG oslo_concurrency.processutils [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:22 np0005465988 podman[326231]: 2025-10-02 12:59:22.564718186 +0000 UTC m=+0.080771718 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:59:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:59:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/166007406' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:59:22 np0005465988 nova_compute[236126]: 2025-10-02 12:59:22.868 2 DEBUG oslo_concurrency.processutils [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:22 np0005465988 nova_compute[236126]: 2025-10-02 12:59:22.875 2 DEBUG nova.compute.provider_tree [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:59:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:22.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:22 np0005465988 nova_compute[236126]: 2025-10-02 12:59:22.908 2 DEBUG nova.scheduler.client.report [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:59:22 np0005465988 nova_compute[236126]: 2025-10-02 12:59:22.949 2 DEBUG oslo_concurrency.lockutils [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:23 np0005465988 nova_compute[236126]: 2025-10-02 12:59:23.212 2 INFO nova.scheduler.client.report [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Deleted allocations for instance e9f655a6-48fa-4199-8652-5c5b47440055#033[00m
Oct  2 08:59:23 np0005465988 nova_compute[236126]: 2025-10-02 12:59:23.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:23 np0005465988 nova_compute[236126]: 2025-10-02 12:59:23.364 2 DEBUG oslo_concurrency.lockutils [None req-b950ec86-2c06-4b47-87b1-a3ee3f24d293 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "e9f655a6-48fa-4199-8652-5c5b47440055" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:23.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:24 np0005465988 nova_compute[236126]: 2025-10-02 12:59:24.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:24 np0005465988 nova_compute[236126]: 2025-10-02 12:59:24.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:24 np0005465988 nova_compute[236126]: 2025-10-02 12:59:24.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:59:24 np0005465988 nova_compute[236126]: 2025-10-02 12:59:24.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:24 np0005465988 nova_compute[236126]: 2025-10-02 12:59:24.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:59:24 np0005465988 nova_compute[236126]: 2025-10-02 12:59:24.499 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:59:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:24.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:25 np0005465988 nova_compute[236126]: 2025-10-02 12:59:25.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:25.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:26.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:27.406 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:27.407 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:27.407 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:27 np0005465988 nova_compute[236126]: 2025-10-02 12:59:27.500 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:27.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:28 np0005465988 nova_compute[236126]: 2025-10-02 12:59:28.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:28 np0005465988 nova_compute[236126]: 2025-10-02 12:59:28.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:28 np0005465988 nova_compute[236126]: 2025-10-02 12:59:28.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:28.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 08:59:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:59:29 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 08:59:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:29.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:30 np0005465988 nova_compute[236126]: 2025-10-02 12:59:30.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:30.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:31 np0005465988 nova_compute[236126]: 2025-10-02 12:59:31.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:31.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:32 np0005465988 nova_compute[236126]: 2025-10-02 12:59:32.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:32 np0005465988 nova_compute[236126]: 2025-10-02 12:59:32.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:59:32 np0005465988 nova_compute[236126]: 2025-10-02 12:59:32.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:59:32 np0005465988 nova_compute[236126]: 2025-10-02 12:59:32.512 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:59:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:32.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:33 np0005465988 nova_compute[236126]: 2025-10-02 12:59:33.155 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409958.1536925, e9f655a6-48fa-4199-8652-5c5b47440055 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:59:33 np0005465988 nova_compute[236126]: 2025-10-02 12:59:33.155 2 INFO nova.compute.manager [-] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:59:33 np0005465988 nova_compute[236126]: 2025-10-02 12:59:33.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:33 np0005465988 podman[326408]: 2025-10-02 12:59:33.537983969 +0000 UTC m=+0.070410689 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:59:33 np0005465988 podman[326409]: 2025-10-02 12:59:33.542122688 +0000 UTC m=+0.070791100 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:59:33 np0005465988 podman[326407]: 2025-10-02 12:59:33.573252804 +0000 UTC m=+0.107604610 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  2 08:59:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:34.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:34 np0005465988 nova_compute[236126]: 2025-10-02 12:59:34.210 2 DEBUG nova.compute.manager [None req-8f76fbb8-a1fa-4380-93c6-c3f6534a16ec - - - - - -] [instance: e9f655a6-48fa-4199-8652-5c5b47440055] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:59:34 np0005465988 nova_compute[236126]: 2025-10-02 12:59:34.631 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "5fca9509-b756-4d01-a533-1f53ccd1c749" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:34 np0005465988 nova_compute[236126]: 2025-10-02 12:59:34.632 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:34 np0005465988 nova_compute[236126]: 2025-10-02 12:59:34.666 2 DEBUG nova.compute.manager [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:59:34 np0005465988 nova_compute[236126]: 2025-10-02 12:59:34.772 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:34 np0005465988 nova_compute[236126]: 2025-10-02 12:59:34.773 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:34 np0005465988 nova_compute[236126]: 2025-10-02 12:59:34.779 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:59:34 np0005465988 nova_compute[236126]: 2025-10-02 12:59:34.780 2 INFO nova.compute.claims [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:59:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:34.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:34 np0005465988 nova_compute[236126]: 2025-10-02 12:59:34.975 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:59:35 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/783575176' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:59:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.485 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.491 2 DEBUG nova.compute.provider_tree [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.538 2 DEBUG nova.scheduler.client.report [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.589 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.590 2 DEBUG nova.compute.manager [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.649 2 DEBUG nova.compute.manager [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.650 2 DEBUG nova.network.neutron [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.676 2 INFO nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.702 2 DEBUG nova.compute.manager [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.812 2 DEBUG nova.compute.manager [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.813 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.814 2 INFO nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Creating image(s)#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.837 2 DEBUG nova.storage.rbd_utils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 5fca9509-b756-4d01-a533-1f53ccd1c749_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.937 2 DEBUG nova.storage.rbd_utils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 5fca9509-b756-4d01-a533-1f53ccd1c749_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.969 2 DEBUG nova.storage.rbd_utils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 5fca9509-b756-4d01-a533-1f53ccd1c749_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:35 np0005465988 nova_compute[236126]: 2025-10-02 12:59:35.975 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:36.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:36 np0005465988 nova_compute[236126]: 2025-10-02 12:59:36.069 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:36 np0005465988 nova_compute[236126]: 2025-10-02 12:59:36.069 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:36 np0005465988 nova_compute[236126]: 2025-10-02 12:59:36.070 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:36 np0005465988 nova_compute[236126]: 2025-10-02 12:59:36.070 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:36 np0005465988 nova_compute[236126]: 2025-10-02 12:59:36.095 2 DEBUG nova.storage.rbd_utils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 5fca9509-b756-4d01-a533-1f53ccd1c749_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:36 np0005465988 nova_compute[236126]: 2025-10-02 12:59:36.103 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 5fca9509-b756-4d01-a533-1f53ccd1c749_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:36 np0005465988 nova_compute[236126]: 2025-10-02 12:59:36.246 2 DEBUG nova.policy [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb366465e6154871b8a53c9f500105ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:59:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:36.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:37 np0005465988 nova_compute[236126]: 2025-10-02 12:59:37.064 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 5fca9509-b756-4d01-a533-1f53ccd1c749_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.962s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:37 np0005465988 nova_compute[236126]: 2025-10-02 12:59:37.157 2 DEBUG nova.storage.rbd_utils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] resizing rbd image 5fca9509-b756-4d01-a533-1f53ccd1c749_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 08:59:37 np0005465988 nova_compute[236126]: 2025-10-02 12:59:37.304 2 DEBUG nova.objects.instance [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'migration_context' on Instance uuid 5fca9509-b756-4d01-a533-1f53ccd1c749 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:59:37 np0005465988 nova_compute[236126]: 2025-10-02 12:59:37.426 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:59:37 np0005465988 nova_compute[236126]: 2025-10-02 12:59:37.428 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Ensure instance console log exists: /var/lib/nova/instances/5fca9509-b756-4d01-a533-1f53ccd1c749/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:59:37 np0005465988 nova_compute[236126]: 2025-10-02 12:59:37.429 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:37 np0005465988 nova_compute[236126]: 2025-10-02 12:59:37.430 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:37 np0005465988 nova_compute[236126]: 2025-10-02 12:59:37.430 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:59:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 08:59:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:38.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:38 np0005465988 nova_compute[236126]: 2025-10-02 12:59:38.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:38.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:40.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:40 np0005465988 nova_compute[236126]: 2025-10-02 12:59:40.082 2 DEBUG nova.network.neutron [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Successfully updated port: 74de555c-711e-4e21-a6f4-89c8289aae93 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:59:40 np0005465988 nova_compute[236126]: 2025-10-02 12:59:40.101 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "refresh_cache-5fca9509-b756-4d01-a533-1f53ccd1c749" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:59:40 np0005465988 nova_compute[236126]: 2025-10-02 12:59:40.102 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquired lock "refresh_cache-5fca9509-b756-4d01-a533-1f53ccd1c749" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:59:40 np0005465988 nova_compute[236126]: 2025-10-02 12:59:40.102 2 DEBUG nova.network.neutron [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:59:40 np0005465988 nova_compute[236126]: 2025-10-02 12:59:40.251 2 DEBUG nova.network.neutron [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:59:40 np0005465988 nova_compute[236126]: 2025-10-02 12:59:40.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:40.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.399 2 DEBUG nova.network.neutron [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Updating instance_info_cache with network_info: [{"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.433 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Releasing lock "refresh_cache-5fca9509-b756-4d01-a533-1f53ccd1c749" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.433 2 DEBUG nova.compute.manager [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Instance network_info: |[{"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.438 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Start _get_guest_xml network_info=[{"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.447 2 WARNING nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.453 2 DEBUG nova.virt.libvirt.host [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.454 2 DEBUG nova.virt.libvirt.host [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.460 2 DEBUG nova.virt.libvirt.host [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.460 2 DEBUG nova.virt.libvirt.host [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.462 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.463 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.464 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.464 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.465 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.465 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.466 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.466 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.467 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.467 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.468 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.468 2 DEBUG nova.virt.hardware [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.474 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.746 2 DEBUG nova.compute.manager [req-b6454312-750b-48f3-959f-e3a2b2fc5827 req-7b73d44d-2374-46fe-ab76-5f33c4b9dae1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Received event network-changed-74de555c-711e-4e21-a6f4-89c8289aae93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.747 2 DEBUG nova.compute.manager [req-b6454312-750b-48f3-959f-e3a2b2fc5827 req-7b73d44d-2374-46fe-ab76-5f33c4b9dae1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Refreshing instance network info cache due to event network-changed-74de555c-711e-4e21-a6f4-89c8289aae93. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.748 2 DEBUG oslo_concurrency.lockutils [req-b6454312-750b-48f3-959f-e3a2b2fc5827 req-7b73d44d-2374-46fe-ab76-5f33c4b9dae1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-5fca9509-b756-4d01-a533-1f53ccd1c749" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.748 2 DEBUG oslo_concurrency.lockutils [req-b6454312-750b-48f3-959f-e3a2b2fc5827 req-7b73d44d-2374-46fe-ab76-5f33c4b9dae1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-5fca9509-b756-4d01-a533-1f53ccd1c749" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.749 2 DEBUG nova.network.neutron [req-b6454312-750b-48f3-959f-e3a2b2fc5827 req-7b73d44d-2374-46fe-ab76-5f33c4b9dae1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Refreshing network info cache for port 74de555c-711e-4e21-a6f4-89c8289aae93 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:59:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:59:41 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3336498066' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:59:41 np0005465988 nova_compute[236126]: 2025-10-02 12:59:41.968 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.003 2 DEBUG nova.storage.rbd_utils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 5fca9509-b756-4d01-a533-1f53ccd1c749_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.009 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:42.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 08:59:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3263233194' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.480 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.483 2 DEBUG nova.virt.libvirt.vif [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:59:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-416029129',display_name='tempest-TestNetworkBasicOps-server-416029129',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-416029129',id=195,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEX0Mk0a5E5v3x8m2u9I/6ViGfoOZFYOxEtTP/WXi7vw2UKM5WOROIAQ6lmEigioWie1J23wHKcklEZulWTABVkRNG/2t5U0nSLlttJPO0YfDfbgNp43IExtvZ93fzfmWA==',key_name='tempest-TestNetworkBasicOps-725216195',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-qu85rrbh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:59:35Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=5fca9509-b756-4d01-a533-1f53ccd1c749,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.484 2 DEBUG nova.network.os_vif_util [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.485 2 DEBUG nova.network.os_vif_util [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.488 2 DEBUG nova.objects.instance [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'pci_devices' on Instance uuid 5fca9509-b756-4d01-a533-1f53ccd1c749 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.535 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  <uuid>5fca9509-b756-4d01-a533-1f53ccd1c749</uuid>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  <name>instance-000000c3</name>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestNetworkBasicOps-server-416029129</nova:name>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 12:59:41</nova:creationTime>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <nova:user uuid="fb366465e6154871b8a53c9f500105ce">tempest-TestNetworkBasicOps-1692262680-project-member</nova:user>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <nova:project uuid="ce2ca82c03554560b55ed747ae63f1fb">tempest-TestNetworkBasicOps-1692262680</nova:project>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <nova:port uuid="74de555c-711e-4e21-a6f4-89c8289aae93">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <system>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <entry name="serial">5fca9509-b756-4d01-a533-1f53ccd1c749</entry>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <entry name="uuid">5fca9509-b756-4d01-a533-1f53ccd1c749</entry>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    </system>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  <os>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  </os>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  <features>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  </features>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  </clock>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  <devices>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/5fca9509-b756-4d01-a533-1f53ccd1c749_disk">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/5fca9509-b756-4d01-a533-1f53ccd1c749_disk.config">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      </source>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      </auth>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    </disk>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:fa:91:d3"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <target dev="tap74de555c-71"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    </interface>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/5fca9509-b756-4d01-a533-1f53ccd1c749/console.log" append="off"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    </serial>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <video>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    </video>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    </rng>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 08:59:42 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 08:59:42 np0005465988 nova_compute[236126]:  </devices>
Oct  2 08:59:42 np0005465988 nova_compute[236126]: </domain>
Oct  2 08:59:42 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.535 2 DEBUG nova.compute.manager [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Preparing to wait for external event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.536 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.536 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.537 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.538 2 DEBUG nova.virt.libvirt.vif [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:59:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-416029129',display_name='tempest-TestNetworkBasicOps-server-416029129',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-416029129',id=195,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEX0Mk0a5E5v3x8m2u9I/6ViGfoOZFYOxEtTP/WXi7vw2UKM5WOROIAQ6lmEigioWie1J23wHKcklEZulWTABVkRNG/2t5U0nSLlttJPO0YfDfbgNp43IExtvZ93fzfmWA==',key_name='tempest-TestNetworkBasicOps-725216195',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-qu85rrbh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:59:35Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=5fca9509-b756-4d01-a533-1f53ccd1c749,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.538 2 DEBUG nova.network.os_vif_util [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.539 2 DEBUG nova.network.os_vif_util [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.540 2 DEBUG os_vif [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:42 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.542 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.542 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74de555c-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.548 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap74de555c-71, col_values=(('external_ids', {'iface-id': '74de555c-711e-4e21-a6f4-89c8289aae93', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:91:d3', 'vm-uuid': '5fca9509-b756-4d01-a533-1f53ccd1c749'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:42 np0005465988 NetworkManager[45041]: <info>  [1759409982.5519] manager: (tap74de555c-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/382)
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.561 2 INFO os_vif [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71')#033[00m
Oct  2 08:59:42 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.630 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.631 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.631 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No VIF found with MAC fa:16:3e:fa:91:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.631 2 INFO nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Using config drive#033[00m
Oct  2 08:59:42 np0005465988 nova_compute[236126]: 2025-10-02 12:59:42.662 2 DEBUG nova.storage.rbd_utils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 5fca9509-b756-4d01-a533-1f53ccd1c749_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 08:59:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:42.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 08:59:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:44.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:44 np0005465988 nova_compute[236126]: 2025-10-02 12:59:44.534 2 INFO nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Creating config drive at /var/lib/nova/instances/5fca9509-b756-4d01-a533-1f53ccd1c749/disk.config#033[00m
Oct  2 08:59:44 np0005465988 nova_compute[236126]: 2025-10-02 12:59:44.539 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5fca9509-b756-4d01-a533-1f53ccd1c749/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa4s98qdi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:44 np0005465988 nova_compute[236126]: 2025-10-02 12:59:44.696 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5fca9509-b756-4d01-a533-1f53ccd1c749/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa4s98qdi" returned: 0 in 0.157s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:44 np0005465988 nova_compute[236126]: 2025-10-02 12:59:44.730 2 DEBUG nova.storage.rbd_utils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 5fca9509-b756-4d01-a533-1f53ccd1c749_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 08:59:44 np0005465988 nova_compute[236126]: 2025-10-02 12:59:44.735 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5fca9509-b756-4d01-a533-1f53ccd1c749/disk.config 5fca9509-b756-4d01-a533-1f53ccd1c749_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:44.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:44 np0005465988 nova_compute[236126]: 2025-10-02 12:59:44.993 2 DEBUG oslo_concurrency.processutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5fca9509-b756-4d01-a533-1f53ccd1c749/disk.config 5fca9509-b756-4d01-a533-1f53ccd1c749_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.258s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:44 np0005465988 nova_compute[236126]: 2025-10-02 12:59:44.995 2 INFO nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Deleting local config drive /var/lib/nova/instances/5fca9509-b756-4d01-a533-1f53ccd1c749/disk.config because it was imported into RBD.#033[00m
Oct  2 08:59:45 np0005465988 kernel: tap74de555c-71: entered promiscuous mode
Oct  2 08:59:45 np0005465988 NetworkManager[45041]: <info>  [1759409985.0554] manager: (tap74de555c-71): new Tun device (/org/freedesktop/NetworkManager/Devices/383)
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:45Z|00879|binding|INFO|Claiming lport 74de555c-711e-4e21-a6f4-89c8289aae93 for this chassis.
Oct  2 08:59:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:45Z|00880|binding|INFO|74de555c-711e-4e21-a6f4-89c8289aae93: Claiming fa:16:3e:fa:91:d3 10.100.0.6
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.064 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:91:d3 10.100.0.6'], port_security=['fa:16:3e:fa:91:d3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1996543852', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5fca9509-b756-4d01-a533-1f53ccd1c749', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1996543852', 'neutron:project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'c114e1a6-21d7-49a2-a13f-595584b99547', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09ec6ed0-28d8-4666-8087-300b86d2afe4, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=74de555c-711e-4e21-a6f4-89c8289aae93) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.065 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 74de555c-711e-4e21-a6f4-89c8289aae93 in datapath cf9dc276-03fd-47d7-92fb-6f6d94b7d169 bound to our chassis#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.067 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cf9dc276-03fd-47d7-92fb-6f6d94b7d169#033[00m
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:45Z|00881|binding|INFO|Setting lport 74de555c-711e-4e21-a6f4-89c8289aae93 ovn-installed in OVS
Oct  2 08:59:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:45Z|00882|binding|INFO|Setting lport 74de555c-711e-4e21-a6f4-89c8289aae93 up in Southbound
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.085 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7662de53-c719-4ee9-a185-004f9cfa7cfb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.086 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcf9dc276-01 in ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.088 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcf9dc276-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.089 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6641bf17-5d71-4388-ba0b-19f165f86d6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.090 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[32bab29b-cc70-43a9-bc0f-974129291340]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 systemd-udevd[326899]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:59:45 np0005465988 NetworkManager[45041]: <info>  [1759409985.1041] device (tap74de555c-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:59:45 np0005465988 NetworkManager[45041]: <info>  [1759409985.1061] device (tap74de555c-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:59:45 np0005465988 systemd-machined[192594]: New machine qemu-91-instance-000000c3.
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.109 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[406cf209-b111-42cb-89a9-7dcd55de3e54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 systemd[1]: Started Virtual Machine qemu-91-instance-000000c3.
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.134 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4d8cb2be-f1b6-4b62-bc78-09f7901081c1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.174 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[43cd08d3-02b7-4bcf-8efa-525a03023a49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.182 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[38868800-3c6f-40a6-aa69-94710e142248]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 NetworkManager[45041]: <info>  [1759409985.1843] manager: (tapcf9dc276-00): new Veth device (/org/freedesktop/NetworkManager/Devices/384)
Oct  2 08:59:45 np0005465988 systemd-udevd[326903]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.234 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c36e56-a938-44f7-823d-b6f0f1f95ab9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.240 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ec64bb1a-b8be-401b-816b-b6572955bf84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 NetworkManager[45041]: <info>  [1759409985.2708] device (tapcf9dc276-00): carrier: link connected
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.280 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[51bc3c8b-cb8c-4174-b209-8e603f9995b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.300 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[52275e48-7861-4395-853d-3a4e9bb79cfd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcf9dc276-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:71:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 256], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 804560, 'reachable_time': 37322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 326932, 'error': None, 'target': 'ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.321 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf4fa1e-8b36-49cb-8230-1a2f7b8e7d5e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:713d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 804560, 'tstamp': 804560}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 326933, 'error': None, 'target': 'ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.341 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3d331e38-99fb-4c39-a4b2-0d28427cf3a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcf9dc276-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:71:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 256], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 804560, 'reachable_time': 37322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 326934, 'error': None, 'target': 'ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.373 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d562b670-10d8-4000-96f0-a33b4c2ecb38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.428 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a57564-a7d4-4d4e-84e2-a7b841235582]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.430 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf9dc276-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.430 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.431 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcf9dc276-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:45 np0005465988 NetworkManager[45041]: <info>  [1759409985.4338] manager: (tapcf9dc276-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/385)
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:45 np0005465988 kernel: tapcf9dc276-00: entered promiscuous mode
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.437 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcf9dc276-00, col_values=(('external_ids', {'iface-id': 'd5135195-c335-444d-8d55-41c04f12d49b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:45 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:45Z|00883|binding|INFO|Releasing lport d5135195-c335-444d-8d55-41c04f12d49b from this chassis (sb_readonly=0)
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.455 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cf9dc276-03fd-47d7-92fb-6f6d94b7d169.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cf9dc276-03fd-47d7-92fb-6f6d94b7d169.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.456 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[347b0c4e-7028-4a47-a618-6162dfb67d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.457 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-cf9dc276-03fd-47d7-92fb-6f6d94b7d169
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/cf9dc276-03fd-47d7-92fb-6f6d94b7d169.pid.haproxy
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID cf9dc276-03fd-47d7-92fb-6f6d94b7d169
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:59:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:45.458 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'env', 'PROCESS_TAG=haproxy-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cf9dc276-03fd-47d7-92fb-6f6d94b7d169.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.578 2 DEBUG nova.compute.manager [req-d9d6e775-30b8-4ded-b66c-2d94c011c934 req-d2c78e1c-838a-4cd8-a16b-8f010820b582 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Received event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.579 2 DEBUG oslo_concurrency.lockutils [req-d9d6e775-30b8-4ded-b66c-2d94c011c934 req-d2c78e1c-838a-4cd8-a16b-8f010820b582 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.579 2 DEBUG oslo_concurrency.lockutils [req-d9d6e775-30b8-4ded-b66c-2d94c011c934 req-d2c78e1c-838a-4cd8-a16b-8f010820b582 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.580 2 DEBUG oslo_concurrency.lockutils [req-d9d6e775-30b8-4ded-b66c-2d94c011c934 req-d2c78e1c-838a-4cd8-a16b-8f010820b582 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.580 2 DEBUG nova.compute.manager [req-d9d6e775-30b8-4ded-b66c-2d94c011c934 req-d2c78e1c-838a-4cd8-a16b-8f010820b582 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Processing event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.677 2 DEBUG nova.network.neutron [req-b6454312-750b-48f3-959f-e3a2b2fc5827 req-7b73d44d-2374-46fe-ab76-5f33c4b9dae1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Updated VIF entry in instance network info cache for port 74de555c-711e-4e21-a6f4-89c8289aae93. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.678 2 DEBUG nova.network.neutron [req-b6454312-750b-48f3-959f-e3a2b2fc5827 req-7b73d44d-2374-46fe-ab76-5f33c4b9dae1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Updating instance_info_cache with network_info: [{"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:59:45 np0005465988 nova_compute[236126]: 2025-10-02 12:59:45.695 2 DEBUG oslo_concurrency.lockutils [req-b6454312-750b-48f3-959f-e3a2b2fc5827 req-7b73d44d-2374-46fe-ab76-5f33c4b9dae1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-5fca9509-b756-4d01-a533-1f53ccd1c749" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:59:45 np0005465988 podman[326966]: 2025-10-02 12:59:45.875889987 +0000 UTC m=+0.062404069 container create ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:59:45 np0005465988 systemd[1]: Started libpod-conmon-ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141.scope.
Oct  2 08:59:45 np0005465988 podman[326966]: 2025-10-02 12:59:45.846290614 +0000 UTC m=+0.032804736 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:59:45 np0005465988 systemd[1]: Started libcrun container.
Oct  2 08:59:45 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda1142f7e5575feca4c414258a2149f49d0c05d9be5f308ee3888e653219224/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:59:45 np0005465988 podman[326966]: 2025-10-02 12:59:45.979319686 +0000 UTC m=+0.165833798 container init ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:59:45 np0005465988 podman[326966]: 2025-10-02 12:59:45.985829453 +0000 UTC m=+0.172343535 container start ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:59:46 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[327013]: [NOTICE]   (327025) : New worker (327027) forked
Oct  2 08:59:46 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[327013]: [NOTICE]   (327025) : Loading success.
Oct  2 08:59:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:46.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.479 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409986.4790351, 5fca9509-b756-4d01-a533-1f53ccd1c749 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.480 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] VM Started (Lifecycle Event)#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.483 2 DEBUG nova.compute.manager [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.487 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.491 2 INFO nova.virt.libvirt.driver [-] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Instance spawned successfully.#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.491 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.597 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.605 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.606 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.607 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.608 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.609 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.609 2 DEBUG nova.virt.libvirt.driver [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.618 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.744 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.745 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409986.479288, 5fca9509-b756-4d01-a533-1f53ccd1c749 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.745 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.796 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.802 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759409986.48644, 5fca9509-b756-4d01-a533-1f53ccd1c749 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.802 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.869 2 INFO nova.compute.manager [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Took 11.06 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.869 2 DEBUG nova.compute.manager [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.886 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:59:46 np0005465988 nova_compute[236126]: 2025-10-02 12:59:46.888 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:59:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:46.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:47 np0005465988 nova_compute[236126]: 2025-10-02 12:59:47.002 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:59:47 np0005465988 nova_compute[236126]: 2025-10-02 12:59:47.078 2 INFO nova.compute.manager [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Took 12.33 seconds to build instance.#033[00m
Oct  2 08:59:47 np0005465988 nova_compute[236126]: 2025-10-02 12:59:47.210 2 DEBUG oslo_concurrency.lockutils [None req-e1716614-6fe3-4589-af66-5048ce5e33ec fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:47 np0005465988 nova_compute[236126]: 2025-10-02 12:59:47.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:47 np0005465988 nova_compute[236126]: 2025-10-02 12:59:47.717 2 DEBUG nova.compute.manager [req-579437d1-963f-45ab-b93e-5c03cad39eba req-4ecdb7bd-1091-44f7-a703-9d20bf419544 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Received event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:59:47 np0005465988 nova_compute[236126]: 2025-10-02 12:59:47.717 2 DEBUG oslo_concurrency.lockutils [req-579437d1-963f-45ab-b93e-5c03cad39eba req-4ecdb7bd-1091-44f7-a703-9d20bf419544 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:47 np0005465988 nova_compute[236126]: 2025-10-02 12:59:47.717 2 DEBUG oslo_concurrency.lockutils [req-579437d1-963f-45ab-b93e-5c03cad39eba req-4ecdb7bd-1091-44f7-a703-9d20bf419544 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:47 np0005465988 nova_compute[236126]: 2025-10-02 12:59:47.718 2 DEBUG oslo_concurrency.lockutils [req-579437d1-963f-45ab-b93e-5c03cad39eba req-4ecdb7bd-1091-44f7-a703-9d20bf419544 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:47 np0005465988 nova_compute[236126]: 2025-10-02 12:59:47.718 2 DEBUG nova.compute.manager [req-579437d1-963f-45ab-b93e-5c03cad39eba req-4ecdb7bd-1091-44f7-a703-9d20bf419544 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] No waiting events found dispatching network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:59:47 np0005465988 nova_compute[236126]: 2025-10-02 12:59:47.718 2 WARNING nova.compute.manager [req-579437d1-963f-45ab-b93e-5c03cad39eba req-4ecdb7bd-1091-44f7-a703-9d20bf419544 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Received unexpected event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:59:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:48.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.763 2 DEBUG oslo_concurrency.lockutils [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "5fca9509-b756-4d01-a533-1f53ccd1c749" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.764 2 DEBUG oslo_concurrency.lockutils [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.764 2 DEBUG oslo_concurrency.lockutils [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.765 2 DEBUG oslo_concurrency.lockutils [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.765 2 DEBUG oslo_concurrency.lockutils [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.767 2 INFO nova.compute.manager [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Terminating instance#033[00m
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.768 2 DEBUG nova.compute.manager [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:59:48 np0005465988 kernel: tap74de555c-71 (unregistering): left promiscuous mode
Oct  2 08:59:48 np0005465988 NetworkManager[45041]: <info>  [1759409988.8184] device (tap74de555c-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:59:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:48Z|00884|binding|INFO|Releasing lport 74de555c-711e-4e21-a6f4-89c8289aae93 from this chassis (sb_readonly=0)
Oct  2 08:59:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:48Z|00885|binding|INFO|Setting lport 74de555c-711e-4e21-a6f4-89c8289aae93 down in Southbound
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:48 np0005465988 ovn_controller[132601]: 2025-10-02T12:59:48Z|00886|binding|INFO|Removing iface tap74de555c-71 ovn-installed in OVS
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:48 np0005465988 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000c3.scope: Deactivated successfully.
Oct  2 08:59:48 np0005465988 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000c3.scope: Consumed 3.741s CPU time.
Oct  2 08:59:48 np0005465988 systemd-machined[192594]: Machine qemu-91-instance-000000c3 terminated.
Oct  2 08:59:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:48.884 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:91:d3 10.100.0.6'], port_security=['fa:16:3e:fa:91:d3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1996543852', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5fca9509-b756-4d01-a533-1f53ccd1c749', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1996543852', 'neutron:project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'c114e1a6-21d7-49a2-a13f-595584b99547', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.230', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09ec6ed0-28d8-4666-8087-300b86d2afe4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=74de555c-711e-4e21-a6f4-89c8289aae93) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:59:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:48.885 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 74de555c-711e-4e21-a6f4-89c8289aae93 in datapath cf9dc276-03fd-47d7-92fb-6f6d94b7d169 unbound from our chassis#033[00m
Oct  2 08:59:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:48.887 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cf9dc276-03fd-47d7-92fb-6f6d94b7d169, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:59:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:48.888 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[73fbe10d-0c1c-4636-8430-b9d0c0428270]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:48.889 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169 namespace which is not needed anymore#033[00m
Oct  2 08:59:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:48.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:48 np0005465988 nova_compute[236126]: 2025-10-02 12:59:48.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.008 2 INFO nova.virt.libvirt.driver [-] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Instance destroyed successfully.#033[00m
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.009 2 DEBUG nova.objects.instance [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'resources' on Instance uuid 5fca9509-b756-4d01-a533-1f53ccd1c749 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:59:49 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[327013]: [NOTICE]   (327025) : haproxy version is 2.8.14-c23fe91
Oct  2 08:59:49 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[327013]: [NOTICE]   (327025) : path to executable is /usr/sbin/haproxy
Oct  2 08:59:49 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[327013]: [WARNING]  (327025) : Exiting Master process...
Oct  2 08:59:49 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[327013]: [ALERT]    (327025) : Current worker (327027) exited with code 143 (Terminated)
Oct  2 08:59:49 np0005465988 neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169[327013]: [WARNING]  (327025) : All workers exited. Exiting... (0)
Oct  2 08:59:49 np0005465988 systemd[1]: libpod-ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141.scope: Deactivated successfully.
Oct  2 08:59:49 np0005465988 podman[327063]: 2025-10-02 12:59:49.026731705 +0000 UTC m=+0.054346877 container died ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.045 2 DEBUG nova.virt.libvirt.vif [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:59:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-416029129',display_name='tempest-TestNetworkBasicOps-server-416029129',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-416029129',id=195,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEX0Mk0a5E5v3x8m2u9I/6ViGfoOZFYOxEtTP/WXi7vw2UKM5WOROIAQ6lmEigioWie1J23wHKcklEZulWTABVkRNG/2t5U0nSLlttJPO0YfDfbgNp43IExtvZ93fzfmWA==',key_name='tempest-TestNetworkBasicOps-725216195',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:59:46Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-qu85rrbh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:59:47Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=5fca9509-b756-4d01-a533-1f53ccd1c749,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.047 2 DEBUG nova.network.os_vif_util [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "74de555c-711e-4e21-a6f4-89c8289aae93", "address": "fa:16:3e:fa:91:d3", "network": {"id": "cf9dc276-03fd-47d7-92fb-6f6d94b7d169", "bridge": "br-int", "label": "tempest-network-smoke--763808298", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74de555c-71", "ovs_interfaceid": "74de555c-711e-4e21-a6f4-89c8289aae93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.049 2 DEBUG nova.network.os_vif_util [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.049 2 DEBUG os_vif [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.051 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74de555c-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:59:49 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141-userdata-shm.mount: Deactivated successfully.
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:49 np0005465988 systemd[1]: var-lib-containers-storage-overlay-dda1142f7e5575feca4c414258a2149f49d0c05d9be5f308ee3888e653219224-merged.mount: Deactivated successfully.
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.061 2 INFO os_vif [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:91:d3,bridge_name='br-int',has_traffic_filtering=True,id=74de555c-711e-4e21-a6f4-89c8289aae93,network=Network(cf9dc276-03fd-47d7-92fb-6f6d94b7d169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap74de555c-71')#033[00m
Oct  2 08:59:49 np0005465988 podman[327063]: 2025-10-02 12:59:49.071574776 +0000 UTC m=+0.099189948 container cleanup ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:59:49 np0005465988 systemd[1]: libpod-conmon-ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141.scope: Deactivated successfully.
Oct  2 08:59:49 np0005465988 podman[327114]: 2025-10-02 12:59:49.147856233 +0000 UTC m=+0.050010921 container remove ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:59:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:49.156 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2679f0f7-5148-4130-8938-784c0c274b01]: (4, ('Thu Oct  2 12:59:48 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169 (ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141)\nea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141\nThu Oct  2 12:59:49 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169 (ea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141)\nea5522cc49501cf364e3559ad71eec7b1f905d4edbfb1a943f316c3d955cb141\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:49.159 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9c276800-07e0-4593-aeb1-2f19797803cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:49.160 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf9dc276-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:59:49 np0005465988 kernel: tapcf9dc276-00: left promiscuous mode
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:49.182 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[086cb6f5-0e9d-4a5b-a5d1-88a2e3d2361e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:49.209 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[801fb379-0f5a-4898-ab7b-a39aef97ed1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:49.211 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[80fdcc04-a879-4fbe-941c-166ea03cee7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:49.231 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[40890ef7-8046-42d0-a63b-3686a5020761]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 804549, 'reachable_time': 22421, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 327134, 'error': None, 'target': 'ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:49 np0005465988 systemd[1]: run-netns-ovnmeta\x2dcf9dc276\x2d03fd\x2d47d7\x2d92fb\x2d6f6d94b7d169.mount: Deactivated successfully.
Oct  2 08:59:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:49.235 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cf9dc276-03fd-47d7-92fb-6f6d94b7d169 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:59:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 12:59:49.235 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[e60c0c01-4180-4a94-b062-bdcc5b928241]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.731 2 INFO nova.virt.libvirt.driver [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Deleting instance files /var/lib/nova/instances/5fca9509-b756-4d01-a533-1f53ccd1c749_del#033[00m
Oct  2 08:59:49 np0005465988 nova_compute[236126]: 2025-10-02 12:59:49.733 2 INFO nova.virt.libvirt.driver [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Deletion of /var/lib/nova/instances/5fca9509-b756-4d01-a533-1f53ccd1c749_del complete#033[00m
Oct  2 08:59:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:50.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:50 np0005465988 nova_compute[236126]: 2025-10-02 12:59:50.031 2 DEBUG nova.compute.manager [req-dcfc8fe3-c0d8-4050-863f-ff56734f0b9d req-0713219b-0f1f-4770-b979-603634556dc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Received event network-vif-unplugged-74de555c-711e-4e21-a6f4-89c8289aae93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:59:50 np0005465988 nova_compute[236126]: 2025-10-02 12:59:50.033 2 DEBUG oslo_concurrency.lockutils [req-dcfc8fe3-c0d8-4050-863f-ff56734f0b9d req-0713219b-0f1f-4770-b979-603634556dc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:50 np0005465988 nova_compute[236126]: 2025-10-02 12:59:50.033 2 DEBUG oslo_concurrency.lockutils [req-dcfc8fe3-c0d8-4050-863f-ff56734f0b9d req-0713219b-0f1f-4770-b979-603634556dc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:50 np0005465988 nova_compute[236126]: 2025-10-02 12:59:50.034 2 DEBUG oslo_concurrency.lockutils [req-dcfc8fe3-c0d8-4050-863f-ff56734f0b9d req-0713219b-0f1f-4770-b979-603634556dc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:50 np0005465988 nova_compute[236126]: 2025-10-02 12:59:50.034 2 DEBUG nova.compute.manager [req-dcfc8fe3-c0d8-4050-863f-ff56734f0b9d req-0713219b-0f1f-4770-b979-603634556dc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] No waiting events found dispatching network-vif-unplugged-74de555c-711e-4e21-a6f4-89c8289aae93 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:59:50 np0005465988 nova_compute[236126]: 2025-10-02 12:59:50.034 2 DEBUG nova.compute.manager [req-dcfc8fe3-c0d8-4050-863f-ff56734f0b9d req-0713219b-0f1f-4770-b979-603634556dc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Received event network-vif-unplugged-74de555c-711e-4e21-a6f4-89c8289aae93 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:59:50 np0005465988 nova_compute[236126]: 2025-10-02 12:59:50.070 2 INFO nova.compute.manager [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Took 1.30 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:59:50 np0005465988 nova_compute[236126]: 2025-10-02 12:59:50.073 2 DEBUG oslo.service.loopingcall [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:59:50 np0005465988 nova_compute[236126]: 2025-10-02 12:59:50.074 2 DEBUG nova.compute.manager [-] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:59:50 np0005465988 nova_compute[236126]: 2025-10-02 12:59:50.075 2 DEBUG nova.network.neutron [-] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:59:50 np0005465988 nova_compute[236126]: 2025-10-02 12:59:50.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:50.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:52.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.112 2 DEBUG nova.compute.manager [req-eaeef2be-7b56-4659-a778-a1e964f5ee2d req-f028fc2a-aa68-4bf9-ab2c-087723117398 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Received event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.113 2 DEBUG oslo_concurrency.lockutils [req-eaeef2be-7b56-4659-a778-a1e964f5ee2d req-f028fc2a-aa68-4bf9-ab2c-087723117398 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.113 2 DEBUG oslo_concurrency.lockutils [req-eaeef2be-7b56-4659-a778-a1e964f5ee2d req-f028fc2a-aa68-4bf9-ab2c-087723117398 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.114 2 DEBUG oslo_concurrency.lockutils [req-eaeef2be-7b56-4659-a778-a1e964f5ee2d req-f028fc2a-aa68-4bf9-ab2c-087723117398 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.114 2 DEBUG nova.compute.manager [req-eaeef2be-7b56-4659-a778-a1e964f5ee2d req-f028fc2a-aa68-4bf9-ab2c-087723117398 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] No waiting events found dispatching network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.114 2 WARNING nova.compute.manager [req-eaeef2be-7b56-4659-a778-a1e964f5ee2d req-f028fc2a-aa68-4bf9-ab2c-087723117398 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Received unexpected event network-vif-plugged-74de555c-711e-4e21-a6f4-89c8289aae93 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.248 2 DEBUG nova.network.neutron [-] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.268 2 INFO nova.compute.manager [-] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Took 2.19 seconds to deallocate network for instance.#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.336 2 DEBUG oslo_concurrency.lockutils [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.339 2 DEBUG oslo_concurrency.lockutils [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.422 2 DEBUG oslo_concurrency.processutils [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 08:59:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4141547658' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 08:59:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:52.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.933 2 DEBUG oslo_concurrency.processutils [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.941 2 DEBUG nova.compute.provider_tree [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.967 2 DEBUG nova.scheduler.client.report [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:59:52 np0005465988 nova_compute[236126]: 2025-10-02 12:59:52.994 2 DEBUG oslo_concurrency.lockutils [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:53 np0005465988 nova_compute[236126]: 2025-10-02 12:59:53.031 2 INFO nova.scheduler.client.report [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Deleted allocations for instance 5fca9509-b756-4d01-a533-1f53ccd1c749#033[00m
Oct  2 08:59:53 np0005465988 nova_compute[236126]: 2025-10-02 12:59:53.128 2 DEBUG oslo_concurrency.lockutils [None req-278d9a44-62a3-4e53-be9e-d44db16e0da2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "5fca9509-b756-4d01-a533-1f53ccd1c749" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:53 np0005465988 podman[327160]: 2025-10-02 12:59:53.55932851 +0000 UTC m=+0.090449437 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:59:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:54.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:54 np0005465988 nova_compute[236126]: 2025-10-02 12:59:54.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:54.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:55 np0005465988 nova_compute[236126]: 2025-10-02 12:59:55.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 08:59:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:56.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e397 e397: 3 total, 3 up, 3 in
Oct  2 08:59:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:56.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e398 e398: 3 total, 3 up, 3 in
Oct  2 08:59:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 08:59:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:12:59:58.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 08:59:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 08:59:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/764651198' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 08:59:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 08:59:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/764651198' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 08:59:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 08:59:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 08:59:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:12:59:58.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 08:59:58 np0005465988 nova_compute[236126]: 2025-10-02 12:59:58.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:59 np0005465988 nova_compute[236126]: 2025-10-02 12:59:59.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:59 np0005465988 nova_compute[236126]: 2025-10-02 12:59:59.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:00.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:00 np0005465988 nova_compute[236126]: 2025-10-02 13:00:00.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:00 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 09:00:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:00.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:01 np0005465988 nova_compute[236126]: 2025-10-02 13:00:01.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:00:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:02.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:00:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:00:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:02.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:00:03 np0005465988 nova_compute[236126]: 2025-10-02 13:00:03.499 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:03 np0005465988 nova_compute[236126]: 2025-10-02 13:00:03.500 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 09:00:04 np0005465988 nova_compute[236126]: 2025-10-02 13:00:04.007 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409989.004838, 5fca9509-b756-4d01-a533-1f53ccd1c749 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:00:04 np0005465988 nova_compute[236126]: 2025-10-02 13:00:04.007 2 INFO nova.compute.manager [-] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:00:04 np0005465988 nova_compute[236126]: 2025-10-02 13:00:04.033 2 DEBUG nova.compute.manager [None req-ef740905-1dc2-47d8-a8cd-ebca10302cb3 - - - - - -] [instance: 5fca9509-b756-4d01-a533-1f53ccd1c749] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:00:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:04.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:04 np0005465988 nova_compute[236126]: 2025-10-02 13:00:04.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:04 np0005465988 podman[327237]: 2025-10-02 13:00:04.539426311 +0000 UTC m=+0.071926333 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 09:00:04 np0005465988 podman[327238]: 2025-10-02 13:00:04.548476221 +0000 UTC m=+0.077202374 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, config_id=multipathd)
Oct  2 09:00:04 np0005465988 podman[327236]: 2025-10-02 13:00:04.56718005 +0000 UTC m=+0.102176174 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 09:00:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 e399: 3 total, 3 up, 3 in
Oct  2 09:00:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:04.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:05 np0005465988 nova_compute[236126]: 2025-10-02 13:00:05.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:06.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:06.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:08.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:08.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:09 np0005465988 nova_compute[236126]: 2025-10-02 13:00:09.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:10.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:10 np0005465988 nova_compute[236126]: 2025-10-02 13:00:10.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:10.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:12.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:12.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:13 np0005465988 nova_compute[236126]: 2025-10-02 13:00:13.487 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:14.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:14 np0005465988 nova_compute[236126]: 2025-10-02 13:00:14.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:14.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.476 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.509 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.510 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.510 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.510 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.511 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.570 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "6575cd5d-2f78-4718-b6b8-16026aec208c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.570 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.591 2 DEBUG nova.compute.manager [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.663 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.664 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.670 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.671 2 INFO nova.compute.claims [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:00:15 np0005465988 nova_compute[236126]: 2025-10-02 13:00:15.795 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:00:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:00:16 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/462770064' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:00:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:16.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.066 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.555s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.240 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.241 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4023MB free_disk=20.942848205566406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.242 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:00:16 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2036417450' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.298 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.303 2 DEBUG nova.compute.provider_tree [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.324 2 DEBUG nova.scheduler.client.report [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.421 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.422 2 DEBUG nova.compute.manager [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.425 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.524 2 DEBUG nova.compute.manager [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.524 2 DEBUG nova.network.neutron [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.546 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 6575cd5d-2f78-4718-b6b8-16026aec208c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.546 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.547 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.552 2 INFO nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.579 2 DEBUG nova.compute.manager [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.594 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.703 2 DEBUG nova.compute.manager [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.704 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.705 2 INFO nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Creating image(s)#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.736 2 DEBUG nova.storage.rbd_utils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 6575cd5d-2f78-4718-b6b8-16026aec208c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.781 2 DEBUG nova.storage.rbd_utils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 6575cd5d-2f78-4718-b6b8-16026aec208c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.809 2 DEBUG nova.storage.rbd_utils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 6575cd5d-2f78-4718-b6b8-16026aec208c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.816 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.857 2 DEBUG nova.policy [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb366465e6154871b8a53c9f500105ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.900 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.901 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.902 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.902 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.937 2 DEBUG nova.storage.rbd_utils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 6575cd5d-2f78-4718-b6b8-16026aec208c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:00:16 np0005465988 nova_compute[236126]: 2025-10-02 13:00:16.943 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 6575cd5d-2f78-4718-b6b8-16026aec208c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:00:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:16.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:00:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2996359736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.094 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.102 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.123 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.284 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.285 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.395 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 6575cd5d-2f78-4718-b6b8-16026aec208c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.473 2 DEBUG nova.storage.rbd_utils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] resizing rbd image 6575cd5d-2f78-4718-b6b8-16026aec208c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.605 2 DEBUG nova.objects.instance [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'migration_context' on Instance uuid 6575cd5d-2f78-4718-b6b8-16026aec208c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.626 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.627 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Ensure instance console log exists: /var/lib/nova/instances/6575cd5d-2f78-4718-b6b8-16026aec208c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.628 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.628 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.629 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:17.752 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:00:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:17.753 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:17 np0005465988 nova_compute[236126]: 2025-10-02 13:00:17.957 2 DEBUG nova.network.neutron [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Successfully created port: 9221552d-e239-4ec0-9a6e-28c040fcba23 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:00:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:18.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:18 np0005465988 nova_compute[236126]: 2025-10-02 13:00:18.748 2 DEBUG nova.network.neutron [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Successfully updated port: 9221552d-e239-4ec0-9a6e-28c040fcba23 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:00:18 np0005465988 nova_compute[236126]: 2025-10-02 13:00:18.767 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:00:18 np0005465988 nova_compute[236126]: 2025-10-02 13:00:18.767 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquired lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:00:18 np0005465988 nova_compute[236126]: 2025-10-02 13:00:18.767 2 DEBUG nova.network.neutron [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:00:18 np0005465988 nova_compute[236126]: 2025-10-02 13:00:18.881 2 DEBUG nova.compute.manager [req-c43e9521-5b1c-41db-b073-91a201fc183d req-a9f52e0b-70cf-4797-abc1-2350c0e5841a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Received event network-changed-9221552d-e239-4ec0-9a6e-28c040fcba23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:00:18 np0005465988 nova_compute[236126]: 2025-10-02 13:00:18.882 2 DEBUG nova.compute.manager [req-c43e9521-5b1c-41db-b073-91a201fc183d req-a9f52e0b-70cf-4797-abc1-2350c0e5841a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Refreshing instance network info cache due to event network-changed-9221552d-e239-4ec0-9a6e-28c040fcba23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:00:18 np0005465988 nova_compute[236126]: 2025-10-02 13:00:18.882 2 DEBUG oslo_concurrency.lockutils [req-c43e9521-5b1c-41db-b073-91a201fc183d req-a9f52e0b-70cf-4797-abc1-2350c0e5841a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:00:18 np0005465988 nova_compute[236126]: 2025-10-02 13:00:18.961 2 DEBUG nova.network.neutron [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:00:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:18.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.775 2 DEBUG nova.network.neutron [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Updating instance_info_cache with network_info: [{"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.794 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Releasing lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.795 2 DEBUG nova.compute.manager [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Instance network_info: |[{"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.795 2 DEBUG oslo_concurrency.lockutils [req-c43e9521-5b1c-41db-b073-91a201fc183d req-a9f52e0b-70cf-4797-abc1-2350c0e5841a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.796 2 DEBUG nova.network.neutron [req-c43e9521-5b1c-41db-b073-91a201fc183d req-a9f52e0b-70cf-4797-abc1-2350c0e5841a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Refreshing network info cache for port 9221552d-e239-4ec0-9a6e-28c040fcba23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.800 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Start _get_guest_xml network_info=[{"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.805 2 WARNING nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.809 2 DEBUG nova.virt.libvirt.host [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.810 2 DEBUG nova.virt.libvirt.host [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.815 2 DEBUG nova.virt.libvirt.host [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.815 2 DEBUG nova.virt.libvirt.host [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.817 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.817 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.817 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.817 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.818 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.818 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.818 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.818 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.818 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.819 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.819 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.819 2 DEBUG nova.virt.hardware [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:00:19 np0005465988 nova_compute[236126]: 2025-10-02 13:00:19.822 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:00:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:20.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:00:20 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2740930688' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.294 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.329 2 DEBUG nova.storage.rbd_utils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 6575cd5d-2f78-4718-b6b8-16026aec208c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.336 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:00:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:00:20 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3618510218' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.779 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.781 2 DEBUG nova.virt.libvirt.vif [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:00:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-48146842',display_name='tempest-TestNetworkBasicOps-server-48146842',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-48146842',id=197,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTzsnbn+W8rzAT2a52k99bGMwHywSGrM7Q6kP25/NGhEia97iCwvjkAY/jK0D4sgOXB4OPIUZM/wSRzdRxmyFxJk6uXMlT+HX2/sfp2LZC/CCn6EmQRg3Oz7tf1Vw+DMg==',key_name='tempest-TestNetworkBasicOps-926842640',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-8c0qztcx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:00:16Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=6575cd5d-2f78-4718-b6b8-16026aec208c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.781 2 DEBUG nova.network.os_vif_util [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.782 2 DEBUG nova.network.os_vif_util [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:97:e8,bridge_name='br-int',has_traffic_filtering=True,id=9221552d-e239-4ec0-9a6e-28c040fcba23,network=Network(7de325ef-ce0b-4c58-9e5a-e7fd1d046659),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9221552d-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.783 2 DEBUG nova.objects.instance [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'pci_devices' on Instance uuid 6575cd5d-2f78-4718-b6b8-16026aec208c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.800 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  <uuid>6575cd5d-2f78-4718-b6b8-16026aec208c</uuid>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  <name>instance-000000c5</name>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestNetworkBasicOps-server-48146842</nova:name>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:00:19</nova:creationTime>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <nova:user uuid="fb366465e6154871b8a53c9f500105ce">tempest-TestNetworkBasicOps-1692262680-project-member</nova:user>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <nova:project uuid="ce2ca82c03554560b55ed747ae63f1fb">tempest-TestNetworkBasicOps-1692262680</nova:project>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <nova:port uuid="9221552d-e239-4ec0-9a6e-28c040fcba23">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <entry name="serial">6575cd5d-2f78-4718-b6b8-16026aec208c</entry>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <entry name="uuid">6575cd5d-2f78-4718-b6b8-16026aec208c</entry>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/6575cd5d-2f78-4718-b6b8-16026aec208c_disk">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/6575cd5d-2f78-4718-b6b8-16026aec208c_disk.config">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:ed:97:e8"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <target dev="tap9221552d-e2"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/6575cd5d-2f78-4718-b6b8-16026aec208c/console.log" append="off"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:00:20 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:00:20 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:00:20 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:00:20 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.802 2 DEBUG nova.compute.manager [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Preparing to wait for external event network-vif-plugged-9221552d-e239-4ec0-9a6e-28c040fcba23 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.802 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.802 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.803 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.803 2 DEBUG nova.virt.libvirt.vif [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:00:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-48146842',display_name='tempest-TestNetworkBasicOps-server-48146842',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-48146842',id=197,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTzsnbn+W8rzAT2a52k99bGMwHywSGrM7Q6kP25/NGhEia97iCwvjkAY/jK0D4sgOXB4OPIUZM/wSRzdRxmyFxJk6uXMlT+HX2/sfp2LZC/CCn6EmQRg3Oz7tf1Vw+DMg==',key_name='tempest-TestNetworkBasicOps-926842640',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-8c0qztcx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:00:16Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=6575cd5d-2f78-4718-b6b8-16026aec208c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.804 2 DEBUG nova.network.os_vif_util [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.804 2 DEBUG nova.network.os_vif_util [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:97:e8,bridge_name='br-int',has_traffic_filtering=True,id=9221552d-e239-4ec0-9a6e-28c040fcba23,network=Network(7de325ef-ce0b-4c58-9e5a-e7fd1d046659),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9221552d-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.805 2 DEBUG os_vif [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:97:e8,bridge_name='br-int',has_traffic_filtering=True,id=9221552d-e239-4ec0-9a6e-28c040fcba23,network=Network(7de325ef-ce0b-4c58-9e5a-e7fd1d046659),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9221552d-e2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.806 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.806 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.810 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9221552d-e2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.811 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9221552d-e2, col_values=(('external_ids', {'iface-id': '9221552d-e239-4ec0-9a6e-28c040fcba23', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ed:97:e8', 'vm-uuid': '6575cd5d-2f78-4718-b6b8-16026aec208c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:20 np0005465988 NetworkManager[45041]: <info>  [1759410020.8137] manager: (tap9221552d-e2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/386)
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.819 2 INFO os_vif [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:97:e8,bridge_name='br-int',has_traffic_filtering=True,id=9221552d-e239-4ec0-9a6e-28c040fcba23,network=Network(7de325ef-ce0b-4c58-9e5a-e7fd1d046659),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9221552d-e2')#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.963 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.964 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.964 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No VIF found with MAC fa:16:3e:ed:97:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.964 2 INFO nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Using config drive#033[00m
Oct  2 09:00:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:20.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:20 np0005465988 nova_compute[236126]: 2025-10-02 13:00:20.994 2 DEBUG nova.storage.rbd_utils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 6575cd5d-2f78-4718-b6b8-16026aec208c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:00:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:22.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:22 np0005465988 nova_compute[236126]: 2025-10-02 13:00:22.477 2 INFO nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Creating config drive at /var/lib/nova/instances/6575cd5d-2f78-4718-b6b8-16026aec208c/disk.config#033[00m
Oct  2 09:00:22 np0005465988 nova_compute[236126]: 2025-10-02 13:00:22.484 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6575cd5d-2f78-4718-b6b8-16026aec208c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1o4ncmmg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:00:22 np0005465988 nova_compute[236126]: 2025-10-02 13:00:22.642 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6575cd5d-2f78-4718-b6b8-16026aec208c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1o4ncmmg" returned: 0 in 0.158s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:00:22 np0005465988 nova_compute[236126]: 2025-10-02 13:00:22.678 2 DEBUG nova.storage.rbd_utils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image 6575cd5d-2f78-4718-b6b8-16026aec208c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:00:22 np0005465988 nova_compute[236126]: 2025-10-02 13:00:22.683 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6575cd5d-2f78-4718-b6b8-16026aec208c/disk.config 6575cd5d-2f78-4718-b6b8-16026aec208c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:00:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:22.756 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:00:22 np0005465988 nova_compute[236126]: 2025-10-02 13:00:22.900 2 DEBUG oslo_concurrency.processutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6575cd5d-2f78-4718-b6b8-16026aec208c/disk.config 6575cd5d-2f78-4718-b6b8-16026aec208c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.217s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:00:22 np0005465988 nova_compute[236126]: 2025-10-02 13:00:22.901 2 INFO nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Deleting local config drive /var/lib/nova/instances/6575cd5d-2f78-4718-b6b8-16026aec208c/disk.config because it was imported into RBD.#033[00m
Oct  2 09:00:22 np0005465988 kernel: tap9221552d-e2: entered promiscuous mode
Oct  2 09:00:22 np0005465988 NetworkManager[45041]: <info>  [1759410022.9682] manager: (tap9221552d-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/387)
Oct  2 09:00:22 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:22Z|00887|binding|INFO|Claiming lport 9221552d-e239-4ec0-9a6e-28c040fcba23 for this chassis.
Oct  2 09:00:22 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:22Z|00888|binding|INFO|9221552d-e239-4ec0-9a6e-28c040fcba23: Claiming fa:16:3e:ed:97:e8 10.100.0.6
Oct  2 09:00:22 np0005465988 nova_compute[236126]: 2025-10-02 13:00:22.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:22 np0005465988 nova_compute[236126]: 2025-10-02 13:00:22.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:22.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:22.987 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:97:e8 10.100.0.6'], port_security=['fa:16:3e:ed:97:e8 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6575cd5d-2f78-4718-b6b8-16026aec208c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7de325ef-ce0b-4c58-9e5a-e7fd1d046659', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'afe8f391-0a44-4d97-b178-976c8f6e4f3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc47b176-2cb3-4091-92f6-666c5ed4cb38, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=9221552d-e239-4ec0-9a6e-28c040fcba23) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:00:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:22.989 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 9221552d-e239-4ec0-9a6e-28c040fcba23 in datapath 7de325ef-ce0b-4c58-9e5a-e7fd1d046659 bound to our chassis#033[00m
Oct  2 09:00:22 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:22.990 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7de325ef-ce0b-4c58-9e5a-e7fd1d046659#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.006 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[abe157af-19bf-4c37-947e-69e62222e971]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.007 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7de325ef-c1 in ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:00:23 np0005465988 systemd-machined[192594]: New machine qemu-92-instance-000000c5.
Oct  2 09:00:23 np0005465988 systemd-udevd[327727]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.008 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7de325ef-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.009 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ae76cc67-85fe-487d-a19f-c4a89e7a0b63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.010 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f1eba435-38bc-464c-b577-eb1d88745e20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 NetworkManager[45041]: <info>  [1759410023.0271] device (tap9221552d-e2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.025 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[b12c0800-2830-498b-8452-338111272947]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 NetworkManager[45041]: <info>  [1759410023.0281] device (tap9221552d-e2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:00:23 np0005465988 systemd[1]: Started Virtual Machine qemu-92-instance-000000c5.
Oct  2 09:00:23 np0005465988 nova_compute[236126]: 2025-10-02 13:00:23.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:23 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:23Z|00889|binding|INFO|Setting lport 9221552d-e239-4ec0-9a6e-28c040fcba23 ovn-installed in OVS
Oct  2 09:00:23 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:23Z|00890|binding|INFO|Setting lport 9221552d-e239-4ec0-9a6e-28c040fcba23 up in Southbound
Oct  2 09:00:23 np0005465988 nova_compute[236126]: 2025-10-02 13:00:23.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.060 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9fcf0f8b-d8b0-4fe5-ba1b-994df6918d8a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.103 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[dfbfca3d-155d-4d9d-8609-d133e8fb4fb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 NetworkManager[45041]: <info>  [1759410023.1138] manager: (tap7de325ef-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/388)
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.112 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d3392f81-a907-4648-ba77-4e32479c3398]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 systemd-udevd[327730]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.155 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f911af14-b6b8-4b8c-bdc5-197b8b0579cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.159 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[75181414-1ad1-4d75-bcde-e951f2785776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 NetworkManager[45041]: <info>  [1759410023.1851] device (tap7de325ef-c0): carrier: link connected
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.192 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8c5e2064-4caa-4aa4-aa40-fc48ab53d4d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.211 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e837bbd6-e5ca-4096-8c8b-405f8733fdb2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7de325ef-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:44:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 259], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 808351, 'reachable_time': 24429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 327760, 'error': None, 'target': 'ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.234 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7eab283e-43c3-43f3-9418-5f43c3e1301f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1c:449e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 808351, 'tstamp': 808351}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 327761, 'error': None, 'target': 'ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.259 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b397e320-217b-48e7-9557-eb427074d5a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7de325ef-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:44:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 259], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 808351, 'reachable_time': 24429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 327762, 'error': None, 'target': 'ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.294 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb8fc23-1e9c-413e-a60d-b5c956eef53f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.367 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[57aee10a-1229-4401-98dd-0f6930b8cfa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.368 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7de325ef-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.368 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.369 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7de325ef-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:00:23 np0005465988 nova_compute[236126]: 2025-10-02 13:00:23.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:23 np0005465988 kernel: tap7de325ef-c0: entered promiscuous mode
Oct  2 09:00:23 np0005465988 NetworkManager[45041]: <info>  [1759410023.3733] manager: (tap7de325ef-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/389)
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.375 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7de325ef-c0, col_values=(('external_ids', {'iface-id': 'aa247423-9f6a-4efe-a8cb-4e606af0e2b3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:00:23 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:23Z|00891|binding|INFO|Releasing lport aa247423-9f6a-4efe-a8cb-4e606af0e2b3 from this chassis (sb_readonly=0)
Oct  2 09:00:23 np0005465988 nova_compute[236126]: 2025-10-02 13:00:23.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:23 np0005465988 nova_compute[236126]: 2025-10-02 13:00:23.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.391 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7de325ef-ce0b-4c58-9e5a-e7fd1d046659.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7de325ef-ce0b-4c58-9e5a-e7fd1d046659.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.392 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6d7d1532-c00c-425a-acbf-afb9150ef71e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.392 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-7de325ef-ce0b-4c58-9e5a-e7fd1d046659
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/7de325ef-ce0b-4c58-9e5a-e7fd1d046659.pid.haproxy
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 7de325ef-ce0b-4c58-9e5a-e7fd1d046659
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:00:23 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:23.393 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659', 'env', 'PROCESS_TAG=haproxy-7de325ef-ce0b-4c58-9e5a-e7fd1d046659', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7de325ef-ce0b-4c58-9e5a-e7fd1d046659.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:00:23 np0005465988 podman[327836]: 2025-10-02 13:00:23.831451656 +0000 UTC m=+0.088926132 container create 9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 09:00:23 np0005465988 podman[327836]: 2025-10-02 13:00:23.768093411 +0000 UTC m=+0.025567897 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:00:23 np0005465988 systemd[1]: Started libpod-conmon-9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7.scope.
Oct  2 09:00:23 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:00:23 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7061d632cc68e1c4762bb6bab293af8da41d564ffbf1b798405f82e82e95888c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:00:23 np0005465988 podman[327836]: 2025-10-02 13:00:23.921469389 +0000 UTC m=+0.178943885 container init 9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 09:00:23 np0005465988 podman[327836]: 2025-10-02 13:00:23.926494714 +0000 UTC m=+0.183969190 container start 9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 09:00:23 np0005465988 podman[327848]: 2025-10-02 13:00:23.935585976 +0000 UTC m=+0.065286212 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Oct  2 09:00:23 np0005465988 neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659[327852]: [NOTICE]   (327875) : New worker (327877) forked
Oct  2 09:00:23 np0005465988 neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659[327852]: [NOTICE]   (327875) : Loading success.
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.051 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410024.050051, 6575cd5d-2f78-4718-b6b8-16026aec208c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.051 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] VM Started (Lifecycle Event)#033[00m
Oct  2 09:00:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:24.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.069 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.075 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410024.0508103, 6575cd5d-2f78-4718-b6b8-16026aec208c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.075 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.098 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.103 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.122 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.573 2 DEBUG nova.network.neutron [req-c43e9521-5b1c-41db-b073-91a201fc183d req-a9f52e0b-70cf-4797-abc1-2350c0e5841a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Updated VIF entry in instance network info cache for port 9221552d-e239-4ec0-9a6e-28c040fcba23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.574 2 DEBUG nova.network.neutron [req-c43e9521-5b1c-41db-b073-91a201fc183d req-a9f52e0b-70cf-4797-abc1-2350c0e5841a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Updating instance_info_cache with network_info: [{"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.590 2 DEBUG oslo_concurrency.lockutils [req-c43e9521-5b1c-41db-b073-91a201fc183d req-a9f52e0b-70cf-4797-abc1-2350c0e5841a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.608 2 DEBUG nova.compute.manager [req-0936c905-6436-44b5-9177-01e2d5421b36 req-8dcb067c-e9ce-4615-8e9c-d8567af105d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Received event network-vif-plugged-9221552d-e239-4ec0-9a6e-28c040fcba23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.608 2 DEBUG oslo_concurrency.lockutils [req-0936c905-6436-44b5-9177-01e2d5421b36 req-8dcb067c-e9ce-4615-8e9c-d8567af105d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.608 2 DEBUG oslo_concurrency.lockutils [req-0936c905-6436-44b5-9177-01e2d5421b36 req-8dcb067c-e9ce-4615-8e9c-d8567af105d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.609 2 DEBUG oslo_concurrency.lockutils [req-0936c905-6436-44b5-9177-01e2d5421b36 req-8dcb067c-e9ce-4615-8e9c-d8567af105d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.609 2 DEBUG nova.compute.manager [req-0936c905-6436-44b5-9177-01e2d5421b36 req-8dcb067c-e9ce-4615-8e9c-d8567af105d5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Processing event network-vif-plugged-9221552d-e239-4ec0-9a6e-28c040fcba23 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.610 2 DEBUG nova.compute.manager [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.613 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410024.6136467, 6575cd5d-2f78-4718-b6b8-16026aec208c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.614 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.616 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.620 2 INFO nova.virt.libvirt.driver [-] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Instance spawned successfully.#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.621 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.631 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.637 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.645 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.646 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.646 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.647 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.647 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.648 2 DEBUG nova.virt.libvirt.driver [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.653 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.695 2 INFO nova.compute.manager [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Took 7.99 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.696 2 DEBUG nova.compute.manager [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.760 2 INFO nova.compute.manager [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Took 9.13 seconds to build instance.#033[00m
Oct  2 09:00:24 np0005465988 nova_compute[236126]: 2025-10-02 13:00:24.775 2 DEBUG oslo_concurrency.lockutils [None req-6161c2cc-efa8-489c-949b-f00230926ee5 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:24.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:25 np0005465988 nova_compute[236126]: 2025-10-02 13:00:25.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:25 np0005465988 nova_compute[236126]: 2025-10-02 13:00:25.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:26.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:26 np0005465988 nova_compute[236126]: 2025-10-02 13:00:26.691 2 DEBUG nova.compute.manager [req-73290602-761c-4508-a25e-74a85c677c7d req-012256c5-9aeb-49fe-9ba1-5acb1147352e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Received event network-vif-plugged-9221552d-e239-4ec0-9a6e-28c040fcba23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:00:26 np0005465988 nova_compute[236126]: 2025-10-02 13:00:26.691 2 DEBUG oslo_concurrency.lockutils [req-73290602-761c-4508-a25e-74a85c677c7d req-012256c5-9aeb-49fe-9ba1-5acb1147352e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:26 np0005465988 nova_compute[236126]: 2025-10-02 13:00:26.692 2 DEBUG oslo_concurrency.lockutils [req-73290602-761c-4508-a25e-74a85c677c7d req-012256c5-9aeb-49fe-9ba1-5acb1147352e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:26 np0005465988 nova_compute[236126]: 2025-10-02 13:00:26.692 2 DEBUG oslo_concurrency.lockutils [req-73290602-761c-4508-a25e-74a85c677c7d req-012256c5-9aeb-49fe-9ba1-5acb1147352e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:26 np0005465988 nova_compute[236126]: 2025-10-02 13:00:26.692 2 DEBUG nova.compute.manager [req-73290602-761c-4508-a25e-74a85c677c7d req-012256c5-9aeb-49fe-9ba1-5acb1147352e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] No waiting events found dispatching network-vif-plugged-9221552d-e239-4ec0-9a6e-28c040fcba23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:00:26 np0005465988 nova_compute[236126]: 2025-10-02 13:00:26.692 2 WARNING nova.compute.manager [req-73290602-761c-4508-a25e-74a85c677c7d req-012256c5-9aeb-49fe-9ba1-5acb1147352e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Received unexpected event network-vif-plugged-9221552d-e239-4ec0-9a6e-28c040fcba23 for instance with vm_state active and task_state None.#033[00m
Oct  2 09:00:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:26.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:27 np0005465988 nova_compute[236126]: 2025-10-02 13:00:27.284 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:27 np0005465988 nova_compute[236126]: 2025-10-02 13:00:27.284 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:27 np0005465988 nova_compute[236126]: 2025-10-02 13:00:27.284 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:00:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:27.407 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:27.408 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:27.409 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:28.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:28 np0005465988 nova_compute[236126]: 2025-10-02 13:00:28.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:28.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:29 np0005465988 NetworkManager[45041]: <info>  [1759410029.2628] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/390)
Oct  2 09:00:29 np0005465988 NetworkManager[45041]: <info>  [1759410029.2646] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/391)
Oct  2 09:00:29 np0005465988 nova_compute[236126]: 2025-10-02 13:00:29.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:29 np0005465988 nova_compute[236126]: 2025-10-02 13:00:29.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:29 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:29Z|00892|binding|INFO|Releasing lport aa247423-9f6a-4efe-a8cb-4e606af0e2b3 from this chassis (sb_readonly=0)
Oct  2 09:00:29 np0005465988 nova_compute[236126]: 2025-10-02 13:00:29.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:29 np0005465988 nova_compute[236126]: 2025-10-02 13:00:29.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:29 np0005465988 nova_compute[236126]: 2025-10-02 13:00:29.575 2 DEBUG nova.compute.manager [req-cddc0fd2-de12-4cb2-b352-3de9b5b18204 req-21652710-8a10-493f-83b4-f2766754b933 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Received event network-changed-9221552d-e239-4ec0-9a6e-28c040fcba23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:00:29 np0005465988 nova_compute[236126]: 2025-10-02 13:00:29.575 2 DEBUG nova.compute.manager [req-cddc0fd2-de12-4cb2-b352-3de9b5b18204 req-21652710-8a10-493f-83b4-f2766754b933 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Refreshing instance network info cache due to event network-changed-9221552d-e239-4ec0-9a6e-28c040fcba23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:00:29 np0005465988 nova_compute[236126]: 2025-10-02 13:00:29.576 2 DEBUG oslo_concurrency.lockutils [req-cddc0fd2-de12-4cb2-b352-3de9b5b18204 req-21652710-8a10-493f-83b4-f2766754b933 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:00:29 np0005465988 nova_compute[236126]: 2025-10-02 13:00:29.576 2 DEBUG oslo_concurrency.lockutils [req-cddc0fd2-de12-4cb2-b352-3de9b5b18204 req-21652710-8a10-493f-83b4-f2766754b933 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:00:29 np0005465988 nova_compute[236126]: 2025-10-02 13:00:29.576 2 DEBUG nova.network.neutron [req-cddc0fd2-de12-4cb2-b352-3de9b5b18204 req-21652710-8a10-493f-83b4-f2766754b933 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Refreshing network info cache for port 9221552d-e239-4ec0-9a6e-28c040fcba23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:00:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:30.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:30 np0005465988 nova_compute[236126]: 2025-10-02 13:00:30.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:30 np0005465988 nova_compute[236126]: 2025-10-02 13:00:30.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:30 np0005465988 nova_compute[236126]: 2025-10-02 13:00:30.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:30.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:31 np0005465988 nova_compute[236126]: 2025-10-02 13:00:31.073 2 DEBUG nova.network.neutron [req-cddc0fd2-de12-4cb2-b352-3de9b5b18204 req-21652710-8a10-493f-83b4-f2766754b933 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Updated VIF entry in instance network info cache for port 9221552d-e239-4ec0-9a6e-28c040fcba23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:00:31 np0005465988 nova_compute[236126]: 2025-10-02 13:00:31.074 2 DEBUG nova.network.neutron [req-cddc0fd2-de12-4cb2-b352-3de9b5b18204 req-21652710-8a10-493f-83b4-f2766754b933 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Updating instance_info_cache with network_info: [{"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:00:31 np0005465988 nova_compute[236126]: 2025-10-02 13:00:31.162 2 DEBUG oslo_concurrency.lockutils [req-cddc0fd2-de12-4cb2-b352-3de9b5b18204 req-21652710-8a10-493f-83b4-f2766754b933 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:00:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:32.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:32 np0005465988 nova_compute[236126]: 2025-10-02 13:00:32.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:32.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:33 np0005465988 nova_compute[236126]: 2025-10-02 13:00:33.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:33 np0005465988 nova_compute[236126]: 2025-10-02 13:00:33.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:00:33 np0005465988 nova_compute[236126]: 2025-10-02 13:00:33.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:00:33 np0005465988 nova_compute[236126]: 2025-10-02 13:00:33.672 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:00:33 np0005465988 nova_compute[236126]: 2025-10-02 13:00:33.672 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:00:33 np0005465988 nova_compute[236126]: 2025-10-02 13:00:33.673 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 09:00:33 np0005465988 nova_compute[236126]: 2025-10-02 13:00:33.673 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6575cd5d-2f78-4718-b6b8-16026aec208c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:00:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:34.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:35.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:35 np0005465988 nova_compute[236126]: 2025-10-02 13:00:35.183 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Updating instance_info_cache with network_info: [{"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:00:35 np0005465988 nova_compute[236126]: 2025-10-02 13:00:35.200 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:00:35 np0005465988 nova_compute[236126]: 2025-10-02 13:00:35.201 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 09:00:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:35 np0005465988 nova_compute[236126]: 2025-10-02 13:00:35.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:35 np0005465988 podman[327895]: 2025-10-02 13:00:35.5494581 +0000 UTC m=+0.064144179 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 09:00:35 np0005465988 podman[327896]: 2025-10-02 13:00:35.585475067 +0000 UTC m=+0.094034069 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:00:35 np0005465988 podman[327894]: 2025-10-02 13:00:35.606449671 +0000 UTC m=+0.124445905 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 09:00:35 np0005465988 nova_compute[236126]: 2025-10-02 13:00:35.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:36.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:37.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:37 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:37Z|00112|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ed:97:e8 10.100.0.6
Oct  2 09:00:37 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:37Z|00113|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ed:97:e8 10.100.0.6
Oct  2 09:00:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:38.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:00:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:00:38 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:00:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:39.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:40.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:40 np0005465988 nova_compute[236126]: 2025-10-02 13:00:40.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:40 np0005465988 nova_compute[236126]: 2025-10-02 13:00:40.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:41.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:42.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:43.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:44.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:44 np0005465988 nova_compute[236126]: 2025-10-02 13:00:44.373 2 INFO nova.compute.manager [None req-c51202e8-9685-4d0d-a4c1-a1142c381905 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Get console output#033[00m
Oct  2 09:00:44 np0005465988 nova_compute[236126]: 2025-10-02 13:00:44.380 15591 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 09:00:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:00:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:00:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:45.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:45 np0005465988 nova_compute[236126]: 2025-10-02 13:00:45.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:45 np0005465988 nova_compute[236126]: 2025-10-02 13:00:45.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:46.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:46 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:46Z|00114|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ed:97:e8 10.100.0.6
Oct  2 09:00:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:47.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:47 np0005465988 nova_compute[236126]: 2025-10-02 13:00:47.196 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:48.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:00:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:49.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:00:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:50.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.149 2 DEBUG nova.compute.manager [req-aa555697-a005-47a7-9991-9174567a70ae req-99648c8c-06a7-423a-b467-a159a7e4112a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Received event network-changed-9221552d-e239-4ec0-9a6e-28c040fcba23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.149 2 DEBUG nova.compute.manager [req-aa555697-a005-47a7-9991-9174567a70ae req-99648c8c-06a7-423a-b467-a159a7e4112a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Refreshing instance network info cache due to event network-changed-9221552d-e239-4ec0-9a6e-28c040fcba23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.150 2 DEBUG oslo_concurrency.lockutils [req-aa555697-a005-47a7-9991-9174567a70ae req-99648c8c-06a7-423a-b467-a159a7e4112a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.150 2 DEBUG oslo_concurrency.lockutils [req-aa555697-a005-47a7-9991-9174567a70ae req-99648c8c-06a7-423a-b467-a159a7e4112a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.150 2 DEBUG nova.network.neutron [req-aa555697-a005-47a7-9991-9174567a70ae req-99648c8c-06a7-423a-b467-a159a7e4112a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Refreshing network info cache for port 9221552d-e239-4ec0-9a6e-28c040fcba23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.244 2 DEBUG oslo_concurrency.lockutils [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "6575cd5d-2f78-4718-b6b8-16026aec208c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.244 2 DEBUG oslo_concurrency.lockutils [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.244 2 DEBUG oslo_concurrency.lockutils [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.245 2 DEBUG oslo_concurrency.lockutils [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.245 2 DEBUG oslo_concurrency.lockutils [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.246 2 INFO nova.compute.manager [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Terminating instance#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.247 2 DEBUG nova.compute.manager [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 09:00:50 np0005465988 kernel: tap9221552d-e2 (unregistering): left promiscuous mode
Oct  2 09:00:50 np0005465988 NetworkManager[45041]: <info>  [1759410050.3344] device (tap9221552d-e2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:50 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:50Z|00893|binding|INFO|Releasing lport 9221552d-e239-4ec0-9a6e-28c040fcba23 from this chassis (sb_readonly=0)
Oct  2 09:00:50 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:50Z|00894|binding|INFO|Setting lport 9221552d-e239-4ec0-9a6e-28c040fcba23 down in Southbound
Oct  2 09:00:50 np0005465988 ovn_controller[132601]: 2025-10-02T13:00:50Z|00895|binding|INFO|Removing iface tap9221552d-e2 ovn-installed in OVS
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.373 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:97:e8 10.100.0.6'], port_security=['fa:16:3e:ed:97:e8 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6575cd5d-2f78-4718-b6b8-16026aec208c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7de325ef-ce0b-4c58-9e5a-e7fd1d046659', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'afe8f391-0a44-4d97-b178-976c8f6e4f3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc47b176-2cb3-4091-92f6-666c5ed4cb38, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=9221552d-e239-4ec0-9a6e-28c040fcba23) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.375 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 9221552d-e239-4ec0-9a6e-28c040fcba23 in datapath 7de325ef-ce0b-4c58-9e5a-e7fd1d046659 unbound from our chassis#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.376 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7de325ef-ce0b-4c58-9e5a-e7fd1d046659, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.378 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e09622a1-3479-4f02-9599-d8dd4b96faca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.378 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659 namespace which is not needed anymore#033[00m
Oct  2 09:00:50 np0005465988 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000c5.scope: Deactivated successfully.
Oct  2 09:00:50 np0005465988 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000c5.scope: Consumed 15.057s CPU time.
Oct  2 09:00:50 np0005465988 systemd-machined[192594]: Machine qemu-92-instance-000000c5 terminated.
Oct  2 09:00:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.491 2 INFO nova.virt.libvirt.driver [-] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Instance destroyed successfully.#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.492 2 DEBUG nova.objects.instance [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'resources' on Instance uuid 6575cd5d-2f78-4718-b6b8-16026aec208c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.520 2 DEBUG nova.virt.libvirt.vif [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:00:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-48146842',display_name='tempest-TestNetworkBasicOps-server-48146842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-48146842',id=197,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTzsnbn+W8rzAT2a52k99bGMwHywSGrM7Q6kP25/NGhEia97iCwvjkAY/jK0D4sgOXB4OPIUZM/wSRzdRxmyFxJk6uXMlT+HX2/sfp2LZC/CCn6EmQRg3Oz7tf1Vw+DMg==',key_name='tempest-TestNetworkBasicOps-926842640',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:00:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-8c0qztcx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:00:24Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=6575cd5d-2f78-4718-b6b8-16026aec208c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.521 2 DEBUG nova.network.os_vif_util [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.521 2 DEBUG nova.network.os_vif_util [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ed:97:e8,bridge_name='br-int',has_traffic_filtering=True,id=9221552d-e239-4ec0-9a6e-28c040fcba23,network=Network(7de325ef-ce0b-4c58-9e5a-e7fd1d046659),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9221552d-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.522 2 DEBUG os_vif [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:97:e8,bridge_name='br-int',has_traffic_filtering=True,id=9221552d-e239-4ec0-9a6e-28c040fcba23,network=Network(7de325ef-ce0b-4c58-9e5a-e7fd1d046659),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9221552d-e2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.524 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9221552d-e2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.528 2 INFO os_vif [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:97:e8,bridge_name='br-int',has_traffic_filtering=True,id=9221552d-e239-4ec0-9a6e-28c040fcba23,network=Network(7de325ef-ce0b-4c58-9e5a-e7fd1d046659),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9221552d-e2')#033[00m
Oct  2 09:00:50 np0005465988 neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659[327852]: [NOTICE]   (327875) : haproxy version is 2.8.14-c23fe91
Oct  2 09:00:50 np0005465988 neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659[327852]: [NOTICE]   (327875) : path to executable is /usr/sbin/haproxy
Oct  2 09:00:50 np0005465988 neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659[327852]: [WARNING]  (327875) : Exiting Master process...
Oct  2 09:00:50 np0005465988 neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659[327852]: [ALERT]    (327875) : Current worker (327877) exited with code 143 (Terminated)
Oct  2 09:00:50 np0005465988 neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659[327852]: [WARNING]  (327875) : All workers exited. Exiting... (0)
Oct  2 09:00:50 np0005465988 systemd[1]: libpod-9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7.scope: Deactivated successfully.
Oct  2 09:00:50 np0005465988 podman[328227]: 2025-10-02 13:00:50.552373852 +0000 UTC m=+0.052374469 container died 9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:00:50 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7-userdata-shm.mount: Deactivated successfully.
Oct  2 09:00:50 np0005465988 systemd[1]: var-lib-containers-storage-overlay-7061d632cc68e1c4762bb6bab293af8da41d564ffbf1b798405f82e82e95888c-merged.mount: Deactivated successfully.
Oct  2 09:00:50 np0005465988 podman[328227]: 2025-10-02 13:00:50.632777158 +0000 UTC m=+0.132777785 container cleanup 9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 09:00:50 np0005465988 systemd[1]: libpod-conmon-9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7.scope: Deactivated successfully.
Oct  2 09:00:50 np0005465988 podman[328277]: 2025-10-02 13:00:50.705778441 +0000 UTC m=+0.048905500 container remove 9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.711 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[01360188-e81c-495c-8f9a-cda70df192aa]: (4, ('Thu Oct  2 01:00:50 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659 (9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7)\n9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7\nThu Oct  2 01:00:50 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659 (9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7)\n9461b39821cfc790a2efe46eaaac2106a64ba5e0dcaf3a660fa5f00df833ace7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.713 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[74e28f5b-829d-42c8-b7db-0a1229e791db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.714 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7de325ef-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:50 np0005465988 kernel: tap7de325ef-c0: left promiscuous mode
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.721 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9716587e-8f68-4291-8fb9-3460bdff880d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.756 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cda20fa3-eaf3-447e-9f57-c7ed08ba345a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.759 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[58f4cf42-17db-425f-acc0-9d754ce80140]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.776 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[06e3380e-8a87-49af-85dc-1db101c724e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 808342, 'reachable_time': 26851, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 328292, 'error': None, 'target': 'ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:50 np0005465988 systemd[1]: run-netns-ovnmeta\x2d7de325ef\x2dce0b\x2d4c58\x2d9e5a\x2de7fd1d046659.mount: Deactivated successfully.
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.781 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7de325ef-ce0b-4c58-9e5a-e7fd1d046659 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:00:50 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:00:50.782 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[1b780097-aff8-45e6-b85a-142a34f65e1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.962 2 DEBUG nova.compute.manager [req-925a4e3a-1230-42c7-9038-cbf48b92362b req-9ba65afb-1559-4954-8793-40f8e31c7f82 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Received event network-vif-unplugged-9221552d-e239-4ec0-9a6e-28c040fcba23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.963 2 DEBUG oslo_concurrency.lockutils [req-925a4e3a-1230-42c7-9038-cbf48b92362b req-9ba65afb-1559-4954-8793-40f8e31c7f82 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.963 2 DEBUG oslo_concurrency.lockutils [req-925a4e3a-1230-42c7-9038-cbf48b92362b req-9ba65afb-1559-4954-8793-40f8e31c7f82 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.963 2 DEBUG oslo_concurrency.lockutils [req-925a4e3a-1230-42c7-9038-cbf48b92362b req-9ba65afb-1559-4954-8793-40f8e31c7f82 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.964 2 DEBUG nova.compute.manager [req-925a4e3a-1230-42c7-9038-cbf48b92362b req-9ba65afb-1559-4954-8793-40f8e31c7f82 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] No waiting events found dispatching network-vif-unplugged-9221552d-e239-4ec0-9a6e-28c040fcba23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:00:50 np0005465988 nova_compute[236126]: 2025-10-02 13:00:50.964 2 DEBUG nova.compute.manager [req-925a4e3a-1230-42c7-9038-cbf48b92362b req-9ba65afb-1559-4954-8793-40f8e31c7f82 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Received event network-vif-unplugged-9221552d-e239-4ec0-9a6e-28c040fcba23 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:00:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:51.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:51 np0005465988 nova_compute[236126]: 2025-10-02 13:00:51.123 2 INFO nova.virt.libvirt.driver [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Deleting instance files /var/lib/nova/instances/6575cd5d-2f78-4718-b6b8-16026aec208c_del#033[00m
Oct  2 09:00:51 np0005465988 nova_compute[236126]: 2025-10-02 13:00:51.125 2 INFO nova.virt.libvirt.driver [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Deletion of /var/lib/nova/instances/6575cd5d-2f78-4718-b6b8-16026aec208c_del complete#033[00m
Oct  2 09:00:51 np0005465988 nova_compute[236126]: 2025-10-02 13:00:51.590 2 INFO nova.compute.manager [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Took 1.34 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 09:00:51 np0005465988 nova_compute[236126]: 2025-10-02 13:00:51.590 2 DEBUG oslo.service.loopingcall [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 09:00:51 np0005465988 nova_compute[236126]: 2025-10-02 13:00:51.590 2 DEBUG nova.compute.manager [-] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 09:00:51 np0005465988 nova_compute[236126]: 2025-10-02 13:00:51.591 2 DEBUG nova.network.neutron [-] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 09:00:51 np0005465988 nova_compute[236126]: 2025-10-02 13:00:51.597 2 DEBUG nova.network.neutron [req-aa555697-a005-47a7-9991-9174567a70ae req-99648c8c-06a7-423a-b467-a159a7e4112a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Updated VIF entry in instance network info cache for port 9221552d-e239-4ec0-9a6e-28c040fcba23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:00:51 np0005465988 nova_compute[236126]: 2025-10-02 13:00:51.597 2 DEBUG nova.network.neutron [req-aa555697-a005-47a7-9991-9174567a70ae req-99648c8c-06a7-423a-b467-a159a7e4112a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Updating instance_info_cache with network_info: [{"id": "9221552d-e239-4ec0-9a6e-28c040fcba23", "address": "fa:16:3e:ed:97:e8", "network": {"id": "7de325ef-ce0b-4c58-9e5a-e7fd1d046659", "bridge": "br-int", "label": "tempest-network-smoke--2119894440", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9221552d-e2", "ovs_interfaceid": "9221552d-e239-4ec0-9a6e-28c040fcba23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:00:51 np0005465988 nova_compute[236126]: 2025-10-02 13:00:51.619 2 DEBUG oslo_concurrency.lockutils [req-aa555697-a005-47a7-9991-9174567a70ae req-99648c8c-06a7-423a-b467-a159a7e4112a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-6575cd5d-2f78-4718-b6b8-16026aec208c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:00:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:00:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:52.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:00:52 np0005465988 nova_compute[236126]: 2025-10-02 13:00:52.269 2 DEBUG nova.network.neutron [-] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:00:52 np0005465988 nova_compute[236126]: 2025-10-02 13:00:52.284 2 INFO nova.compute.manager [-] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Took 0.69 seconds to deallocate network for instance.#033[00m
Oct  2 09:00:52 np0005465988 nova_compute[236126]: 2025-10-02 13:00:52.335 2 DEBUG nova.compute.manager [req-55dac668-64c6-4c60-b7e9-c26e6ae12e31 req-41f5c5fb-2849-49bb-a555-4ef69c1a157d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Received event network-vif-deleted-9221552d-e239-4ec0-9a6e-28c040fcba23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:00:52 np0005465988 nova_compute[236126]: 2025-10-02 13:00:52.347 2 DEBUG oslo_concurrency.lockutils [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:52 np0005465988 nova_compute[236126]: 2025-10-02 13:00:52.348 2 DEBUG oslo_concurrency.lockutils [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:52 np0005465988 nova_compute[236126]: 2025-10-02 13:00:52.398 2 DEBUG oslo_concurrency.processutils [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:00:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 09:00:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1841137174' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 09:00:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 09:00:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1841137174' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 09:00:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:00:52 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1858059662' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:00:52 np0005465988 nova_compute[236126]: 2025-10-02 13:00:52.847 2 DEBUG oslo_concurrency.processutils [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:00:52 np0005465988 nova_compute[236126]: 2025-10-02 13:00:52.854 2 DEBUG nova.compute.provider_tree [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:00:52 np0005465988 nova_compute[236126]: 2025-10-02 13:00:52.877 2 DEBUG nova.scheduler.client.report [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:00:52 np0005465988 nova_compute[236126]: 2025-10-02 13:00:52.947 2 DEBUG oslo_concurrency.lockutils [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:52 np0005465988 nova_compute[236126]: 2025-10-02 13:00:52.994 2 INFO nova.scheduler.client.report [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Deleted allocations for instance 6575cd5d-2f78-4718-b6b8-16026aec208c#033[00m
Oct  2 09:00:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:53.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:53 np0005465988 nova_compute[236126]: 2025-10-02 13:00:53.052 2 DEBUG nova.compute.manager [req-55e22556-fd26-4008-b63d-a8e03a76b210 req-e5d78241-6455-4ce7-b6cb-d455ecf072d6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Received event network-vif-plugged-9221552d-e239-4ec0-9a6e-28c040fcba23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:00:53 np0005465988 nova_compute[236126]: 2025-10-02 13:00:53.053 2 DEBUG oslo_concurrency.lockutils [req-55e22556-fd26-4008-b63d-a8e03a76b210 req-e5d78241-6455-4ce7-b6cb-d455ecf072d6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:53 np0005465988 nova_compute[236126]: 2025-10-02 13:00:53.053 2 DEBUG oslo_concurrency.lockutils [req-55e22556-fd26-4008-b63d-a8e03a76b210 req-e5d78241-6455-4ce7-b6cb-d455ecf072d6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:53 np0005465988 nova_compute[236126]: 2025-10-02 13:00:53.053 2 DEBUG oslo_concurrency.lockutils [req-55e22556-fd26-4008-b63d-a8e03a76b210 req-e5d78241-6455-4ce7-b6cb-d455ecf072d6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:53 np0005465988 nova_compute[236126]: 2025-10-02 13:00:53.054 2 DEBUG nova.compute.manager [req-55e22556-fd26-4008-b63d-a8e03a76b210 req-e5d78241-6455-4ce7-b6cb-d455ecf072d6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] No waiting events found dispatching network-vif-plugged-9221552d-e239-4ec0-9a6e-28c040fcba23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:00:53 np0005465988 nova_compute[236126]: 2025-10-02 13:00:53.054 2 WARNING nova.compute.manager [req-55e22556-fd26-4008-b63d-a8e03a76b210 req-e5d78241-6455-4ce7-b6cb-d455ecf072d6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Received unexpected event network-vif-plugged-9221552d-e239-4ec0-9a6e-28c040fcba23 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 09:00:53 np0005465988 nova_compute[236126]: 2025-10-02 13:00:53.075 2 DEBUG oslo_concurrency.lockutils [None req-99f5df8a-27c9-4501-9abe-aaf0a0ae59b3 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "6575cd5d-2f78-4718-b6b8-16026aec208c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.831s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:54.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:54 np0005465988 podman[328318]: 2025-10-02 13:00:54.536345526 +0000 UTC m=+0.068343340 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:00:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:55.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:00:55 np0005465988 nova_compute[236126]: 2025-10-02 13:00:55.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:55 np0005465988 nova_compute[236126]: 2025-10-02 13:00:55.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:56.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:57.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:00:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:00:58.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:00:58 np0005465988 nova_compute[236126]: 2025-10-02 13:00:58.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:00:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:00:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:00:59.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:00:59 np0005465988 nova_compute[236126]: 2025-10-02 13:00:59.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:00.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:00 np0005465988 nova_compute[236126]: 2025-10-02 13:01:00.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:00 np0005465988 nova_compute[236126]: 2025-10-02 13:01:00.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:01.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:02.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:03.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:01:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:04.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:01:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:01:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:05.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:01:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:05 np0005465988 nova_compute[236126]: 2025-10-02 13:01:05.487 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410050.4848666, 6575cd5d-2f78-4718-b6b8-16026aec208c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:01:05 np0005465988 nova_compute[236126]: 2025-10-02 13:01:05.487 2 INFO nova.compute.manager [-] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:01:05 np0005465988 nova_compute[236126]: 2025-10-02 13:01:05.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:05 np0005465988 nova_compute[236126]: 2025-10-02 13:01:05.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:06.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:06 np0005465988 podman[328407]: 2025-10-02 13:01:06.532538422 +0000 UTC m=+0.059864715 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:01:06 np0005465988 podman[328408]: 2025-10-02 13:01:06.538572986 +0000 UTC m=+0.063149040 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:01:06 np0005465988 podman[328406]: 2025-10-02 13:01:06.5702977 +0000 UTC m=+0.098815037 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 09:01:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:07.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:08.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:01:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:09.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:01:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:10.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:10 np0005465988 nova_compute[236126]: 2025-10-02 13:01:10.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:10 np0005465988 nova_compute[236126]: 2025-10-02 13:01:10.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:11.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:12.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:01:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:13.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:01:13 np0005465988 nova_compute[236126]: 2025-10-02 13:01:13.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:13 np0005465988 nova_compute[236126]: 2025-10-02 13:01:13.475 2 DEBUG nova.compute.manager [None req-bd8c678a-27ae-4a53-a7b2-e28aa6968132 - - - - - -] [instance: 6575cd5d-2f78-4718-b6b8-16026aec208c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:01:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:14.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:15.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:15 np0005465988 nova_compute[236126]: 2025-10-02 13:01:15.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:15 np0005465988 nova_compute[236126]: 2025-10-02 13:01:15.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:01:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:16.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:01:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:17.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:17 np0005465988 nova_compute[236126]: 2025-10-02 13:01:17.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:17 np0005465988 nova_compute[236126]: 2025-10-02 13:01:17.668 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:17 np0005465988 nova_compute[236126]: 2025-10-02 13:01:17.669 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:17 np0005465988 nova_compute[236126]: 2025-10-02 13:01:17.669 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:17 np0005465988 nova_compute[236126]: 2025-10-02 13:01:17.669 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:01:17 np0005465988 nova_compute[236126]: 2025-10-02 13:01:17.669 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:01:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:01:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3759177344' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.130 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:01:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:18.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.333 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.334 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4058MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.335 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.335 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.617 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.617 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.632 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.648 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.649 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.662 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.689 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 09:01:18 np0005465988 nova_compute[236126]: 2025-10-02 13:01:18.706 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:01:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:19.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:01:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/446484916' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:01:19 np0005465988 nova_compute[236126]: 2025-10-02 13:01:19.176 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:01:19 np0005465988 nova_compute[236126]: 2025-10-02 13:01:19.183 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:01:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:01:19.213 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:01:19 np0005465988 nova_compute[236126]: 2025-10-02 13:01:19.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:01:19.215 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:01:19 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:01:19.216 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:01:19 np0005465988 nova_compute[236126]: 2025-10-02 13:01:19.244 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:01:19 np0005465988 nova_compute[236126]: 2025-10-02 13:01:19.485 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:01:19 np0005465988 nova_compute[236126]: 2025-10-02 13:01:19.485 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:20.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:20 np0005465988 nova_compute[236126]: 2025-10-02 13:01:20.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:20 np0005465988 nova_compute[236126]: 2025-10-02 13:01:20.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:21.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:22.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:23.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:24.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:25.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:25 np0005465988 nova_compute[236126]: 2025-10-02 13:01:25.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:25 np0005465988 podman[328578]: 2025-10-02 13:01:25.517190974 +0000 UTC m=+0.056425756 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:01:25 np0005465988 nova_compute[236126]: 2025-10-02 13:01:25.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:26.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:27.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:01:27.407 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:01:27.408 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:01:27.408 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:27 np0005465988 nova_compute[236126]: 2025-10-02 13:01:27.486 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:27 np0005465988 nova_compute[236126]: 2025-10-02 13:01:27.486 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:27 np0005465988 nova_compute[236126]: 2025-10-02 13:01:27.487 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:01:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:28.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:28 np0005465988 nova_compute[236126]: 2025-10-02 13:01:28.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:29.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:29 np0005465988 nova_compute[236126]: 2025-10-02 13:01:29.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:30.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:30 np0005465988 nova_compute[236126]: 2025-10-02 13:01:30.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:30 np0005465988 nova_compute[236126]: 2025-10-02 13:01:30.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:31.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:32.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:32 np0005465988 nova_compute[236126]: 2025-10-02 13:01:32.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:32 np0005465988 nova_compute[236126]: 2025-10-02 13:01:32.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:33.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:34.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:35.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #148. Immutable memtables: 0.
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.135765) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 93] Flushing memtable with next log file: 148
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410095135830, "job": 93, "event": "flush_started", "num_memtables": 1, "num_entries": 2439, "num_deletes": 254, "total_data_size": 5769538, "memory_usage": 5856992, "flush_reason": "Manual Compaction"}
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 93] Level-0 flush table #149: started
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410095165506, "cf_name": "default", "job": 93, "event": "table_file_creation", "file_number": 149, "file_size": 3781841, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 71839, "largest_seqno": 74273, "table_properties": {"data_size": 3771917, "index_size": 6289, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20823, "raw_average_key_size": 20, "raw_value_size": 3751972, "raw_average_value_size": 3729, "num_data_blocks": 273, "num_entries": 1006, "num_filter_entries": 1006, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759409886, "oldest_key_time": 1759409886, "file_creation_time": 1759410095, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 149, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 93] Flush lasted 29821 microseconds, and 16200 cpu microseconds.
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.165579) [db/flush_job.cc:967] [default] [JOB 93] Level-0 flush table #149: 3781841 bytes OK
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.165617) [db/memtable_list.cc:519] [default] Level-0 commit table #149 started
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.167600) [db/memtable_list.cc:722] [default] Level-0 commit table #149: memtable #1 done
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.167622) EVENT_LOG_v1 {"time_micros": 1759410095167614, "job": 93, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.167648) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 93] Try to delete WAL files size 5758847, prev total WAL file size 5758847, number of live WAL files 2.
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000145.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.169950) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036323735' seq:72057594037927935, type:22 .. '7061786F730036353237' seq:0, type:0; will stop at (end)
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 94] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 93 Base level 0, inputs: [149(3693KB)], [147(10011KB)]
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410095170068, "job": 94, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [149], "files_L6": [147], "score": -1, "input_data_size": 14033228, "oldest_snapshot_seqno": -1}
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 94] Generated table #150: 9557 keys, 12069511 bytes, temperature: kUnknown
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410095282972, "cf_name": "default", "job": 94, "event": "table_file_creation", "file_number": 150, "file_size": 12069511, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12007939, "index_size": 36563, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23941, "raw_key_size": 251532, "raw_average_key_size": 26, "raw_value_size": 11840444, "raw_average_value_size": 1238, "num_data_blocks": 1393, "num_entries": 9557, "num_filter_entries": 9557, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759410095, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 150, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.283348) [db/compaction/compaction_job.cc:1663] [default] [JOB 94] Compacted 1@0 + 1@6 files to L6 => 12069511 bytes
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.285209) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 124.2 rd, 106.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 9.8 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(6.9) write-amplify(3.2) OK, records in: 10082, records dropped: 525 output_compression: NoCompression
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.285239) EVENT_LOG_v1 {"time_micros": 1759410095285225, "job": 94, "event": "compaction_finished", "compaction_time_micros": 112998, "compaction_time_cpu_micros": 50754, "output_level": 6, "num_output_files": 1, "total_output_size": 12069511, "num_input_records": 10082, "num_output_records": 9557, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000149.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410095286512, "job": 94, "event": "table_file_deletion", "file_number": 149}
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000147.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410095290184, "job": 94, "event": "table_file_deletion", "file_number": 147}
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.169717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.290341) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.290349) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.290350) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.290352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:01:35.290353) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:01:35 np0005465988 nova_compute[236126]: 2025-10-02 13:01:35.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:35 np0005465988 nova_compute[236126]: 2025-10-02 13:01:35.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:01:35 np0005465988 nova_compute[236126]: 2025-10-02 13:01:35.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:01:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:35 np0005465988 nova_compute[236126]: 2025-10-02 13:01:35.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:35 np0005465988 nova_compute[236126]: 2025-10-02 13:01:35.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:35 np0005465988 nova_compute[236126]: 2025-10-02 13:01:35.548 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:01:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:36.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:37.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:37 np0005465988 podman[328604]: 2025-10-02 13:01:37.533396377 +0000 UTC m=+0.062958375 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 09:01:37 np0005465988 podman[328605]: 2025-10-02 13:01:37.535476336 +0000 UTC m=+0.063738806 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:01:37 np0005465988 podman[328603]: 2025-10-02 13:01:37.563949747 +0000 UTC m=+0.097782998 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 09:01:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:38.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:39.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:40.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:40 np0005465988 nova_compute[236126]: 2025-10-02 13:01:40.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:40 np0005465988 nova_compute[236126]: 2025-10-02 13:01:40.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:41.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:42.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:43.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:44.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:45.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:45 np0005465988 nova_compute[236126]: 2025-10-02 13:01:45.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:45 np0005465988 nova_compute[236126]: 2025-10-02 13:01:45.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:46.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:46 np0005465988 nova_compute[236126]: 2025-10-02 13:01:46.269 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:46 np0005465988 nova_compute[236126]: 2025-10-02 13:01:46.270 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:46 np0005465988 nova_compute[236126]: 2025-10-02 13:01:46.667 2 DEBUG nova.compute.manager [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:01:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:47.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:47 np0005465988 nova_compute[236126]: 2025-10-02 13:01:47.140 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:47 np0005465988 nova_compute[236126]: 2025-10-02 13:01:47.142 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:47 np0005465988 nova_compute[236126]: 2025-10-02 13:01:47.151 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:01:47 np0005465988 nova_compute[236126]: 2025-10-02 13:01:47.152 2 INFO nova.compute.claims [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:01:47 np0005465988 nova_compute[236126]: 2025-10-02 13:01:47.539 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:01:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:01:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:01:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:01:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:01:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:01:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:01:48 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/710081758' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:01:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:48.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.175 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.635s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.182 2 DEBUG nova.compute.provider_tree [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.242 2 DEBUG nova.scheduler.client.report [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.301 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.303 2 DEBUG nova.compute.manager [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.419 2 DEBUG nova.compute.manager [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.420 2 DEBUG nova.network.neutron [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.442 2 INFO nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.469 2 DEBUG nova.compute.manager [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.601 2 DEBUG nova.compute.manager [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.602 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.602 2 INFO nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Creating image(s)#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.628 2 DEBUG nova.storage.rbd_utils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:01:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:01:48 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.846 2 DEBUG nova.storage.rbd_utils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.880 2 DEBUG nova.storage.rbd_utils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.886 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.966 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.967 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.968 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.968 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:48 np0005465988 nova_compute[236126]: 2025-10-02 13:01:48.997 2 DEBUG nova.storage.rbd_utils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:01:49 np0005465988 nova_compute[236126]: 2025-10-02 13:01:49.002 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:01:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:49.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:49 np0005465988 nova_compute[236126]: 2025-10-02 13:01:49.433 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:01:49 np0005465988 nova_compute[236126]: 2025-10-02 13:01:49.524 2 DEBUG nova.storage.rbd_utils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] resizing rbd image f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 09:01:49 np0005465988 nova_compute[236126]: 2025-10-02 13:01:49.572 2 DEBUG nova.policy [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb366465e6154871b8a53c9f500105ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:01:49 np0005465988 nova_compute[236126]: 2025-10-02 13:01:49.678 2 DEBUG nova.objects.instance [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'migration_context' on Instance uuid f7e4398e-72d2-4983-9680-d518c4ca2b0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:01:49 np0005465988 nova_compute[236126]: 2025-10-02 13:01:49.696 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 09:01:49 np0005465988 nova_compute[236126]: 2025-10-02 13:01:49.696 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Ensure instance console log exists: /var/lib/nova/instances/f7e4398e-72d2-4983-9680-d518c4ca2b0e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:01:49 np0005465988 nova_compute[236126]: 2025-10-02 13:01:49.697 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:49 np0005465988 nova_compute[236126]: 2025-10-02 13:01:49.697 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:49 np0005465988 nova_compute[236126]: 2025-10-02 13:01:49.697 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:01:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:50.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:01:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:50 np0005465988 nova_compute[236126]: 2025-10-02 13:01:50.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:50 np0005465988 nova_compute[236126]: 2025-10-02 13:01:50.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:51.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:52.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:53.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:53 np0005465988 nova_compute[236126]: 2025-10-02 13:01:53.739 2 DEBUG nova.network.neutron [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Successfully created port: c3e55349-c91d-43c1-be7d-394f7b35ee2e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:01:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:54.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:01:54 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:01:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:55.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:01:55 np0005465988 nova_compute[236126]: 2025-10-02 13:01:55.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:55 np0005465988 nova_compute[236126]: 2025-10-02 13:01:55.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:56.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:56 np0005465988 podman[329098]: 2025-10-02 13:01:56.533588366 +0000 UTC m=+0.059903026 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:01:56 np0005465988 nova_compute[236126]: 2025-10-02 13:01:56.740 2 DEBUG nova.network.neutron [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Successfully updated port: c3e55349-c91d-43c1-be7d-394f7b35ee2e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:01:56 np0005465988 nova_compute[236126]: 2025-10-02 13:01:56.925 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:01:56 np0005465988 nova_compute[236126]: 2025-10-02 13:01:56.926 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquired lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:01:56 np0005465988 nova_compute[236126]: 2025-10-02 13:01:56.926 2 DEBUG nova.network.neutron [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:01:57 np0005465988 nova_compute[236126]: 2025-10-02 13:01:57.004 2 DEBUG nova.compute.manager [req-a297b56e-49d8-40cc-932e-a3f7e3314837 req-358cb6e5-aba1-4857-8b0c-236f4568557d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-changed-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:01:57 np0005465988 nova_compute[236126]: 2025-10-02 13:01:57.004 2 DEBUG nova.compute.manager [req-a297b56e-49d8-40cc-932e-a3f7e3314837 req-358cb6e5-aba1-4857-8b0c-236f4568557d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Refreshing instance network info cache due to event network-changed-c3e55349-c91d-43c1-be7d-394f7b35ee2e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:01:57 np0005465988 nova_compute[236126]: 2025-10-02 13:01:57.005 2 DEBUG oslo_concurrency.lockutils [req-a297b56e-49d8-40cc-932e-a3f7e3314837 req-358cb6e5-aba1-4857-8b0c-236f4568557d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:01:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:57.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:57 np0005465988 nova_compute[236126]: 2025-10-02 13:01:57.480 2 DEBUG nova.network.neutron [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:01:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:01:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:01:58.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.457 2 DEBUG nova.network.neutron [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updating instance_info_cache with network_info: [{"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.588 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Releasing lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.589 2 DEBUG nova.compute.manager [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Instance network_info: |[{"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.590 2 DEBUG oslo_concurrency.lockutils [req-a297b56e-49d8-40cc-932e-a3f7e3314837 req-358cb6e5-aba1-4857-8b0c-236f4568557d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.590 2 DEBUG nova.network.neutron [req-a297b56e-49d8-40cc-932e-a3f7e3314837 req-358cb6e5-aba1-4857-8b0c-236f4568557d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Refreshing network info cache for port c3e55349-c91d-43c1-be7d-394f7b35ee2e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.594 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Start _get_guest_xml network_info=[{"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.598 2 WARNING nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.602 2 DEBUG nova.virt.libvirt.host [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.602 2 DEBUG nova.virt.libvirt.host [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.605 2 DEBUG nova.virt.libvirt.host [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.606 2 DEBUG nova.virt.libvirt.host [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.607 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.607 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.608 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.608 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.608 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.608 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.609 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.609 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.609 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.609 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.609 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.610 2 DEBUG nova.virt.hardware [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:01:58 np0005465988 nova_compute[236126]: 2025-10-02 13:01:58.612 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:01:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:01:59 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3460342541' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.111 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.139 2 DEBUG nova.storage.rbd_utils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:01:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:01:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:01:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:01:59.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.144 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:01:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:01:59 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1874987275' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.649 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.651 2 DEBUG nova.virt.libvirt.vif [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:01:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-651296364',display_name='tempest-TestNetworkBasicOps-server-651296364',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-651296364',id=198,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLXzVoXqoH0c8B/MwaBanzF36D2STG/cCl7jSEQqMF9llc/T0alPbEVXmYKDW7yWtOCde2/kXi2eYv4vBR+19jXFCuuAhkxru11z4SaMvv7zHypYZ5UAQipjiPwodEPTg==',key_name='tempest-TestNetworkBasicOps-1704277327',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-3xkuzi1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:01:48Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=f7e4398e-72d2-4983-9680-d518c4ca2b0e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.652 2 DEBUG nova.network.os_vif_util [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.652 2 DEBUG nova.network.os_vif_util [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:d3:90,bridge_name='br-int',has_traffic_filtering=True,id=c3e55349-c91d-43c1-be7d-394f7b35ee2e,network=Network(3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3e55349-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.654 2 DEBUG nova.objects.instance [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'pci_devices' on Instance uuid f7e4398e-72d2-4983-9680-d518c4ca2b0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.850 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  <uuid>f7e4398e-72d2-4983-9680-d518c4ca2b0e</uuid>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  <name>instance-000000c6</name>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestNetworkBasicOps-server-651296364</nova:name>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:01:58</nova:creationTime>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <nova:user uuid="fb366465e6154871b8a53c9f500105ce">tempest-TestNetworkBasicOps-1692262680-project-member</nova:user>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <nova:project uuid="ce2ca82c03554560b55ed747ae63f1fb">tempest-TestNetworkBasicOps-1692262680</nova:project>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <nova:port uuid="c3e55349-c91d-43c1-be7d-394f7b35ee2e">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <entry name="serial">f7e4398e-72d2-4983-9680-d518c4ca2b0e</entry>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <entry name="uuid">f7e4398e-72d2-4983-9680-d518c4ca2b0e</entry>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk.config">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:c2:d3:90"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <target dev="tapc3e55349-c9"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/f7e4398e-72d2-4983-9680-d518c4ca2b0e/console.log" append="off"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:01:59 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:01:59 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:01:59 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:01:59 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.852 2 DEBUG nova.compute.manager [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Preparing to wait for external event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.853 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.853 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.854 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.854 2 DEBUG nova.virt.libvirt.vif [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:01:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-651296364',display_name='tempest-TestNetworkBasicOps-server-651296364',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-651296364',id=198,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLXzVoXqoH0c8B/MwaBanzF36D2STG/cCl7jSEQqMF9llc/T0alPbEVXmYKDW7yWtOCde2/kXi2eYv4vBR+19jXFCuuAhkxru11z4SaMvv7zHypYZ5UAQipjiPwodEPTg==',key_name='tempest-TestNetworkBasicOps-1704277327',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-3xkuzi1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:01:48Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=f7e4398e-72d2-4983-9680-d518c4ca2b0e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.855 2 DEBUG nova.network.os_vif_util [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.856 2 DEBUG nova.network.os_vif_util [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:d3:90,bridge_name='br-int',has_traffic_filtering=True,id=c3e55349-c91d-43c1-be7d-394f7b35ee2e,network=Network(3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3e55349-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.856 2 DEBUG os_vif [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:d3:90,bridge_name='br-int',has_traffic_filtering=True,id=c3e55349-c91d-43c1-be7d-394f7b35ee2e,network=Network(3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3e55349-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.858 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.858 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.863 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3e55349-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.864 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc3e55349-c9, col_values=(('external_ids', {'iface-id': 'c3e55349-c91d-43c1-be7d-394f7b35ee2e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:d3:90', 'vm-uuid': 'f7e4398e-72d2-4983-9680-d518c4ca2b0e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:59 np0005465988 NetworkManager[45041]: <info>  [1759410119.8667] manager: (tapc3e55349-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/392)
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:59 np0005465988 nova_compute[236126]: 2025-10-02 13:01:59.876 2 INFO os_vif [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:d3:90,bridge_name='br-int',has_traffic_filtering=True,id=c3e55349-c91d-43c1-be7d-394f7b35ee2e,network=Network(3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3e55349-c9')#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.046 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.047 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.047 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] No VIF found with MAC fa:16:3e:c2:d3:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.048 2 INFO nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Using config drive#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.084 2 DEBUG nova.storage.rbd_utils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:02:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:00.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:00 np0005465988 ovn_controller[132601]: 2025-10-02T13:02:00Z|00896|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  2 09:02:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.570 2 DEBUG nova.network.neutron [req-a297b56e-49d8-40cc-932e-a3f7e3314837 req-358cb6e5-aba1-4857-8b0c-236f4568557d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updated VIF entry in instance network info cache for port c3e55349-c91d-43c1-be7d-394f7b35ee2e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.571 2 DEBUG nova.network.neutron [req-a297b56e-49d8-40cc-932e-a3f7e3314837 req-358cb6e5-aba1-4857-8b0c-236f4568557d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updating instance_info_cache with network_info: [{"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.615 2 DEBUG oslo_concurrency.lockutils [req-a297b56e-49d8-40cc-932e-a3f7e3314837 req-358cb6e5-aba1-4857-8b0c-236f4568557d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.667 2 INFO nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Creating config drive at /var/lib/nova/instances/f7e4398e-72d2-4983-9680-d518c4ca2b0e/disk.config#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.675 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f7e4398e-72d2-4983-9680-d518c4ca2b0e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4zxy_rjj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.828 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f7e4398e-72d2-4983-9680-d518c4ca2b0e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4zxy_rjj" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.867 2 DEBUG nova.storage.rbd_utils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] rbd image f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:02:00 np0005465988 nova_compute[236126]: 2025-10-02 13:02:00.871 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f7e4398e-72d2-4983-9680-d518c4ca2b0e/disk.config f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:02:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:01.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:01 np0005465988 nova_compute[236126]: 2025-10-02 13:02:01.581 2 DEBUG oslo_concurrency.processutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f7e4398e-72d2-4983-9680-d518c4ca2b0e/disk.config f7e4398e-72d2-4983-9680-d518c4ca2b0e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.710s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:02:01 np0005465988 nova_compute[236126]: 2025-10-02 13:02:01.582 2 INFO nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Deleting local config drive /var/lib/nova/instances/f7e4398e-72d2-4983-9680-d518c4ca2b0e/disk.config because it was imported into RBD.#033[00m
Oct  2 09:02:01 np0005465988 kernel: tapc3e55349-c9: entered promiscuous mode
Oct  2 09:02:01 np0005465988 NetworkManager[45041]: <info>  [1759410121.6430] manager: (tapc3e55349-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/393)
Oct  2 09:02:01 np0005465988 ovn_controller[132601]: 2025-10-02T13:02:01Z|00897|binding|INFO|Claiming lport c3e55349-c91d-43c1-be7d-394f7b35ee2e for this chassis.
Oct  2 09:02:01 np0005465988 ovn_controller[132601]: 2025-10-02T13:02:01Z|00898|binding|INFO|c3e55349-c91d-43c1-be7d-394f7b35ee2e: Claiming fa:16:3e:c2:d3:90 10.100.0.4
Oct  2 09:02:01 np0005465988 nova_compute[236126]: 2025-10-02 13:02:01.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:01 np0005465988 systemd-udevd[329303]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:02:01 np0005465988 systemd-machined[192594]: New machine qemu-93-instance-000000c6.
Oct  2 09:02:01 np0005465988 NetworkManager[45041]: <info>  [1759410121.6926] device (tapc3e55349-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:02:01 np0005465988 NetworkManager[45041]: <info>  [1759410121.6936] device (tapc3e55349-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:02:01 np0005465988 nova_compute[236126]: 2025-10-02 13:02:01.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:01 np0005465988 systemd[1]: Started Virtual Machine qemu-93-instance-000000c6.
Oct  2 09:02:01 np0005465988 ovn_controller[132601]: 2025-10-02T13:02:01Z|00899|binding|INFO|Setting lport c3e55349-c91d-43c1-be7d-394f7b35ee2e ovn-installed in OVS
Oct  2 09:02:01 np0005465988 nova_compute[236126]: 2025-10-02 13:02:01.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:01 np0005465988 ovn_controller[132601]: 2025-10-02T13:02:01Z|00900|binding|INFO|Setting lport c3e55349-c91d-43c1-be7d-394f7b35ee2e up in Southbound
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.783 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:d3:90 10.100.0.4'], port_security=['fa:16:3e:c2:d3:90 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f7e4398e-72d2-4983-9680-d518c4ca2b0e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3aaf3d03-7d0a-4651-a8c6-da6d26cf1390', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=010b8986-6ed3-4d08-a5a9-d68a3bf546a0, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c3e55349-c91d-43c1-be7d-394f7b35ee2e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.785 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c3e55349-c91d-43c1-be7d-394f7b35ee2e in datapath 3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f bound to our chassis#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.787 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.799 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cb6317ae-1153-4234-aef9-7482611e5c3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.800 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3e2a6e0a-a1 in ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.801 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3e2a6e0a-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.801 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed89a89-e54a-4d14-aa13-bc0fe89514f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.803 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[267ecfd6-d71b-4dee-b0aa-ddc6d21ca649]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.816 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9ec7ce-e55b-4dae-814f-a3762fcb0088]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.840 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[042fb9fe-0e76-4cb6-819c-39bdd5b151e6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.879 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[76873132-2631-4052-b81b-07db90b0da55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:01 np0005465988 NetworkManager[45041]: <info>  [1759410121.8889] manager: (tap3e2a6e0a-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/394)
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.888 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f5b8fa90-1802-40a5-b674-9ad383ec96b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.934 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[1cf1a54c-2a7e-4080-9e38-00889716c776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.939 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[02d19b55-7ae9-4d4f-9059-414286d01664]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:01 np0005465988 NetworkManager[45041]: <info>  [1759410121.9731] device (tap3e2a6e0a-a0): carrier: link connected
Oct  2 09:02:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:01.981 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cdbe10-808e-4830-9a0a-24a8ad66fdbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.004 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0267ea94-5dcd-4a5d-8ca7-af9d25099022]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e2a6e0a-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:86:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 262], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 818230, 'reachable_time': 17441, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 329337, 'error': None, 'target': 'ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.023 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bd59b09f-4530-4a47-b1dc-65e301ddf0de]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feee:86fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 818230, 'tstamp': 818230}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 329338, 'error': None, 'target': 'ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.043 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[005de3c8-694a-4640-a751-eb05c4ca1125]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e2a6e0a-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:86:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 196, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 196, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 262], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 818230, 'reachable_time': 17441, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 329339, 'error': None, 'target': 'ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.073 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4693e05b-7e29-488f-96ab-8bb447475f9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.132 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[32f2836f-7178-4f8c-a366-0f7012e24195]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.136 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e2a6e0a-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.136 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.137 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e2a6e0a-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:02:02 np0005465988 kernel: tap3e2a6e0a-a0: entered promiscuous mode
Oct  2 09:02:02 np0005465988 nova_compute[236126]: 2025-10-02 13:02:02.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:02 np0005465988 NetworkManager[45041]: <info>  [1759410122.1411] manager: (tap3e2a6e0a-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/395)
Oct  2 09:02:02 np0005465988 nova_compute[236126]: 2025-10-02 13:02:02.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.143 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3e2a6e0a-a0, col_values=(('external_ids', {'iface-id': '7036aed5-0937-4e25-a7aa-7fca92d826ad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:02:02 np0005465988 nova_compute[236126]: 2025-10-02 13:02:02.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:02 np0005465988 ovn_controller[132601]: 2025-10-02T13:02:02Z|00901|binding|INFO|Releasing lport 7036aed5-0937-4e25-a7aa-7fca92d826ad from this chassis (sb_readonly=0)
Oct  2 09:02:02 np0005465988 nova_compute[236126]: 2025-10-02 13:02:02.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.160 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.161 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8c159f90-42d4-4783-9a16-e901b1eb6e66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.162 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f.pid.haproxy
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:02:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:02.163 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f', 'env', 'PROCESS_TAG=haproxy-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:02:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:02.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:02 np0005465988 podman[329371]: 2025-10-02 13:02:02.525842241 +0000 UTC m=+0.025206347 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:02:02 np0005465988 podman[329371]: 2025-10-02 13:02:02.626791898 +0000 UTC m=+0.126155974 container create 5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 09:02:02 np0005465988 systemd[1]: Started libpod-conmon-5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55.scope.
Oct  2 09:02:02 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:02:02 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08e77e52409b99907e6b5b75a4a41d2a1a93b290f28ff47870a363263bb7a379/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:02:02 np0005465988 podman[329371]: 2025-10-02 13:02:02.742378437 +0000 UTC m=+0.241742523 container init 5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:02:02 np0005465988 podman[329371]: 2025-10-02 13:02:02.751049577 +0000 UTC m=+0.250413673 container start 5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:02:02 np0005465988 neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f[329404]: [NOTICE]   (329426) : New worker (329432) forked
Oct  2 09:02:02 np0005465988 neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f[329404]: [NOTICE]   (329426) : Loading success.
Oct  2 09:02:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:03.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:03 np0005465988 nova_compute[236126]: 2025-10-02 13:02:03.243 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410123.2430832, f7e4398e-72d2-4983-9680-d518c4ca2b0e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:02:03 np0005465988 nova_compute[236126]: 2025-10-02 13:02:03.244 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] VM Started (Lifecycle Event)#033[00m
Oct  2 09:02:03 np0005465988 nova_compute[236126]: 2025-10-02 13:02:03.399 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:02:03 np0005465988 nova_compute[236126]: 2025-10-02 13:02:03.405 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410123.244118, f7e4398e-72d2-4983-9680-d518c4ca2b0e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:02:03 np0005465988 nova_compute[236126]: 2025-10-02 13:02:03.405 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:02:03 np0005465988 nova_compute[236126]: 2025-10-02 13:02:03.678 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:02:03 np0005465988 nova_compute[236126]: 2025-10-02 13:02:03.682 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:02:04 np0005465988 nova_compute[236126]: 2025-10-02 13:02:04.155 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:02:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:04.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:04 np0005465988 nova_compute[236126]: 2025-10-02 13:02:04.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 09:02:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:05.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 09:02:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:05 np0005465988 nova_compute[236126]: 2025-10-02 13:02:05.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:06.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.960 2 DEBUG nova.compute.manager [req-dd32985b-3994-4149-ad9f-8f2cd5e1e06f req-9aa55258-70c1-48c4-a29e-cb0a511ab63d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.960 2 DEBUG oslo_concurrency.lockutils [req-dd32985b-3994-4149-ad9f-8f2cd5e1e06f req-9aa55258-70c1-48c4-a29e-cb0a511ab63d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.961 2 DEBUG oslo_concurrency.lockutils [req-dd32985b-3994-4149-ad9f-8f2cd5e1e06f req-9aa55258-70c1-48c4-a29e-cb0a511ab63d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.961 2 DEBUG oslo_concurrency.lockutils [req-dd32985b-3994-4149-ad9f-8f2cd5e1e06f req-9aa55258-70c1-48c4-a29e-cb0a511ab63d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.961 2 DEBUG nova.compute.manager [req-dd32985b-3994-4149-ad9f-8f2cd5e1e06f req-9aa55258-70c1-48c4-a29e-cb0a511ab63d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Processing event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.961 2 DEBUG nova.compute.manager [req-dd32985b-3994-4149-ad9f-8f2cd5e1e06f req-9aa55258-70c1-48c4-a29e-cb0a511ab63d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.961 2 DEBUG oslo_concurrency.lockutils [req-dd32985b-3994-4149-ad9f-8f2cd5e1e06f req-9aa55258-70c1-48c4-a29e-cb0a511ab63d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.962 2 DEBUG oslo_concurrency.lockutils [req-dd32985b-3994-4149-ad9f-8f2cd5e1e06f req-9aa55258-70c1-48c4-a29e-cb0a511ab63d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.962 2 DEBUG oslo_concurrency.lockutils [req-dd32985b-3994-4149-ad9f-8f2cd5e1e06f req-9aa55258-70c1-48c4-a29e-cb0a511ab63d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.962 2 DEBUG nova.compute.manager [req-dd32985b-3994-4149-ad9f-8f2cd5e1e06f req-9aa55258-70c1-48c4-a29e-cb0a511ab63d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] No waiting events found dispatching network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.962 2 WARNING nova.compute.manager [req-dd32985b-3994-4149-ad9f-8f2cd5e1e06f req-9aa55258-70c1-48c4-a29e-cb0a511ab63d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received unexpected event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e for instance with vm_state building and task_state spawning.#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.963 2 DEBUG nova.compute.manager [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.967 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410126.9669533, f7e4398e-72d2-4983-9680-d518c4ca2b0e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.967 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.969 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.973 2 INFO nova.virt.libvirt.driver [-] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Instance spawned successfully.#033[00m
Oct  2 09:02:06 np0005465988 nova_compute[236126]: 2025-10-02 13:02:06.973 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.062 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.068 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.068 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.069 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.069 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.070 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.070 2 DEBUG nova.virt.libvirt.driver [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.074 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.149 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:02:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:07.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.274 2 INFO nova.compute.manager [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Took 18.67 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.274 2 DEBUG nova.compute.manager [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.422 2 INFO nova.compute.manager [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Took 20.31 seconds to build instance.#033[00m
Oct  2 09:02:07 np0005465988 nova_compute[236126]: 2025-10-02 13:02:07.462 2 DEBUG oslo_concurrency.lockutils [None req-ae7fc593-ffc9-4821-b5b3-ba5a506bf5a2 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:08.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:08 np0005465988 podman[329447]: 2025-10-02 13:02:08.53671358 +0000 UTC m=+0.067452114 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 09:02:08 np0005465988 podman[329448]: 2025-10-02 13:02:08.559503476 +0000 UTC m=+0.083412133 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 09:02:08 np0005465988 podman[329446]: 2025-10-02 13:02:08.571930924 +0000 UTC m=+0.102936366 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:02:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:09.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:09 np0005465988 nova_compute[236126]: 2025-10-02 13:02:09.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:10.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:10 np0005465988 nova_compute[236126]: 2025-10-02 13:02:10.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:11.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:12.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:13.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:14 np0005465988 nova_compute[236126]: 2025-10-02 13:02:14.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:14 np0005465988 NetworkManager[45041]: <info>  [1759410134.0788] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/396)
Oct  2 09:02:14 np0005465988 NetworkManager[45041]: <info>  [1759410134.0804] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/397)
Oct  2 09:02:14 np0005465988 nova_compute[236126]: 2025-10-02 13:02:14.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:14 np0005465988 ovn_controller[132601]: 2025-10-02T13:02:14Z|00902|binding|INFO|Releasing lport 7036aed5-0937-4e25-a7aa-7fca92d826ad from this chassis (sb_readonly=0)
Oct  2 09:02:14 np0005465988 nova_compute[236126]: 2025-10-02 13:02:14.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:14.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:14 np0005465988 nova_compute[236126]: 2025-10-02 13:02:14.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:14 np0005465988 nova_compute[236126]: 2025-10-02 13:02:14.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:15.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:15 np0005465988 nova_compute[236126]: 2025-10-02 13:02:15.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:15 np0005465988 nova_compute[236126]: 2025-10-02 13:02:15.660 2 DEBUG nova.compute.manager [req-e42f693d-dee2-442f-9f9f-de4b19f2abd6 req-667194c8-d8bc-4dc3-96f1-de76408482de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-changed-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:02:15 np0005465988 nova_compute[236126]: 2025-10-02 13:02:15.661 2 DEBUG nova.compute.manager [req-e42f693d-dee2-442f-9f9f-de4b19f2abd6 req-667194c8-d8bc-4dc3-96f1-de76408482de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Refreshing instance network info cache due to event network-changed-c3e55349-c91d-43c1-be7d-394f7b35ee2e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:02:15 np0005465988 nova_compute[236126]: 2025-10-02 13:02:15.661 2 DEBUG oslo_concurrency.lockutils [req-e42f693d-dee2-442f-9f9f-de4b19f2abd6 req-667194c8-d8bc-4dc3-96f1-de76408482de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:02:15 np0005465988 nova_compute[236126]: 2025-10-02 13:02:15.662 2 DEBUG oslo_concurrency.lockutils [req-e42f693d-dee2-442f-9f9f-de4b19f2abd6 req-667194c8-d8bc-4dc3-96f1-de76408482de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:02:15 np0005465988 nova_compute[236126]: 2025-10-02 13:02:15.662 2 DEBUG nova.network.neutron [req-e42f693d-dee2-442f-9f9f-de4b19f2abd6 req-667194c8-d8bc-4dc3-96f1-de76408482de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Refreshing network info cache for port c3e55349-c91d-43c1-be7d-394f7b35ee2e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:02:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:16.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:17.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:17 np0005465988 nova_compute[236126]: 2025-10-02 13:02:17.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:17 np0005465988 nova_compute[236126]: 2025-10-02 13:02:17.560 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:17 np0005465988 nova_compute[236126]: 2025-10-02 13:02:17.562 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:17 np0005465988 nova_compute[236126]: 2025-10-02 13:02:17.562 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:17 np0005465988 nova_compute[236126]: 2025-10-02 13:02:17.563 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:02:17 np0005465988 nova_compute[236126]: 2025-10-02 13:02:17.564 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:02:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:02:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1747323140' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.003 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:02:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:18.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.248 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.248 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.450 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.451 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3877MB free_disk=20.924232482910156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.452 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.452 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.700 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance f7e4398e-72d2-4983-9680-d518c4ca2b0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.701 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.701 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.823 2 DEBUG nova.network.neutron [req-e42f693d-dee2-442f-9f9f-de4b19f2abd6 req-667194c8-d8bc-4dc3-96f1-de76408482de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updated VIF entry in instance network info cache for port c3e55349-c91d-43c1-be7d-394f7b35ee2e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.824 2 DEBUG nova.network.neutron [req-e42f693d-dee2-442f-9f9f-de4b19f2abd6 req-667194c8-d8bc-4dc3-96f1-de76408482de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updating instance_info_cache with network_info: [{"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.884 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:02:18 np0005465988 nova_compute[236126]: 2025-10-02 13:02:18.922 2 DEBUG oslo_concurrency.lockutils [req-e42f693d-dee2-442f-9f9f-de4b19f2abd6 req-667194c8-d8bc-4dc3-96f1-de76408482de d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:02:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:19.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:02:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3523359346' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:02:19 np0005465988 nova_compute[236126]: 2025-10-02 13:02:19.340 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:02:19 np0005465988 nova_compute[236126]: 2025-10-02 13:02:19.345 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:02:19 np0005465988 nova_compute[236126]: 2025-10-02 13:02:19.411 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:02:19 np0005465988 nova_compute[236126]: 2025-10-02 13:02:19.514 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:02:19 np0005465988 nova_compute[236126]: 2025-10-02 13:02:19.515 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:19 np0005465988 nova_compute[236126]: 2025-10-02 13:02:19.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:20.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:20 np0005465988 nova_compute[236126]: 2025-10-02 13:02:20.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:20.385 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:02:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:20.388 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:02:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:20 np0005465988 nova_compute[236126]: 2025-10-02 13:02:20.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:20 np0005465988 ovn_controller[132601]: 2025-10-02T13:02:20Z|00115|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c2:d3:90 10.100.0.4
Oct  2 09:02:20 np0005465988 ovn_controller[132601]: 2025-10-02T13:02:20Z|00116|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c2:d3:90 10.100.0.4
Oct  2 09:02:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:21.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:22.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:23.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:24.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:24 np0005465988 nova_compute[236126]: 2025-10-02 13:02:24.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:25.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:25 np0005465988 nova_compute[236126]: 2025-10-02 13:02:25.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:26.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:27.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:27.408 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:27.409 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:27.410 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:27 np0005465988 podman[329613]: 2025-10-02 13:02:27.55432805 +0000 UTC m=+0.083974090 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 09:02:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:28.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:02:28.390 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:02:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:29.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:29 np0005465988 nova_compute[236126]: 2025-10-02 13:02:29.516 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:29 np0005465988 nova_compute[236126]: 2025-10-02 13:02:29.517 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:29 np0005465988 nova_compute[236126]: 2025-10-02 13:02:29.517 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:02:29 np0005465988 nova_compute[236126]: 2025-10-02 13:02:29.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:30.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:30 np0005465988 nova_compute[236126]: 2025-10-02 13:02:30.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:30 np0005465988 nova_compute[236126]: 2025-10-02 13:02:30.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:30 np0005465988 nova_compute[236126]: 2025-10-02 13:02:30.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:31.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:32.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:32 np0005465988 nova_compute[236126]: 2025-10-02 13:02:32.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:33.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:33 np0005465988 nova_compute[236126]: 2025-10-02 13:02:33.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:34.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:34 np0005465988 nova_compute[236126]: 2025-10-02 13:02:34.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:35.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:35 np0005465988 nova_compute[236126]: 2025-10-02 13:02:35.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:36.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:37.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:37 np0005465988 nova_compute[236126]: 2025-10-02 13:02:37.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:37 np0005465988 nova_compute[236126]: 2025-10-02 13:02:37.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:02:37 np0005465988 nova_compute[236126]: 2025-10-02 13:02:37.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:02:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:38.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:38 np0005465988 nova_compute[236126]: 2025-10-02 13:02:38.263 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:02:38 np0005465988 nova_compute[236126]: 2025-10-02 13:02:38.263 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:02:38 np0005465988 nova_compute[236126]: 2025-10-02 13:02:38.263 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 09:02:38 np0005465988 nova_compute[236126]: 2025-10-02 13:02:38.264 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f7e4398e-72d2-4983-9680-d518c4ca2b0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:02:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:39.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:39 np0005465988 podman[329640]: 2025-10-02 13:02:39.553830374 +0000 UTC m=+0.086577414 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:02:39 np0005465988 podman[329639]: 2025-10-02 13:02:39.554519584 +0000 UTC m=+0.090797216 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 09:02:39 np0005465988 podman[329638]: 2025-10-02 13:02:39.57832153 +0000 UTC m=+0.111560424 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 09:02:39 np0005465988 nova_compute[236126]: 2025-10-02 13:02:39.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:40.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:40 np0005465988 nova_compute[236126]: 2025-10-02 13:02:40.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:41.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:41 np0005465988 nova_compute[236126]: 2025-10-02 13:02:41.594 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updating instance_info_cache with network_info: [{"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:02:41 np0005465988 nova_compute[236126]: 2025-10-02 13:02:41.628 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:02:41 np0005465988 nova_compute[236126]: 2025-10-02 13:02:41.629 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 09:02:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:42.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:43.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:44.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:44 np0005465988 nova_compute[236126]: 2025-10-02 13:02:44.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:45.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:45 np0005465988 nova_compute[236126]: 2025-10-02 13:02:45.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:46.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:47.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:48.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:49.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:49 np0005465988 nova_compute[236126]: 2025-10-02 13:02:49.624 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:49 np0005465988 nova_compute[236126]: 2025-10-02 13:02:49.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:02:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:50.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:02:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:50 np0005465988 nova_compute[236126]: 2025-10-02 13:02:50.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:51.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:51 np0005465988 nova_compute[236126]: 2025-10-02 13:02:51.341 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Acquiring lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:51 np0005465988 nova_compute[236126]: 2025-10-02 13:02:51.341 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:51 np0005465988 nova_compute[236126]: 2025-10-02 13:02:51.433 2 DEBUG nova.compute.manager [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:02:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:02:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:52.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:02:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:53.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:53 np0005465988 nova_compute[236126]: 2025-10-02 13:02:53.454 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:53 np0005465988 nova_compute[236126]: 2025-10-02 13:02:53.455 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:53 np0005465988 nova_compute[236126]: 2025-10-02 13:02:53.461 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:02:53 np0005465988 nova_compute[236126]: 2025-10-02 13:02:53.461 2 INFO nova.compute.claims [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:02:53 np0005465988 nova_compute[236126]: 2025-10-02 13:02:53.714 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:02:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:02:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2175556008' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.189 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.200 2 DEBUG nova.compute.provider_tree [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.223 2 DEBUG nova.scheduler.client.report [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:02:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:02:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:54.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.265 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.266 2 DEBUG nova.compute.manager [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.336 2 DEBUG nova.compute.manager [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.337 2 DEBUG nova.network.neutron [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.365 2 INFO nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.412 2 DEBUG nova.compute.manager [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.564 2 DEBUG nova.compute.manager [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.566 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.566 2 INFO nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Creating image(s)#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.598 2 DEBUG nova.storage.rbd_utils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] rbd image 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.633 2 DEBUG nova.storage.rbd_utils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] rbd image 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.669 2 DEBUG nova.storage.rbd_utils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] rbd image 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.676 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.720 2 DEBUG nova.policy [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93facc00c95f4cbfa6cecaf3641182bc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5eceae619a6f4fdeaa8ba6fafda4912a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.760 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.762 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.764 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.764 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.803 2 DEBUG nova.storage.rbd_utils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] rbd image 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.808 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:02:54 np0005465988 nova_compute[236126]: 2025-10-02 13:02:54.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:55 np0005465988 nova_compute[236126]: 2025-10-02 13:02:55.131 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.323s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:02:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 09:02:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2008475557' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 09:02:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 09:02:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2008475557' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 09:02:55 np0005465988 nova_compute[236126]: 2025-10-02 13:02:55.239 2 DEBUG nova.storage.rbd_utils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] resizing rbd image 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 09:02:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:55.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:02:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:02:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:02:55 np0005465988 nova_compute[236126]: 2025-10-02 13:02:55.348 2 DEBUG nova.objects.instance [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lazy-loading 'migration_context' on Instance uuid 640fbec9-1ab9-4115-892a-3e91f24ed2ae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:02:55 np0005465988 nova_compute[236126]: 2025-10-02 13:02:55.380 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 09:02:55 np0005465988 nova_compute[236126]: 2025-10-02 13:02:55.380 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Ensure instance console log exists: /var/lib/nova/instances/640fbec9-1ab9-4115-892a-3e91f24ed2ae/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:02:55 np0005465988 nova_compute[236126]: 2025-10-02 13:02:55.381 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:55 np0005465988 nova_compute[236126]: 2025-10-02 13:02:55.381 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:55 np0005465988 nova_compute[236126]: 2025-10-02 13:02:55.381 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:02:55 np0005465988 nova_compute[236126]: 2025-10-02 13:02:55.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:56.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:56 np0005465988 nova_compute[236126]: 2025-10-02 13:02:56.775 2 DEBUG nova.network.neutron [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Successfully created port: e808672d-dd35-463d-8c4c-c82ed7646741 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:02:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:57.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:58 np0005465988 nova_compute[236126]: 2025-10-02 13:02:58.132 2 DEBUG nova.network.neutron [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Successfully updated port: e808672d-dd35-463d-8c4c-c82ed7646741 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:02:58 np0005465988 nova_compute[236126]: 2025-10-02 13:02:58.173 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Acquiring lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:02:58 np0005465988 nova_compute[236126]: 2025-10-02 13:02:58.175 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Acquired lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:02:58 np0005465988 nova_compute[236126]: 2025-10-02 13:02:58.175 2 DEBUG nova.network.neutron [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:02:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:02:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:02:58.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:02:58 np0005465988 nova_compute[236126]: 2025-10-02 13:02:58.306 2 DEBUG nova.compute.manager [req-3060d37d-2736-4b34-987b-62c43e4dd06f req-1e4bfedc-1349-44f8-af09-9c453f258a5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Received event network-changed-e808672d-dd35-463d-8c4c-c82ed7646741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:02:58 np0005465988 nova_compute[236126]: 2025-10-02 13:02:58.307 2 DEBUG nova.compute.manager [req-3060d37d-2736-4b34-987b-62c43e4dd06f req-1e4bfedc-1349-44f8-af09-9c453f258a5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Refreshing instance network info cache due to event network-changed-e808672d-dd35-463d-8c4c-c82ed7646741. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:02:58 np0005465988 nova_compute[236126]: 2025-10-02 13:02:58.307 2 DEBUG oslo_concurrency.lockutils [req-3060d37d-2736-4b34-987b-62c43e4dd06f req-1e4bfedc-1349-44f8-af09-9c453f258a5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:02:58 np0005465988 nova_compute[236126]: 2025-10-02 13:02:58.480 2 DEBUG nova.network.neutron [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:02:58 np0005465988 podman[330080]: 2025-10-02 13:02:58.539829615 +0000 UTC m=+0.068730870 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:02:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:02:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:02:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:02:59.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:02:59 np0005465988 nova_compute[236126]: 2025-10-02 13:02:59.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:00.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:00 np0005465988 nova_compute[236126]: 2025-10-02 13:03:00.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:00 np0005465988 nova_compute[236126]: 2025-10-02 13:03:00.882 2 DEBUG nova.network.neutron [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Updating instance_info_cache with network_info: [{"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:03:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:01.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.296 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Releasing lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.296 2 DEBUG nova.compute.manager [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Instance network_info: |[{"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.297 2 DEBUG oslo_concurrency.lockutils [req-3060d37d-2736-4b34-987b-62c43e4dd06f req-1e4bfedc-1349-44f8-af09-9c453f258a5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.297 2 DEBUG nova.network.neutron [req-3060d37d-2736-4b34-987b-62c43e4dd06f req-1e4bfedc-1349-44f8-af09-9c453f258a5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Refreshing network info cache for port e808672d-dd35-463d-8c4c-c82ed7646741 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.299 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Start _get_guest_xml network_info=[{"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.304 2 WARNING nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.309 2 DEBUG nova.virt.libvirt.host [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.310 2 DEBUG nova.virt.libvirt.host [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.313 2 DEBUG nova.virt.libvirt.host [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.314 2 DEBUG nova.virt.libvirt.host [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.315 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.315 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.316 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.316 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.316 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.316 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.317 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.317 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.317 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.317 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.318 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.318 2 DEBUG nova.virt.hardware [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.320 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:03:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:03:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:03:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:03:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4130636126' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.843 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.877 2 DEBUG nova.storage.rbd_utils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] rbd image 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:03:01 np0005465988 nova_compute[236126]: 2025-10-02 13:03:01.882 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:03:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:02.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:02 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #58. Immutable memtables: 14.
Oct  2 09:03:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:03:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1682172624' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.382 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.384 2 DEBUG nova.virt.libvirt.vif [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1921024035',display_name='tempest-AttachVolumeNegativeTest-server-1921024035',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1921024035',id=201,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDDrMvZ77F/UIZfU+v9K7atXR5NjjRA7wn/L4bHndWJAEEnJTo/JMZnZyeU+hLDfIrLuljZuLJ61gnvWEBfMNMfiDcyAb4KC3UGvs/4WwzQe2L+IRgQtFWqJOITPqlAajA==',key_name='tempest-keypair-89702726',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5eceae619a6f4fdeaa8ba6fafda4912a',ramdisk_id='',reservation_id='r-ixaq07al',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1407980822',owner_user_name='tempest-AttachVolumeNegativeTest-1407980822-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:02:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93facc00c95f4cbfa6cecaf3641182bc',uuid=640fbec9-1ab9-4115-892a-3e91f24ed2ae,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.385 2 DEBUG nova.network.os_vif_util [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Converting VIF {"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.386 2 DEBUG nova.network.os_vif_util [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:14:c5,bridge_name='br-int',has_traffic_filtering=True,id=e808672d-dd35-463d-8c4c-c82ed7646741,network=Network(2471b6f7-ee51-4239-8b52-7016ab4d9fd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape808672d-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.388 2 DEBUG nova.objects.instance [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lazy-loading 'pci_devices' on Instance uuid 640fbec9-1ab9-4115-892a-3e91f24ed2ae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.447 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  <uuid>640fbec9-1ab9-4115-892a-3e91f24ed2ae</uuid>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  <name>instance-000000c9</name>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <nova:name>tempest-AttachVolumeNegativeTest-server-1921024035</nova:name>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:03:01</nova:creationTime>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <nova:user uuid="93facc00c95f4cbfa6cecaf3641182bc">tempest-AttachVolumeNegativeTest-1407980822-project-member</nova:user>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <nova:project uuid="5eceae619a6f4fdeaa8ba6fafda4912a">tempest-AttachVolumeNegativeTest-1407980822</nova:project>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <nova:port uuid="e808672d-dd35-463d-8c4c-c82ed7646741">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <entry name="serial">640fbec9-1ab9-4115-892a-3e91f24ed2ae</entry>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <entry name="uuid">640fbec9-1ab9-4115-892a-3e91f24ed2ae</entry>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk.config">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:43:14:c5"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <target dev="tape808672d-dd"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/640fbec9-1ab9-4115-892a-3e91f24ed2ae/console.log" append="off"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:03:02 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:03:02 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:03:02 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:03:02 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.449 2 DEBUG nova.compute.manager [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Preparing to wait for external event network-vif-plugged-e808672d-dd35-463d-8c4c-c82ed7646741 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.449 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Acquiring lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.449 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.450 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.450 2 DEBUG nova.virt.libvirt.vif [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1921024035',display_name='tempest-AttachVolumeNegativeTest-server-1921024035',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1921024035',id=201,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDDrMvZ77F/UIZfU+v9K7atXR5NjjRA7wn/L4bHndWJAEEnJTo/JMZnZyeU+hLDfIrLuljZuLJ61gnvWEBfMNMfiDcyAb4KC3UGvs/4WwzQe2L+IRgQtFWqJOITPqlAajA==',key_name='tempest-keypair-89702726',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5eceae619a6f4fdeaa8ba6fafda4912a',ramdisk_id='',reservation_id='r-ixaq07al',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1407980822',owner_user_name='tempest-AttachVolumeNegativeTest-1407980822-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:02:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93facc00c95f4cbfa6cecaf3641182bc',uuid=640fbec9-1ab9-4115-892a-3e91f24ed2ae,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.451 2 DEBUG nova.network.os_vif_util [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Converting VIF {"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.451 2 DEBUG nova.network.os_vif_util [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:14:c5,bridge_name='br-int',has_traffic_filtering=True,id=e808672d-dd35-463d-8c4c-c82ed7646741,network=Network(2471b6f7-ee51-4239-8b52-7016ab4d9fd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape808672d-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.452 2 DEBUG os_vif [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:14:c5,bridge_name='br-int',has_traffic_filtering=True,id=e808672d-dd35-463d-8c4c-c82ed7646741,network=Network(2471b6f7-ee51-4239-8b52-7016ab4d9fd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape808672d-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.453 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.453 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.457 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape808672d-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.458 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape808672d-dd, col_values=(('external_ids', {'iface-id': 'e808672d-dd35-463d-8c4c-c82ed7646741', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:14:c5', 'vm-uuid': '640fbec9-1ab9-4115-892a-3e91f24ed2ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:02 np0005465988 NetworkManager[45041]: <info>  [1759410182.4609] manager: (tape808672d-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/398)
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.469 2 INFO os_vif [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:14:c5,bridge_name='br-int',has_traffic_filtering=True,id=e808672d-dd35-463d-8c4c-c82ed7646741,network=Network(2471b6f7-ee51-4239-8b52-7016ab4d9fd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape808672d-dd')#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.599 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.600 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.600 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] No VIF found with MAC fa:16:3e:43:14:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.601 2 INFO nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Using config drive#033[00m
Oct  2 09:03:02 np0005465988 nova_compute[236126]: 2025-10-02 13:03:02.634 2 DEBUG nova.storage.rbd_utils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] rbd image 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:03:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:03.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:04.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:05.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:05 np0005465988 nova_compute[236126]: 2025-10-02 13:03:05.402 2 INFO nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Creating config drive at /var/lib/nova/instances/640fbec9-1ab9-4115-892a-3e91f24ed2ae/disk.config#033[00m
Oct  2 09:03:05 np0005465988 nova_compute[236126]: 2025-10-02 13:03:05.409 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/640fbec9-1ab9-4115-892a-3e91f24ed2ae/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6ybq62v6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:03:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:05 np0005465988 nova_compute[236126]: 2025-10-02 13:03:05.566 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/640fbec9-1ab9-4115-892a-3e91f24ed2ae/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6ybq62v6" returned: 0 in 0.157s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:03:05 np0005465988 nova_compute[236126]: 2025-10-02 13:03:05.607 2 DEBUG nova.storage.rbd_utils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] rbd image 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:03:05 np0005465988 nova_compute[236126]: 2025-10-02 13:03:05.620 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/640fbec9-1ab9-4115-892a-3e91f24ed2ae/disk.config 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:03:05 np0005465988 nova_compute[236126]: 2025-10-02 13:03:05.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:05 np0005465988 nova_compute[236126]: 2025-10-02 13:03:05.826 2 DEBUG oslo_concurrency.processutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/640fbec9-1ab9-4115-892a-3e91f24ed2ae/disk.config 640fbec9-1ab9-4115-892a-3e91f24ed2ae_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:03:05 np0005465988 nova_compute[236126]: 2025-10-02 13:03:05.827 2 INFO nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Deleting local config drive /var/lib/nova/instances/640fbec9-1ab9-4115-892a-3e91f24ed2ae/disk.config because it was imported into RBD.#033[00m
Oct  2 09:03:05 np0005465988 kernel: tape808672d-dd: entered promiscuous mode
Oct  2 09:03:05 np0005465988 NetworkManager[45041]: <info>  [1759410185.8872] manager: (tape808672d-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/399)
Oct  2 09:03:05 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:05Z|00903|binding|INFO|Claiming lport e808672d-dd35-463d-8c4c-c82ed7646741 for this chassis.
Oct  2 09:03:05 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:05Z|00904|binding|INFO|e808672d-dd35-463d-8c4c-c82ed7646741: Claiming fa:16:3e:43:14:c5 10.100.0.5
Oct  2 09:03:05 np0005465988 nova_compute[236126]: 2025-10-02 13:03:05.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:05 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:05Z|00905|binding|INFO|Setting lport e808672d-dd35-463d-8c4c-c82ed7646741 ovn-installed in OVS
Oct  2 09:03:05 np0005465988 nova_compute[236126]: 2025-10-02 13:03:05.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:05 np0005465988 nova_compute[236126]: 2025-10-02 13:03:05.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:05 np0005465988 systemd-udevd[330338]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:03:05 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:05Z|00906|binding|INFO|Setting lport e808672d-dd35-463d-8c4c-c82ed7646741 up in Southbound
Oct  2 09:03:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:05.922 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:14:c5 10.100.0.5'], port_security=['fa:16:3e:43:14:c5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '640fbec9-1ab9-4115-892a-3e91f24ed2ae', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2471b6f7-ee51-4239-8b52-7016ab4d9fd1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5eceae619a6f4fdeaa8ba6fafda4912a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fcb2f420-7d55-44da-a4fe-84e82e3282b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa923984-fb22-4ee5-9bd7-5034c98e7f0a, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=e808672d-dd35-463d-8c4c-c82ed7646741) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:03:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:05.924 142124 INFO neutron.agent.ovn.metadata.agent [-] Port e808672d-dd35-463d-8c4c-c82ed7646741 in datapath 2471b6f7-ee51-4239-8b52-7016ab4d9fd1 bound to our chassis#033[00m
Oct  2 09:03:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:05.925 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2471b6f7-ee51-4239-8b52-7016ab4d9fd1#033[00m
Oct  2 09:03:05 np0005465988 systemd-machined[192594]: New machine qemu-94-instance-000000c9.
Oct  2 09:03:05 np0005465988 NetworkManager[45041]: <info>  [1759410185.9403] device (tape808672d-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:03:05 np0005465988 NetworkManager[45041]: <info>  [1759410185.9416] device (tape808672d-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:03:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:05.942 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bdfe4bf4-c89a-4878-bd25-f2e6bb4ebf0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:05.943 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2471b6f7-e1 in ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:03:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:05.945 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2471b6f7-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:03:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:05.945 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ae050b-010b-4556-83e4-3897840708ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:05.946 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7740dea2-ec5a-43ea-a6d1-7d72e37795ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:05 np0005465988 systemd[1]: Started Virtual Machine qemu-94-instance-000000c9.
Oct  2 09:03:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:05.959 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[531ca8ae-9c6c-4ee5-b73f-8d8013b900ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:05 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:05.974 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[317e7e83-7ec5-4414-be4d-ef162abb53a7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.009 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[2cad944c-dbf9-47ce-bf3a-aba5426bd9ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.017 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[193b96a0-e73f-4de4-87f4-2a976fdae083]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 NetworkManager[45041]: <info>  [1759410186.0186] manager: (tap2471b6f7-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/400)
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.062 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[00aac6cf-96af-4f5b-8e4f-78dd38255353]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.067 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[de027e5b-810f-4be1-ab4d-af25e6855445]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 NetworkManager[45041]: <info>  [1759410186.0945] device (tap2471b6f7-e0): carrier: link connected
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.102 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5174a06e-0d0a-4fed-b397-1542c0c4d49b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.124 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6313d34a-5fa9-47a7-a11b-2bfdfe783241]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2471b6f7-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:da:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 264], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 824642, 'reachable_time': 30645, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 330372, 'error': None, 'target': 'ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.145 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f48f11-93fc-4bc7-8067-7848d1b00419]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:da65'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 824642, 'tstamp': 824642}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 330373, 'error': None, 'target': 'ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.168 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f8772354-cc1b-400b-a0f4-90e50f6f2482]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2471b6f7-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:da:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 264], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 824642, 'reachable_time': 30645, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 330374, 'error': None, 'target': 'ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.209 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5d35059d-148d-440a-b053-d0c1ec56b1f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:06.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.282 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ad1f2593-259c-41d0-b581-9bfa06f349e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.283 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2471b6f7-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.284 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.284 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2471b6f7-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:06 np0005465988 NetworkManager[45041]: <info>  [1759410186.2867] manager: (tap2471b6f7-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/401)
Oct  2 09:03:06 np0005465988 kernel: tap2471b6f7-e0: entered promiscuous mode
Oct  2 09:03:06 np0005465988 nova_compute[236126]: 2025-10-02 13:03:06.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.289 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2471b6f7-e0, col_values=(('external_ids', {'iface-id': 'c5388d11-12a4-491d-825a-d4dc574d0a0e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:06 np0005465988 nova_compute[236126]: 2025-10-02 13:03:06.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:06 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:06Z|00907|binding|INFO|Releasing lport c5388d11-12a4-491d-825a-d4dc574d0a0e from this chassis (sb_readonly=0)
Oct  2 09:03:06 np0005465988 nova_compute[236126]: 2025-10-02 13:03:06.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.308 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2471b6f7-ee51-4239-8b52-7016ab4d9fd1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2471b6f7-ee51-4239-8b52-7016ab4d9fd1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.309 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[335f4bd0-5b03-4ca3-adf2-68a16f08e35b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.309 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-2471b6f7-ee51-4239-8b52-7016ab4d9fd1
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/2471b6f7-ee51-4239-8b52-7016ab4d9fd1.pid.haproxy
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 2471b6f7-ee51-4239-8b52-7016ab4d9fd1
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:03:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:06.310 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1', 'env', 'PROCESS_TAG=haproxy-2471b6f7-ee51-4239-8b52-7016ab4d9fd1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2471b6f7-ee51-4239-8b52-7016ab4d9fd1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:03:06 np0005465988 podman[330448]: 2025-10-02 13:03:06.710353677 +0000 UTC m=+0.059721942 container create b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:03:06 np0005465988 systemd[1]: Started libpod-conmon-b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09.scope.
Oct  2 09:03:06 np0005465988 podman[330448]: 2025-10-02 13:03:06.681457834 +0000 UTC m=+0.030826119 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:03:06 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:03:06 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cd39028dbc4678469d905e5d3721ba77fd1f93bf883528751476cf060773949/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:03:06 np0005465988 podman[330448]: 2025-10-02 13:03:06.82019293 +0000 UTC m=+0.169561215 container init b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:03:06 np0005465988 podman[330448]: 2025-10-02 13:03:06.826409039 +0000 UTC m=+0.175777304 container start b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:03:06 np0005465988 neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1[330464]: [NOTICE]   (330468) : New worker (330470) forked
Oct  2 09:03:06 np0005465988 neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1[330464]: [NOTICE]   (330468) : Loading success.
Oct  2 09:03:06 np0005465988 nova_compute[236126]: 2025-10-02 13:03:06.894 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410186.8918781, 640fbec9-1ab9-4115-892a-3e91f24ed2ae => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:03:06 np0005465988 nova_compute[236126]: 2025-10-02 13:03:06.896 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] VM Started (Lifecycle Event)#033[00m
Oct  2 09:03:06 np0005465988 nova_compute[236126]: 2025-10-02 13:03:06.958 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:03:06 np0005465988 nova_compute[236126]: 2025-10-02 13:03:06.964 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410186.892356, 640fbec9-1ab9-4115-892a-3e91f24ed2ae => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:03:06 np0005465988 nova_compute[236126]: 2025-10-02 13:03:06.965 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.016 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.022 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.070 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:03:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:07.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.634 2 DEBUG nova.network.neutron [req-3060d37d-2736-4b34-987b-62c43e4dd06f req-1e4bfedc-1349-44f8-af09-9c453f258a5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Updated VIF entry in instance network info cache for port e808672d-dd35-463d-8c4c-c82ed7646741. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.635 2 DEBUG nova.network.neutron [req-3060d37d-2736-4b34-987b-62c43e4dd06f req-1e4bfedc-1349-44f8-af09-9c453f258a5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Updating instance_info_cache with network_info: [{"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.788 2 DEBUG nova.compute.manager [req-8da52803-9f21-49e4-92af-f6b3b234ef5d req-106773cc-5a22-4f66-a6d5-65d88b016c5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Received event network-vif-plugged-e808672d-dd35-463d-8c4c-c82ed7646741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.789 2 DEBUG oslo_concurrency.lockutils [req-8da52803-9f21-49e4-92af-f6b3b234ef5d req-106773cc-5a22-4f66-a6d5-65d88b016c5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.789 2 DEBUG oslo_concurrency.lockutils [req-8da52803-9f21-49e4-92af-f6b3b234ef5d req-106773cc-5a22-4f66-a6d5-65d88b016c5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.790 2 DEBUG oslo_concurrency.lockutils [req-8da52803-9f21-49e4-92af-f6b3b234ef5d req-106773cc-5a22-4f66-a6d5-65d88b016c5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.790 2 DEBUG nova.compute.manager [req-8da52803-9f21-49e4-92af-f6b3b234ef5d req-106773cc-5a22-4f66-a6d5-65d88b016c5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Processing event network-vif-plugged-e808672d-dd35-463d-8c4c-c82ed7646741 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.792 2 DEBUG nova.compute.manager [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.792 2 DEBUG oslo_concurrency.lockutils [req-3060d37d-2736-4b34-987b-62c43e4dd06f req-1e4bfedc-1349-44f8-af09-9c453f258a5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.797 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410187.7969458, 640fbec9-1ab9-4115-892a-3e91f24ed2ae => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.798 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.799 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.805 2 INFO nova.virt.libvirt.driver [-] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Instance spawned successfully.#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.806 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.839 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.851 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.852 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.853 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.854 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.855 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.856 2 DEBUG nova.virt.libvirt.driver [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.864 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.933 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.969 2 INFO nova.compute.manager [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Took 13.40 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:03:07 np0005465988 nova_compute[236126]: 2025-10-02 13:03:07.970 2 DEBUG nova.compute.manager [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:03:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:08.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:08 np0005465988 nova_compute[236126]: 2025-10-02 13:03:08.378 2 INFO nova.compute.manager [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Took 14.96 seconds to build instance.#033[00m
Oct  2 09:03:08 np0005465988 nova_compute[236126]: 2025-10-02 13:03:08.426 2 DEBUG oslo_concurrency.lockutils [None req-ebad6097-3195-467b-8e17-9edecb1d3db5 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:08.720 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:03:08 np0005465988 nova_compute[236126]: 2025-10-02 13:03:08.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:08.722 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:03:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:09.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:10 np0005465988 nova_compute[236126]: 2025-10-02 13:03:10.047 2 DEBUG nova.compute.manager [req-f0321104-0d63-44a8-b2f7-b75c8c5bb2bd req-66f7ddcd-fe76-48f8-895c-21c79e03c73a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Received event network-vif-plugged-e808672d-dd35-463d-8c4c-c82ed7646741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:10 np0005465988 nova_compute[236126]: 2025-10-02 13:03:10.051 2 DEBUG oslo_concurrency.lockutils [req-f0321104-0d63-44a8-b2f7-b75c8c5bb2bd req-66f7ddcd-fe76-48f8-895c-21c79e03c73a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:10 np0005465988 nova_compute[236126]: 2025-10-02 13:03:10.053 2 DEBUG oslo_concurrency.lockutils [req-f0321104-0d63-44a8-b2f7-b75c8c5bb2bd req-66f7ddcd-fe76-48f8-895c-21c79e03c73a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:10 np0005465988 nova_compute[236126]: 2025-10-02 13:03:10.053 2 DEBUG oslo_concurrency.lockutils [req-f0321104-0d63-44a8-b2f7-b75c8c5bb2bd req-66f7ddcd-fe76-48f8-895c-21c79e03c73a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:10 np0005465988 nova_compute[236126]: 2025-10-02 13:03:10.054 2 DEBUG nova.compute.manager [req-f0321104-0d63-44a8-b2f7-b75c8c5bb2bd req-66f7ddcd-fe76-48f8-895c-21c79e03c73a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] No waiting events found dispatching network-vif-plugged-e808672d-dd35-463d-8c4c-c82ed7646741 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:03:10 np0005465988 nova_compute[236126]: 2025-10-02 13:03:10.055 2 WARNING nova.compute.manager [req-f0321104-0d63-44a8-b2f7-b75c8c5bb2bd req-66f7ddcd-fe76-48f8-895c-21c79e03c73a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Received unexpected event network-vif-plugged-e808672d-dd35-463d-8c4c-c82ed7646741 for instance with vm_state active and task_state None.#033[00m
Oct  2 09:03:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:10.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:10 np0005465988 podman[330482]: 2025-10-02 13:03:10.567570359 +0000 UTC m=+0.088118729 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:03:10 np0005465988 nova_compute[236126]: 2025-10-02 13:03:10.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:10 np0005465988 podman[330481]: 2025-10-02 13:03:10.656909372 +0000 UTC m=+0.187389968 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 09:03:10 np0005465988 podman[330483]: 2025-10-02 13:03:10.67731185 +0000 UTC m=+0.201833754 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  2 09:03:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:11.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:11.724 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:12.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:12 np0005465988 nova_compute[236126]: 2025-10-02 13:03:12.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:03:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:13.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:03:13 np0005465988 nova_compute[236126]: 2025-10-02 13:03:13.762 2 DEBUG nova.compute.manager [req-efe31a6b-f0fe-4eb1-99c3-0018a37f99f1 req-c96a255e-5eb7-4765-8caa-58f1561e7bc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Received event network-changed-e808672d-dd35-463d-8c4c-c82ed7646741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:13 np0005465988 nova_compute[236126]: 2025-10-02 13:03:13.764 2 DEBUG nova.compute.manager [req-efe31a6b-f0fe-4eb1-99c3-0018a37f99f1 req-c96a255e-5eb7-4765-8caa-58f1561e7bc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Refreshing instance network info cache due to event network-changed-e808672d-dd35-463d-8c4c-c82ed7646741. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:03:13 np0005465988 nova_compute[236126]: 2025-10-02 13:03:13.765 2 DEBUG oslo_concurrency.lockutils [req-efe31a6b-f0fe-4eb1-99c3-0018a37f99f1 req-c96a255e-5eb7-4765-8caa-58f1561e7bc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:03:13 np0005465988 nova_compute[236126]: 2025-10-02 13:03:13.765 2 DEBUG oslo_concurrency.lockutils [req-efe31a6b-f0fe-4eb1-99c3-0018a37f99f1 req-c96a255e-5eb7-4765-8caa-58f1561e7bc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:03:13 np0005465988 nova_compute[236126]: 2025-10-02 13:03:13.765 2 DEBUG nova.network.neutron [req-efe31a6b-f0fe-4eb1-99c3-0018a37f99f1 req-c96a255e-5eb7-4765-8caa-58f1561e7bc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Refreshing network info cache for port e808672d-dd35-463d-8c4c-c82ed7646741 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:03:14 np0005465988 nova_compute[236126]: 2025-10-02 13:03:14.009 2 INFO nova.compute.manager [None req-57253aeb-a25a-4f64-a96a-3ae8b2ace16a fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Get console output#033[00m
Oct  2 09:03:14 np0005465988 nova_compute[236126]: 2025-10-02 13:03:14.023 15591 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 09:03:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:14.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:15.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:15 np0005465988 nova_compute[236126]: 2025-10-02 13:03:15.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:16 np0005465988 nova_compute[236126]: 2025-10-02 13:03:16.235 2 DEBUG nova.compute.manager [req-9967b019-c493-4ef1-b390-ba2a95b79c6f req-f8120753-502f-469b-b72e-62b5c39173fb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-changed-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:16 np0005465988 nova_compute[236126]: 2025-10-02 13:03:16.236 2 DEBUG nova.compute.manager [req-9967b019-c493-4ef1-b390-ba2a95b79c6f req-f8120753-502f-469b-b72e-62b5c39173fb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Refreshing instance network info cache due to event network-changed-c3e55349-c91d-43c1-be7d-394f7b35ee2e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:03:16 np0005465988 nova_compute[236126]: 2025-10-02 13:03:16.236 2 DEBUG oslo_concurrency.lockutils [req-9967b019-c493-4ef1-b390-ba2a95b79c6f req-f8120753-502f-469b-b72e-62b5c39173fb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:03:16 np0005465988 nova_compute[236126]: 2025-10-02 13:03:16.237 2 DEBUG oslo_concurrency.lockutils [req-9967b019-c493-4ef1-b390-ba2a95b79c6f req-f8120753-502f-469b-b72e-62b5c39173fb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:03:16 np0005465988 nova_compute[236126]: 2025-10-02 13:03:16.237 2 DEBUG nova.network.neutron [req-9967b019-c493-4ef1-b390-ba2a95b79c6f req-f8120753-502f-469b-b72e-62b5c39173fb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Refreshing network info cache for port c3e55349-c91d-43c1-be7d-394f7b35ee2e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:03:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:16.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:16 np0005465988 nova_compute[236126]: 2025-10-02 13:03:16.346 2 DEBUG nova.network.neutron [req-efe31a6b-f0fe-4eb1-99c3-0018a37f99f1 req-c96a255e-5eb7-4765-8caa-58f1561e7bc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Updated VIF entry in instance network info cache for port e808672d-dd35-463d-8c4c-c82ed7646741. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:03:16 np0005465988 nova_compute[236126]: 2025-10-02 13:03:16.347 2 DEBUG nova.network.neutron [req-efe31a6b-f0fe-4eb1-99c3-0018a37f99f1 req-c96a255e-5eb7-4765-8caa-58f1561e7bc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Updating instance_info_cache with network_info: [{"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:03:16 np0005465988 nova_compute[236126]: 2025-10-02 13:03:16.410 2 DEBUG oslo_concurrency.lockutils [req-efe31a6b-f0fe-4eb1-99c3-0018a37f99f1 req-c96a255e-5eb7-4765-8caa-58f1561e7bc9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:03:16 np0005465988 nova_compute[236126]: 2025-10-02 13:03:16.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:17.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.736 2 INFO nova.compute.manager [None req-6c53d06f-cdb1-45ff-aaef-a7db1d6fe742 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Get console output#033[00m
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.742 15591 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.764 2 DEBUG nova.network.neutron [req-9967b019-c493-4ef1-b390-ba2a95b79c6f req-f8120753-502f-469b-b72e-62b5c39173fb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updated VIF entry in instance network info cache for port c3e55349-c91d-43c1-be7d-394f7b35ee2e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.765 2 DEBUG nova.network.neutron [req-9967b019-c493-4ef1-b390-ba2a95b79c6f req-f8120753-502f-469b-b72e-62b5c39173fb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updating instance_info_cache with network_info: [{"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.877 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.878 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.879 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.879 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.880 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:03:17 np0005465988 nova_compute[236126]: 2025-10-02 13:03:17.921 2 DEBUG oslo_concurrency.lockutils [req-9967b019-c493-4ef1-b390-ba2a95b79c6f req-f8120753-502f-469b-b72e-62b5c39173fb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:03:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:18.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:03:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1156892007' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.351 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.408 2 DEBUG nova.compute.manager [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-vif-unplugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.409 2 DEBUG oslo_concurrency.lockutils [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.410 2 DEBUG oslo_concurrency.lockutils [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.410 2 DEBUG oslo_concurrency.lockutils [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.411 2 DEBUG nova.compute.manager [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] No waiting events found dispatching network-vif-unplugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.411 2 WARNING nova.compute.manager [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received unexpected event network-vif-unplugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e for instance with vm_state active and task_state None.#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.412 2 DEBUG nova.compute.manager [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.412 2 DEBUG oslo_concurrency.lockutils [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.412 2 DEBUG oslo_concurrency.lockutils [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.413 2 DEBUG oslo_concurrency.lockutils [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.413 2 DEBUG nova.compute.manager [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] No waiting events found dispatching network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.414 2 WARNING nova.compute.manager [req-52f676a6-ab21-4f66-9856-75365fbee161 req-1490aa58-9420-4d08-a763-f5e1b841b4be d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received unexpected event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e for instance with vm_state active and task_state None.#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.489 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000c9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.489 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000c9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.492 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.493 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.708 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.710 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3685MB free_disk=20.8306884765625GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.710 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.710 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.955 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance f7e4398e-72d2-4983-9680-d518c4ca2b0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.956 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 640fbec9-1ab9-4115-892a-3e91f24ed2ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.957 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:03:18 np0005465988 nova_compute[236126]: 2025-10-02 13:03:18.957 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:03:19 np0005465988 nova_compute[236126]: 2025-10-02 13:03:19.072 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:03:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:19.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:03:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2633219503' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:03:19 np0005465988 nova_compute[236126]: 2025-10-02 13:03:19.601 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:03:19 np0005465988 nova_compute[236126]: 2025-10-02 13:03:19.608 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:03:19 np0005465988 nova_compute[236126]: 2025-10-02 13:03:19.628 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:03:19 np0005465988 nova_compute[236126]: 2025-10-02 13:03:19.661 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:03:19 np0005465988 nova_compute[236126]: 2025-10-02 13:03:19.661 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:03:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:20.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:03:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:20 np0005465988 nova_compute[236126]: 2025-10-02 13:03:20.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:20 np0005465988 nova_compute[236126]: 2025-10-02 13:03:20.650 2 DEBUG nova.compute.manager [req-ff349335-60a8-4af0-9d04-48227e7d0711 req-8d0edea8-5d20-4773-8dcf-acc2068ef022 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-changed-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:20 np0005465988 nova_compute[236126]: 2025-10-02 13:03:20.651 2 DEBUG nova.compute.manager [req-ff349335-60a8-4af0-9d04-48227e7d0711 req-8d0edea8-5d20-4773-8dcf-acc2068ef022 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Refreshing instance network info cache due to event network-changed-c3e55349-c91d-43c1-be7d-394f7b35ee2e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:03:20 np0005465988 nova_compute[236126]: 2025-10-02 13:03:20.652 2 DEBUG oslo_concurrency.lockutils [req-ff349335-60a8-4af0-9d04-48227e7d0711 req-8d0edea8-5d20-4773-8dcf-acc2068ef022 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:03:20 np0005465988 nova_compute[236126]: 2025-10-02 13:03:20.652 2 DEBUG oslo_concurrency.lockutils [req-ff349335-60a8-4af0-9d04-48227e7d0711 req-8d0edea8-5d20-4773-8dcf-acc2068ef022 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:03:20 np0005465988 nova_compute[236126]: 2025-10-02 13:03:20.653 2 DEBUG nova.network.neutron [req-ff349335-60a8-4af0-9d04-48227e7d0711 req-8d0edea8-5d20-4773-8dcf-acc2068ef022 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Refreshing network info cache for port c3e55349-c91d-43c1-be7d-394f7b35ee2e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:03:20 np0005465988 nova_compute[236126]: 2025-10-02 13:03:20.919 2 INFO nova.compute.manager [None req-914939d4-b3cc-4681-b671-edc8e06d8a97 fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Get console output#033[00m
Oct  2 09:03:20 np0005465988 nova_compute[236126]: 2025-10-02 13:03:20.927 15591 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 09:03:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:21.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:22.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:22 np0005465988 nova_compute[236126]: 2025-10-02 13:03:22.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:23.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.305 2 DEBUG nova.compute.manager [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.305 2 DEBUG oslo_concurrency.lockutils [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.306 2 DEBUG oslo_concurrency.lockutils [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.306 2 DEBUG oslo_concurrency.lockutils [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.306 2 DEBUG nova.compute.manager [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] No waiting events found dispatching network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.306 2 WARNING nova.compute.manager [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received unexpected event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e for instance with vm_state active and task_state None.#033[00m
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.307 2 DEBUG nova.compute.manager [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.307 2 DEBUG oslo_concurrency.lockutils [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.307 2 DEBUG oslo_concurrency.lockutils [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.307 2 DEBUG oslo_concurrency.lockutils [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.307 2 DEBUG nova.compute.manager [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] No waiting events found dispatching network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:03:23 np0005465988 nova_compute[236126]: 2025-10-02 13:03:23.308 2 WARNING nova.compute.manager [req-a81e2324-4b8f-42bc-b47f-8dda865644ff req-81728e50-2326-4cf1-8e10-eb01904db6ac d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received unexpected event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e for instance with vm_state active and task_state None.#033[00m
Oct  2 09:03:24 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:24Z|00117|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:43:14:c5 10.100.0.5
Oct  2 09:03:24 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:24Z|00118|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:43:14:c5 10.100.0.5
Oct  2 09:03:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:24.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:25.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:25 np0005465988 nova_compute[236126]: 2025-10-02 13:03:25.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:26.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:26 np0005465988 nova_compute[236126]: 2025-10-02 13:03:26.427 2 DEBUG nova.network.neutron [req-ff349335-60a8-4af0-9d04-48227e7d0711 req-8d0edea8-5d20-4773-8dcf-acc2068ef022 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updated VIF entry in instance network info cache for port c3e55349-c91d-43c1-be7d-394f7b35ee2e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:03:26 np0005465988 nova_compute[236126]: 2025-10-02 13:03:26.427 2 DEBUG nova.network.neutron [req-ff349335-60a8-4af0-9d04-48227e7d0711 req-8d0edea8-5d20-4773-8dcf-acc2068ef022 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updating instance_info_cache with network_info: [{"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:03:26 np0005465988 nova_compute[236126]: 2025-10-02 13:03:26.450 2 DEBUG oslo_concurrency.lockutils [req-ff349335-60a8-4af0-9d04-48227e7d0711 req-8d0edea8-5d20-4773-8dcf-acc2068ef022 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:03:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:27.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:27.409 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:27.410 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:27.411 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:27 np0005465988 nova_compute[236126]: 2025-10-02 13:03:27.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:28.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.0 total, 600.0 interval#012Cumulative writes: 15K writes, 75K keys, 15K commit groups, 1.0 writes per commit group, ingest: 0.15 GB, 0.03 MB/s#012Cumulative WAL: 15K writes, 15K syncs, 1.00 writes per sync, written: 0.15 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1581 writes, 7479 keys, 1581 commit groups, 1.0 writes per commit group, ingest: 15.71 MB, 0.03 MB/s#012Interval WAL: 1581 writes, 1581 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     56.0      1.63              0.32        47    0.035       0      0       0.0       0.0#012  L6      1/0   11.51 MB   0.0      0.5     0.1      0.4       0.5      0.0       0.0   5.1    110.0     93.8      4.92              1.65        46    0.107    328K    24K       0.0       0.0#012 Sum      1/0   11.51 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   6.1     82.6     84.4      6.55              1.97        93    0.070    328K    24K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.4     83.6     85.4      0.80              0.22        10    0.080     48K   2609       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.5      0.0       0.0   0.0    110.0     93.8      4.92              1.65        46    0.107    328K    24K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     56.0      1.63              0.32        46    0.035       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 5400.0 total, 600.0 interval#012Flush(GB): cumulative 0.089, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.54 GB write, 0.10 MB/s write, 0.53 GB read, 0.10 MB/s read, 6.6 seconds#012Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 304.00 MB usage: 59.69 MB table_size: 0 occupancy: 18446744073709551615 collections: 10 last_copies: 0 last_secs: 0.000798 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3428,57.27 MB,18.8396%) FilterBlock(93,917.23 KB,0.29465%) IndexBlock(93,1.52 MB,0.500142%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 09:03:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:29.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:29 np0005465988 podman[330651]: 2025-10-02 13:03:29.52960104 +0000 UTC m=+0.060946746 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 09:03:29 np0005465988 nova_compute[236126]: 2025-10-02 13:03:29.662 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #151. Immutable memtables: 0.
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.673899) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 95] Flushing memtable with next log file: 151
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410209673944, "job": 95, "event": "flush_started", "num_memtables": 1, "num_entries": 1404, "num_deletes": 256, "total_data_size": 3121154, "memory_usage": 3165344, "flush_reason": "Manual Compaction"}
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 95] Level-0 flush table #152: started
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410209686559, "cf_name": "default", "job": 95, "event": "table_file_creation", "file_number": 152, "file_size": 2036658, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 74278, "largest_seqno": 75677, "table_properties": {"data_size": 2030704, "index_size": 3220, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12767, "raw_average_key_size": 19, "raw_value_size": 2018706, "raw_average_value_size": 3115, "num_data_blocks": 142, "num_entries": 648, "num_filter_entries": 648, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759410096, "oldest_key_time": 1759410096, "file_creation_time": 1759410209, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 152, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 95] Flush lasted 12733 microseconds, and 5211 cpu microseconds.
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.686626) [db/flush_job.cc:967] [default] [JOB 95] Level-0 flush table #152: 2036658 bytes OK
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.686654) [db/memtable_list.cc:519] [default] Level-0 commit table #152 started
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.688318) [db/memtable_list.cc:722] [default] Level-0 commit table #152: memtable #1 done
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.688335) EVENT_LOG_v1 {"time_micros": 1759410209688329, "job": 95, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.688355) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 95] Try to delete WAL files size 3114610, prev total WAL file size 3114610, number of live WAL files 2.
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000148.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.689461) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032373539' seq:72057594037927935, type:22 .. '6C6F676D0033303131' seq:0, type:0; will stop at (end)
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 96] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 95 Base level 0, inputs: [152(1988KB)], [150(11MB)]
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410209689507, "job": 96, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [152], "files_L6": [150], "score": -1, "input_data_size": 14106169, "oldest_snapshot_seqno": -1}
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 96] Generated table #153: 9678 keys, 13960538 bytes, temperature: kUnknown
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410209790588, "cf_name": "default", "job": 96, "event": "table_file_creation", "file_number": 153, "file_size": 13960538, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13896000, "index_size": 39209, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24261, "raw_key_size": 255012, "raw_average_key_size": 26, "raw_value_size": 13724331, "raw_average_value_size": 1418, "num_data_blocks": 1504, "num_entries": 9678, "num_filter_entries": 9678, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759410209, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 153, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.790876) [db/compaction/compaction_job.cc:1663] [default] [JOB 96] Compacted 1@0 + 1@6 files to L6 => 13960538 bytes
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.792826) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 139.5 rd, 138.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.5 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(13.8) write-amplify(6.9) OK, records in: 10205, records dropped: 527 output_compression: NoCompression
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.792867) EVENT_LOG_v1 {"time_micros": 1759410209792853, "job": 96, "event": "compaction_finished", "compaction_time_micros": 101150, "compaction_time_cpu_micros": 43328, "output_level": 6, "num_output_files": 1, "total_output_size": 13960538, "num_input_records": 10205, "num_output_records": 9678, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000152.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410209793411, "job": 96, "event": "table_file_deletion", "file_number": 152}
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000150.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410209795631, "job": 96, "event": "table_file_deletion", "file_number": 150}
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.689352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.795750) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.795757) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.795760) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.795762) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:03:29 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:03:29.795763) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:03:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:30.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:30 np0005465988 nova_compute[236126]: 2025-10-02 13:03:30.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:30 np0005465988 nova_compute[236126]: 2025-10-02 13:03:30.472 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:30 np0005465988 nova_compute[236126]: 2025-10-02 13:03:30.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:30 np0005465988 nova_compute[236126]: 2025-10-02 13:03:30.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:03:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:30 np0005465988 nova_compute[236126]: 2025-10-02 13:03:30.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:31.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:32.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:32 np0005465988 nova_compute[236126]: 2025-10-02 13:03:32.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:33.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:33 np0005465988 nova_compute[236126]: 2025-10-02 13:03:33.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:34.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:34 np0005465988 nova_compute[236126]: 2025-10-02 13:03:34.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:34 np0005465988 nova_compute[236126]: 2025-10-02 13:03:34.816 2 DEBUG nova.compute.manager [req-1c15deb3-a184-42c9-94f9-aaab4727254a req-b2479a60-7d77-4d65-ae47-c50e66b0940a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-changed-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:34 np0005465988 nova_compute[236126]: 2025-10-02 13:03:34.816 2 DEBUG nova.compute.manager [req-1c15deb3-a184-42c9-94f9-aaab4727254a req-b2479a60-7d77-4d65-ae47-c50e66b0940a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Refreshing instance network info cache due to event network-changed-c3e55349-c91d-43c1-be7d-394f7b35ee2e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:03:34 np0005465988 nova_compute[236126]: 2025-10-02 13:03:34.816 2 DEBUG oslo_concurrency.lockutils [req-1c15deb3-a184-42c9-94f9-aaab4727254a req-b2479a60-7d77-4d65-ae47-c50e66b0940a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:03:34 np0005465988 nova_compute[236126]: 2025-10-02 13:03:34.816 2 DEBUG oslo_concurrency.lockutils [req-1c15deb3-a184-42c9-94f9-aaab4727254a req-b2479a60-7d77-4d65-ae47-c50e66b0940a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:03:34 np0005465988 nova_compute[236126]: 2025-10-02 13:03:34.817 2 DEBUG nova.network.neutron [req-1c15deb3-a184-42c9-94f9-aaab4727254a req-b2479a60-7d77-4d65-ae47-c50e66b0940a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Refreshing network info cache for port c3e55349-c91d-43c1-be7d-394f7b35ee2e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.007 2 DEBUG oslo_concurrency.lockutils [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.008 2 DEBUG oslo_concurrency.lockutils [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.008 2 DEBUG oslo_concurrency.lockutils [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.008 2 DEBUG oslo_concurrency.lockutils [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.009 2 DEBUG oslo_concurrency.lockutils [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.010 2 INFO nova.compute.manager [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Terminating instance#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.011 2 DEBUG nova.compute.manager [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 09:03:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:35.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:35 np0005465988 kernel: tapc3e55349-c9 (unregistering): left promiscuous mode
Oct  2 09:03:35 np0005465988 NetworkManager[45041]: <info>  [1759410215.4172] device (tapc3e55349-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:03:35 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:35Z|00908|binding|INFO|Releasing lport c3e55349-c91d-43c1-be7d-394f7b35ee2e from this chassis (sb_readonly=0)
Oct  2 09:03:35 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:35Z|00909|binding|INFO|Setting lport c3e55349-c91d-43c1-be7d-394f7b35ee2e down in Southbound
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:35 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:35Z|00910|binding|INFO|Removing iface tapc3e55349-c9 ovn-installed in OVS
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:35.439 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:d3:90 10.100.0.4'], port_security=['fa:16:3e:c2:d3:90 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f7e4398e-72d2-4983-9680-d518c4ca2b0e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce2ca82c03554560b55ed747ae63f1fb', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3aaf3d03-7d0a-4651-a8c6-da6d26cf1390', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=010b8986-6ed3-4d08-a5a9-d68a3bf546a0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=c3e55349-c91d-43c1-be7d-394f7b35ee2e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:03:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:35.441 142124 INFO neutron.agent.ovn.metadata.agent [-] Port c3e55349-c91d-43c1-be7d-394f7b35ee2e in datapath 3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f unbound from our chassis#033[00m
Oct  2 09:03:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:35.442 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:35.443 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9c9ea004-3499-40fd-a2ff-7ffbbfb71def]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:35.444 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f namespace which is not needed anymore#033[00m
Oct  2 09:03:35 np0005465988 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000c6.scope: Deactivated successfully.
Oct  2 09:03:35 np0005465988 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000c6.scope: Consumed 17.916s CPU time.
Oct  2 09:03:35 np0005465988 systemd-machined[192594]: Machine qemu-93-instance-000000c6 terminated.
Oct  2 09:03:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.656 2 INFO nova.virt.libvirt.driver [-] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Instance destroyed successfully.#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.657 2 DEBUG nova.objects.instance [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lazy-loading 'resources' on Instance uuid f7e4398e-72d2-4983-9680-d518c4ca2b0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:03:35 np0005465988 neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f[329404]: [NOTICE]   (329426) : haproxy version is 2.8.14-c23fe91
Oct  2 09:03:35 np0005465988 neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f[329404]: [NOTICE]   (329426) : path to executable is /usr/sbin/haproxy
Oct  2 09:03:35 np0005465988 neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f[329404]: [WARNING]  (329426) : Exiting Master process...
Oct  2 09:03:35 np0005465988 neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f[329404]: [WARNING]  (329426) : Exiting Master process...
Oct  2 09:03:35 np0005465988 neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f[329404]: [ALERT]    (329426) : Current worker (329432) exited with code 143 (Terminated)
Oct  2 09:03:35 np0005465988 neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f[329404]: [WARNING]  (329426) : All workers exited. Exiting... (0)
Oct  2 09:03:35 np0005465988 systemd[1]: libpod-5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55.scope: Deactivated successfully.
Oct  2 09:03:35 np0005465988 podman[330694]: 2025-10-02 13:03:35.672904434 +0000 UTC m=+0.131121518 container died 5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.684 2 DEBUG nova.virt.libvirt.vif [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:01:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-651296364',display_name='tempest-TestNetworkBasicOps-server-651296364',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-651296364',id=198,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLXzVoXqoH0c8B/MwaBanzF36D2STG/cCl7jSEQqMF9llc/T0alPbEVXmYKDW7yWtOCde2/kXi2eYv4vBR+19jXFCuuAhkxru11z4SaMvv7zHypYZ5UAQipjiPwodEPTg==',key_name='tempest-TestNetworkBasicOps-1704277327',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:02:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ce2ca82c03554560b55ed747ae63f1fb',ramdisk_id='',reservation_id='r-3xkuzi1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1692262680',owner_user_name='tempest-TestNetworkBasicOps-1692262680-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:02:07Z,user_data=None,user_id='fb366465e6154871b8a53c9f500105ce',uuid=f7e4398e-72d2-4983-9680-d518c4ca2b0e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.684 2 DEBUG nova.network.os_vif_util [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converting VIF {"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.685 2 DEBUG nova.network.os_vif_util [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:d3:90,bridge_name='br-int',has_traffic_filtering=True,id=c3e55349-c91d-43c1-be7d-394f7b35ee2e,network=Network(3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3e55349-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.685 2 DEBUG os_vif [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:d3:90,bridge_name='br-int',has_traffic_filtering=True,id=c3e55349-c91d-43c1-be7d-394f7b35ee2e,network=Network(3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3e55349-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.687 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3e55349-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.695 2 INFO os_vif [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:d3:90,bridge_name='br-int',has_traffic_filtering=True,id=c3e55349-c91d-43c1-be7d-394f7b35ee2e,network=Network(3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3e55349-c9')#033[00m
Oct  2 09:03:35 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55-userdata-shm.mount: Deactivated successfully.
Oct  2 09:03:35 np0005465988 systemd[1]: var-lib-containers-storage-overlay-08e77e52409b99907e6b5b75a4a41d2a1a93b290f28ff47870a363263bb7a379-merged.mount: Deactivated successfully.
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.994 2 DEBUG nova.compute.manager [req-27b06ead-4633-441c-a3f0-cee4964d25c5 req-b5609115-4815-4c39-9844-2af136e08e93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-vif-unplugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.994 2 DEBUG oslo_concurrency.lockutils [req-27b06ead-4633-441c-a3f0-cee4964d25c5 req-b5609115-4815-4c39-9844-2af136e08e93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.995 2 DEBUG oslo_concurrency.lockutils [req-27b06ead-4633-441c-a3f0-cee4964d25c5 req-b5609115-4815-4c39-9844-2af136e08e93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.995 2 DEBUG oslo_concurrency.lockutils [req-27b06ead-4633-441c-a3f0-cee4964d25c5 req-b5609115-4815-4c39-9844-2af136e08e93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.995 2 DEBUG nova.compute.manager [req-27b06ead-4633-441c-a3f0-cee4964d25c5 req-b5609115-4815-4c39-9844-2af136e08e93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] No waiting events found dispatching network-vif-unplugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:03:35 np0005465988 nova_compute[236126]: 2025-10-02 13:03:35.995 2 DEBUG nova.compute.manager [req-27b06ead-4633-441c-a3f0-cee4964d25c5 req-b5609115-4815-4c39-9844-2af136e08e93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-vif-unplugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:03:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:36.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:36 np0005465988 podman[330694]: 2025-10-02 13:03:36.352134465 +0000 UTC m=+0.810351609 container cleanup 5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 09:03:36 np0005465988 systemd[1]: libpod-conmon-5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55.scope: Deactivated successfully.
Oct  2 09:03:36 np0005465988 podman[330753]: 2025-10-02 13:03:36.629080951 +0000 UTC m=+0.242545746 container remove 5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:03:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:36.636 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0868dc-3f76-407f-b06e-590b3d75d8da]: (4, ('Thu Oct  2 01:03:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f (5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55)\n5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55\nThu Oct  2 01:03:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f (5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55)\n5843b956d17ee3afa7e1fc124a7b45b21f893489cb6f4ea0d2b13667c94b2b55\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:36.638 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[378b8dd3-3cf5-490a-a140-a6b2aed764c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:36.639 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e2a6e0a-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:36 np0005465988 nova_compute[236126]: 2025-10-02 13:03:36.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:36 np0005465988 kernel: tap3e2a6e0a-a0: left promiscuous mode
Oct  2 09:03:36 np0005465988 nova_compute[236126]: 2025-10-02 13:03:36.645 2 DEBUG nova.network.neutron [req-1c15deb3-a184-42c9-94f9-aaab4727254a req-b2479a60-7d77-4d65-ae47-c50e66b0940a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updated VIF entry in instance network info cache for port c3e55349-c91d-43c1-be7d-394f7b35ee2e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:03:36 np0005465988 nova_compute[236126]: 2025-10-02 13:03:36.645 2 DEBUG nova.network.neutron [req-1c15deb3-a184-42c9-94f9-aaab4727254a req-b2479a60-7d77-4d65-ae47-c50e66b0940a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updating instance_info_cache with network_info: [{"id": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "address": "fa:16:3e:c2:d3:90", "network": {"id": "3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f", "bridge": "br-int", "label": "tempest-network-smoke--1973286350", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce2ca82c03554560b55ed747ae63f1fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3e55349-c9", "ovs_interfaceid": "c3e55349-c91d-43c1-be7d-394f7b35ee2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:03:36 np0005465988 nova_compute[236126]: 2025-10-02 13:03:36.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:36.659 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7a8ff911-364b-497f-abef-b38971c37b27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:36.681 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3a829cd1-1e91-42c0-8238-5f4a4fe3293e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:36.683 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e36a78-be25-4087-bb35-b630797bb5f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:36.702 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[41ea3547-be34-461d-900a-a427c1660310]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 818220, 'reachable_time': 27908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 330769, 'error': None, 'target': 'ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:36 np0005465988 systemd[1]: run-netns-ovnmeta\x2d3e2a6e0a\x2da0e0\x2d4ffa\x2da478\x2df7ace212a48f.mount: Deactivated successfully.
Oct  2 09:03:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:36.706 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3e2a6e0a-a0e0-4ffa-a478-f7ace212a48f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:03:36 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:36.706 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[71ee063c-e168-4256-b83d-7069cd57834b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:36 np0005465988 nova_compute[236126]: 2025-10-02 13:03:36.784 2 DEBUG oslo_concurrency.lockutils [req-1c15deb3-a184-42c9-94f9-aaab4727254a req-b2479a60-7d77-4d65-ae47-c50e66b0940a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-f7e4398e-72d2-4983-9680-d518c4ca2b0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:03:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:37.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.147 2 DEBUG nova.compute.manager [req-283cd9fb-3244-4223-b46a-2ec90a486822 req-d307c204-ab0d-43ed-bf8c-7aade5026579 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.148 2 DEBUG oslo_concurrency.lockutils [req-283cd9fb-3244-4223-b46a-2ec90a486822 req-d307c204-ab0d-43ed-bf8c-7aade5026579 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.148 2 DEBUG oslo_concurrency.lockutils [req-283cd9fb-3244-4223-b46a-2ec90a486822 req-d307c204-ab0d-43ed-bf8c-7aade5026579 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.148 2 DEBUG oslo_concurrency.lockutils [req-283cd9fb-3244-4223-b46a-2ec90a486822 req-d307c204-ab0d-43ed-bf8c-7aade5026579 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.149 2 DEBUG nova.compute.manager [req-283cd9fb-3244-4223-b46a-2ec90a486822 req-d307c204-ab0d-43ed-bf8c-7aade5026579 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] No waiting events found dispatching network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.149 2 WARNING nova.compute.manager [req-283cd9fb-3244-4223-b46a-2ec90a486822 req-d307c204-ab0d-43ed-bf8c-7aade5026579 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received unexpected event network-vif-plugged-c3e55349-c91d-43c1-be7d-394f7b35ee2e for instance with vm_state active and task_state deleting.#033[00m
Oct  2 09:03:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:38.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.507 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.892 2 INFO nova.virt.libvirt.driver [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Deleting instance files /var/lib/nova/instances/f7e4398e-72d2-4983-9680-d518c4ca2b0e_del#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.893 2 INFO nova.virt.libvirt.driver [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Deletion of /var/lib/nova/instances/f7e4398e-72d2-4983-9680-d518c4ca2b0e_del complete#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.983 2 INFO nova.compute.manager [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Took 3.97 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.984 2 DEBUG oslo.service.loopingcall [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.985 2 DEBUG nova.compute.manager [-] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 09:03:38 np0005465988 nova_compute[236126]: 2025-10-02 13:03:38.985 2 DEBUG nova.network.neutron [-] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 09:03:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:39.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:39 np0005465988 nova_compute[236126]: 2025-10-02 13:03:39.612 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:03:39 np0005465988 nova_compute[236126]: 2025-10-02 13:03:39.613 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:03:39 np0005465988 nova_compute[236126]: 2025-10-02 13:03:39.613 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 09:03:39 np0005465988 nova_compute[236126]: 2025-10-02 13:03:39.613 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 640fbec9-1ab9-4115-892a-3e91f24ed2ae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:03:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:40.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:40 np0005465988 nova_compute[236126]: 2025-10-02 13:03:40.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:40 np0005465988 nova_compute[236126]: 2025-10-02 13:03:40.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:41 np0005465988 nova_compute[236126]: 2025-10-02 13:03:41.035 2 DEBUG nova.network.neutron [-] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:03:41 np0005465988 nova_compute[236126]: 2025-10-02 13:03:41.072 2 INFO nova.compute.manager [-] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Took 2.09 seconds to deallocate network for instance.#033[00m
Oct  2 09:03:41 np0005465988 nova_compute[236126]: 2025-10-02 13:03:41.150 2 DEBUG oslo_concurrency.lockutils [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:41 np0005465988 nova_compute[236126]: 2025-10-02 13:03:41.151 2 DEBUG oslo_concurrency.lockutils [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:41 np0005465988 nova_compute[236126]: 2025-10-02 13:03:41.206 2 DEBUG nova.compute.manager [req-da1d7bf0-7775-4899-99d7-b7f66a411169 req-d8706961-12e7-48b6-a74d-2e147ebbff6b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Received event network-vif-deleted-c3e55349-c91d-43c1-be7d-394f7b35ee2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:41 np0005465988 nova_compute[236126]: 2025-10-02 13:03:41.238 2 DEBUG oslo_concurrency.processutils [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:03:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:41.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:41 np0005465988 podman[330795]: 2025-10-02 13:03:41.530984513 +0000 UTC m=+0.062364317 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 09:03:41 np0005465988 podman[330794]: 2025-10-02 13:03:41.554077668 +0000 UTC m=+0.088571212 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:03:41 np0005465988 podman[330796]: 2025-10-02 13:03:41.56141804 +0000 UTC m=+0.079487241 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 09:03:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:03:41 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2773844659' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:03:41 np0005465988 nova_compute[236126]: 2025-10-02 13:03:41.720 2 DEBUG oslo_concurrency.processutils [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:03:41 np0005465988 nova_compute[236126]: 2025-10-02 13:03:41.727 2 DEBUG nova.compute.provider_tree [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:03:41 np0005465988 nova_compute[236126]: 2025-10-02 13:03:41.770 2 DEBUG nova.scheduler.client.report [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:03:41 np0005465988 nova_compute[236126]: 2025-10-02 13:03:41.818 2 DEBUG oslo_concurrency.lockutils [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:41 np0005465988 nova_compute[236126]: 2025-10-02 13:03:41.895 2 INFO nova.scheduler.client.report [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Deleted allocations for instance f7e4398e-72d2-4983-9680-d518c4ca2b0e#033[00m
Oct  2 09:03:42 np0005465988 nova_compute[236126]: 2025-10-02 13:03:42.018 2 DEBUG oslo_concurrency.lockutils [None req-549dc55f-e4d5-4756-88c1-de628f62c13f fb366465e6154871b8a53c9f500105ce ce2ca82c03554560b55ed747ae63f1fb - - default default] Lock "f7e4398e-72d2-4983-9680-d518c4ca2b0e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:42.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:42 np0005465988 nova_compute[236126]: 2025-10-02 13:03:42.372 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Updating instance_info_cache with network_info: [{"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:03:42 np0005465988 nova_compute[236126]: 2025-10-02 13:03:42.419 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-640fbec9-1ab9-4115-892a-3e91f24ed2ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:03:42 np0005465988 nova_compute[236126]: 2025-10-02 13:03:42.419 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 09:03:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:03:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:43.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:03:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:44.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:45.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:45 np0005465988 nova_compute[236126]: 2025-10-02 13:03:45.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:45 np0005465988 nova_compute[236126]: 2025-10-02 13:03:45.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.220 2 DEBUG oslo_concurrency.lockutils [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Acquiring lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.221 2 DEBUG oslo_concurrency.lockutils [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.221 2 DEBUG oslo_concurrency.lockutils [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Acquiring lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.222 2 DEBUG oslo_concurrency.lockutils [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.222 2 DEBUG oslo_concurrency.lockutils [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.223 2 INFO nova.compute.manager [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Terminating instance#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.224 2 DEBUG nova.compute.manager [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.269 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.270 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 kernel: tape808672d-dd (unregistering): left promiscuous mode
Oct  2 09:03:46 np0005465988 NetworkManager[45041]: <info>  [1759410226.2947] device (tape808672d-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:03:46 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:46Z|00911|binding|INFO|Releasing lport e808672d-dd35-463d-8c4c-c82ed7646741 from this chassis (sb_readonly=0)
Oct  2 09:03:46 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:46Z|00912|binding|INFO|Setting lport e808672d-dd35-463d-8c4c-c82ed7646741 down in Southbound
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 ovn_controller[132601]: 2025-10-02T13:03:46Z|00913|binding|INFO|Removing iface tape808672d-dd ovn-installed in OVS
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:46.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.323 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:14:c5 10.100.0.5'], port_security=['fa:16:3e:43:14:c5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '640fbec9-1ab9-4115-892a-3e91f24ed2ae', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2471b6f7-ee51-4239-8b52-7016ab4d9fd1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5eceae619a6f4fdeaa8ba6fafda4912a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fcb2f420-7d55-44da-a4fe-84e82e3282b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.221'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa923984-fb22-4ee5-9bd7-5034c98e7f0a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=e808672d-dd35-463d-8c4c-c82ed7646741) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.325 142124 INFO neutron.agent.ovn.metadata.agent [-] Port e808672d-dd35-463d-8c4c-c82ed7646741 in datapath 2471b6f7-ee51-4239-8b52-7016ab4d9fd1 unbound from our chassis#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.326 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2471b6f7-ee51-4239-8b52-7016ab4d9fd1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.327 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b744b60b-4e0e-4d2f-91a6-d2f73fae193d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.328 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1 namespace which is not needed anymore#033[00m
Oct  2 09:03:46 np0005465988 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000c9.scope: Deactivated successfully.
Oct  2 09:03:46 np0005465988 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000c9.scope: Consumed 14.899s CPU time.
Oct  2 09:03:46 np0005465988 systemd-machined[192594]: Machine qemu-94-instance-000000c9 terminated.
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.457 2 INFO nova.virt.libvirt.driver [-] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Instance destroyed successfully.#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.457 2 DEBUG nova.objects.instance [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lazy-loading 'resources' on Instance uuid 640fbec9-1ab9-4115-892a-3e91f24ed2ae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:03:46 np0005465988 neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1[330464]: [NOTICE]   (330468) : haproxy version is 2.8.14-c23fe91
Oct  2 09:03:46 np0005465988 neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1[330464]: [NOTICE]   (330468) : path to executable is /usr/sbin/haproxy
Oct  2 09:03:46 np0005465988 neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1[330464]: [WARNING]  (330468) : Exiting Master process...
Oct  2 09:03:46 np0005465988 neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1[330464]: [ALERT]    (330468) : Current worker (330470) exited with code 143 (Terminated)
Oct  2 09:03:46 np0005465988 neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1[330464]: [WARNING]  (330468) : All workers exited. Exiting... (0)
Oct  2 09:03:46 np0005465988 systemd[1]: libpod-b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09.scope: Deactivated successfully.
Oct  2 09:03:46 np0005465988 podman[330939]: 2025-10-02 13:03:46.469940823 +0000 UTC m=+0.052469632 container died b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.487 2 DEBUG nova.virt.libvirt.vif [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1921024035',display_name='tempest-AttachVolumeNegativeTest-server-1921024035',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1921024035',id=201,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDDrMvZ77F/UIZfU+v9K7atXR5NjjRA7wn/L4bHndWJAEEnJTo/JMZnZyeU+hLDfIrLuljZuLJ61gnvWEBfMNMfiDcyAb4KC3UGvs/4WwzQe2L+IRgQtFWqJOITPqlAajA==',key_name='tempest-keypair-89702726',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:03:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5eceae619a6f4fdeaa8ba6fafda4912a',ramdisk_id='',reservation_id='r-ixaq07al',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeNegativeTest-1407980822',owner_user_name='tempest-AttachVolumeNegativeTest-1407980822-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:03:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93facc00c95f4cbfa6cecaf3641182bc',uuid=640fbec9-1ab9-4115-892a-3e91f24ed2ae,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.488 2 DEBUG nova.network.os_vif_util [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Converting VIF {"id": "e808672d-dd35-463d-8c4c-c82ed7646741", "address": "fa:16:3e:43:14:c5", "network": {"id": "2471b6f7-ee51-4239-8b52-7016ab4d9fd1", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1867797555-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5eceae619a6f4fdeaa8ba6fafda4912a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape808672d-dd", "ovs_interfaceid": "e808672d-dd35-463d-8c4c-c82ed7646741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.489 2 DEBUG nova.network.os_vif_util [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:43:14:c5,bridge_name='br-int',has_traffic_filtering=True,id=e808672d-dd35-463d-8c4c-c82ed7646741,network=Network(2471b6f7-ee51-4239-8b52-7016ab4d9fd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape808672d-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.489 2 DEBUG os_vif [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:14:c5,bridge_name='br-int',has_traffic_filtering=True,id=e808672d-dd35-463d-8c4c-c82ed7646741,network=Network(2471b6f7-ee51-4239-8b52-7016ab4d9fd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape808672d-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.491 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape808672d-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.498 2 INFO os_vif [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:14:c5,bridge_name='br-int',has_traffic_filtering=True,id=e808672d-dd35-463d-8c4c-c82ed7646741,network=Network(2471b6f7-ee51-4239-8b52-7016ab4d9fd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape808672d-dd')#033[00m
Oct  2 09:03:46 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09-userdata-shm.mount: Deactivated successfully.
Oct  2 09:03:46 np0005465988 systemd[1]: var-lib-containers-storage-overlay-1cd39028dbc4678469d905e5d3721ba77fd1f93bf883528751476cf060773949-merged.mount: Deactivated successfully.
Oct  2 09:03:46 np0005465988 podman[330939]: 2025-10-02 13:03:46.545593212 +0000 UTC m=+0.128122021 container cleanup b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 09:03:46 np0005465988 systemd[1]: libpod-conmon-b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09.scope: Deactivated successfully.
Oct  2 09:03:46 np0005465988 podman[330996]: 2025-10-02 13:03:46.647404205 +0000 UTC m=+0.073812037 container remove b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.653 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a160a86c-d8e8-4f4d-8d3e-afebce6299eb]: (4, ('Thu Oct  2 01:03:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1 (b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09)\nb1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09\nThu Oct  2 01:03:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1 (b1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09)\nb1102309e681f4e295ddb52839129ecb29b033e49b8fea7564afd6d525ed2e09\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.657 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2ffa5ae6-39fe-4da4-9575-0685b4d6823c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.659 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2471b6f7-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 kernel: tap2471b6f7-e0: left promiscuous mode
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.679 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ca548062-aecf-48ae-a74f-6ed2c953247e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.701 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[085319fa-b92f-4f08-ac10-e2fb2526169b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.705 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb789d1-f8be-4b3a-a7b4-924ffa0094c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.724 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed9f694-5d35-4e9e-9be0-c29e147504de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 824633, 'reachable_time': 20627, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 331011, 'error': None, 'target': 'ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:46 np0005465988 systemd[1]: run-netns-ovnmeta\x2d2471b6f7\x2dee51\x2d4239\x2d8b52\x2d7016ab4d9fd1.mount: Deactivated successfully.
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.729 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2471b6f7-ee51-4239-8b52-7016ab4d9fd1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:03:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:46.730 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[89084124-5ac5-4dc3-866d-59b41466c460]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.737 2 DEBUG nova.compute.manager [req-0d01500f-00dd-4feb-adfc-40ebfd627155 req-17f55838-1e70-49e9-98e5-4aa890b70169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Received event network-vif-unplugged-e808672d-dd35-463d-8c4c-c82ed7646741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.737 2 DEBUG oslo_concurrency.lockutils [req-0d01500f-00dd-4feb-adfc-40ebfd627155 req-17f55838-1e70-49e9-98e5-4aa890b70169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.738 2 DEBUG oslo_concurrency.lockutils [req-0d01500f-00dd-4feb-adfc-40ebfd627155 req-17f55838-1e70-49e9-98e5-4aa890b70169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.738 2 DEBUG oslo_concurrency.lockutils [req-0d01500f-00dd-4feb-adfc-40ebfd627155 req-17f55838-1e70-49e9-98e5-4aa890b70169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.738 2 DEBUG nova.compute.manager [req-0d01500f-00dd-4feb-adfc-40ebfd627155 req-17f55838-1e70-49e9-98e5-4aa890b70169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] No waiting events found dispatching network-vif-unplugged-e808672d-dd35-463d-8c4c-c82ed7646741 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:03:46 np0005465988 nova_compute[236126]: 2025-10-02 13:03:46.738 2 DEBUG nova.compute.manager [req-0d01500f-00dd-4feb-adfc-40ebfd627155 req-17f55838-1e70-49e9-98e5-4aa890b70169 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Received event network-vif-unplugged-e808672d-dd35-463d-8c4c-c82ed7646741 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:03:47 np0005465988 nova_compute[236126]: 2025-10-02 13:03:47.033 2 INFO nova.virt.libvirt.driver [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Deleting instance files /var/lib/nova/instances/640fbec9-1ab9-4115-892a-3e91f24ed2ae_del#033[00m
Oct  2 09:03:47 np0005465988 nova_compute[236126]: 2025-10-02 13:03:47.034 2 INFO nova.virt.libvirt.driver [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Deletion of /var/lib/nova/instances/640fbec9-1ab9-4115-892a-3e91f24ed2ae_del complete#033[00m
Oct  2 09:03:47 np0005465988 nova_compute[236126]: 2025-10-02 13:03:47.123 2 INFO nova.compute.manager [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Took 0.90 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 09:03:47 np0005465988 nova_compute[236126]: 2025-10-02 13:03:47.124 2 DEBUG oslo.service.loopingcall [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 09:03:47 np0005465988 nova_compute[236126]: 2025-10-02 13:03:47.124 2 DEBUG nova.compute.manager [-] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 09:03:47 np0005465988 nova_compute[236126]: 2025-10-02 13:03:47.124 2 DEBUG nova.network.neutron [-] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 09:03:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:47.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:48.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:48 np0005465988 nova_compute[236126]: 2025-10-02 13:03:48.955 2 DEBUG nova.compute.manager [req-ea8c2e98-7b87-42d8-8bce-bf1fe53865a1 req-e10cb3ba-d3c6-42e3-9663-f37ec6ca6e80 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Received event network-vif-plugged-e808672d-dd35-463d-8c4c-c82ed7646741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:48 np0005465988 nova_compute[236126]: 2025-10-02 13:03:48.955 2 DEBUG oslo_concurrency.lockutils [req-ea8c2e98-7b87-42d8-8bce-bf1fe53865a1 req-e10cb3ba-d3c6-42e3-9663-f37ec6ca6e80 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:48 np0005465988 nova_compute[236126]: 2025-10-02 13:03:48.956 2 DEBUG oslo_concurrency.lockutils [req-ea8c2e98-7b87-42d8-8bce-bf1fe53865a1 req-e10cb3ba-d3c6-42e3-9663-f37ec6ca6e80 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:48 np0005465988 nova_compute[236126]: 2025-10-02 13:03:48.956 2 DEBUG oslo_concurrency.lockutils [req-ea8c2e98-7b87-42d8-8bce-bf1fe53865a1 req-e10cb3ba-d3c6-42e3-9663-f37ec6ca6e80 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:48 np0005465988 nova_compute[236126]: 2025-10-02 13:03:48.956 2 DEBUG nova.compute.manager [req-ea8c2e98-7b87-42d8-8bce-bf1fe53865a1 req-e10cb3ba-d3c6-42e3-9663-f37ec6ca6e80 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] No waiting events found dispatching network-vif-plugged-e808672d-dd35-463d-8c4c-c82ed7646741 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:03:48 np0005465988 nova_compute[236126]: 2025-10-02 13:03:48.956 2 WARNING nova.compute.manager [req-ea8c2e98-7b87-42d8-8bce-bf1fe53865a1 req-e10cb3ba-d3c6-42e3-9663-f37ec6ca6e80 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Received unexpected event network-vif-plugged-e808672d-dd35-463d-8c4c-c82ed7646741 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 09:03:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:49.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:49 np0005465988 nova_compute[236126]: 2025-10-02 13:03:49.882 2 DEBUG nova.network.neutron [-] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:03:49 np0005465988 nova_compute[236126]: 2025-10-02 13:03:49.929 2 INFO nova.compute.manager [-] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Took 2.80 seconds to deallocate network for instance.#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.010 2 DEBUG oslo_concurrency.lockutils [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.011 2 DEBUG oslo_concurrency.lockutils [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.085 2 DEBUG oslo_concurrency.processutils [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.136 2 DEBUG nova.compute.manager [req-1efd733f-2cac-4168-a01c-309de37adfae req-085ad124-5fe1-46cc-85a8-7d9f11fe9bf3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Received event network-vif-deleted-e808672d-dd35-463d-8c4c-c82ed7646741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:50.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:03:50 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/56588911' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.589 2 DEBUG oslo_concurrency.processutils [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.595 2 DEBUG nova.compute.provider_tree [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.630 2 DEBUG nova.scheduler.client.report [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.652 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410215.6510525, f7e4398e-72d2-4983-9680-d518c4ca2b0e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.652 2 INFO nova.compute.manager [-] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.671 2 DEBUG oslo_concurrency.lockutils [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.687 2 DEBUG nova.compute.manager [None req-4206e5bd-b723-4f15-8d19-e1649d43f05e - - - - - -] [instance: f7e4398e-72d2-4983-9680-d518c4ca2b0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.712 2 INFO nova.scheduler.client.report [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Deleted allocations for instance 640fbec9-1ab9-4115-892a-3e91f24ed2ae#033[00m
Oct  2 09:03:50 np0005465988 nova_compute[236126]: 2025-10-02 13:03:50.869 2 DEBUG oslo_concurrency.lockutils [None req-11783178-0b5a-40f4-86b3-92ef219bc8bf 93facc00c95f4cbfa6cecaf3641182bc 5eceae619a6f4fdeaa8ba6fafda4912a - - default default] Lock "640fbec9-1ab9-4115-892a-3e91f24ed2ae" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:51.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:51 np0005465988 nova_compute[236126]: 2025-10-02 13:03:51.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:52.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:53.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:54.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:55.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:03:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:03:55 np0005465988 nova_compute[236126]: 2025-10-02 13:03:55.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:03:56.273 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:03:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:56.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:56 np0005465988 nova_compute[236126]: 2025-10-02 13:03:56.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:57.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:03:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:03:58.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:03:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:03:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:03:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:03:59.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:04:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:00.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:04:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:00 np0005465988 podman[331044]: 2025-10-02 13:04:00.526549152 +0000 UTC m=+0.061244385 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 09:04:00 np0005465988 nova_compute[236126]: 2025-10-02 13:04:00.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:01.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:01 np0005465988 nova_compute[236126]: 2025-10-02 13:04:01.455 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410226.45418, 640fbec9-1ab9-4115-892a-3e91f24ed2ae => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:04:01 np0005465988 nova_compute[236126]: 2025-10-02 13:04:01.456 2 INFO nova.compute.manager [-] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:04:01 np0005465988 nova_compute[236126]: 2025-10-02 13:04:01.480 2 DEBUG nova.compute.manager [None req-7313a6bc-47d2-473f-b1c9-558013fae7cf - - - - - -] [instance: 640fbec9-1ab9-4115-892a-3e91f24ed2ae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:04:01 np0005465988 nova_compute[236126]: 2025-10-02 13:04:01.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:02.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:02 np0005465988 ceph-mgr[76715]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3158772141
Oct  2 09:04:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:03.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:04:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:04:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:04:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:04:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:04:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:04:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:04:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:04:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:04.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:04:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:05.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:05 np0005465988 nova_compute[236126]: 2025-10-02 13:04:05.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:06.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:06 np0005465988 nova_compute[236126]: 2025-10-02 13:04:06.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:04:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5401.0 total, 600.0 interval#012Cumulative writes: 72K writes, 293K keys, 72K commit groups, 1.0 writes per commit group, ingest: 0.29 GB, 0.06 MB/s#012Cumulative WAL: 72K writes, 26K syncs, 2.71 writes per sync, written: 0.29 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5949 writes, 24K keys, 5949 commit groups, 1.0 writes per commit group, ingest: 24.13 MB, 0.04 MB/s#012Interval WAL: 5949 writes, 2301 syncs, 2.59 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 09:04:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:07.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:08.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:09.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:04:09 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:04:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:10.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:10 np0005465988 nova_compute[236126]: 2025-10-02 13:04:10.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:11.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:11 np0005465988 nova_compute[236126]: 2025-10-02 13:04:11.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:12.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:12 np0005465988 podman[331304]: 2025-10-02 13:04:12.52799963 +0000 UTC m=+0.056364855 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:04:12 np0005465988 podman[331303]: 2025-10-02 13:04:12.554176593 +0000 UTC m=+0.077245775 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid)
Oct  2 09:04:12 np0005465988 podman[331302]: 2025-10-02 13:04:12.561213876 +0000 UTC m=+0.094179053 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:04:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:13.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:14.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:15.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:15 np0005465988 nova_compute[236126]: 2025-10-02 13:04:15.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:16.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:16 np0005465988 nova_compute[236126]: 2025-10-02 13:04:16.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:16 np0005465988 nova_compute[236126]: 2025-10-02 13:04:16.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:17.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:17 np0005465988 nova_compute[236126]: 2025-10-02 13:04:17.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:17 np0005465988 nova_compute[236126]: 2025-10-02 13:04:17.543 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:04:17 np0005465988 nova_compute[236126]: 2025-10-02 13:04:17.544 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:04:17 np0005465988 nova_compute[236126]: 2025-10-02 13:04:17.544 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:04:17 np0005465988 nova_compute[236126]: 2025-10-02 13:04:17.545 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:04:17 np0005465988 nova_compute[236126]: 2025-10-02 13:04:17.545 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:04:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:04:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4188905716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:04:17 np0005465988 nova_compute[236126]: 2025-10-02 13:04:17.988 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:04:18 np0005465988 nova_compute[236126]: 2025-10-02 13:04:18.192 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:04:18 np0005465988 nova_compute[236126]: 2025-10-02 13:04:18.193 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4039MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:04:18 np0005465988 nova_compute[236126]: 2025-10-02 13:04:18.194 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:04:18 np0005465988 nova_compute[236126]: 2025-10-02 13:04:18.194 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:04:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:18.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:18 np0005465988 nova_compute[236126]: 2025-10-02 13:04:18.701 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:04:18 np0005465988 nova_compute[236126]: 2025-10-02 13:04:18.702 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:04:18 np0005465988 nova_compute[236126]: 2025-10-02 13:04:18.742 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:04:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:04:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1470117821' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:04:19 np0005465988 nova_compute[236126]: 2025-10-02 13:04:19.208 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:04:19 np0005465988 nova_compute[236126]: 2025-10-02 13:04:19.214 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:04:19 np0005465988 nova_compute[236126]: 2025-10-02 13:04:19.265 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:04:19 np0005465988 nova_compute[236126]: 2025-10-02 13:04:19.327 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:04:19 np0005465988 nova_compute[236126]: 2025-10-02 13:04:19.327 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:04:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:19.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:04:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:20.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:04:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:20 np0005465988 nova_compute[236126]: 2025-10-02 13:04:20.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:21.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:21 np0005465988 nova_compute[236126]: 2025-10-02 13:04:21.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:22.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:23.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:24.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:25.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:25 np0005465988 nova_compute[236126]: 2025-10-02 13:04:25.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:26.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:26 np0005465988 nova_compute[236126]: 2025-10-02 13:04:26.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:27.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:04:27.410 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:04:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:04:27.410 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:04:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:04:27.410 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:04:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:28.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:29.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:04:29.434 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:04:29 np0005465988 nova_compute[236126]: 2025-10-02 13:04:29.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:04:29.436 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:04:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:30.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:30 np0005465988 nova_compute[236126]: 2025-10-02 13:04:30.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:30 np0005465988 nova_compute[236126]: 2025-10-02 13:04:30.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:30 np0005465988 nova_compute[236126]: 2025-10-02 13:04:30.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 09:04:30 np0005465988 nova_compute[236126]: 2025-10-02 13:04:30.489 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 09:04:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:30 np0005465988 nova_compute[236126]: 2025-10-02 13:04:30.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:31.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:31 np0005465988 nova_compute[236126]: 2025-10-02 13:04:31.489 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:31 np0005465988 nova_compute[236126]: 2025-10-02 13:04:31.490 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:31 np0005465988 nova_compute[236126]: 2025-10-02 13:04:31.490 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:04:31 np0005465988 nova_compute[236126]: 2025-10-02 13:04:31.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:31 np0005465988 podman[331471]: 2025-10-02 13:04:31.559573834 +0000 UTC m=+0.094791671 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 09:04:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:32.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:32 np0005465988 nova_compute[236126]: 2025-10-02 13:04:32.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:33.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:34.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:34 np0005465988 nova_compute[236126]: 2025-10-02 13:04:34.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:34 np0005465988 ovn_controller[132601]: 2025-10-02T13:04:34Z|00914|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  2 09:04:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:35.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:04:35.438 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:04:35 np0005465988 nova_compute[236126]: 2025-10-02 13:04:35.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:35 np0005465988 nova_compute[236126]: 2025-10-02 13:04:35.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:36.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:36 np0005465988 nova_compute[236126]: 2025-10-02 13:04:36.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:37.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:38.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:39.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:40.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:40 np0005465988 nova_compute[236126]: 2025-10-02 13:04:40.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:40 np0005465988 nova_compute[236126]: 2025-10-02 13:04:40.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:04:40 np0005465988 nova_compute[236126]: 2025-10-02 13:04:40.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:04:40 np0005465988 nova_compute[236126]: 2025-10-02 13:04:40.496 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:04:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:40 np0005465988 nova_compute[236126]: 2025-10-02 13:04:40.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:41.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:41 np0005465988 nova_compute[236126]: 2025-10-02 13:04:41.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:42.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:42 np0005465988 podman[331547]: 2025-10-02 13:04:42.683463468 +0000 UTC m=+0.059607478 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 09:04:42 np0005465988 podman[331546]: 2025-10-02 13:04:42.703434903 +0000 UTC m=+0.085143613 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 09:04:42 np0005465988 podman[331548]: 2025-10-02 13:04:42.704158874 +0000 UTC m=+0.074707063 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 09:04:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 09:04:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:43.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 09:04:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:44.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:45.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:45 np0005465988 nova_compute[236126]: 2025-10-02 13:04:45.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:46.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:46 np0005465988 nova_compute[236126]: 2025-10-02 13:04:46.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:47.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:48.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:49.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:49 np0005465988 nova_compute[236126]: 2025-10-02 13:04:49.491 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:50.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:50 np0005465988 nova_compute[236126]: 2025-10-02 13:04:50.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:51.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:51 np0005465988 nova_compute[236126]: 2025-10-02 13:04:51.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:52.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:04:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1802367210' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:04:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:53.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:54.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 09:04:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3280960449' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 09:04:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 09:04:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3280960449' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 09:04:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:55.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:04:55 np0005465988 nova_compute[236126]: 2025-10-02 13:04:55.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:56.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:56 np0005465988 nova_compute[236126]: 2025-10-02 13:04:56.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:57.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:04:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:04:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:04:58.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:04:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:04:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:04:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:04:59.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:05:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:00.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:05:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:00 np0005465988 nova_compute[236126]: 2025-10-02 13:05:00.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:01.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:01 np0005465988 nova_compute[236126]: 2025-10-02 13:05:01.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:02.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:02 np0005465988 podman[331621]: 2025-10-02 13:05:02.560142534 +0000 UTC m=+0.084976028 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  2 09:05:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:03.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:05:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:04.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:05:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:05.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:05 np0005465988 nova_compute[236126]: 2025-10-02 13:05:05.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:05 np0005465988 nova_compute[236126]: 2025-10-02 13:05:05.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:06.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:06 np0005465988 nova_compute[236126]: 2025-10-02 13:05:06.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #154. Immutable memtables: 0.
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.704972) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 97] Flushing memtable with next log file: 154
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410306705025, "job": 97, "event": "flush_started", "num_memtables": 1, "num_entries": 1219, "num_deletes": 251, "total_data_size": 2685143, "memory_usage": 2729280, "flush_reason": "Manual Compaction"}
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 97] Level-0 flush table #155: started
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410306719009, "cf_name": "default", "job": 97, "event": "table_file_creation", "file_number": 155, "file_size": 1760507, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 75682, "largest_seqno": 76896, "table_properties": {"data_size": 1755222, "index_size": 2744, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11653, "raw_average_key_size": 19, "raw_value_size": 1744585, "raw_average_value_size": 2992, "num_data_blocks": 121, "num_entries": 583, "num_filter_entries": 583, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759410210, "oldest_key_time": 1759410210, "file_creation_time": 1759410306, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 155, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 97] Flush lasted 14111 microseconds, and 8314 cpu microseconds.
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.719079) [db/flush_job.cc:967] [default] [JOB 97] Level-0 flush table #155: 1760507 bytes OK
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.719109) [db/memtable_list.cc:519] [default] Level-0 commit table #155 started
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.721823) [db/memtable_list.cc:722] [default] Level-0 commit table #155: memtable #1 done
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.721845) EVENT_LOG_v1 {"time_micros": 1759410306721838, "job": 97, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.721867) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 97] Try to delete WAL files size 2679347, prev total WAL file size 2679347, number of live WAL files 2.
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000151.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.722938) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036353236' seq:72057594037927935, type:22 .. '7061786F730036373738' seq:0, type:0; will stop at (end)
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 98] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 97 Base level 0, inputs: [155(1719KB)], [153(13MB)]
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410306723021, "job": 98, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [155], "files_L6": [153], "score": -1, "input_data_size": 15721045, "oldest_snapshot_seqno": -1}
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 98] Generated table #156: 9744 keys, 13789752 bytes, temperature: kUnknown
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410306825409, "cf_name": "default", "job": 98, "event": "table_file_creation", "file_number": 156, "file_size": 13789752, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13725054, "index_size": 39241, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24389, "raw_key_size": 257095, "raw_average_key_size": 26, "raw_value_size": 13552403, "raw_average_value_size": 1390, "num_data_blocks": 1500, "num_entries": 9744, "num_filter_entries": 9744, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759410306, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 156, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.825804) [db/compaction/compaction_job.cc:1663] [default] [JOB 98] Compacted 1@0 + 1@6 files to L6 => 13789752 bytes
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.827792) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.4 rd, 134.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.3 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(16.8) write-amplify(7.8) OK, records in: 10261, records dropped: 517 output_compression: NoCompression
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.827836) EVENT_LOG_v1 {"time_micros": 1759410306827817, "job": 98, "event": "compaction_finished", "compaction_time_micros": 102482, "compaction_time_cpu_micros": 51094, "output_level": 6, "num_output_files": 1, "total_output_size": 13789752, "num_input_records": 10261, "num_output_records": 9744, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000155.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410306828823, "job": 98, "event": "table_file_deletion", "file_number": 155}
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000153.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410306833968, "job": 98, "event": "table_file_deletion", "file_number": 153}
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.722790) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.834116) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.834123) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.834125) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.834127) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:06 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:06.834128) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:07.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:08.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:09.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:09 np0005465988 podman[331866]: 2025-10-02 13:05:09.751896607 +0000 UTC m=+0.069425471 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 09:05:09 np0005465988 podman[331866]: 2025-10-02 13:05:09.845781461 +0000 UTC m=+0.163310325 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct  2 09:05:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:10.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:10 np0005465988 podman[332004]: 2025-10-02 13:05:10.443847716 +0000 UTC m=+0.056346074 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 09:05:10 np0005465988 podman[332004]: 2025-10-02 13:05:10.490668195 +0000 UTC m=+0.103166533 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 09:05:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:10 np0005465988 podman[332069]: 2025-10-02 13:05:10.770250427 +0000 UTC m=+0.067560187 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, version=2.2.4, com.redhat.component=keepalived-container, distribution-scope=public, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, vcs-type=git)
Oct  2 09:05:10 np0005465988 podman[332069]: 2025-10-02 13:05:10.799058827 +0000 UTC m=+0.096368607 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, architecture=x86_64, release=1793, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, name=keepalived)
Oct  2 09:05:10 np0005465988 nova_compute[236126]: 2025-10-02 13:05:10.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:11.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:11 np0005465988 nova_compute[236126]: 2025-10-02 13:05:11.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:05:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:05:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:05:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:12.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:05:12 np0005465988 nova_compute[236126]: 2025-10-02 13:05:12.515 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:12 np0005465988 nova_compute[236126]: 2025-10-02 13:05:12.516 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 09:05:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:05:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:05:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:05:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:05:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:05:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:05:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:05:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:05:12 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:05:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:13.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:13 np0005465988 podman[332371]: 2025-10-02 13:05:13.545343733 +0000 UTC m=+0.068737621 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:05:13 np0005465988 podman[332370]: 2025-10-02 13:05:13.549608866 +0000 UTC m=+0.077457412 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:05:13 np0005465988 podman[332369]: 2025-10-02 13:05:13.593664435 +0000 UTC m=+0.121125530 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 09:05:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:14.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:15.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:15 np0005465988 nova_compute[236126]: 2025-10-02 13:05:15.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:16.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:16 np0005465988 nova_compute[236126]: 2025-10-02 13:05:16.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:17.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:17 np0005465988 nova_compute[236126]: 2025-10-02 13:05:17.504 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:18.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:18 np0005465988 nova_compute[236126]: 2025-10-02 13:05:18.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:18 np0005465988 nova_compute[236126]: 2025-10-02 13:05:18.504 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:18 np0005465988 nova_compute[236126]: 2025-10-02 13:05:18.504 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:18 np0005465988 nova_compute[236126]: 2025-10-02 13:05:18.505 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:18 np0005465988 nova_compute[236126]: 2025-10-02 13:05:18.505 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:05:18 np0005465988 nova_compute[236126]: 2025-10-02 13:05:18.505 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:05:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:05:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3537897323' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:05:18 np0005465988 nova_compute[236126]: 2025-10-02 13:05:18.987 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.167 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.168 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4024MB free_disk=20.985797882080078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.169 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.169 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.259 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.260 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.279 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:05:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:19.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:05:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1490843618' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.784 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.791 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.809 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.811 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:05:19 np0005465988 nova_compute[236126]: 2025-10-02 13:05:19.811 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 09:05:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:20.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 09:05:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:20 np0005465988 nova_compute[236126]: 2025-10-02 13:05:20.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:21 np0005465988 nova_compute[236126]: 2025-10-02 13:05:21.372 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:21 np0005465988 nova_compute[236126]: 2025-10-02 13:05:21.373 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:21 np0005465988 nova_compute[236126]: 2025-10-02 13:05:21.397 2 DEBUG nova.compute.manager [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:05:21 np0005465988 nova_compute[236126]: 2025-10-02 13:05:21.467 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:21 np0005465988 nova_compute[236126]: 2025-10-02 13:05:21.468 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:21.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:21 np0005465988 nova_compute[236126]: 2025-10-02 13:05:21.475 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:05:21 np0005465988 nova_compute[236126]: 2025-10-02 13:05:21.475 2 INFO nova.compute.claims [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:05:21 np0005465988 nova_compute[236126]: 2025-10-02 13:05:21.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:21 np0005465988 nova_compute[236126]: 2025-10-02 13:05:21.595 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:05:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:05:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1765555069' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.059 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.066 2 DEBUG nova.compute.provider_tree [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.086 2 DEBUG nova.scheduler.client.report [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.126 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.128 2 DEBUG nova.compute.manager [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.187 2 DEBUG nova.compute.manager [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.188 2 DEBUG nova.network.neutron [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.215 2 INFO nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.252 2 DEBUG nova.compute.manager [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.380 2 DEBUG nova.compute.manager [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.381 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.382 2 INFO nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Creating image(s)#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.416 2 DEBUG nova.storage.rbd_utils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] rbd image 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:05:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:22.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.448 2 DEBUG nova.storage.rbd_utils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] rbd image 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.475 2 DEBUG nova.storage.rbd_utils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] rbd image 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.480 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.570 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.575 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.577 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.578 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.612 2 DEBUG nova.storage.rbd_utils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] rbd image 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.618 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:05:22 np0005465988 nova_compute[236126]: 2025-10-02 13:05:22.703 2 DEBUG nova.policy [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '10c60eb2034e4ded8a792115857927ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fe4f31859f5d412a94d15bbb07e1e35f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:05:23 np0005465988 ceph-mds[84851]: mds.beacon.cephfs.compute-2.gpiyct missed beacon ack from the monitors
Oct  2 09:05:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:23.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:23 np0005465988 nova_compute[236126]: 2025-10-02 13:05:23.752 2 DEBUG nova.network.neutron [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Successfully created port: 66ed1f18-0610-4138-8cea-79f080445c81 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:05:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:24.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:24 np0005465988 nova_compute[236126]: 2025-10-02 13:05:24.747 2 DEBUG nova.network.neutron [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Successfully updated port: 66ed1f18-0610-4138-8cea-79f080445c81 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:05:24 np0005465988 nova_compute[236126]: 2025-10-02 13:05:24.768 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquiring lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:05:24 np0005465988 nova_compute[236126]: 2025-10-02 13:05:24.768 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquired lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:05:24 np0005465988 nova_compute[236126]: 2025-10-02 13:05:24.768 2 DEBUG nova.network.neutron [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:05:24 np0005465988 nova_compute[236126]: 2025-10-02 13:05:24.910 2 DEBUG nova.compute.manager [req-5dad5610-5018-41ab-87b4-845e0d1fa9bc req-99cedaa7-652f-456c-919d-fcdb3b500817 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-changed-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:24 np0005465988 nova_compute[236126]: 2025-10-02 13:05:24.911 2 DEBUG nova.compute.manager [req-5dad5610-5018-41ab-87b4-845e0d1fa9bc req-99cedaa7-652f-456c-919d-fcdb3b500817 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Refreshing instance network info cache due to event network-changed-66ed1f18-0610-4138-8cea-79f080445c81. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:05:24 np0005465988 nova_compute[236126]: 2025-10-02 13:05:24.911 2 DEBUG oslo_concurrency.lockutils [req-5dad5610-5018-41ab-87b4-845e0d1fa9bc req-99cedaa7-652f-456c-919d-fcdb3b500817 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:05:24 np0005465988 nova_compute[236126]: 2025-10-02 13:05:24.984 2 DEBUG nova.network.neutron [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:05:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:25.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:25 np0005465988 nova_compute[236126]: 2025-10-02 13:05:25.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:25 np0005465988 nova_compute[236126]: 2025-10-02 13:05:25.871 2 DEBUG nova.network.neutron [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Updating instance_info_cache with network_info: [{"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:05:25 np0005465988 nova_compute[236126]: 2025-10-02 13:05:25.893 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Releasing lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:05:25 np0005465988 nova_compute[236126]: 2025-10-02 13:05:25.893 2 DEBUG nova.compute.manager [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Instance network_info: |[{"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:05:25 np0005465988 nova_compute[236126]: 2025-10-02 13:05:25.894 2 DEBUG oslo_concurrency.lockutils [req-5dad5610-5018-41ab-87b4-845e0d1fa9bc req-99cedaa7-652f-456c-919d-fcdb3b500817 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:05:25 np0005465988 nova_compute[236126]: 2025-10-02 13:05:25.894 2 DEBUG nova.network.neutron [req-5dad5610-5018-41ab-87b4-845e0d1fa9bc req-99cedaa7-652f-456c-919d-fcdb3b500817 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Refreshing network info cache for port 66ed1f18-0610-4138-8cea-79f080445c81 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:05:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:26.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:26 np0005465988 nova_compute[236126]: 2025-10-02 13:05:26.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:26 np0005465988 nova_compute[236126]: 2025-10-02 13:05:26.673 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:05:26 np0005465988 nova_compute[236126]: 2025-10-02 13:05:26.755 2 DEBUG nova.storage.rbd_utils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] resizing rbd image 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.257 2 DEBUG nova.objects.instance [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lazy-loading 'migration_context' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.271 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.272 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Ensure instance console log exists: /var/lib/nova/instances/016bb555-dc0d-42a6-9e52-552a2e62a3ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.273 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.273 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.274 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.277 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Start _get_guest_xml network_info=[{"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.283 2 WARNING nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.290 2 DEBUG nova.virt.libvirt.host [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.291 2 DEBUG nova.virt.libvirt.host [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.297 2 DEBUG nova.virt.libvirt.host [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.298 2 DEBUG nova.virt.libvirt.host [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.300 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.300 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.301 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.301 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.301 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.301 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.302 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.302 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.302 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.303 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.303 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.303 2 DEBUG nova.virt.hardware [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.306 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:05:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:27.411 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:27.412 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:27.412 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.433 2 DEBUG nova.network.neutron [req-5dad5610-5018-41ab-87b4-845e0d1fa9bc req-99cedaa7-652f-456c-919d-fcdb3b500817 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Updated VIF entry in instance network info cache for port 66ed1f18-0610-4138-8cea-79f080445c81. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.434 2 DEBUG nova.network.neutron [req-5dad5610-5018-41ab-87b4-845e0d1fa9bc req-99cedaa7-652f-456c-919d-fcdb3b500817 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Updating instance_info_cache with network_info: [{"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.460 2 DEBUG oslo_concurrency.lockutils [req-5dad5610-5018-41ab-87b4-845e0d1fa9bc req-99cedaa7-652f-456c-919d-fcdb3b500817 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:05:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:27.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:05:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2288016458' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.776 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.804 2 DEBUG nova.storage.rbd_utils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] rbd image 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:05:27 np0005465988 nova_compute[236126]: 2025-10-02 13:05:27.810 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:05:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:05:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/868795829' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.287 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.291 2 DEBUG nova.virt.libvirt.vif [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1641251003',display_name='tempest-TestServerAdvancedOps-server-1641251003',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1641251003',id=205,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe4f31859f5d412a94d15bbb07e1e35f',ramdisk_id='',reservation_id='r-g41yb073',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-137168702',owner_user_name='tempest-TestServerAdvancedOps-137168702-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:05:22Z,user_data=None,user_id='10c60eb2034e4ded8a792115857927ff',uuid=016bb555-dc0d-42a6-9e52-552a2e62a3ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.292 2 DEBUG nova.network.os_vif_util [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Converting VIF {"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.293 2 DEBUG nova.network.os_vif_util [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.294 2 DEBUG nova.objects.instance [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lazy-loading 'pci_devices' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.312 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  <uuid>016bb555-dc0d-42a6-9e52-552a2e62a3ed</uuid>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  <name>instance-000000cd</name>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestServerAdvancedOps-server-1641251003</nova:name>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:05:27</nova:creationTime>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <nova:user uuid="10c60eb2034e4ded8a792115857927ff">tempest-TestServerAdvancedOps-137168702-project-member</nova:user>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <nova:project uuid="fe4f31859f5d412a94d15bbb07e1e35f">tempest-TestServerAdvancedOps-137168702</nova:project>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <nova:port uuid="66ed1f18-0610-4138-8cea-79f080445c81">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <entry name="serial">016bb555-dc0d-42a6-9e52-552a2e62a3ed</entry>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <entry name="uuid">016bb555-dc0d-42a6-9e52-552a2e62a3ed</entry>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk.config">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:06:af:6b"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <target dev="tap66ed1f18-06"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/016bb555-dc0d-42a6-9e52-552a2e62a3ed/console.log" append="off"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:05:28 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:05:28 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:05:28 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:05:28 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.314 2 DEBUG nova.compute.manager [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Preparing to wait for external event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.314 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.314 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.315 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.316 2 DEBUG nova.virt.libvirt.vif [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1641251003',display_name='tempest-TestServerAdvancedOps-server-1641251003',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1641251003',id=205,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe4f31859f5d412a94d15bbb07e1e35f',ramdisk_id='',reservation_id='r-g41yb073',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-137168702',owner_user_name='tempest-TestServerAdvancedOps-137168702-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:05:22Z,user_data=None,user_id='10c60eb2034e4ded8a792115857927ff',uuid=016bb555-dc0d-42a6-9e52-552a2e62a3ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.316 2 DEBUG nova.network.os_vif_util [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Converting VIF {"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.317 2 DEBUG nova.network.os_vif_util [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.317 2 DEBUG os_vif [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.319 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.319 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.324 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66ed1f18-06, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.324 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66ed1f18-06, col_values=(('external_ids', {'iface-id': '66ed1f18-0610-4138-8cea-79f080445c81', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:af:6b', 'vm-uuid': '016bb555-dc0d-42a6-9e52-552a2e62a3ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:28 np0005465988 NetworkManager[45041]: <info>  [1759410328.3280] manager: (tap66ed1f18-06): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/402)
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.338 2 INFO os_vif [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06')#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.391 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.392 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.392 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] No VIF found with MAC fa:16:3e:06:af:6b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.392 2 INFO nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Using config drive#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.416 2 DEBUG nova.storage.rbd_utils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] rbd image 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:05:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:28.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.883 2 INFO nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Creating config drive at /var/lib/nova/instances/016bb555-dc0d-42a6-9e52-552a2e62a3ed/disk.config#033[00m
Oct  2 09:05:28 np0005465988 nova_compute[236126]: 2025-10-02 13:05:28.888 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/016bb555-dc0d-42a6-9e52-552a2e62a3ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpot1068sg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:05:29 np0005465988 nova_compute[236126]: 2025-10-02 13:05:29.048 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/016bb555-dc0d-42a6-9e52-552a2e62a3ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpot1068sg" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:05:29 np0005465988 nova_compute[236126]: 2025-10-02 13:05:29.081 2 DEBUG nova.storage.rbd_utils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] rbd image 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:05:29 np0005465988 nova_compute[236126]: 2025-10-02 13:05:29.085 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/016bb555-dc0d-42a6-9e52-552a2e62a3ed/disk.config 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:05:29 np0005465988 nova_compute[236126]: 2025-10-02 13:05:29.285 2 DEBUG oslo_concurrency.processutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/016bb555-dc0d-42a6-9e52-552a2e62a3ed/disk.config 016bb555-dc0d-42a6-9e52-552a2e62a3ed_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.200s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:05:29 np0005465988 nova_compute[236126]: 2025-10-02 13:05:29.287 2 INFO nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Deleting local config drive /var/lib/nova/instances/016bb555-dc0d-42a6-9e52-552a2e62a3ed/disk.config because it was imported into RBD.#033[00m
Oct  2 09:05:29 np0005465988 kernel: tap66ed1f18-06: entered promiscuous mode
Oct  2 09:05:29 np0005465988 NetworkManager[45041]: <info>  [1759410329.4316] manager: (tap66ed1f18-06): new Tun device (/org/freedesktop/NetworkManager/Devices/403)
Oct  2 09:05:29 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:29Z|00915|binding|INFO|Claiming lport 66ed1f18-0610-4138-8cea-79f080445c81 for this chassis.
Oct  2 09:05:29 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:29Z|00916|binding|INFO|66ed1f18-0610-4138-8cea-79f080445c81: Claiming fa:16:3e:06:af:6b 10.100.0.8
Oct  2 09:05:29 np0005465988 nova_compute[236126]: 2025-10-02 13:05:29.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:29.445 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:af:6b 10.100.0.8'], port_security=['fa:16:3e:06:af:6b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '016bb555-dc0d-42a6-9e52-552a2e62a3ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-caee6415-691f-4a45-b08e-98a3dc829d14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe4f31859f5d412a94d15bbb07e1e35f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3b3f0d2e-39dd-4344-b55a-04abff5943c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44d7bb4d-6845-4f95-bb66-6c2a2f581889, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=66ed1f18-0610-4138-8cea-79f080445c81) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:05:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:29.446 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 66ed1f18-0610-4138-8cea-79f080445c81 in datapath caee6415-691f-4a45-b08e-98a3dc829d14 bound to our chassis#033[00m
Oct  2 09:05:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:29.447 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network caee6415-691f-4a45-b08e-98a3dc829d14 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 09:05:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:29.450 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[86bfc7ec-4a77-4e2c-84e2-c101d1714b43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:05:29 np0005465988 systemd-udevd[332856]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:05:29 np0005465988 systemd-machined[192594]: New machine qemu-95-instance-000000cd.
Oct  2 09:05:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:29.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:29 np0005465988 systemd[1]: Started Virtual Machine qemu-95-instance-000000cd.
Oct  2 09:05:29 np0005465988 nova_compute[236126]: 2025-10-02 13:05:29.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:29 np0005465988 NetworkManager[45041]: <info>  [1759410329.4884] device (tap66ed1f18-06): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:05:29 np0005465988 NetworkManager[45041]: <info>  [1759410329.4895] device (tap66ed1f18-06): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:05:29 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:29Z|00917|binding|INFO|Setting lport 66ed1f18-0610-4138-8cea-79f080445c81 ovn-installed in OVS
Oct  2 09:05:29 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:29Z|00918|binding|INFO|Setting lport 66ed1f18-0610-4138-8cea-79f080445c81 up in Southbound
Oct  2 09:05:29 np0005465988 nova_compute[236126]: 2025-10-02 13:05:29.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.033 2 DEBUG nova.compute.manager [req-7e4d6f67-34be-470d-aecd-232f7ba21c3c req-5f31444e-a626-496d-8d0c-b0cde1426c85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.035 2 DEBUG oslo_concurrency.lockutils [req-7e4d6f67-34be-470d-aecd-232f7ba21c3c req-5f31444e-a626-496d-8d0c-b0cde1426c85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.035 2 DEBUG oslo_concurrency.lockutils [req-7e4d6f67-34be-470d-aecd-232f7ba21c3c req-5f31444e-a626-496d-8d0c-b0cde1426c85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.036 2 DEBUG oslo_concurrency.lockutils [req-7e4d6f67-34be-470d-aecd-232f7ba21c3c req-5f31444e-a626-496d-8d0c-b0cde1426c85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.036 2 DEBUG nova.compute.manager [req-7e4d6f67-34be-470d-aecd-232f7ba21c3c req-5f31444e-a626-496d-8d0c-b0cde1426c85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Processing event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.036 2 DEBUG nova.compute.manager [req-7e4d6f67-34be-470d-aecd-232f7ba21c3c req-5f31444e-a626-496d-8d0c-b0cde1426c85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.037 2 DEBUG oslo_concurrency.lockutils [req-7e4d6f67-34be-470d-aecd-232f7ba21c3c req-5f31444e-a626-496d-8d0c-b0cde1426c85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.037 2 DEBUG oslo_concurrency.lockutils [req-7e4d6f67-34be-470d-aecd-232f7ba21c3c req-5f31444e-a626-496d-8d0c-b0cde1426c85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.037 2 DEBUG oslo_concurrency.lockutils [req-7e4d6f67-34be-470d-aecd-232f7ba21c3c req-5f31444e-a626-496d-8d0c-b0cde1426c85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.038 2 DEBUG nova.compute.manager [req-7e4d6f67-34be-470d-aecd-232f7ba21c3c req-5f31444e-a626-496d-8d0c-b0cde1426c85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] No waiting events found dispatching network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.038 2 WARNING nova.compute.manager [req-7e4d6f67-34be-470d-aecd-232f7ba21c3c req-5f31444e-a626-496d-8d0c-b0cde1426c85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received unexpected event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.420 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410330.418976, 016bb555-dc0d-42a6-9e52-552a2e62a3ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.420 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] VM Started (Lifecycle Event)#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.423 2 DEBUG nova.compute.manager [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.428 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.432 2 INFO nova.virt.libvirt.driver [-] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Instance spawned successfully.#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.432 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 09:05:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:30.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:05:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.450 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.457 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.463 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.463 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.464 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.464 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.464 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.465 2 DEBUG nova.virt.libvirt.driver [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.476 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.477 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410330.4231393, 016bb555-dc0d-42a6-9e52-552a2e62a3ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.477 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:05:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.545 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.550 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410330.426501, 016bb555-dc0d-42a6-9e52-552a2e62a3ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.550 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.595 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.600 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.613 2 INFO nova.compute.manager [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Took 8.23 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.613 2 DEBUG nova.compute.manager [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.623 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.695 2 INFO nova.compute.manager [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Took 9.25 seconds to build instance.#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.722 2 DEBUG oslo_concurrency.lockutils [None req-2e6341cf-5237-4286-891c-8d74fea1d784 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:30 np0005465988 nova_compute[236126]: 2025-10-02 13:05:30.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:05:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:31.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:05:31 np0005465988 nova_compute[236126]: 2025-10-02 13:05:31.812 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:32.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:32 np0005465988 nova_compute[236126]: 2025-10-02 13:05:32.476 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:33 np0005465988 nova_compute[236126]: 2025-10-02 13:05:33.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:33 np0005465988 nova_compute[236126]: 2025-10-02 13:05:33.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:33 np0005465988 nova_compute[236126]: 2025-10-02 13:05:33.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:05:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:33.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:33 np0005465988 podman[332959]: 2025-10-02 13:05:33.583810658 +0000 UTC m=+0.097083057 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 09:05:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:34.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:34 np0005465988 nova_compute[236126]: 2025-10-02 13:05:34.470 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:34 np0005465988 nova_compute[236126]: 2025-10-02 13:05:34.948 2 DEBUG nova.objects.instance [None req-8bf8b6b2-1148-48dd-8a1f-87775b693488 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lazy-loading 'pci_devices' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:34 np0005465988 nova_compute[236126]: 2025-10-02 13:05:34.979 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410334.979385, 016bb555-dc0d-42a6-9e52-552a2e62a3ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:05:34 np0005465988 nova_compute[236126]: 2025-10-02 13:05:34.980 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:34.999 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.006 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.034 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:35.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:35 np0005465988 kernel: tap66ed1f18-06 (unregistering): left promiscuous mode
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:35 np0005465988 NetworkManager[45041]: <info>  [1759410335.5281] device (tap66ed1f18-06): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:05:35 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:35Z|00919|binding|INFO|Releasing lport 66ed1f18-0610-4138-8cea-79f080445c81 from this chassis (sb_readonly=0)
Oct  2 09:05:35 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:35Z|00920|binding|INFO|Setting lport 66ed1f18-0610-4138-8cea-79f080445c81 down in Southbound
Oct  2 09:05:35 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:35Z|00921|binding|INFO|Removing iface tap66ed1f18-06 ovn-installed in OVS
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:35.548 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:af:6b 10.100.0.8'], port_security=['fa:16:3e:06:af:6b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '016bb555-dc0d-42a6-9e52-552a2e62a3ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-caee6415-691f-4a45-b08e-98a3dc829d14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe4f31859f5d412a94d15bbb07e1e35f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3b3f0d2e-39dd-4344-b55a-04abff5943c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44d7bb4d-6845-4f95-bb66-6c2a2f581889, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=66ed1f18-0610-4138-8cea-79f080445c81) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:05:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:35.549 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 66ed1f18-0610-4138-8cea-79f080445c81 in datapath caee6415-691f-4a45-b08e-98a3dc829d14 unbound from our chassis#033[00m
Oct  2 09:05:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:35.550 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network caee6415-691f-4a45-b08e-98a3dc829d14 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 09:05:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:35.552 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[df3f0d29-bd7c-4984-871c-6cb8db51665f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:35 np0005465988 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000cd.scope: Deactivated successfully.
Oct  2 09:05:35 np0005465988 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000cd.scope: Consumed 5.651s CPU time.
Oct  2 09:05:35 np0005465988 systemd-machined[192594]: Machine qemu-95-instance-000000cd terminated.
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.689 2 DEBUG nova.compute.manager [None req-8bf8b6b2-1148-48dd-8a1f-87775b693488 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #157. Immutable memtables: 0.
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.788590) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 99] Flushing memtable with next log file: 157
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410335788680, "job": 99, "event": "flush_started", "num_memtables": 1, "num_entries": 571, "num_deletes": 252, "total_data_size": 890718, "memory_usage": 901504, "flush_reason": "Manual Compaction"}
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 99] Level-0 flush table #158: started
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410335793255, "cf_name": "default", "job": 99, "event": "table_file_creation", "file_number": 158, "file_size": 503389, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 76901, "largest_seqno": 77467, "table_properties": {"data_size": 500449, "index_size": 911, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7949, "raw_average_key_size": 21, "raw_value_size": 494372, "raw_average_value_size": 1314, "num_data_blocks": 38, "num_entries": 376, "num_filter_entries": 376, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759410307, "oldest_key_time": 1759410307, "file_creation_time": 1759410335, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 158, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 99] Flush lasted 4702 microseconds, and 2686 cpu microseconds.
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.793307) [db/flush_job.cc:967] [default] [JOB 99] Level-0 flush table #158: 503389 bytes OK
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.793329) [db/memtable_list.cc:519] [default] Level-0 commit table #158 started
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.795518) [db/memtable_list.cc:722] [default] Level-0 commit table #158: memtable #1 done
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.795533) EVENT_LOG_v1 {"time_micros": 1759410335795527, "job": 99, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.795553) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 99] Try to delete WAL files size 887405, prev total WAL file size 887405, number of live WAL files 2.
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000154.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.796221) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032353130' seq:72057594037927935, type:22 .. '6D6772737461740032373633' seq:0, type:0; will stop at (end)
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 100] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 99 Base level 0, inputs: [158(491KB)], [156(13MB)]
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410335796260, "job": 100, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [158], "files_L6": [156], "score": -1, "input_data_size": 14293141, "oldest_snapshot_seqno": -1}
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.822 2 DEBUG nova.compute.manager [req-9bbca015-08e7-450c-8e25-779a6879ec63 req-42044363-1cf6-4ccb-90c7-e25d44a58548 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-unplugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.822 2 DEBUG oslo_concurrency.lockutils [req-9bbca015-08e7-450c-8e25-779a6879ec63 req-42044363-1cf6-4ccb-90c7-e25d44a58548 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.822 2 DEBUG oslo_concurrency.lockutils [req-9bbca015-08e7-450c-8e25-779a6879ec63 req-42044363-1cf6-4ccb-90c7-e25d44a58548 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.822 2 DEBUG oslo_concurrency.lockutils [req-9bbca015-08e7-450c-8e25-779a6879ec63 req-42044363-1cf6-4ccb-90c7-e25d44a58548 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.822 2 DEBUG nova.compute.manager [req-9bbca015-08e7-450c-8e25-779a6879ec63 req-42044363-1cf6-4ccb-90c7-e25d44a58548 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] No waiting events found dispatching network-vif-unplugged-66ed1f18-0610-4138-8cea-79f080445c81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.823 2 WARNING nova.compute.manager [req-9bbca015-08e7-450c-8e25-779a6879ec63 req-42044363-1cf6-4ccb-90c7-e25d44a58548 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received unexpected event network-vif-unplugged-66ed1f18-0610-4138-8cea-79f080445c81 for instance with vm_state suspended and task_state None.#033[00m
Oct  2 09:05:35 np0005465988 nova_compute[236126]: 2025-10-02 13:05:35.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 100] Generated table #159: 9603 keys, 10455411 bytes, temperature: kUnknown
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410335868637, "cf_name": "default", "job": 100, "event": "table_file_creation", "file_number": 159, "file_size": 10455411, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10396168, "index_size": 34106, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24069, "raw_key_size": 254400, "raw_average_key_size": 26, "raw_value_size": 10230603, "raw_average_value_size": 1065, "num_data_blocks": 1286, "num_entries": 9603, "num_filter_entries": 9603, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759410335, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 159, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.869019) [db/compaction/compaction_job.cc:1663] [default] [JOB 100] Compacted 1@0 + 1@6 files to L6 => 10455411 bytes
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.880289) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 197.2 rd, 144.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 13.2 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(49.2) write-amplify(20.8) OK, records in: 10120, records dropped: 517 output_compression: NoCompression
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.880349) EVENT_LOG_v1 {"time_micros": 1759410335880327, "job": 100, "event": "compaction_finished", "compaction_time_micros": 72492, "compaction_time_cpu_micros": 31551, "output_level": 6, "num_output_files": 1, "total_output_size": 10455411, "num_input_records": 10120, "num_output_records": 9603, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000158.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410335881148, "job": 100, "event": "table_file_deletion", "file_number": 158}
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000156.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410335886467, "job": 100, "event": "table_file_deletion", "file_number": 156}
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.796073) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.886535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.886558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.886563) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.886567) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:05:35.886571) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:05:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:36.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:36 np0005465988 nova_compute[236126]: 2025-10-02 13:05:36.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:37 np0005465988 nova_compute[236126]: 2025-10-02 13:05:37.256 2 INFO nova.compute.manager [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Resuming#033[00m
Oct  2 09:05:37 np0005465988 nova_compute[236126]: 2025-10-02 13:05:37.257 2 DEBUG nova.objects.instance [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lazy-loading 'flavor' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:37 np0005465988 nova_compute[236126]: 2025-10-02 13:05:37.305 2 DEBUG oslo_concurrency.lockutils [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquiring lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:05:37 np0005465988 nova_compute[236126]: 2025-10-02 13:05:37.306 2 DEBUG oslo_concurrency.lockutils [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquired lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:05:37 np0005465988 nova_compute[236126]: 2025-10-02 13:05:37.306 2 DEBUG nova.network.neutron [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:05:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:37.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:37 np0005465988 nova_compute[236126]: 2025-10-02 13:05:37.972 2 DEBUG nova.compute.manager [req-336670d7-ba6a-4d23-bf03-d9a82d561314 req-465109cf-78d1-4864-9d51-e254f2706427 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:37 np0005465988 nova_compute[236126]: 2025-10-02 13:05:37.973 2 DEBUG oslo_concurrency.lockutils [req-336670d7-ba6a-4d23-bf03-d9a82d561314 req-465109cf-78d1-4864-9d51-e254f2706427 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:37 np0005465988 nova_compute[236126]: 2025-10-02 13:05:37.973 2 DEBUG oslo_concurrency.lockutils [req-336670d7-ba6a-4d23-bf03-d9a82d561314 req-465109cf-78d1-4864-9d51-e254f2706427 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:37 np0005465988 nova_compute[236126]: 2025-10-02 13:05:37.974 2 DEBUG oslo_concurrency.lockutils [req-336670d7-ba6a-4d23-bf03-d9a82d561314 req-465109cf-78d1-4864-9d51-e254f2706427 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:37 np0005465988 nova_compute[236126]: 2025-10-02 13:05:37.974 2 DEBUG nova.compute.manager [req-336670d7-ba6a-4d23-bf03-d9a82d561314 req-465109cf-78d1-4864-9d51-e254f2706427 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] No waiting events found dispatching network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:05:37 np0005465988 nova_compute[236126]: 2025-10-02 13:05:37.974 2 WARNING nova.compute.manager [req-336670d7-ba6a-4d23-bf03-d9a82d561314 req-465109cf-78d1-4864-9d51-e254f2706427 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received unexpected event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 09:05:38 np0005465988 nova_compute[236126]: 2025-10-02 13:05:38.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:38.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.032 2 DEBUG nova.network.neutron [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Updating instance_info_cache with network_info: [{"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.061 2 DEBUG oslo_concurrency.lockutils [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Releasing lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.070 2 DEBUG nova.virt.libvirt.vif [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1641251003',display_name='tempest-TestServerAdvancedOps-server-1641251003',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1641251003',id=205,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:05:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='fe4f31859f5d412a94d15bbb07e1e35f',ramdisk_id='',reservation_id='r-g41yb073',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-137168702',owner_user_name='tempest-TestServerAdvancedOps-137168702-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:05:35Z,user_data=None,user_id='10c60eb2034e4ded8a792115857927ff',uuid=016bb555-dc0d-42a6-9e52-552a2e62a3ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.071 2 DEBUG nova.network.os_vif_util [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Converting VIF {"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.073 2 DEBUG nova.network.os_vif_util [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.073 2 DEBUG os_vif [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.075 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.076 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.082 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66ed1f18-06, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.082 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66ed1f18-06, col_values=(('external_ids', {'iface-id': '66ed1f18-0610-4138-8cea-79f080445c81', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:af:6b', 'vm-uuid': '016bb555-dc0d-42a6-9e52-552a2e62a3ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.083 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.084 2 INFO os_vif [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06')#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.119 2 DEBUG nova.objects.instance [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lazy-loading 'numa_topology' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:39 np0005465988 kernel: tap66ed1f18-06: entered promiscuous mode
Oct  2 09:05:39 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:39Z|00922|binding|INFO|Claiming lport 66ed1f18-0610-4138-8cea-79f080445c81 for this chassis.
Oct  2 09:05:39 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:39Z|00923|binding|INFO|66ed1f18-0610-4138-8cea-79f080445c81: Claiming fa:16:3e:06:af:6b 10.100.0.8
Oct  2 09:05:39 np0005465988 NetworkManager[45041]: <info>  [1759410339.2254] manager: (tap66ed1f18-06): new Tun device (/org/freedesktop/NetworkManager/Devices/404)
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:39.235 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:af:6b 10.100.0.8'], port_security=['fa:16:3e:06:af:6b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '016bb555-dc0d-42a6-9e52-552a2e62a3ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-caee6415-691f-4a45-b08e-98a3dc829d14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe4f31859f5d412a94d15bbb07e1e35f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '3b3f0d2e-39dd-4344-b55a-04abff5943c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44d7bb4d-6845-4f95-bb66-6c2a2f581889, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=66ed1f18-0610-4138-8cea-79f080445c81) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:05:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:39.239 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 66ed1f18-0610-4138-8cea-79f080445c81 in datapath caee6415-691f-4a45-b08e-98a3dc829d14 bound to our chassis#033[00m
Oct  2 09:05:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:39.240 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network caee6415-691f-4a45-b08e-98a3dc829d14 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 09:05:39 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:39.241 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e12e7603-97ec-4386-ae5e-03a2945b740a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:05:39 np0005465988 systemd-udevd[333016]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:05:39 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:39Z|00924|binding|INFO|Setting lport 66ed1f18-0610-4138-8cea-79f080445c81 ovn-installed in OVS
Oct  2 09:05:39 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:39Z|00925|binding|INFO|Setting lport 66ed1f18-0610-4138-8cea-79f080445c81 up in Southbound
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:39 np0005465988 nova_compute[236126]: 2025-10-02 13:05:39.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:39 np0005465988 systemd-machined[192594]: New machine qemu-96-instance-000000cd.
Oct  2 09:05:39 np0005465988 NetworkManager[45041]: <info>  [1759410339.2840] device (tap66ed1f18-06): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:05:39 np0005465988 NetworkManager[45041]: <info>  [1759410339.2855] device (tap66ed1f18-06): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:05:39 np0005465988 systemd[1]: Started Virtual Machine qemu-96-instance-000000cd.
Oct  2 09:05:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:39.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.082 2 DEBUG nova.compute.manager [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.084 2 DEBUG oslo_concurrency.lockutils [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.084 2 DEBUG oslo_concurrency.lockutils [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.084 2 DEBUG oslo_concurrency.lockutils [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.084 2 DEBUG nova.compute.manager [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] No waiting events found dispatching network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.084 2 WARNING nova.compute.manager [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received unexpected event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.085 2 DEBUG nova.compute.manager [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.085 2 DEBUG oslo_concurrency.lockutils [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.085 2 DEBUG oslo_concurrency.lockutils [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.085 2 DEBUG oslo_concurrency.lockutils [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.085 2 DEBUG nova.compute.manager [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] No waiting events found dispatching network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.085 2 WARNING nova.compute.manager [req-c03031bb-6c5c-432d-b11c-4aab354a6398 req-8278b0b6-8ac5-4d2b-a8cf-3751a8ab1c99 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received unexpected event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 09:05:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:40.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.491 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.492 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.492 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.492 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.520 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 016bb555-dc0d-42a6-9e52-552a2e62a3ed due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.521 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410340.5199435, 016bb555-dc0d-42a6-9e52-552a2e62a3ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.523 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] VM Started (Lifecycle Event)#033[00m
Oct  2 09:05:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.547 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.548 2 DEBUG nova.compute.manager [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.549 2 DEBUG nova.objects.instance [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lazy-loading 'pci_devices' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.554 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.573 2 INFO nova.virt.libvirt.driver [-] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Instance running successfully.#033[00m
Oct  2 09:05:40 np0005465988 virtqemud[235689]: argument unsupported: QEMU guest agent is not configured
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.577 2 DEBUG nova.virt.libvirt.guest [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.577 2 DEBUG nova.compute.manager [None req-34899ffe-fdeb-44de-82bb-a1f49b5f2e9f 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.579 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.579 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410340.5256765, 016bb555-dc0d-42a6-9e52-552a2e62a3ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.579 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.620 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.625 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:05:40 np0005465988 nova_compute[236126]: 2025-10-02 13:05:40.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.348 2 DEBUG nova.objects.instance [None req-6fe080cc-2cfb-4753-a856-bfdb1059ee85 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lazy-loading 'pci_devices' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.377 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410341.374505, 016bb555-dc0d-42a6-9e52-552a2e62a3ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.378 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.399 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.403 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.423 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Oct  2 09:05:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:41.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.724 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Updating instance_info_cache with network_info: [{"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.761 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.762 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 09:05:41 np0005465988 kernel: tap66ed1f18-06 (unregistering): left promiscuous mode
Oct  2 09:05:41 np0005465988 NetworkManager[45041]: <info>  [1759410341.9162] device (tap66ed1f18-06): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:41 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:41Z|00926|binding|INFO|Releasing lport 66ed1f18-0610-4138-8cea-79f080445c81 from this chassis (sb_readonly=0)
Oct  2 09:05:41 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:41Z|00927|binding|INFO|Setting lport 66ed1f18-0610-4138-8cea-79f080445c81 down in Southbound
Oct  2 09:05:41 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:41Z|00928|binding|INFO|Removing iface tap66ed1f18-06 ovn-installed in OVS
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:41.935 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:af:6b 10.100.0.8'], port_security=['fa:16:3e:06:af:6b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '016bb555-dc0d-42a6-9e52-552a2e62a3ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-caee6415-691f-4a45-b08e-98a3dc829d14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe4f31859f5d412a94d15bbb07e1e35f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3b3f0d2e-39dd-4344-b55a-04abff5943c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44d7bb4d-6845-4f95-bb66-6c2a2f581889, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=66ed1f18-0610-4138-8cea-79f080445c81) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:05:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:41.936 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 66ed1f18-0610-4138-8cea-79f080445c81 in datapath caee6415-691f-4a45-b08e-98a3dc829d14 unbound from our chassis#033[00m
Oct  2 09:05:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:41.937 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network caee6415-691f-4a45-b08e-98a3dc829d14 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 09:05:41 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:41.938 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[114a9cdd-a49f-40a9-8a9a-249ecb5daca9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:05:41 np0005465988 nova_compute[236126]: 2025-10-02 13:05:41.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:41 np0005465988 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000cd.scope: Deactivated successfully.
Oct  2 09:05:41 np0005465988 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000cd.scope: Consumed 2.020s CPU time.
Oct  2 09:05:41 np0005465988 systemd-machined[192594]: Machine qemu-96-instance-000000cd terminated.
Oct  2 09:05:42 np0005465988 NetworkManager[45041]: <info>  [1759410342.0469] manager: (tap66ed1f18-06): new Tun device (/org/freedesktop/NetworkManager/Devices/405)
Oct  2 09:05:42 np0005465988 nova_compute[236126]: 2025-10-02 13:05:42.060 2 DEBUG nova.compute.manager [None req-6fe080cc-2cfb-4753-a856-bfdb1059ee85 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:42 np0005465988 nova_compute[236126]: 2025-10-02 13:05:42.177 2 DEBUG nova.compute.manager [req-fb9d3e40-28cd-41f8-b41a-b357dd329bcb req-b1aa5bef-16af-4fe2-9a17-5dcfe9ee3d9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-unplugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:42 np0005465988 nova_compute[236126]: 2025-10-02 13:05:42.178 2 DEBUG oslo_concurrency.lockutils [req-fb9d3e40-28cd-41f8-b41a-b357dd329bcb req-b1aa5bef-16af-4fe2-9a17-5dcfe9ee3d9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:42 np0005465988 nova_compute[236126]: 2025-10-02 13:05:42.179 2 DEBUG oslo_concurrency.lockutils [req-fb9d3e40-28cd-41f8-b41a-b357dd329bcb req-b1aa5bef-16af-4fe2-9a17-5dcfe9ee3d9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:42 np0005465988 nova_compute[236126]: 2025-10-02 13:05:42.179 2 DEBUG oslo_concurrency.lockutils [req-fb9d3e40-28cd-41f8-b41a-b357dd329bcb req-b1aa5bef-16af-4fe2-9a17-5dcfe9ee3d9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:42 np0005465988 nova_compute[236126]: 2025-10-02 13:05:42.179 2 DEBUG nova.compute.manager [req-fb9d3e40-28cd-41f8-b41a-b357dd329bcb req-b1aa5bef-16af-4fe2-9a17-5dcfe9ee3d9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] No waiting events found dispatching network-vif-unplugged-66ed1f18-0610-4138-8cea-79f080445c81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:05:42 np0005465988 nova_compute[236126]: 2025-10-02 13:05:42.180 2 WARNING nova.compute.manager [req-fb9d3e40-28cd-41f8-b41a-b357dd329bcb req-b1aa5bef-16af-4fe2-9a17-5dcfe9ee3d9a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received unexpected event network-vif-unplugged-66ed1f18-0610-4138-8cea-79f080445c81 for instance with vm_state suspended and task_state None.#033[00m
Oct  2 09:05:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:42.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:43 np0005465988 nova_compute[236126]: 2025-10-02 13:05:43.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:43.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:44 np0005465988 nova_compute[236126]: 2025-10-02 13:05:44.031 2 INFO nova.compute.manager [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Resuming#033[00m
Oct  2 09:05:44 np0005465988 nova_compute[236126]: 2025-10-02 13:05:44.032 2 DEBUG nova.objects.instance [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lazy-loading 'flavor' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:44 np0005465988 nova_compute[236126]: 2025-10-02 13:05:44.081 2 DEBUG oslo_concurrency.lockutils [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquiring lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:05:44 np0005465988 nova_compute[236126]: 2025-10-02 13:05:44.082 2 DEBUG oslo_concurrency.lockutils [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquired lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:05:44 np0005465988 nova_compute[236126]: 2025-10-02 13:05:44.083 2 DEBUG nova.network.neutron [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:05:44 np0005465988 nova_compute[236126]: 2025-10-02 13:05:44.256 2 DEBUG nova.compute.manager [req-5cc6577a-6aa8-4709-bf67-4a4f14b5d5b3 req-bb880ad2-e58c-42d7-84bc-397752a93e53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:44 np0005465988 nova_compute[236126]: 2025-10-02 13:05:44.257 2 DEBUG oslo_concurrency.lockutils [req-5cc6577a-6aa8-4709-bf67-4a4f14b5d5b3 req-bb880ad2-e58c-42d7-84bc-397752a93e53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:44 np0005465988 nova_compute[236126]: 2025-10-02 13:05:44.257 2 DEBUG oslo_concurrency.lockutils [req-5cc6577a-6aa8-4709-bf67-4a4f14b5d5b3 req-bb880ad2-e58c-42d7-84bc-397752a93e53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:44 np0005465988 nova_compute[236126]: 2025-10-02 13:05:44.258 2 DEBUG oslo_concurrency.lockutils [req-5cc6577a-6aa8-4709-bf67-4a4f14b5d5b3 req-bb880ad2-e58c-42d7-84bc-397752a93e53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:44 np0005465988 nova_compute[236126]: 2025-10-02 13:05:44.258 2 DEBUG nova.compute.manager [req-5cc6577a-6aa8-4709-bf67-4a4f14b5d5b3 req-bb880ad2-e58c-42d7-84bc-397752a93e53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] No waiting events found dispatching network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:05:44 np0005465988 nova_compute[236126]: 2025-10-02 13:05:44.258 2 WARNING nova.compute.manager [req-5cc6577a-6aa8-4709-bf67-4a4f14b5d5b3 req-bb880ad2-e58c-42d7-84bc-397752a93e53 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received unexpected event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 09:05:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:44.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:44 np0005465988 podman[333140]: 2025-10-02 13:05:44.553067996 +0000 UTC m=+0.075352831 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 09:05:44 np0005465988 podman[333141]: 2025-10-02 13:05:44.560191261 +0000 UTC m=+0.082228409 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 09:05:44 np0005465988 podman[333139]: 2025-10-02 13:05:44.594090608 +0000 UTC m=+0.116386804 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:05:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:45.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.693 2 DEBUG nova.network.neutron [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Updating instance_info_cache with network_info: [{"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.731 2 DEBUG oslo_concurrency.lockutils [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Releasing lock "refresh_cache-016bb555-dc0d-42a6-9e52-552a2e62a3ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.740 2 DEBUG nova.virt.libvirt.vif [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1641251003',display_name='tempest-TestServerAdvancedOps-server-1641251003',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1641251003',id=205,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:05:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='fe4f31859f5d412a94d15bbb07e1e35f',ramdisk_id='',reservation_id='r-g41yb073',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-137168702',owner_user_name='tempest-TestServerAdvancedOps-137168702-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:05:42Z,user_data=None,user_id='10c60eb2034e4ded8a792115857927ff',uuid=016bb555-dc0d-42a6-9e52-552a2e62a3ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.741 2 DEBUG nova.network.os_vif_util [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Converting VIF {"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.743 2 DEBUG nova.network.os_vif_util [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.743 2 DEBUG os_vif [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.744 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.744 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.748 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66ed1f18-06, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.749 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66ed1f18-06, col_values=(('external_ids', {'iface-id': '66ed1f18-0610-4138-8cea-79f080445c81', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:af:6b', 'vm-uuid': '016bb555-dc0d-42a6-9e52-552a2e62a3ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.749 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.749 2 INFO os_vif [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06')#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.773 2 DEBUG nova.objects.instance [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lazy-loading 'numa_topology' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:45 np0005465988 kernel: tap66ed1f18-06: entered promiscuous mode
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:45 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:45Z|00929|binding|INFO|Claiming lport 66ed1f18-0610-4138-8cea-79f080445c81 for this chassis.
Oct  2 09:05:45 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:45Z|00930|binding|INFO|66ed1f18-0610-4138-8cea-79f080445c81: Claiming fa:16:3e:06:af:6b 10.100.0.8
Oct  2 09:05:45 np0005465988 NetworkManager[45041]: <info>  [1759410345.8725] manager: (tap66ed1f18-06): new Tun device (/org/freedesktop/NetworkManager/Devices/406)
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:45 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:45Z|00931|binding|INFO|Setting lport 66ed1f18-0610-4138-8cea-79f080445c81 ovn-installed in OVS
Oct  2 09:05:45 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:45Z|00932|binding|INFO|Setting lport 66ed1f18-0610-4138-8cea-79f080445c81 up in Southbound
Oct  2 09:05:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:45.884 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:af:6b 10.100.0.8'], port_security=['fa:16:3e:06:af:6b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '016bb555-dc0d-42a6-9e52-552a2e62a3ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-caee6415-691f-4a45-b08e-98a3dc829d14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe4f31859f5d412a94d15bbb07e1e35f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '3b3f0d2e-39dd-4344-b55a-04abff5943c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44d7bb4d-6845-4f95-bb66-6c2a2f581889, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=66ed1f18-0610-4138-8cea-79f080445c81) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:05:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:45.886 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 66ed1f18-0610-4138-8cea-79f080445c81 in datapath caee6415-691f-4a45-b08e-98a3dc829d14 bound to our chassis#033[00m
Oct  2 09:05:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:45.887 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network caee6415-691f-4a45-b08e-98a3dc829d14 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 09:05:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:45.888 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6ca5af-f59d-4b3f-be2d-b3248978f7ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:05:45 np0005465988 nova_compute[236126]: 2025-10-02 13:05:45.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:45 np0005465988 systemd-udevd[333216]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:05:45 np0005465988 systemd-machined[192594]: New machine qemu-97-instance-000000cd.
Oct  2 09:05:45 np0005465988 NetworkManager[45041]: <info>  [1759410345.9119] device (tap66ed1f18-06): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:05:45 np0005465988 NetworkManager[45041]: <info>  [1759410345.9128] device (tap66ed1f18-06): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:05:45 np0005465988 systemd[1]: Started Virtual Machine qemu-97-instance-000000cd.
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.374 2 DEBUG nova.compute.manager [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.375 2 DEBUG oslo_concurrency.lockutils [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.375 2 DEBUG oslo_concurrency.lockutils [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.375 2 DEBUG oslo_concurrency.lockutils [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.376 2 DEBUG nova.compute.manager [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] No waiting events found dispatching network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.376 2 WARNING nova.compute.manager [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received unexpected event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.376 2 DEBUG nova.compute.manager [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.376 2 DEBUG oslo_concurrency.lockutils [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.376 2 DEBUG oslo_concurrency.lockutils [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.377 2 DEBUG oslo_concurrency.lockutils [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.377 2 DEBUG nova.compute.manager [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] No waiting events found dispatching network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:05:46 np0005465988 nova_compute[236126]: 2025-10-02 13:05:46.377 2 WARNING nova.compute.manager [req-b019fac4-b700-4d6f-8267-7b28bb4dfb22 req-b4b1edca-de1f-4dbf-8758-fb69033fd806 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received unexpected event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 09:05:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:46.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.013 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 016bb555-dc0d-42a6-9e52-552a2e62a3ed due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.016 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410347.0128706, 016bb555-dc0d-42a6-9e52-552a2e62a3ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.017 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] VM Started (Lifecycle Event)#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.030 2 DEBUG nova.compute.manager [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.031 2 DEBUG nova.objects.instance [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lazy-loading 'pci_devices' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.046 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.051 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.053 2 INFO nova.virt.libvirt.driver [-] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Instance running successfully.#033[00m
Oct  2 09:05:47 np0005465988 virtqemud[235689]: argument unsupported: QEMU guest agent is not configured
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.058 2 DEBUG nova.virt.libvirt.guest [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.059 2 DEBUG nova.compute.manager [None req-0364a02a-23bb-4547-8db3-71fcd3ce5c74 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.072 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.072 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410347.0202498, 016bb555-dc0d-42a6-9e52-552a2e62a3ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.072 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.097 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:05:47 np0005465988 nova_compute[236126]: 2025-10-02 13:05:47.101 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:05:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:47.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:48.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.468 2 DEBUG oslo_concurrency.lockutils [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.468 2 DEBUG oslo_concurrency.lockutils [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.469 2 DEBUG oslo_concurrency.lockutils [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.469 2 DEBUG oslo_concurrency.lockutils [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.469 2 DEBUG oslo_concurrency.lockutils [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.470 2 INFO nova.compute.manager [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Terminating instance#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.471 2 DEBUG nova.compute.manager [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 09:05:48 np0005465988 kernel: tap66ed1f18-06 (unregistering): left promiscuous mode
Oct  2 09:05:48 np0005465988 NetworkManager[45041]: <info>  [1759410348.5159] device (tap66ed1f18-06): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:48 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:48Z|00933|binding|INFO|Releasing lport 66ed1f18-0610-4138-8cea-79f080445c81 from this chassis (sb_readonly=0)
Oct  2 09:05:48 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:48Z|00934|binding|INFO|Setting lport 66ed1f18-0610-4138-8cea-79f080445c81 down in Southbound
Oct  2 09:05:48 np0005465988 ovn_controller[132601]: 2025-10-02T13:05:48Z|00935|binding|INFO|Removing iface tap66ed1f18-06 ovn-installed in OVS
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:48.534 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:af:6b 10.100.0.8'], port_security=['fa:16:3e:06:af:6b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '016bb555-dc0d-42a6-9e52-552a2e62a3ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-caee6415-691f-4a45-b08e-98a3dc829d14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe4f31859f5d412a94d15bbb07e1e35f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3b3f0d2e-39dd-4344-b55a-04abff5943c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44d7bb4d-6845-4f95-bb66-6c2a2f581889, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=66ed1f18-0610-4138-8cea-79f080445c81) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:05:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:48.535 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 66ed1f18-0610-4138-8cea-79f080445c81 in datapath caee6415-691f-4a45-b08e-98a3dc829d14 unbound from our chassis#033[00m
Oct  2 09:05:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:48.536 142124 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network caee6415-691f-4a45-b08e-98a3dc829d14 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 09:05:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:48.537 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[af8a53ac-962f-406c-8341-957bc41ac5e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:48 np0005465988 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000cd.scope: Deactivated successfully.
Oct  2 09:05:48 np0005465988 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000cd.scope: Consumed 2.425s CPU time.
Oct  2 09:05:48 np0005465988 systemd-machined[192594]: Machine qemu-97-instance-000000cd terminated.
Oct  2 09:05:48 np0005465988 NetworkManager[45041]: <info>  [1759410348.6971] manager: (tap66ed1f18-06): new Tun device (/org/freedesktop/NetworkManager/Devices/407)
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.721 2 INFO nova.virt.libvirt.driver [-] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Instance destroyed successfully.#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.722 2 DEBUG nova.objects.instance [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lazy-loading 'resources' on Instance uuid 016bb555-dc0d-42a6-9e52-552a2e62a3ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.738 2 DEBUG nova.virt.libvirt.vif [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1641251003',display_name='tempest-TestServerAdvancedOps-server-1641251003',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1641251003',id=205,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:05:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fe4f31859f5d412a94d15bbb07e1e35f',ramdisk_id='',reservation_id='r-g41yb073',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerAdvancedOps-137168702',owner_user_name='tempest-TestServerAdvancedOps-137168702-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:05:47Z,user_data=None,user_id='10c60eb2034e4ded8a792115857927ff',uuid=016bb555-dc0d-42a6-9e52-552a2e62a3ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.738 2 DEBUG nova.network.os_vif_util [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Converting VIF {"id": "66ed1f18-0610-4138-8cea-79f080445c81", "address": "fa:16:3e:06:af:6b", "network": {"id": "caee6415-691f-4a45-b08e-98a3dc829d14", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-356276294-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "fe4f31859f5d412a94d15bbb07e1e35f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ed1f18-06", "ovs_interfaceid": "66ed1f18-0610-4138-8cea-79f080445c81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.739 2 DEBUG nova.network.os_vif_util [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.740 2 DEBUG os_vif [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.742 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66ed1f18-06, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.748 2 INFO os_vif [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:af:6b,bridge_name='br-int',has_traffic_filtering=True,id=66ed1f18-0610-4138-8cea-79f080445c81,network=Network(caee6415-691f-4a45-b08e-98a3dc829d14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ed1f18-06')#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.792 2 DEBUG nova.compute.manager [req-8775321d-a0c7-4b22-abb6-4ede1f2ee136 req-24c557ec-347f-491e-bd07-1c15b02f4f55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-unplugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.793 2 DEBUG oslo_concurrency.lockutils [req-8775321d-a0c7-4b22-abb6-4ede1f2ee136 req-24c557ec-347f-491e-bd07-1c15b02f4f55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.793 2 DEBUG oslo_concurrency.lockutils [req-8775321d-a0c7-4b22-abb6-4ede1f2ee136 req-24c557ec-347f-491e-bd07-1c15b02f4f55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.793 2 DEBUG oslo_concurrency.lockutils [req-8775321d-a0c7-4b22-abb6-4ede1f2ee136 req-24c557ec-347f-491e-bd07-1c15b02f4f55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.793 2 DEBUG nova.compute.manager [req-8775321d-a0c7-4b22-abb6-4ede1f2ee136 req-24c557ec-347f-491e-bd07-1c15b02f4f55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] No waiting events found dispatching network-vif-unplugged-66ed1f18-0610-4138-8cea-79f080445c81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:05:48 np0005465988 nova_compute[236126]: 2025-10-02 13:05:48.793 2 DEBUG nova.compute.manager [req-8775321d-a0c7-4b22-abb6-4ede1f2ee136 req-24c557ec-347f-491e-bd07-1c15b02f4f55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-unplugged-66ed1f18-0610-4138-8cea-79f080445c81 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:05:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:49.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.097 2 INFO nova.virt.libvirt.driver [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Deleting instance files /var/lib/nova/instances/016bb555-dc0d-42a6-9e52-552a2e62a3ed_del#033[00m
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.098 2 INFO nova.virt.libvirt.driver [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Deletion of /var/lib/nova/instances/016bb555-dc0d-42a6-9e52-552a2e62a3ed_del complete#033[00m
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.145 2 INFO nova.compute.manager [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Took 1.67 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.146 2 DEBUG oslo.service.loopingcall [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.146 2 DEBUG nova.compute.manager [-] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.147 2 DEBUG nova.network.neutron [-] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 09:05:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:50.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.890 2 DEBUG nova.compute.manager [req-322e61ff-0547-412a-92d4-6af4a3809ad5 req-6366327c-9d0d-4d8c-9b44-a53c32758463 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.890 2 DEBUG oslo_concurrency.lockutils [req-322e61ff-0547-412a-92d4-6af4a3809ad5 req-6366327c-9d0d-4d8c-9b44-a53c32758463 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.891 2 DEBUG oslo_concurrency.lockutils [req-322e61ff-0547-412a-92d4-6af4a3809ad5 req-6366327c-9d0d-4d8c-9b44-a53c32758463 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.891 2 DEBUG oslo_concurrency.lockutils [req-322e61ff-0547-412a-92d4-6af4a3809ad5 req-6366327c-9d0d-4d8c-9b44-a53c32758463 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.892 2 DEBUG nova.compute.manager [req-322e61ff-0547-412a-92d4-6af4a3809ad5 req-6366327c-9d0d-4d8c-9b44-a53c32758463 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] No waiting events found dispatching network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:05:50 np0005465988 nova_compute[236126]: 2025-10-02 13:05:50.892 2 WARNING nova.compute.manager [req-322e61ff-0547-412a-92d4-6af4a3809ad5 req-6366327c-9d0d-4d8c-9b44-a53c32758463 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received unexpected event network-vif-plugged-66ed1f18-0610-4138-8cea-79f080445c81 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 09:05:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:51.043 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:51.046 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.083 2 DEBUG nova.network.neutron [-] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.108 2 INFO nova.compute.manager [-] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Took 0.96 seconds to deallocate network for instance.#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.150 2 DEBUG oslo_concurrency.lockutils [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.151 2 DEBUG oslo_concurrency.lockutils [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.165 2 DEBUG nova.compute.manager [req-457c414e-2d00-4573-8fde-197d0b283c7e req-9bfd78e0-6d4a-4bd7-b1e9-6cc38003f5c9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Received event network-vif-deleted-66ed1f18-0610-4138-8cea-79f080445c81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.201 2 DEBUG oslo_concurrency.processutils [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:05:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:51.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:05:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3817092173' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.664 2 DEBUG oslo_concurrency.processutils [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.672 2 DEBUG nova.compute.provider_tree [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.692 2 DEBUG nova.scheduler.client.report [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.715 2 DEBUG oslo_concurrency.lockutils [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.739 2 INFO nova.scheduler.client.report [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Deleted allocations for instance 016bb555-dc0d-42a6-9e52-552a2e62a3ed#033[00m
Oct  2 09:05:51 np0005465988 nova_compute[236126]: 2025-10-02 13:05:51.796 2 DEBUG oslo_concurrency.lockutils [None req-dd1155a8-b6f2-4dce-b5d0-039148ea3158 10c60eb2034e4ded8a792115857927ff fe4f31859f5d412a94d15bbb07e1e35f - - default default] Lock "016bb555-dc0d-42a6-9e52-552a2e62a3ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:52.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:53 np0005465988 nova_compute[236126]: 2025-10-02 13:05:53.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:05:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:53.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:05:53 np0005465988 nova_compute[236126]: 2025-10-02 13:05:53.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:05:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:54.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:05:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:55.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:05:55 np0005465988 nova_compute[236126]: 2025-10-02 13:05:55.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:05:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:56.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:05:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:57.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:05:58.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:05:58 np0005465988 nova_compute[236126]: 2025-10-02 13:05:58.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:05:59.049 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:05:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:05:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:05:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:05:59.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:00.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:00 np0005465988 nova_compute[236126]: 2025-10-02 13:06:00.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:01.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:02.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:03.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:03 np0005465988 nova_compute[236126]: 2025-10-02 13:06:03.720 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410348.7182984, 016bb555-dc0d-42a6-9e52-552a2e62a3ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:06:03 np0005465988 nova_compute[236126]: 2025-10-02 13:06:03.721 2 INFO nova.compute.manager [-] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:06:03 np0005465988 nova_compute[236126]: 2025-10-02 13:06:03.742 2 DEBUG nova.compute.manager [None req-c2b66c25-6fdc-4992-bdc1-feb0d406c435 - - - - - -] [instance: 016bb555-dc0d-42a6-9e52-552a2e62a3ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:06:03 np0005465988 nova_compute[236126]: 2025-10-02 13:06:03.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:04.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:04 np0005465988 podman[333384]: 2025-10-02 13:06:04.542465871 +0000 UTC m=+0.072691824 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 09:06:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:05.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:05 np0005465988 nova_compute[236126]: 2025-10-02 13:06:05.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:06.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:07.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:08.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:08 np0005465988 nova_compute[236126]: 2025-10-02 13:06:08.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:09.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:10.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:10 np0005465988 nova_compute[236126]: 2025-10-02 13:06:10.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:11.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:12.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:13.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:13 np0005465988 nova_compute[236126]: 2025-10-02 13:06:13.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:14.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:15 np0005465988 podman[333412]: 2025-10-02 13:06:15.536383262 +0000 UTC m=+0.064444838 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:06:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:15.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:15 np0005465988 podman[333411]: 2025-10-02 13:06:15.560351252 +0000 UTC m=+0.094943296 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid)
Oct  2 09:06:15 np0005465988 podman[333410]: 2025-10-02 13:06:15.596299817 +0000 UTC m=+0.131300262 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 09:06:15 np0005465988 nova_compute[236126]: 2025-10-02 13:06:15.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:16.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:17 np0005465988 nova_compute[236126]: 2025-10-02 13:06:17.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:17.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:18.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:18 np0005465988 nova_compute[236126]: 2025-10-02 13:06:18.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:19 np0005465988 nova_compute[236126]: 2025-10-02 13:06:19.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:19.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:19 np0005465988 nova_compute[236126]: 2025-10-02 13:06:19.559 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:19 np0005465988 nova_compute[236126]: 2025-10-02 13:06:19.560 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:19 np0005465988 nova_compute[236126]: 2025-10-02 13:06:19.561 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:19 np0005465988 nova_compute[236126]: 2025-10-02 13:06:19.561 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:06:19 np0005465988 nova_compute[236126]: 2025-10-02 13:06:19.561 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:06:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:06:20 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2478314371' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.077 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.268 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.270 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4002MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.270 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.270 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:20.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.512 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.512 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.529 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 09:06:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.557 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.557 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.577 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.599 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.612 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:06:20 np0005465988 nova_compute[236126]: 2025-10-02 13:06:20.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:06:21 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2674143807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.082 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.087 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.105 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.125 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.125 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:21.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.762 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Acquiring lock "50929a91-e103-467c-a547-5597dfc86ac8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.762 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.791 2 DEBUG nova.compute.manager [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.900 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.901 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.909 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:06:21 np0005465988 nova_compute[236126]: 2025-10-02 13:06:21.909 2 INFO nova.compute.claims [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:06:22 np0005465988 nova_compute[236126]: 2025-10-02 13:06:22.043 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:06:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:06:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2851169436' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:06:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:22.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:22 np0005465988 nova_compute[236126]: 2025-10-02 13:06:22.511 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:06:22 np0005465988 nova_compute[236126]: 2025-10-02 13:06:22.520 2 DEBUG nova.compute.provider_tree [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:06:22 np0005465988 nova_compute[236126]: 2025-10-02 13:06:22.545 2 DEBUG nova.scheduler.client.report [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:06:22 np0005465988 nova_compute[236126]: 2025-10-02 13:06:22.595 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:22 np0005465988 nova_compute[236126]: 2025-10-02 13:06:22.597 2 DEBUG nova.compute.manager [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:06:22 np0005465988 nova_compute[236126]: 2025-10-02 13:06:22.838 2 DEBUG nova.compute.manager [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:06:22 np0005465988 nova_compute[236126]: 2025-10-02 13:06:22.839 2 DEBUG nova.network.neutron [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:06:22 np0005465988 nova_compute[236126]: 2025-10-02 13:06:22.963 2 INFO nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.037 2 DEBUG nova.compute.manager [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.312 2 DEBUG nova.compute.manager [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.314 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.315 2 INFO nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Creating image(s)#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.341 2 DEBUG nova.storage.rbd_utils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] rbd image 50929a91-e103-467c-a547-5597dfc86ac8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.374 2 DEBUG nova.storage.rbd_utils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] rbd image 50929a91-e103-467c-a547-5597dfc86ac8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.403 2 DEBUG nova.storage.rbd_utils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] rbd image 50929a91-e103-467c-a547-5597dfc86ac8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.411 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.511 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.512 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.513 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.513 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.542 2 DEBUG nova.storage.rbd_utils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] rbd image 50929a91-e103-467c-a547-5597dfc86ac8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.548 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 50929a91-e103-467c-a547-5597dfc86ac8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:06:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:06:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:23.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.795 2 DEBUG nova.policy [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c3981911a48146829ed4203fd206e201', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a8be669d844b4165b675a7d02f2d8588', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:06:23 np0005465988 nova_compute[236126]: 2025-10-02 13:06:23.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:24.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:24 np0005465988 nova_compute[236126]: 2025-10-02 13:06:24.686 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 50929a91-e103-467c-a547-5597dfc86ac8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:06:24 np0005465988 nova_compute[236126]: 2025-10-02 13:06:24.772 2 DEBUG nova.storage.rbd_utils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] resizing rbd image 50929a91-e103-467c-a547-5597dfc86ac8_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 09:06:24 np0005465988 nova_compute[236126]: 2025-10-02 13:06:24.992 2 DEBUG nova.objects.instance [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lazy-loading 'migration_context' on Instance uuid 50929a91-e103-467c-a547-5597dfc86ac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:06:25 np0005465988 nova_compute[236126]: 2025-10-02 13:06:25.007 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 09:06:25 np0005465988 nova_compute[236126]: 2025-10-02 13:06:25.007 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Ensure instance console log exists: /var/lib/nova/instances/50929a91-e103-467c-a547-5597dfc86ac8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:06:25 np0005465988 nova_compute[236126]: 2025-10-02 13:06:25.008 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:25 np0005465988 nova_compute[236126]: 2025-10-02 13:06:25.008 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:25 np0005465988 nova_compute[236126]: 2025-10-02 13:06:25.008 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:25 np0005465988 nova_compute[236126]: 2025-10-02 13:06:25.514 2 DEBUG nova.network.neutron [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Successfully created port: 363647f9-a022-4664-8db3-d0fb75c27b3e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:06:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:25.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:25 np0005465988 nova_compute[236126]: 2025-10-02 13:06:25.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:26.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:26 np0005465988 nova_compute[236126]: 2025-10-02 13:06:26.623 2 DEBUG nova.network.neutron [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Successfully updated port: 363647f9-a022-4664-8db3-d0fb75c27b3e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:06:26 np0005465988 nova_compute[236126]: 2025-10-02 13:06:26.641 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Acquiring lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:06:26 np0005465988 nova_compute[236126]: 2025-10-02 13:06:26.641 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Acquired lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:06:26 np0005465988 nova_compute[236126]: 2025-10-02 13:06:26.642 2 DEBUG nova.network.neutron [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:06:26 np0005465988 nova_compute[236126]: 2025-10-02 13:06:26.719 2 DEBUG nova.compute.manager [req-95dd5c41-3fb5-44e0-bdc3-cd767113a279 req-3b3d0125-a61a-4289-8798-73a9c2244b8c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Received event network-changed-363647f9-a022-4664-8db3-d0fb75c27b3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:06:26 np0005465988 nova_compute[236126]: 2025-10-02 13:06:26.721 2 DEBUG nova.compute.manager [req-95dd5c41-3fb5-44e0-bdc3-cd767113a279 req-3b3d0125-a61a-4289-8798-73a9c2244b8c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Refreshing instance network info cache due to event network-changed-363647f9-a022-4664-8db3-d0fb75c27b3e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:06:26 np0005465988 nova_compute[236126]: 2025-10-02 13:06:26.721 2 DEBUG oslo_concurrency.lockutils [req-95dd5c41-3fb5-44e0-bdc3-cd767113a279 req-3b3d0125-a61a-4289-8798-73a9c2244b8c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:06:26 np0005465988 nova_compute[236126]: 2025-10-02 13:06:26.763 2 DEBUG nova.network.neutron [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:06:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:27.413 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:27.413 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:27.413 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:27.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.286 2 DEBUG nova.network.neutron [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Updating instance_info_cache with network_info: [{"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.314 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Releasing lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.315 2 DEBUG nova.compute.manager [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Instance network_info: |[{"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.316 2 DEBUG oslo_concurrency.lockutils [req-95dd5c41-3fb5-44e0-bdc3-cd767113a279 req-3b3d0125-a61a-4289-8798-73a9c2244b8c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.316 2 DEBUG nova.network.neutron [req-95dd5c41-3fb5-44e0-bdc3-cd767113a279 req-3b3d0125-a61a-4289-8798-73a9c2244b8c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Refreshing network info cache for port 363647f9-a022-4664-8db3-d0fb75c27b3e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.318 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Start _get_guest_xml network_info=[{"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.323 2 WARNING nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.332 2 DEBUG nova.virt.libvirt.host [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.333 2 DEBUG nova.virt.libvirt.host [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.344 2 DEBUG nova.virt.libvirt.host [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.345 2 DEBUG nova.virt.libvirt.host [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.346 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.347 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.347 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.347 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.348 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.348 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.348 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.348 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.349 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.349 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.349 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.349 2 DEBUG nova.virt.hardware [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.352 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:06:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:28.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:06:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/273144948' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.833 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.863 2 DEBUG nova.storage.rbd_utils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] rbd image 50929a91-e103-467c-a547-5597dfc86ac8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:06:28 np0005465988 nova_compute[236126]: 2025-10-02 13:06:28.869 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:06:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:06:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/425471803' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.467 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.597s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.470 2 DEBUG nova.virt.libvirt.vif [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:06:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1745601652',display_name='tempest-TestServerBasicOps-server-1745601652',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1745601652',id=206,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJH6ER6qsk5vIhITuZGAQJN77EvOuMzRdBHmezINmsQ7sYkUFyL7iu5F4znbCp0OZ9y3ZkDbpscct+0VcKpSpFDIlxhwrSMYQkrsrdbQZb7UnWEF5iZfu6WSbVMUUb0a5w==',key_name='tempest-TestServerBasicOps-457309094',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8be669d844b4165b675a7d02f2d8588',ramdisk_id='',reservation_id='r-qbqltmho',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-857878669',owner_user_name='tempest-TestServerBasicOps-857878669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:06:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c3981911a48146829ed4203fd206e201',uuid=50929a91-e103-467c-a547-5597dfc86ac8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.471 2 DEBUG nova.network.os_vif_util [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Converting VIF {"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.473 2 DEBUG nova.network.os_vif_util [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=363647f9-a022-4664-8db3-d0fb75c27b3e,network=Network(575de21c-6b12-4449-9e8c-4deaa781bf0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap363647f9-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.475 2 DEBUG nova.objects.instance [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lazy-loading 'pci_devices' on Instance uuid 50929a91-e103-467c-a547-5597dfc86ac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.509 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  <uuid>50929a91-e103-467c-a547-5597dfc86ac8</uuid>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  <name>instance-000000ce</name>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestServerBasicOps-server-1745601652</nova:name>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:06:28</nova:creationTime>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <nova:user uuid="c3981911a48146829ed4203fd206e201">tempest-TestServerBasicOps-857878669-project-member</nova:user>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <nova:project uuid="a8be669d844b4165b675a7d02f2d8588">tempest-TestServerBasicOps-857878669</nova:project>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <nova:port uuid="363647f9-a022-4664-8db3-d0fb75c27b3e">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <entry name="serial">50929a91-e103-467c-a547-5597dfc86ac8</entry>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <entry name="uuid">50929a91-e103-467c-a547-5597dfc86ac8</entry>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/50929a91-e103-467c-a547-5597dfc86ac8_disk">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/50929a91-e103-467c-a547-5597dfc86ac8_disk.config">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:2c:ba:7a"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <target dev="tap363647f9-a0"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/50929a91-e103-467c-a547-5597dfc86ac8/console.log" append="off"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:06:29 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:06:29 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:06:29 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:06:29 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.512 2 DEBUG nova.compute.manager [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Preparing to wait for external event network-vif-plugged-363647f9-a022-4664-8db3-d0fb75c27b3e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.512 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Acquiring lock "50929a91-e103-467c-a547-5597dfc86ac8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.513 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.513 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.514 2 DEBUG nova.virt.libvirt.vif [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:06:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1745601652',display_name='tempest-TestServerBasicOps-server-1745601652',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1745601652',id=206,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJH6ER6qsk5vIhITuZGAQJN77EvOuMzRdBHmezINmsQ7sYkUFyL7iu5F4znbCp0OZ9y3ZkDbpscct+0VcKpSpFDIlxhwrSMYQkrsrdbQZb7UnWEF5iZfu6WSbVMUUb0a5w==',key_name='tempest-TestServerBasicOps-457309094',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8be669d844b4165b675a7d02f2d8588',ramdisk_id='',reservation_id='r-qbqltmho',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-857878669',owner_user_name='tempest-TestServerBasicOps-857878669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:06:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c3981911a48146829ed4203fd206e201',uuid=50929a91-e103-467c-a547-5597dfc86ac8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.515 2 DEBUG nova.network.os_vif_util [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Converting VIF {"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.516 2 DEBUG nova.network.os_vif_util [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=363647f9-a022-4664-8db3-d0fb75c27b3e,network=Network(575de21c-6b12-4449-9e8c-4deaa781bf0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap363647f9-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.517 2 DEBUG os_vif [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=363647f9-a022-4664-8db3-d0fb75c27b3e,network=Network(575de21c-6b12-4449-9e8c-4deaa781bf0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap363647f9-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.519 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.520 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.525 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap363647f9-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.526 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap363647f9-a0, col_values=(('external_ids', {'iface-id': '363647f9-a022-4664-8db3-d0fb75c27b3e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2c:ba:7a', 'vm-uuid': '50929a91-e103-467c-a547-5597dfc86ac8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:29 np0005465988 NetworkManager[45041]: <info>  [1759410389.5305] manager: (tap363647f9-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/408)
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.539 2 INFO os_vif [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=363647f9-a022-4664-8db3-d0fb75c27b3e,network=Network(575de21c-6b12-4449-9e8c-4deaa781bf0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap363647f9-a0')#033[00m
Oct  2 09:06:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:29.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.708 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.709 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.709 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] No VIF found with MAC fa:16:3e:2c:ba:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.710 2 INFO nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Using config drive#033[00m
Oct  2 09:06:29 np0005465988 nova_compute[236126]: 2025-10-02 13:06:29.741 2 DEBUG nova.storage.rbd_utils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] rbd image 50929a91-e103-467c-a547-5597dfc86ac8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:06:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:30.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:30 np0005465988 nova_compute[236126]: 2025-10-02 13:06:30.681 2 INFO nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Creating config drive at /var/lib/nova/instances/50929a91-e103-467c-a547-5597dfc86ac8/disk.config#033[00m
Oct  2 09:06:30 np0005465988 nova_compute[236126]: 2025-10-02 13:06:30.688 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/50929a91-e103-467c-a547-5597dfc86ac8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmr1lyc4j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:06:30 np0005465988 nova_compute[236126]: 2025-10-02 13:06:30.841 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/50929a91-e103-467c-a547-5597dfc86ac8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmr1lyc4j" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:06:30 np0005465988 nova_compute[236126]: 2025-10-02 13:06:30.941 2 DEBUG nova.storage.rbd_utils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] rbd image 50929a91-e103-467c-a547-5597dfc86ac8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:06:30 np0005465988 nova_compute[236126]: 2025-10-02 13:06:30.946 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/50929a91-e103-467c-a547-5597dfc86ac8/disk.config 50929a91-e103-467c-a547-5597dfc86ac8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:06:31 np0005465988 nova_compute[236126]: 2025-10-02 13:06:31.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:31 np0005465988 nova_compute[236126]: 2025-10-02 13:06:31.005 2 DEBUG nova.network.neutron [req-95dd5c41-3fb5-44e0-bdc3-cd767113a279 req-3b3d0125-a61a-4289-8798-73a9c2244b8c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Updated VIF entry in instance network info cache for port 363647f9-a022-4664-8db3-d0fb75c27b3e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:06:31 np0005465988 nova_compute[236126]: 2025-10-02 13:06:31.005 2 DEBUG nova.network.neutron [req-95dd5c41-3fb5-44e0-bdc3-cd767113a279 req-3b3d0125-a61a-4289-8798-73a9c2244b8c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Updating instance_info_cache with network_info: [{"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:06:31 np0005465988 nova_compute[236126]: 2025-10-02 13:06:31.024 2 DEBUG oslo_concurrency.lockutils [req-95dd5c41-3fb5-44e0-bdc3-cd767113a279 req-3b3d0125-a61a-4289-8798-73a9c2244b8c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:06:31 np0005465988 podman[334155]: 2025-10-02 13:06:31.36656659 +0000 UTC m=+0.025630169 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 09:06:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:31.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:31 np0005465988 podman[334155]: 2025-10-02 13:06:31.619510504 +0000 UTC m=+0.278574063 container create 1e7d6df45877f104e30d31c8632be89b8d025b250c26d232e25843aa256d6cc5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_elbakyan, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 09:06:31 np0005465988 systemd[1]: Started libpod-conmon-1e7d6df45877f104e30d31c8632be89b8d025b250c26d232e25843aa256d6cc5.scope.
Oct  2 09:06:31 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:06:32 np0005465988 podman[334155]: 2025-10-02 13:06:32.178969867 +0000 UTC m=+0.838033526 container init 1e7d6df45877f104e30d31c8632be89b8d025b250c26d232e25843aa256d6cc5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_elbakyan, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 09:06:32 np0005465988 podman[334155]: 2025-10-02 13:06:32.19504283 +0000 UTC m=+0.854106389 container start 1e7d6df45877f104e30d31c8632be89b8d025b250c26d232e25843aa256d6cc5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_elbakyan, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct  2 09:06:32 np0005465988 cool_elbakyan[334172]: 167 167
Oct  2 09:06:32 np0005465988 systemd[1]: libpod-1e7d6df45877f104e30d31c8632be89b8d025b250c26d232e25843aa256d6cc5.scope: Deactivated successfully.
Oct  2 09:06:32 np0005465988 podman[334155]: 2025-10-02 13:06:32.437529434 +0000 UTC m=+1.096593003 container attach 1e7d6df45877f104e30d31c8632be89b8d025b250c26d232e25843aa256d6cc5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_elbakyan, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Oct  2 09:06:32 np0005465988 podman[334155]: 2025-10-02 13:06:32.438993407 +0000 UTC m=+1.098056966 container died 1e7d6df45877f104e30d31c8632be89b8d025b250c26d232e25843aa256d6cc5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_elbakyan, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct  2 09:06:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:32.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:32 np0005465988 systemd[1]: var-lib-containers-storage-overlay-71040c3606d39659fe8456a8a8899800e2f7c2368eb4676bf6e1aab8ea659efb-merged.mount: Deactivated successfully.
Oct  2 09:06:32 np0005465988 nova_compute[236126]: 2025-10-02 13:06:32.702 2 DEBUG oslo_concurrency.processutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/50929a91-e103-467c-a547-5597dfc86ac8/disk.config 50929a91-e103-467c-a547-5597dfc86ac8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.756s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:06:32 np0005465988 nova_compute[236126]: 2025-10-02 13:06:32.705 2 INFO nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Deleting local config drive /var/lib/nova/instances/50929a91-e103-467c-a547-5597dfc86ac8/disk.config because it was imported into RBD.#033[00m
Oct  2 09:06:32 np0005465988 podman[334155]: 2025-10-02 13:06:32.707100379 +0000 UTC m=+1.366163968 container remove 1e7d6df45877f104e30d31c8632be89b8d025b250c26d232e25843aa256d6cc5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_elbakyan, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Oct  2 09:06:32 np0005465988 systemd[1]: libpod-conmon-1e7d6df45877f104e30d31c8632be89b8d025b250c26d232e25843aa256d6cc5.scope: Deactivated successfully.
Oct  2 09:06:32 np0005465988 kernel: tap363647f9-a0: entered promiscuous mode
Oct  2 09:06:32 np0005465988 NetworkManager[45041]: <info>  [1759410392.7875] manager: (tap363647f9-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/409)
Oct  2 09:06:32 np0005465988 ovn_controller[132601]: 2025-10-02T13:06:32Z|00936|binding|INFO|Claiming lport 363647f9-a022-4664-8db3-d0fb75c27b3e for this chassis.
Oct  2 09:06:32 np0005465988 ovn_controller[132601]: 2025-10-02T13:06:32Z|00937|binding|INFO|363647f9-a022-4664-8db3-d0fb75c27b3e: Claiming fa:16:3e:2c:ba:7a 10.100.0.12
Oct  2 09:06:32 np0005465988 nova_compute[236126]: 2025-10-02 13:06:32.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.808 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:ba:7a 10.100.0.12'], port_security=['fa:16:3e:2c:ba:7a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '50929a91-e103-467c-a547-5597dfc86ac8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-575de21c-6b12-4449-9e8c-4deaa781bf0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8be669d844b4165b675a7d02f2d8588', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2d10734b-7a94-47ef-8c22-542476f63a59 a39fb845-c78e-41b0-80c5-2ebd426f9843', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd42f018-1a0a-4965-b6c3-a22fd79d46e6, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=363647f9-a022-4664-8db3-d0fb75c27b3e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.809 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 363647f9-a022-4664-8db3-d0fb75c27b3e in datapath 575de21c-6b12-4449-9e8c-4deaa781bf0b bound to our chassis#033[00m
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.811 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 575de21c-6b12-4449-9e8c-4deaa781bf0b#033[00m
Oct  2 09:06:32 np0005465988 systemd-udevd[334210]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.826 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1d0e354e-2339-456d-abdb-44f291811e84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.829 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap575de21c-61 in ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.831 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap575de21c-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.831 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[edb2e266-7418-41af-97be-7cfcda847e49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.832 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f48450e5-0879-434e-bbf9-70edac76cba3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:32 np0005465988 NetworkManager[45041]: <info>  [1759410392.8416] device (tap363647f9-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:06:32 np0005465988 NetworkManager[45041]: <info>  [1759410392.8422] device (tap363647f9-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:06:32 np0005465988 systemd-machined[192594]: New machine qemu-98-instance-000000ce.
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.852 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[894e16a5-cca7-4081-bf2d-5e97f0772291]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:32 np0005465988 systemd[1]: Started Virtual Machine qemu-98-instance-000000ce.
Oct  2 09:06:32 np0005465988 nova_compute[236126]: 2025-10-02 13:06:32.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:32 np0005465988 ovn_controller[132601]: 2025-10-02T13:06:32Z|00938|binding|INFO|Setting lport 363647f9-a022-4664-8db3-d0fb75c27b3e ovn-installed in OVS
Oct  2 09:06:32 np0005465988 ovn_controller[132601]: 2025-10-02T13:06:32Z|00939|binding|INFO|Setting lport 363647f9-a022-4664-8db3-d0fb75c27b3e up in Southbound
Oct  2 09:06:32 np0005465988 nova_compute[236126]: 2025-10-02 13:06:32.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.878 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c5bd2c67-23b5-47de-914c-2d79b2c645d3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:32 np0005465988 podman[334218]: 2025-10-02 13:06:32.913467002 +0000 UTC m=+0.054398918 container create 1f3972d9d582088b51494e5daf4d57c724cd7f7b3c6cfcb481ee723da322b129 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_bouman, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.913 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[73b09ae4-3f3a-4935-b554-2eea7048c9e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:32 np0005465988 NetworkManager[45041]: <info>  [1759410392.9237] manager: (tap575de21c-60): new Veth device (/org/freedesktop/NetworkManager/Devices/410)
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.921 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[887809ec-66cd-44c3-90bf-98eebae810e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.962 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[42a0a79b-751d-4fe6-aae9-5ebef1c7d7d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:32 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:32.970 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[3af6c3d3-b149-46cf-a324-5075eaee91db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:32 np0005465988 systemd[1]: Started libpod-conmon-1f3972d9d582088b51494e5daf4d57c724cd7f7b3c6cfcb481ee723da322b129.scope.
Oct  2 09:06:32 np0005465988 podman[334218]: 2025-10-02 13:06:32.894694962 +0000 UTC m=+0.035626898 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Oct  2 09:06:32 np0005465988 NetworkManager[45041]: <info>  [1759410392.9977] device (tap575de21c-60): carrier: link connected
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.005 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[769927b4-5c9c-4988-819a-5209c6f34f6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:33 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:06:33 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5894578b3bac06666a15a8fdd7df8cd1470b0cede38fe025a682071dce6d1c63/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct  2 09:06:33 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5894578b3bac06666a15a8fdd7df8cd1470b0cede38fe025a682071dce6d1c63/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct  2 09:06:33 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5894578b3bac06666a15a8fdd7df8cd1470b0cede38fe025a682071dce6d1c63/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct  2 09:06:33 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5894578b3bac06666a15a8fdd7df8cd1470b0cede38fe025a682071dce6d1c63/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct  2 09:06:33 np0005465988 podman[334218]: 2025-10-02 13:06:33.029828014 +0000 UTC m=+0.170759950 container init 1f3972d9d582088b51494e5daf4d57c724cd7f7b3c6cfcb481ee723da322b129 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_bouman, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.028 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2b01ca7e-7309-46aa-8588-ae2592649443]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap575de21c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:fd:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 274], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 845332, 'reachable_time': 17476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 334266, 'error': None, 'target': 'ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:33 np0005465988 podman[334218]: 2025-10-02 13:06:33.037527755 +0000 UTC m=+0.178459671 container start 1f3972d9d582088b51494e5daf4d57c724cd7f7b3c6cfcb481ee723da322b129 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_bouman, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct  2 09:06:33 np0005465988 podman[334218]: 2025-10-02 13:06:33.040787589 +0000 UTC m=+0.181719505 container attach 1f3972d9d582088b51494e5daf4d57c724cd7f7b3c6cfcb481ee723da322b129 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_bouman, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507)
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.050 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d2aba160-e77a-4116-9d42-c1b9ff3f805d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:fdb7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 845332, 'tstamp': 845332}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 334268, 'error': None, 'target': 'ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.079 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[80713962-bc1a-4e56-96d1-37f47395cab0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap575de21c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:fd:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 274], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 845332, 'reachable_time': 17476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 334270, 'error': None, 'target': 'ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.119 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[053d87b2-5739-4a1f-8700-03a656902c34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.126 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.210 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5b673fbb-3902-41fd-898c-b74f9cb39292]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.212 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap575de21c-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.212 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.213 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap575de21c-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:33 np0005465988 kernel: tap575de21c-60: entered promiscuous mode
Oct  2 09:06:33 np0005465988 NetworkManager[45041]: <info>  [1759410393.2421] manager: (tap575de21c-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/411)
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.248 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap575de21c-60, col_values=(('external_ids', {'iface-id': 'b48f4d6c-177b-4ffa-8094-1a4095677428'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:33 np0005465988 ovn_controller[132601]: 2025-10-02T13:06:33Z|00940|binding|INFO|Releasing lport b48f4d6c-177b-4ffa-8094-1a4095677428 from this chassis (sb_readonly=0)
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.251 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/575de21c-6b12-4449-9e8c-4deaa781bf0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/575de21c-6b12-4449-9e8c-4deaa781bf0b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.255 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8fbfc803-353a-42b2-a06e-aa3c64c4f429]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.256 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-575de21c-6b12-4449-9e8c-4deaa781bf0b
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/575de21c-6b12-4449-9e8c-4deaa781bf0b.pid.haproxy
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 575de21c-6b12-4449-9e8c-4deaa781bf0b
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:06:33 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:06:33.258 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b', 'env', 'PROCESS_TAG=haproxy-575de21c-6b12-4449-9e8c-4deaa781bf0b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/575de21c-6b12-4449-9e8c-4deaa781bf0b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.415 2 DEBUG nova.compute.manager [req-f01477e5-25f3-4fea-bf97-9fdbedac8657 req-04c1f120-6fa5-4c63-bc32-b616f24a56c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Received event network-vif-plugged-363647f9-a022-4664-8db3-d0fb75c27b3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.417 2 DEBUG oslo_concurrency.lockutils [req-f01477e5-25f3-4fea-bf97-9fdbedac8657 req-04c1f120-6fa5-4c63-bc32-b616f24a56c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "50929a91-e103-467c-a547-5597dfc86ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.417 2 DEBUG oslo_concurrency.lockutils [req-f01477e5-25f3-4fea-bf97-9fdbedac8657 req-04c1f120-6fa5-4c63-bc32-b616f24a56c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.418 2 DEBUG oslo_concurrency.lockutils [req-f01477e5-25f3-4fea-bf97-9fdbedac8657 req-04c1f120-6fa5-4c63-bc32-b616f24a56c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.418 2 DEBUG nova.compute.manager [req-f01477e5-25f3-4fea-bf97-9fdbedac8657 req-04c1f120-6fa5-4c63-bc32-b616f24a56c1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Processing event network-vif-plugged-363647f9-a022-4664-8db3-d0fb75c27b3e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:06:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:33.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:33 np0005465988 podman[334345]: 2025-10-02 13:06:33.631585935 +0000 UTC m=+0.055822929 container create 74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 09:06:33 np0005465988 systemd[1]: Started libpod-conmon-74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4.scope.
Oct  2 09:06:33 np0005465988 podman[334345]: 2025-10-02 13:06:33.601352564 +0000 UTC m=+0.025589578 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:06:33 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:06:33 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26b093a493d70fd911c4b265461584b962ff14fe6e05526d65cee5851425a572/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:06:33 np0005465988 podman[334345]: 2025-10-02 13:06:33.73626029 +0000 UTC m=+0.160497304 container init 74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:06:33 np0005465988 podman[334345]: 2025-10-02 13:06:33.741880592 +0000 UTC m=+0.166117586 container start 74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 09:06:33 np0005465988 neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b[334360]: [NOTICE]   (334364) : New worker (334367) forked
Oct  2 09:06:33 np0005465988 neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b[334360]: [NOTICE]   (334364) : Loading success.
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.982 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410393.981289, 50929a91-e103-467c-a547-5597dfc86ac8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.982 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] VM Started (Lifecycle Event)#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.985 2 DEBUG nova.compute.manager [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.988 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.992 2 INFO nova.virt.libvirt.driver [-] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Instance spawned successfully.#033[00m
Oct  2 09:06:33 np0005465988 nova_compute[236126]: 2025-10-02 13:06:33.993 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.000 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.004 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.030 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.031 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410393.9816425, 50929a91-e103-467c-a547-5597dfc86ac8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.031 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.041 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.042 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.042 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.043 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.043 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.044 2 DEBUG nova.virt.libvirt.driver [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.051 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.056 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410393.987581, 50929a91-e103-467c-a547-5597dfc86ac8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.056 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.096 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.100 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.139 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.150 2 INFO nova.compute.manager [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Took 10.84 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.150 2 DEBUG nova.compute.manager [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.235 2 INFO nova.compute.manager [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Took 12.35 seconds to build instance.#033[00m
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.262 2 DEBUG oslo_concurrency.lockutils [None req-4602f6bd-a573-4af6-b9c0-388708baaf59 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.500s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:34 np0005465988 eager_bouman[334262]: [
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:    {
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:        "available": false,
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:        "ceph_device": false,
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:        "device_id": "QEMU_DVD-ROM_QM00001",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:        "lsm_data": {},
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:        "lvs": [],
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:        "path": "/dev/sr0",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:        "rejected_reasons": [
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "Has a FileSystem",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "Insufficient space (<5GB)"
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:        ],
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:        "sys_api": {
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "actuators": null,
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "device_nodes": "sr0",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "devname": "sr0",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "human_readable_size": "482.00 KB",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "id_bus": "ata",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "model": "QEMU DVD-ROM",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "nr_requests": "2",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "parent": "/dev/sr0",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "partitions": {},
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "path": "/dev/sr0",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "removable": "1",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "rev": "2.5+",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "ro": "0",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "rotational": "0",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "sas_address": "",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "sas_device_handle": "",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "scheduler_mode": "mq-deadline",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "sectors": 0,
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "sectorsize": "2048",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "size": 493568.0,
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "support_discard": "2048",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "type": "disk",
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:            "vendor": "QEMU"
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:        }
Oct  2 09:06:34 np0005465988 eager_bouman[334262]:    }
Oct  2 09:06:34 np0005465988 eager_bouman[334262]: ]
Oct  2 09:06:34 np0005465988 systemd[1]: libpod-1f3972d9d582088b51494e5daf4d57c724cd7f7b3c6cfcb481ee723da322b129.scope: Deactivated successfully.
Oct  2 09:06:34 np0005465988 systemd[1]: libpod-1f3972d9d582088b51494e5daf4d57c724cd7f7b3c6cfcb481ee723da322b129.scope: Consumed 1.305s CPU time.
Oct  2 09:06:34 np0005465988 podman[334218]: 2025-10-02 13:06:34.386310232 +0000 UTC m=+1.527242188 container died 1f3972d9d582088b51494e5daf4d57c724cd7f7b3c6cfcb481ee723da322b129 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_bouman, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 09:06:34 np0005465988 systemd[1]: var-lib-containers-storage-overlay-5894578b3bac06666a15a8fdd7df8cd1470b0cede38fe025a682071dce6d1c63-merged.mount: Deactivated successfully.
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.472 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:34 np0005465988 podman[334218]: 2025-10-02 13:06:34.484165991 +0000 UTC m=+1.625097907 container remove 1f3972d9d582088b51494e5daf4d57c724cd7f7b3c6cfcb481ee723da322b129 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_bouman, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Oct  2 09:06:34 np0005465988 systemd[1]: libpod-conmon-1f3972d9d582088b51494e5daf4d57c724cd7f7b3c6cfcb481ee723da322b129.scope: Deactivated successfully.
Oct  2 09:06:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:34.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:34 np0005465988 nova_compute[236126]: 2025-10-02 13:06:34.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:35 np0005465988 nova_compute[236126]: 2025-10-02 13:06:35.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:35 np0005465988 nova_compute[236126]: 2025-10-02 13:06:35.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:35 np0005465988 nova_compute[236126]: 2025-10-02 13:06:35.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:06:35 np0005465988 nova_compute[236126]: 2025-10-02 13:06:35.505 2 DEBUG nova.compute.manager [req-16bfe42a-6a9a-4ca8-a8ac-3afb819c288a req-b85bdcbf-ff7d-4d73-8ba1-37386bf7a47d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Received event network-vif-plugged-363647f9-a022-4664-8db3-d0fb75c27b3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:06:35 np0005465988 nova_compute[236126]: 2025-10-02 13:06:35.505 2 DEBUG oslo_concurrency.lockutils [req-16bfe42a-6a9a-4ca8-a8ac-3afb819c288a req-b85bdcbf-ff7d-4d73-8ba1-37386bf7a47d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "50929a91-e103-467c-a547-5597dfc86ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:35 np0005465988 nova_compute[236126]: 2025-10-02 13:06:35.506 2 DEBUG oslo_concurrency.lockutils [req-16bfe42a-6a9a-4ca8-a8ac-3afb819c288a req-b85bdcbf-ff7d-4d73-8ba1-37386bf7a47d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:35 np0005465988 nova_compute[236126]: 2025-10-02 13:06:35.506 2 DEBUG oslo_concurrency.lockutils [req-16bfe42a-6a9a-4ca8-a8ac-3afb819c288a req-b85bdcbf-ff7d-4d73-8ba1-37386bf7a47d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:35 np0005465988 nova_compute[236126]: 2025-10-02 13:06:35.506 2 DEBUG nova.compute.manager [req-16bfe42a-6a9a-4ca8-a8ac-3afb819c288a req-b85bdcbf-ff7d-4d73-8ba1-37386bf7a47d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] No waiting events found dispatching network-vif-plugged-363647f9-a022-4664-8db3-d0fb75c27b3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:06:35 np0005465988 nova_compute[236126]: 2025-10-02 13:06:35.506 2 WARNING nova.compute.manager [req-16bfe42a-6a9a-4ca8-a8ac-3afb819c288a req-b85bdcbf-ff7d-4d73-8ba1-37386bf7a47d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Received unexpected event network-vif-plugged-363647f9-a022-4664-8db3-d0fb75c27b3e for instance with vm_state active and task_state None.#033[00m
Oct  2 09:06:35 np0005465988 podman[335534]: 2025-10-02 13:06:35.535304644 +0000 UTC m=+0.066711342 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:06:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:35.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:06:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:06:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:06:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:06:35 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:06:35 np0005465988 nova_compute[236126]: 2025-10-02 13:06:35.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:36.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:37 np0005465988 nova_compute[236126]: 2025-10-02 13:06:37.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:37.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:37 np0005465988 nova_compute[236126]: 2025-10-02 13:06:37.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:37 np0005465988 NetworkManager[45041]: <info>  [1759410397.8902] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/412)
Oct  2 09:06:37 np0005465988 NetworkManager[45041]: <info>  [1759410397.8915] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/413)
Oct  2 09:06:38 np0005465988 nova_compute[236126]: 2025-10-02 13:06:38.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:38 np0005465988 ovn_controller[132601]: 2025-10-02T13:06:38Z|00941|binding|INFO|Releasing lport b48f4d6c-177b-4ffa-8094-1a4095677428 from this chassis (sb_readonly=0)
Oct  2 09:06:38 np0005465988 nova_compute[236126]: 2025-10-02 13:06:38.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:38 np0005465988 nova_compute[236126]: 2025-10-02 13:06:38.233 2 DEBUG nova.compute.manager [req-1068cb49-9bc2-40e5-9d1e-9443e0e88b6d req-89e04e65-8276-4b88-b0e9-783a095019c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Received event network-changed-363647f9-a022-4664-8db3-d0fb75c27b3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:06:38 np0005465988 nova_compute[236126]: 2025-10-02 13:06:38.234 2 DEBUG nova.compute.manager [req-1068cb49-9bc2-40e5-9d1e-9443e0e88b6d req-89e04e65-8276-4b88-b0e9-783a095019c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Refreshing instance network info cache due to event network-changed-363647f9-a022-4664-8db3-d0fb75c27b3e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:06:38 np0005465988 nova_compute[236126]: 2025-10-02 13:06:38.234 2 DEBUG oslo_concurrency.lockutils [req-1068cb49-9bc2-40e5-9d1e-9443e0e88b6d req-89e04e65-8276-4b88-b0e9-783a095019c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:06:38 np0005465988 nova_compute[236126]: 2025-10-02 13:06:38.235 2 DEBUG oslo_concurrency.lockutils [req-1068cb49-9bc2-40e5-9d1e-9443e0e88b6d req-89e04e65-8276-4b88-b0e9-783a095019c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:06:38 np0005465988 nova_compute[236126]: 2025-10-02 13:06:38.235 2 DEBUG nova.network.neutron [req-1068cb49-9bc2-40e5-9d1e-9443e0e88b6d req-89e04e65-8276-4b88-b0e9-783a095019c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Refreshing network info cache for port 363647f9-a022-4664-8db3-d0fb75c27b3e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:06:38 np0005465988 nova_compute[236126]: 2025-10-02 13:06:38.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:38.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:39.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:39 np0005465988 nova_compute[236126]: 2025-10-02 13:06:39.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:40 np0005465988 nova_compute[236126]: 2025-10-02 13:06:40.147 2 DEBUG nova.network.neutron [req-1068cb49-9bc2-40e5-9d1e-9443e0e88b6d req-89e04e65-8276-4b88-b0e9-783a095019c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Updated VIF entry in instance network info cache for port 363647f9-a022-4664-8db3-d0fb75c27b3e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:06:40 np0005465988 nova_compute[236126]: 2025-10-02 13:06:40.148 2 DEBUG nova.network.neutron [req-1068cb49-9bc2-40e5-9d1e-9443e0e88b6d req-89e04e65-8276-4b88-b0e9-783a095019c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Updating instance_info_cache with network_info: [{"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:06:40 np0005465988 nova_compute[236126]: 2025-10-02 13:06:40.168 2 DEBUG oslo_concurrency.lockutils [req-1068cb49-9bc2-40e5-9d1e-9443e0e88b6d req-89e04e65-8276-4b88-b0e9-783a095019c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:06:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:40.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:40 np0005465988 nova_compute[236126]: 2025-10-02 13:06:40.856 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:40 np0005465988 nova_compute[236126]: 2025-10-02 13:06:40.886 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Triggering sync for uuid 50929a91-e103-467c-a547-5597dfc86ac8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 09:06:40 np0005465988 nova_compute[236126]: 2025-10-02 13:06:40.887 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "50929a91-e103-467c-a547-5597dfc86ac8" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:40 np0005465988 nova_compute[236126]: 2025-10-02 13:06:40.888 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "50929a91-e103-467c-a547-5597dfc86ac8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:40 np0005465988 nova_compute[236126]: 2025-10-02 13:06:40.918 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "50929a91-e103-467c-a547-5597dfc86ac8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:40 np0005465988 nova_compute[236126]: 2025-10-02 13:06:40.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:41 np0005465988 nova_compute[236126]: 2025-10-02 13:06:41.505 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:41 np0005465988 nova_compute[236126]: 2025-10-02 13:06:41.506 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:06:41 np0005465988 nova_compute[236126]: 2025-10-02 13:06:41.506 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:06:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:41.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:41 np0005465988 nova_compute[236126]: 2025-10-02 13:06:41.706 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:06:41 np0005465988 nova_compute[236126]: 2025-10-02 13:06:41.706 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:06:41 np0005465988 nova_compute[236126]: 2025-10-02 13:06:41.706 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 09:06:41 np0005465988 nova_compute[236126]: 2025-10-02 13:06:41.707 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 50929a91-e103-467c-a547-5597dfc86ac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:06:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:42.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:42 np0005465988 nova_compute[236126]: 2025-10-02 13:06:42.976 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Updating instance_info_cache with network_info: [{"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:06:43 np0005465988 nova_compute[236126]: 2025-10-02 13:06:43.037 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-50929a91-e103-467c-a547-5597dfc86ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:06:43 np0005465988 nova_compute[236126]: 2025-10-02 13:06:43.038 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 09:06:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:43.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:44.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:44 np0005465988 nova_compute[236126]: 2025-10-02 13:06:44.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:06:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:06:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:45.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:45 np0005465988 nova_compute[236126]: 2025-10-02 13:06:45.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:46.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:46 np0005465988 podman[335660]: 2025-10-02 13:06:46.542009192 +0000 UTC m=+0.066814696 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 09:06:46 np0005465988 podman[335661]: 2025-10-02 13:06:46.596041478 +0000 UTC m=+0.121685676 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 09:06:46 np0005465988 podman[335659]: 2025-10-02 13:06:46.615488168 +0000 UTC m=+0.146604883 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 09:06:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:47.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:48.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:49 np0005465988 nova_compute[236126]: 2025-10-02 13:06:49.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:49.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:50.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:50 np0005465988 nova_compute[236126]: 2025-10-02 13:06:50.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:51.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:51 np0005465988 ovn_controller[132601]: 2025-10-02T13:06:51Z|00119|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2c:ba:7a 10.100.0.12
Oct  2 09:06:51 np0005465988 ovn_controller[132601]: 2025-10-02T13:06:51Z|00120|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2c:ba:7a 10.100.0.12
Oct  2 09:06:52 np0005465988 nova_compute[236126]: 2025-10-02 13:06:52.002 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:52.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:53.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:54.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:54 np0005465988 nova_compute[236126]: 2025-10-02 13:06:54.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:55.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:55 np0005465988 nova_compute[236126]: 2025-10-02 13:06:55.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:06:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:06:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:56.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:06:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:57.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:06:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:06:58.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:06:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:06:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:06:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:06:59.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:06:59 np0005465988 nova_compute[236126]: 2025-10-02 13:06:59.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:00.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:00 np0005465988 nova_compute[236126]: 2025-10-02 13:07:00.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:01.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:01.999 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:07:02 np0005465988 nova_compute[236126]: 2025-10-02 13:07:02.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:02.002 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:07:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:02.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:03.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:04.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:04 np0005465988 nova_compute[236126]: 2025-10-02 13:07:04.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:07:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:05.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:07:05 np0005465988 nova_compute[236126]: 2025-10-02 13:07:05.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:06 np0005465988 podman[335786]: 2025-10-02 13:07:06.519442279 +0000 UTC m=+0.057639511 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 09:07:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:06.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:07.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:08.006 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:07:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:08.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:09 np0005465988 nova_compute[236126]: 2025-10-02 13:07:09.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:09.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:10.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:10 np0005465988 nova_compute[236126]: 2025-10-02 13:07:10.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:11.268 142241 DEBUG eventlet.wsgi.server [-] (142241) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Oct  2 09:07:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:11.269 142241 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0#015
Oct  2 09:07:11 np0005465988 ovn_metadata_agent[142119]: Accept: */*#015
Oct  2 09:07:11 np0005465988 ovn_metadata_agent[142119]: Connection: close#015
Oct  2 09:07:11 np0005465988 ovn_metadata_agent[142119]: Content-Type: text/plain#015
Oct  2 09:07:11 np0005465988 ovn_metadata_agent[142119]: Host: 169.254.169.254#015
Oct  2 09:07:11 np0005465988 ovn_metadata_agent[142119]: User-Agent: curl/7.84.0#015
Oct  2 09:07:11 np0005465988 ovn_metadata_agent[142119]: X-Forwarded-For: 10.100.0.12#015
Oct  2 09:07:11 np0005465988 ovn_metadata_agent[142119]: X-Ovn-Network-Id: 575de21c-6b12-4449-9e8c-4deaa781bf0b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Oct  2 09:07:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:11.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:11.906 142241 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Oct  2 09:07:11 np0005465988 haproxy-metadata-proxy-575de21c-6b12-4449-9e8c-4deaa781bf0b[334367]: 10.100.0.12:49172 [02/Oct/2025:13:07:11.267] listener listener/metadata 0/0/0/639/639 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Oct  2 09:07:11 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:11.907 142241 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.6373291#033[00m
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:12.024 142241 DEBUG eventlet.wsgi.server [-] (142241) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:12.025 142241 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0#015
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: Accept: */*#015
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: Connection: close#015
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: Content-Length: 100#015
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: Content-Type: application/x-www-form-urlencoded#015
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: Host: 169.254.169.254#015
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: User-Agent: curl/7.84.0#015
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: X-Forwarded-For: 10.100.0.12#015
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: X-Ovn-Network-Id: 575de21c-6b12-4449-9e8c-4deaa781bf0b#015
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: #015
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:12.157 142241 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Oct  2 09:07:12 np0005465988 haproxy-metadata-proxy-575de21c-6b12-4449-9e8c-4deaa781bf0b[334367]: 10.100.0.12:49184 [02/Oct/2025:13:07:12.023] listener listener/metadata 0/0/0/135/135 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Oct  2 09:07:12 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:12.158 142241 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.1327531#033[00m
Oct  2 09:07:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:12.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:13.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.188 2 DEBUG oslo_concurrency.lockutils [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Acquiring lock "50929a91-e103-467c-a547-5597dfc86ac8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.189 2 DEBUG oslo_concurrency.lockutils [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.190 2 DEBUG oslo_concurrency.lockutils [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Acquiring lock "50929a91-e103-467c-a547-5597dfc86ac8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.190 2 DEBUG oslo_concurrency.lockutils [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.191 2 DEBUG oslo_concurrency.lockutils [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.193 2 INFO nova.compute.manager [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Terminating instance#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.194 2 DEBUG nova.compute.manager [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 09:07:14 np0005465988 kernel: tap363647f9-a0 (unregistering): left promiscuous mode
Oct  2 09:07:14 np0005465988 NetworkManager[45041]: <info>  [1759410434.2571] device (tap363647f9-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:14 np0005465988 ovn_controller[132601]: 2025-10-02T13:07:14Z|00942|binding|INFO|Releasing lport 363647f9-a022-4664-8db3-d0fb75c27b3e from this chassis (sb_readonly=0)
Oct  2 09:07:14 np0005465988 ovn_controller[132601]: 2025-10-02T13:07:14Z|00943|binding|INFO|Setting lport 363647f9-a022-4664-8db3-d0fb75c27b3e down in Southbound
Oct  2 09:07:14 np0005465988 ovn_controller[132601]: 2025-10-02T13:07:14Z|00944|binding|INFO|Removing iface tap363647f9-a0 ovn-installed in OVS
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.277 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:ba:7a 10.100.0.12'], port_security=['fa:16:3e:2c:ba:7a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '50929a91-e103-467c-a547-5597dfc86ac8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-575de21c-6b12-4449-9e8c-4deaa781bf0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8be669d844b4165b675a7d02f2d8588', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2d10734b-7a94-47ef-8c22-542476f63a59 a39fb845-c78e-41b0-80c5-2ebd426f9843', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.188'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd42f018-1a0a-4965-b6c3-a22fd79d46e6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=363647f9-a022-4664-8db3-d0fb75c27b3e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.279 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 363647f9-a022-4664-8db3-d0fb75c27b3e in datapath 575de21c-6b12-4449-9e8c-4deaa781bf0b unbound from our chassis#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.281 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 575de21c-6b12-4449-9e8c-4deaa781bf0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.284 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[aff700e7-0105-45f2-a52c-02d20cf3b488]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.285 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b namespace which is not needed anymore#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:14 np0005465988 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000ce.scope: Deactivated successfully.
Oct  2 09:07:14 np0005465988 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000ce.scope: Consumed 15.387s CPU time.
Oct  2 09:07:14 np0005465988 systemd-machined[192594]: Machine qemu-98-instance-000000ce terminated.
Oct  2 09:07:14 np0005465988 neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b[334360]: [NOTICE]   (334364) : haproxy version is 2.8.14-c23fe91
Oct  2 09:07:14 np0005465988 neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b[334360]: [NOTICE]   (334364) : path to executable is /usr/sbin/haproxy
Oct  2 09:07:14 np0005465988 neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b[334360]: [WARNING]  (334364) : Exiting Master process...
Oct  2 09:07:14 np0005465988 neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b[334360]: [ALERT]    (334364) : Current worker (334367) exited with code 143 (Terminated)
Oct  2 09:07:14 np0005465988 neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b[334360]: [WARNING]  (334364) : All workers exited. Exiting... (0)
Oct  2 09:07:14 np0005465988 systemd[1]: libpod-74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4.scope: Deactivated successfully.
Oct  2 09:07:14 np0005465988 podman[335833]: 2025-10-02 13:07:14.437409566 +0000 UTC m=+0.054612514 container died 74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.439 2 INFO nova.virt.libvirt.driver [-] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Instance destroyed successfully.#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.440 2 DEBUG nova.objects.instance [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lazy-loading 'resources' on Instance uuid 50929a91-e103-467c-a547-5597dfc86ac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.453 2 DEBUG nova.virt.libvirt.vif [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:06:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1745601652',display_name='tempest-TestServerBasicOps-server-1745601652',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1745601652',id=206,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJH6ER6qsk5vIhITuZGAQJN77EvOuMzRdBHmezINmsQ7sYkUFyL7iu5F4znbCp0OZ9y3ZkDbpscct+0VcKpSpFDIlxhwrSMYQkrsrdbQZb7UnWEF5iZfu6WSbVMUUb0a5w==',key_name='tempest-TestServerBasicOps-457309094',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:06:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a8be669d844b4165b675a7d02f2d8588',ramdisk_id='',reservation_id='r-qbqltmho',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-857878669',owner_user_name='tempest-TestServerBasicOps-857878669-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:07:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c3981911a48146829ed4203fd206e201',uuid=50929a91-e103-467c-a547-5597dfc86ac8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.454 2 DEBUG nova.network.os_vif_util [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Converting VIF {"id": "363647f9-a022-4664-8db3-d0fb75c27b3e", "address": "fa:16:3e:2c:ba:7a", "network": {"id": "575de21c-6b12-4449-9e8c-4deaa781bf0b", "bridge": "br-int", "label": "tempest-TestServerBasicOps-224744067-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a8be669d844b4165b675a7d02f2d8588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap363647f9-a0", "ovs_interfaceid": "363647f9-a022-4664-8db3-d0fb75c27b3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.455 2 DEBUG nova.network.os_vif_util [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=363647f9-a022-4664-8db3-d0fb75c27b3e,network=Network(575de21c-6b12-4449-9e8c-4deaa781bf0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap363647f9-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.455 2 DEBUG os_vif [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=363647f9-a022-4664-8db3-d0fb75c27b3e,network=Network(575de21c-6b12-4449-9e8c-4deaa781bf0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap363647f9-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.458 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap363647f9-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.464 2 INFO os_vif [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=363647f9-a022-4664-8db3-d0fb75c27b3e,network=Network(575de21c-6b12-4449-9e8c-4deaa781bf0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap363647f9-a0')#033[00m
Oct  2 09:07:14 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4-userdata-shm.mount: Deactivated successfully.
Oct  2 09:07:14 np0005465988 systemd[1]: var-lib-containers-storage-overlay-26b093a493d70fd911c4b265461584b962ff14fe6e05526d65cee5851425a572-merged.mount: Deactivated successfully.
Oct  2 09:07:14 np0005465988 podman[335833]: 2025-10-02 13:07:14.483912475 +0000 UTC m=+0.101115413 container cleanup 74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 09:07:14 np0005465988 systemd[1]: libpod-conmon-74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4.scope: Deactivated successfully.
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.550 2 DEBUG nova.compute.manager [req-eae4a65e-20f4-4e0d-9c09-c1b6dd46ff69 req-8e474f2f-c300-4394-9991-61297607fbc8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Received event network-vif-unplugged-363647f9-a022-4664-8db3-d0fb75c27b3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.551 2 DEBUG oslo_concurrency.lockutils [req-eae4a65e-20f4-4e0d-9c09-c1b6dd46ff69 req-8e474f2f-c300-4394-9991-61297607fbc8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "50929a91-e103-467c-a547-5597dfc86ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.551 2 DEBUG oslo_concurrency.lockutils [req-eae4a65e-20f4-4e0d-9c09-c1b6dd46ff69 req-8e474f2f-c300-4394-9991-61297607fbc8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.551 2 DEBUG oslo_concurrency.lockutils [req-eae4a65e-20f4-4e0d-9c09-c1b6dd46ff69 req-8e474f2f-c300-4394-9991-61297607fbc8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.552 2 DEBUG nova.compute.manager [req-eae4a65e-20f4-4e0d-9c09-c1b6dd46ff69 req-8e474f2f-c300-4394-9991-61297607fbc8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] No waiting events found dispatching network-vif-unplugged-363647f9-a022-4664-8db3-d0fb75c27b3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.552 2 DEBUG nova.compute.manager [req-eae4a65e-20f4-4e0d-9c09-c1b6dd46ff69 req-8e474f2f-c300-4394-9991-61297607fbc8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Received event network-vif-unplugged-363647f9-a022-4664-8db3-d0fb75c27b3e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:07:14 np0005465988 podman[335887]: 2025-10-02 13:07:14.558246186 +0000 UTC m=+0.046946144 container remove 74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.566 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0e947660-de43-4cf5-abf3-b5ae3aa91e6a]: (4, ('Thu Oct  2 01:07:14 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b (74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4)\n74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4\nThu Oct  2 01:07:14 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b (74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4)\n74c4f583d820f53c7138e3206c6c7e04ff06b7463c91867e8adceac13f99c0d4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.568 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[caacb992-5d96-4f8a-8367-b62a91336368]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.570 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap575de21c-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:14.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:14 np0005465988 kernel: tap575de21c-60: left promiscuous mode
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.591 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f8ae59-9ce6-4102-886b-fd08672eb540]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.634 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ffb761-eddb-4916-80de-fc7e34840db7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.635 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0e17cec1-18ac-43cd-ac55-3d3de26aac32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.654 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ec43d4-aa80-4ac6-8a34-0e48189589fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 845323, 'reachable_time': 33596, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 335905, 'error': None, 'target': 'ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.657 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-575de21c-6b12-4449-9e8c-4deaa781bf0b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:07:14 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:14.658 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[f53c8f48-e8cb-4e49-83be-6af414d7637a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:07:14 np0005465988 systemd[1]: run-netns-ovnmeta\x2d575de21c\x2d6b12\x2d4449\x2d9e8c\x2d4deaa781bf0b.mount: Deactivated successfully.
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.902 2 INFO nova.virt.libvirt.driver [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Deleting instance files /var/lib/nova/instances/50929a91-e103-467c-a547-5597dfc86ac8_del#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.904 2 INFO nova.virt.libvirt.driver [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Deletion of /var/lib/nova/instances/50929a91-e103-467c-a547-5597dfc86ac8_del complete#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.972 2 INFO nova.compute.manager [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Took 0.78 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.972 2 DEBUG oslo.service.loopingcall [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.972 2 DEBUG nova.compute.manager [-] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 09:07:14 np0005465988 nova_compute[236126]: 2025-10-02 13:07:14.973 2 DEBUG nova.network.neutron [-] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 09:07:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:15.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:15 np0005465988 nova_compute[236126]: 2025-10-02 13:07:15.918 2 DEBUG nova.network.neutron [-] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:07:15 np0005465988 nova_compute[236126]: 2025-10-02 13:07:15.940 2 INFO nova.compute.manager [-] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Took 0.97 seconds to deallocate network for instance.#033[00m
Oct  2 09:07:15 np0005465988 nova_compute[236126]: 2025-10-02 13:07:15.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:15 np0005465988 nova_compute[236126]: 2025-10-02 13:07:15.990 2 DEBUG oslo_concurrency.lockutils [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:15 np0005465988 nova_compute[236126]: 2025-10-02 13:07:15.991 2 DEBUG oslo_concurrency.lockutils [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.043 2 DEBUG oslo_concurrency.processutils [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.094 2 DEBUG nova.compute.manager [req-475c81d9-e001-4491-809f-83e303810757 req-222d8dc9-b656-481d-9d10-8ad968ae4c34 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Received event network-vif-deleted-363647f9-a022-4664-8db3-d0fb75c27b3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:07:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:07:16 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1708553005' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.553 2 DEBUG oslo_concurrency.processutils [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.562 2 DEBUG nova.compute.provider_tree [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:07:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:16.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.581 2 DEBUG nova.scheduler.client.report [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.610 2 DEBUG oslo_concurrency.lockutils [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.673 2 INFO nova.scheduler.client.report [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Deleted allocations for instance 50929a91-e103-467c-a547-5597dfc86ac8#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.719 2 DEBUG nova.compute.manager [req-3a52a75c-d868-4e75-8f4d-89b8a75f4362 req-0b74ea37-cb50-4e11-a5b0-d6440dc68367 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Received event network-vif-plugged-363647f9-a022-4664-8db3-d0fb75c27b3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.719 2 DEBUG oslo_concurrency.lockutils [req-3a52a75c-d868-4e75-8f4d-89b8a75f4362 req-0b74ea37-cb50-4e11-a5b0-d6440dc68367 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "50929a91-e103-467c-a547-5597dfc86ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.720 2 DEBUG oslo_concurrency.lockutils [req-3a52a75c-d868-4e75-8f4d-89b8a75f4362 req-0b74ea37-cb50-4e11-a5b0-d6440dc68367 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.720 2 DEBUG oslo_concurrency.lockutils [req-3a52a75c-d868-4e75-8f4d-89b8a75f4362 req-0b74ea37-cb50-4e11-a5b0-d6440dc68367 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.720 2 DEBUG nova.compute.manager [req-3a52a75c-d868-4e75-8f4d-89b8a75f4362 req-0b74ea37-cb50-4e11-a5b0-d6440dc68367 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] No waiting events found dispatching network-vif-plugged-363647f9-a022-4664-8db3-d0fb75c27b3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.720 2 WARNING nova.compute.manager [req-3a52a75c-d868-4e75-8f4d-89b8a75f4362 req-0b74ea37-cb50-4e11-a5b0-d6440dc68367 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Received unexpected event network-vif-plugged-363647f9-a022-4664-8db3-d0fb75c27b3e for instance with vm_state deleted and task_state None.#033[00m
Oct  2 09:07:16 np0005465988 nova_compute[236126]: 2025-10-02 13:07:16.762 2 DEBUG oslo_concurrency.lockutils [None req-91f05d41-d6ff-416b-9b78-d6c9181f5160 c3981911a48146829ed4203fd206e201 a8be669d844b4165b675a7d02f2d8588 - - default default] Lock "50929a91-e103-467c-a547-5597dfc86ac8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:17 np0005465988 podman[335932]: 2025-10-02 13:07:17.527839364 +0000 UTC m=+0.059216136 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:07:17 np0005465988 podman[335933]: 2025-10-02 13:07:17.535278258 +0000 UTC m=+0.066428724 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:07:17 np0005465988 podman[335931]: 2025-10-02 13:07:17.566451656 +0000 UTC m=+0.100587768 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct  2 09:07:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:17.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e400 e400: 3 total, 3 up, 3 in
Oct  2 09:07:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:18.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:19 np0005465988 nova_compute[236126]: 2025-10-02 13:07:19.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:19 np0005465988 nova_compute[236126]: 2025-10-02 13:07:19.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:19.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:20 np0005465988 nova_compute[236126]: 2025-10-02 13:07:20.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:20 np0005465988 nova_compute[236126]: 2025-10-02 13:07:20.496 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:20 np0005465988 nova_compute[236126]: 2025-10-02 13:07:20.497 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:20 np0005465988 nova_compute[236126]: 2025-10-02 13:07:20.497 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:20 np0005465988 nova_compute[236126]: 2025-10-02 13:07:20.497 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:07:20 np0005465988 nova_compute[236126]: 2025-10-02 13:07:20.498 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:07:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:20.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:07:20 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1844808281' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:07:20 np0005465988 nova_compute[236126]: 2025-10-02 13:07:20.954 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:07:20 np0005465988 nova_compute[236126]: 2025-10-02 13:07:20.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.153 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.155 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4023MB free_disk=20.93163299560547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.155 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.155 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.293 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.293 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.391 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:07:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:21.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:07:21 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2798188944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.873 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.880 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.900 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.926 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:07:21 np0005465988 nova_compute[236126]: 2025-10-02 13:07:21.927 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.772s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:22.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:23.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:24 np0005465988 nova_compute[236126]: 2025-10-02 13:07:24.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:24.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:25.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:25 np0005465988 nova_compute[236126]: 2025-10-02 13:07:25.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:26.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:27.414 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:27.415 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:07:27.415 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:27.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:27 np0005465988 nova_compute[236126]: 2025-10-02 13:07:27.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:28 np0005465988 nova_compute[236126]: 2025-10-02 13:07:28.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:28.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e401 e401: 3 total, 3 up, 3 in
Oct  2 09:07:29 np0005465988 nova_compute[236126]: 2025-10-02 13:07:29.434 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410434.4309795, 50929a91-e103-467c-a547-5597dfc86ac8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:07:29 np0005465988 nova_compute[236126]: 2025-10-02 13:07:29.435 2 INFO nova.compute.manager [-] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:07:29 np0005465988 nova_compute[236126]: 2025-10-02 13:07:29.456 2 DEBUG nova.compute.manager [None req-0bd21108-d8df-42c9-afd3-4ac8dc8ab162 - - - - - -] [instance: 50929a91-e103-467c-a547-5597dfc86ac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:07:29 np0005465988 nova_compute[236126]: 2025-10-02 13:07:29.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:29.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:30.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:30 np0005465988 nova_compute[236126]: 2025-10-02 13:07:30.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:31.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:32.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:33.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:34 np0005465988 nova_compute[236126]: 2025-10-02 13:07:34.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:34.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:34 np0005465988 nova_compute[236126]: 2025-10-02 13:07:34.929 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:35 np0005465988 nova_compute[236126]: 2025-10-02 13:07:35.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:35.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:35 np0005465988 nova_compute[236126]: 2025-10-02 13:07:35.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 e402: 3 total, 3 up, 3 in
Oct  2 09:07:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:36 np0005465988 nova_compute[236126]: 2025-10-02 13:07:36.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:07:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:36.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:07:37 np0005465988 nova_compute[236126]: 2025-10-02 13:07:37.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:37 np0005465988 nova_compute[236126]: 2025-10-02 13:07:37.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:37 np0005465988 nova_compute[236126]: 2025-10-02 13:07:37.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:07:37 np0005465988 podman[336097]: 2025-10-02 13:07:37.513135796 +0000 UTC m=+0.050582418 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 09:07:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:37.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:38 np0005465988 nova_compute[236126]: 2025-10-02 13:07:38.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:38.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:39 np0005465988 nova_compute[236126]: 2025-10-02 13:07:39.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:39.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:40.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:40 np0005465988 nova_compute[236126]: 2025-10-02 13:07:40.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:41.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:42 np0005465988 nova_compute[236126]: 2025-10-02 13:07:42.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:42 np0005465988 nova_compute[236126]: 2025-10-02 13:07:42.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:07:42 np0005465988 nova_compute[236126]: 2025-10-02 13:07:42.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:07:42 np0005465988 nova_compute[236126]: 2025-10-02 13:07:42.515 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:07:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:42.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:43.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:44 np0005465988 nova_compute[236126]: 2025-10-02 13:07:44.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:44.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:45.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:46 np0005465988 nova_compute[236126]: 2025-10-02 13:07:46.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:46.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:07:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:07:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:07:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:07:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:07:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:07:47 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:07:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:47.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:48 np0005465988 podman[336306]: 2025-10-02 13:07:48.531088745 +0000 UTC m=+0.060085891 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 09:07:48 np0005465988 podman[336307]: 2025-10-02 13:07:48.539635562 +0000 UTC m=+0.066560168 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 09:07:48 np0005465988 podman[336305]: 2025-10-02 13:07:48.560985406 +0000 UTC m=+0.095359127 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 09:07:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:48.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:49 np0005465988 nova_compute[236126]: 2025-10-02 13:07:49.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:49.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:07:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:50.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:07:51 np0005465988 nova_compute[236126]: 2025-10-02 13:07:51.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:51.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:52.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:07:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:07:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:53.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:54 np0005465988 nova_compute[236126]: 2025-10-02 13:07:54.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:54.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 09:07:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3753809649' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 09:07:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 09:07:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3753809649' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 09:07:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:55.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:56 np0005465988 nova_compute[236126]: 2025-10-02 13:07:56.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:07:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:56.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:57.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:07:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:07:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:07:58.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:07:59 np0005465988 nova_compute[236126]: 2025-10-02 13:07:59.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:07:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:07:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:07:59.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:00.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:01 np0005465988 nova_compute[236126]: 2025-10-02 13:08:01.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:01.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:02.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:03.635 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:08:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:03.636 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:08:03 np0005465988 nova_compute[236126]: 2025-10-02 13:08:03.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:08:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:03.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:08:04 np0005465988 nova_compute[236126]: 2025-10-02 13:08:04.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:04.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:05.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:06 np0005465988 nova_compute[236126]: 2025-10-02 13:08:06.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:06.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:07.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:08 np0005465988 podman[336476]: 2025-10-02 13:08:08.533725257 +0000 UTC m=+0.067918757 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  2 09:08:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:08.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:09 np0005465988 nova_compute[236126]: 2025-10-02 13:08:09.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:09.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:10.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:11 np0005465988 nova_compute[236126]: 2025-10-02 13:08:11.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:11.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:12.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:13.638 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:13.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:14 np0005465988 nova_compute[236126]: 2025-10-02 13:08:14.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:14.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:15.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:16 np0005465988 nova_compute[236126]: 2025-10-02 13:08:16.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:16.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:17.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:18.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:19 np0005465988 nova_compute[236126]: 2025-10-02 13:08:19.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:19 np0005465988 podman[336504]: 2025-10-02 13:08:19.534152003 +0000 UTC m=+0.067026561 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 09:08:19 np0005465988 podman[336502]: 2025-10-02 13:08:19.553205762 +0000 UTC m=+0.092606408 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 09:08:19 np0005465988 podman[336503]: 2025-10-02 13:08:19.553568713 +0000 UTC m=+0.089672904 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  2 09:08:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:19.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:20 np0005465988 nova_compute[236126]: 2025-10-02 13:08:20.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:08:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:20.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:20 np0005465988 ovn_controller[132601]: 2025-10-02T13:08:20Z|00945|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  2 09:08:21 np0005465988 nova_compute[236126]: 2025-10-02 13:08:21.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:21.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:22 np0005465988 nova_compute[236126]: 2025-10-02 13:08:22.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:08:22 np0005465988 nova_compute[236126]: 2025-10-02 13:08:22.540 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:22 np0005465988 nova_compute[236126]: 2025-10-02 13:08:22.541 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:22 np0005465988 nova_compute[236126]: 2025-10-02 13:08:22.541 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:22 np0005465988 nova_compute[236126]: 2025-10-02 13:08:22.542 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:08:22 np0005465988 nova_compute[236126]: 2025-10-02 13:08:22.542 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:08:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:22.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:08:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/329128017' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.037 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.251 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.253 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4027MB free_disk=20.946483612060547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.254 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.254 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.320 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.321 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.384 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:08:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:23.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:08:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1850533019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.871 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.879 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.897 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.899 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:08:23 np0005465988 nova_compute[236126]: 2025-10-02 13:08:23.900 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:24 np0005465988 nova_compute[236126]: 2025-10-02 13:08:24.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:24.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:25.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:26 np0005465988 nova_compute[236126]: 2025-10-02 13:08:26.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:26.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:27.415 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:27.416 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:27.416 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:27.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:28.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:29 np0005465988 nova_compute[236126]: 2025-10-02 13:08:29.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:29.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:08:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:30.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:08:31 np0005465988 nova_compute[236126]: 2025-10-02 13:08:31.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:31.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:32.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:33.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:34 np0005465988 nova_compute[236126]: 2025-10-02 13:08:34.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:34.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e403 e403: 3 total, 3 up, 3 in
Oct  2 09:08:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:35.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #160. Immutable memtables: 0.
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.795720) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 101] Flushing memtable with next log file: 160
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410515795789, "job": 101, "event": "flush_started", "num_memtables": 1, "num_entries": 2086, "num_deletes": 252, "total_data_size": 4940203, "memory_usage": 5002704, "flush_reason": "Manual Compaction"}
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 101] Level-0 flush table #161: started
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410515812219, "cf_name": "default", "job": 101, "event": "table_file_creation", "file_number": 161, "file_size": 3236545, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 77472, "largest_seqno": 79553, "table_properties": {"data_size": 3227977, "index_size": 5253, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 17984, "raw_average_key_size": 20, "raw_value_size": 3210826, "raw_average_value_size": 3656, "num_data_blocks": 228, "num_entries": 878, "num_filter_entries": 878, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759410336, "oldest_key_time": 1759410336, "file_creation_time": 1759410515, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 161, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 101] Flush lasted 16551 microseconds, and 6894 cpu microseconds.
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.812277) [db/flush_job.cc:967] [default] [JOB 101] Level-0 flush table #161: 3236545 bytes OK
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.812305) [db/memtable_list.cc:519] [default] Level-0 commit table #161 started
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.814258) [db/memtable_list.cc:722] [default] Level-0 commit table #161: memtable #1 done
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.814275) EVENT_LOG_v1 {"time_micros": 1759410515814268, "job": 101, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.814296) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 101] Try to delete WAL files size 4930977, prev total WAL file size 4930977, number of live WAL files 2.
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000157.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.815623) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036373737' seq:72057594037927935, type:22 .. '7061786F730037303239' seq:0, type:0; will stop at (end)
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 102] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 101 Base level 0, inputs: [161(3160KB)], [159(10210KB)]
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410515815686, "job": 102, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [161], "files_L6": [159], "score": -1, "input_data_size": 13691956, "oldest_snapshot_seqno": -1}
Oct  2 09:08:35 np0005465988 nova_compute[236126]: 2025-10-02 13:08:35.901 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 102] Generated table #162: 9956 keys, 11731725 bytes, temperature: kUnknown
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410515912399, "cf_name": "default", "job": 102, "event": "table_file_creation", "file_number": 162, "file_size": 11731725, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11669175, "index_size": 36581, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24901, "raw_key_size": 262504, "raw_average_key_size": 26, "raw_value_size": 11496527, "raw_average_value_size": 1154, "num_data_blocks": 1386, "num_entries": 9956, "num_filter_entries": 9956, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759410515, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 162, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.912691) [db/compaction/compaction_job.cc:1663] [default] [JOB 102] Compacted 1@0 + 1@6 files to L6 => 11731725 bytes
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.917972) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 141.5 rd, 121.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 10.0 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(7.9) write-amplify(3.6) OK, records in: 10481, records dropped: 525 output_compression: NoCompression
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.918020) EVENT_LOG_v1 {"time_micros": 1759410515918001, "job": 102, "event": "compaction_finished", "compaction_time_micros": 96795, "compaction_time_cpu_micros": 28555, "output_level": 6, "num_output_files": 1, "total_output_size": 11731725, "num_input_records": 10481, "num_output_records": 9956, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000161.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410515918689, "job": 102, "event": "table_file_deletion", "file_number": 161}
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000159.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410515920623, "job": 102, "event": "table_file_deletion", "file_number": 159}
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.815522) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.920788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.920798) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.920801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.920803) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:08:35 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:08:35.920805) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:08:36 np0005465988 nova_compute[236126]: 2025-10-02 13:08:36.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:36 np0005465988 nova_compute[236126]: 2025-10-02 13:08:36.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:08:36 np0005465988 nova_compute[236126]: 2025-10-02 13:08:36.607 2 DEBUG nova.compute.manager [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Oct  2 09:08:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:36.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:36 np0005465988 nova_compute[236126]: 2025-10-02 13:08:36.723 2 DEBUG oslo_concurrency.lockutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:36 np0005465988 nova_compute[236126]: 2025-10-02 13:08:36.724 2 DEBUG oslo_concurrency.lockutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e404 e404: 3 total, 3 up, 3 in
Oct  2 09:08:36 np0005465988 nova_compute[236126]: 2025-10-02 13:08:36.856 2 DEBUG nova.objects.instance [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lazy-loading 'pci_requests' on Instance uuid 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:08:36 np0005465988 nova_compute[236126]: 2025-10-02 13:08:36.893 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:08:36 np0005465988 nova_compute[236126]: 2025-10-02 13:08:36.894 2 INFO nova.compute.claims [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:08:36 np0005465988 nova_compute[236126]: 2025-10-02 13:08:36.894 2 DEBUG nova.objects.instance [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lazy-loading 'resources' on Instance uuid 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:08:36 np0005465988 nova_compute[236126]: 2025-10-02 13:08:36.917 2 DEBUG nova.objects.instance [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lazy-loading 'numa_topology' on Instance uuid 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:08:36 np0005465988 nova_compute[236126]: 2025-10-02 13:08:36.964 2 DEBUG nova.objects.instance [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lazy-loading 'pci_devices' on Instance uuid 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:08:37 np0005465988 nova_compute[236126]: 2025-10-02 13:08:37.112 2 INFO nova.compute.resource_tracker [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Updating resource usage from migration aecd633a-2b42-456e-a50c-d6475dc25816#033[00m
Oct  2 09:08:37 np0005465988 nova_compute[236126]: 2025-10-02 13:08:37.112 2 DEBUG nova.compute.resource_tracker [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Starting to track incoming migration aecd633a-2b42-456e-a50c-d6475dc25816 with flavor cef129e5-cce4-4465-9674-03d3559e8a14 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Oct  2 09:08:37 np0005465988 nova_compute[236126]: 2025-10-02 13:08:37.165 2 DEBUG oslo_concurrency.processutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:08:37 np0005465988 nova_compute[236126]: 2025-10-02 13:08:37.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:08:37 np0005465988 nova_compute[236126]: 2025-10-02 13:08:37.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:08:37 np0005465988 nova_compute[236126]: 2025-10-02 13:08:37.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:08:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:08:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3513952075' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:08:37 np0005465988 nova_compute[236126]: 2025-10-02 13:08:37.661 2 DEBUG oslo_concurrency.processutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:08:37 np0005465988 nova_compute[236126]: 2025-10-02 13:08:37.673 2 DEBUG nova.compute.provider_tree [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:08:37 np0005465988 nova_compute[236126]: 2025-10-02 13:08:37.692 2 DEBUG nova.scheduler.client.report [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:08:37 np0005465988 nova_compute[236126]: 2025-10-02 13:08:37.713 2 DEBUG oslo_concurrency.lockutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.988s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:37 np0005465988 nova_compute[236126]: 2025-10-02 13:08:37.713 2 INFO nova.compute.manager [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Migrating#033[00m
Oct  2 09:08:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:37.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e405 e405: 3 total, 3 up, 3 in
Oct  2 09:08:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:38.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:39 np0005465988 nova_compute[236126]: 2025-10-02 13:08:39.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:08:39 np0005465988 nova_compute[236126]: 2025-10-02 13:08:39.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:39 np0005465988 podman[336695]: 2025-10-02 13:08:39.553632751 +0000 UTC m=+0.090872128 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 09:08:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:08:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:39.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:08:40 np0005465988 systemd[1]: Created slice User Slice of UID 42436.
Oct  2 09:08:40 np0005465988 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  2 09:08:40 np0005465988 systemd-logind[827]: New session 56 of user nova.
Oct  2 09:08:40 np0005465988 nova_compute[236126]: 2025-10-02 13:08:40.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:08:40 np0005465988 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  2 09:08:40 np0005465988 systemd[1]: Starting User Manager for UID 42436...
Oct  2 09:08:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:40.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:40 np0005465988 systemd[336719]: Queued start job for default target Main User Target.
Oct  2 09:08:40 np0005465988 systemd[336719]: Created slice User Application Slice.
Oct  2 09:08:40 np0005465988 systemd[336719]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 09:08:40 np0005465988 systemd[336719]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 09:08:40 np0005465988 systemd[336719]: Reached target Paths.
Oct  2 09:08:40 np0005465988 systemd[336719]: Reached target Timers.
Oct  2 09:08:40 np0005465988 systemd[336719]: Starting D-Bus User Message Bus Socket...
Oct  2 09:08:40 np0005465988 systemd[336719]: Starting Create User's Volatile Files and Directories...
Oct  2 09:08:40 np0005465988 systemd[336719]: Finished Create User's Volatile Files and Directories.
Oct  2 09:08:40 np0005465988 systemd[336719]: Listening on D-Bus User Message Bus Socket.
Oct  2 09:08:40 np0005465988 systemd[336719]: Reached target Sockets.
Oct  2 09:08:40 np0005465988 systemd[336719]: Reached target Basic System.
Oct  2 09:08:40 np0005465988 systemd[336719]: Reached target Main User Target.
Oct  2 09:08:40 np0005465988 systemd[336719]: Startup finished in 208ms.
Oct  2 09:08:40 np0005465988 systemd[1]: Started User Manager for UID 42436.
Oct  2 09:08:40 np0005465988 systemd[1]: Started Session 56 of User nova.
Oct  2 09:08:40 np0005465988 systemd[1]: session-56.scope: Deactivated successfully.
Oct  2 09:08:40 np0005465988 systemd-logind[827]: Session 56 logged out. Waiting for processes to exit.
Oct  2 09:08:40 np0005465988 systemd-logind[827]: Removed session 56.
Oct  2 09:08:40 np0005465988 systemd-logind[827]: New session 58 of user nova.
Oct  2 09:08:40 np0005465988 systemd[1]: Started Session 58 of User nova.
Oct  2 09:08:41 np0005465988 systemd[1]: session-58.scope: Deactivated successfully.
Oct  2 09:08:41 np0005465988 systemd-logind[827]: Session 58 logged out. Waiting for processes to exit.
Oct  2 09:08:41 np0005465988 systemd-logind[827]: Removed session 58.
Oct  2 09:08:41 np0005465988 nova_compute[236126]: 2025-10-02 13:08:41.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e405 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:41.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:42.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:43.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.192 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Acquiring lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.193 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.211 2 DEBUG nova.compute.manager [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.287 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.288 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.295 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.295 2 INFO nova.compute.claims [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.370 2 DEBUG nova.compute.manager [req-2356677a-cf58-4765-a0bf-6779f2b6532d req-c5eabc61-e3c0-498e-8daa-9a4974388c5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received event network-vif-unplugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.371 2 DEBUG oslo_concurrency.lockutils [req-2356677a-cf58-4765-a0bf-6779f2b6532d req-c5eabc61-e3c0-498e-8daa-9a4974388c5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.371 2 DEBUG oslo_concurrency.lockutils [req-2356677a-cf58-4765-a0bf-6779f2b6532d req-c5eabc61-e3c0-498e-8daa-9a4974388c5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.371 2 DEBUG oslo_concurrency.lockutils [req-2356677a-cf58-4765-a0bf-6779f2b6532d req-c5eabc61-e3c0-498e-8daa-9a4974388c5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.372 2 DEBUG nova.compute.manager [req-2356677a-cf58-4765-a0bf-6779f2b6532d req-c5eabc61-e3c0-498e-8daa-9a4974388c5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] No waiting events found dispatching network-vif-unplugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.372 2 WARNING nova.compute.manager [req-2356677a-cf58-4765-a0bf-6779f2b6532d req-c5eabc61-e3c0-498e-8daa-9a4974388c5b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received unexpected event network-vif-unplugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.476 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.477 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.477 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.533 2 DEBUG oslo_concurrency.processutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.581 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.582 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.582 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.583 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 09:08:44 np0005465988 nova_compute[236126]: 2025-10-02 13:08:44.583 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:08:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:44.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:08:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3354737208' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.054 2 DEBUG oslo_concurrency.processutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.062 2 DEBUG nova.compute.provider_tree [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.119 2 DEBUG nova.scheduler.client.report [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.158 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.870s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.159 2 DEBUG nova.compute.manager [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.255 2 DEBUG nova.compute.manager [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.256 2 DEBUG nova.network.neutron [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.291 2 INFO nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.315 2 DEBUG nova.compute.manager [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.466 2 DEBUG nova.compute.manager [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.468 2 DEBUG nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.468 2 INFO nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Creating image(s)#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.502 2 DEBUG nova.storage.rbd_utils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] rbd image 0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.539 2 DEBUG nova.storage.rbd_utils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] rbd image 0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.597 2 DEBUG nova.storage.rbd_utils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] rbd image 0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.602 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Acquiring lock "139b6ebefb6dec1e7a575771f42f773f500d2a8d" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.603 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "139b6ebefb6dec1e7a575771f42f773f500d2a8d" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.607 2 INFO nova.network.neutron [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Updating port 06105eee-1ccc-4976-9ef2-84b4765d9a79 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  2 09:08:45 np0005465988 nova_compute[236126]: 2025-10-02 13:08:45.777 2 DEBUG nova.policy [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '29c8a28c5bdd4feb9412127428bf0c3b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '60bfd415ee154615b20dd99528061614', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:08:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:45.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e406 e406: 3 total, 3 up, 3 in
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.115 2 DEBUG nova.virt.libvirt.imagebackend [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Image locations are: [{'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/a1241e22-2f30-42e3-8072-a21ad0ab0f69/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/a1241e22-2f30-42e3-8072-a21ad0ab0f69/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Oct  2 09:08:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.188 2 DEBUG nova.virt.libvirt.imagebackend [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Selected location: {'url': 'rbd://fd4c5763-22d1-50ea-ad0b-96a3dc3040b2/images/a1241e22-2f30-42e3-8072-a21ad0ab0f69/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.189 2 DEBUG nova.storage.rbd_utils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] cloning images/a1241e22-2f30-42e3-8072-a21ad0ab0f69@snap to None/0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.324 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "139b6ebefb6dec1e7a575771f42f773f500d2a8d" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.477 2 DEBUG nova.objects.instance [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lazy-loading 'migration_context' on Instance uuid 0f002be2-0f9d-4b3b-a8b2-552c569f0d28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.593 2 DEBUG nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.594 2 DEBUG nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Ensure instance console log exists: /var/lib/nova/instances/0f002be2-0f9d-4b3b-a8b2-552c569f0d28/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.594 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.595 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.595 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.630 2 DEBUG nova.compute.manager [req-85a86fbd-a37e-4386-9fef-614c0c1f970d req-4010acd0-2b12-46de-9310-67052e749d97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.631 2 DEBUG oslo_concurrency.lockutils [req-85a86fbd-a37e-4386-9fef-614c0c1f970d req-4010acd0-2b12-46de-9310-67052e749d97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.631 2 DEBUG oslo_concurrency.lockutils [req-85a86fbd-a37e-4386-9fef-614c0c1f970d req-4010acd0-2b12-46de-9310-67052e749d97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.631 2 DEBUG oslo_concurrency.lockutils [req-85a86fbd-a37e-4386-9fef-614c0c1f970d req-4010acd0-2b12-46de-9310-67052e749d97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.632 2 DEBUG nova.compute.manager [req-85a86fbd-a37e-4386-9fef-614c0c1f970d req-4010acd0-2b12-46de-9310-67052e749d97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] No waiting events found dispatching network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:08:46 np0005465988 nova_compute[236126]: 2025-10-02 13:08:46.632 2 WARNING nova.compute.manager [req-85a86fbd-a37e-4386-9fef-614c0c1f970d req-4010acd0-2b12-46de-9310-67052e749d97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received unexpected event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 09:08:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:46.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:08:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:47.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:08:48 np0005465988 nova_compute[236126]: 2025-10-02 13:08:48.615 2 DEBUG oslo_concurrency.lockutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Acquiring lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:08:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:48.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:48 np0005465988 nova_compute[236126]: 2025-10-02 13:08:48.754 2 DEBUG nova.compute.manager [req-d994fe06-4ab6-4653-a41f-15929b58a8a3 req-f62c90e3-135b-49b5-a448-88e26aeb259d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received event network-changed-06105eee-1ccc-4976-9ef2-84b4765d9a79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:08:48 np0005465988 nova_compute[236126]: 2025-10-02 13:08:48.754 2 DEBUG nova.compute.manager [req-d994fe06-4ab6-4653-a41f-15929b58a8a3 req-f62c90e3-135b-49b5-a448-88e26aeb259d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Refreshing instance network info cache due to event network-changed-06105eee-1ccc-4976-9ef2-84b4765d9a79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:08:48 np0005465988 nova_compute[236126]: 2025-10-02 13:08:48.755 2 DEBUG oslo_concurrency.lockutils [req-d994fe06-4ab6-4653-a41f-15929b58a8a3 req-f62c90e3-135b-49b5-a448-88e26aeb259d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:08:48 np0005465988 nova_compute[236126]: 2025-10-02 13:08:48.955 2 DEBUG nova.network.neutron [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Successfully created port: 7e799a10-8b7f-44c3-b57d-de4a1255aad4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:08:48 np0005465988 nova_compute[236126]: 2025-10-02 13:08:48.979 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Updating instance_info_cache with network_info: [{"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:08:49 np0005465988 nova_compute[236126]: 2025-10-02 13:08:49.039 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:08:49 np0005465988 nova_compute[236126]: 2025-10-02 13:08:49.040 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 09:08:49 np0005465988 nova_compute[236126]: 2025-10-02 13:08:49.040 2 DEBUG oslo_concurrency.lockutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Acquired lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:08:49 np0005465988 nova_compute[236126]: 2025-10-02 13:08:49.040 2 DEBUG nova.network.neutron [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:08:49 np0005465988 nova_compute[236126]: 2025-10-02 13:08:49.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:49.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:50 np0005465988 podman[336997]: 2025-10-02 13:08:50.55040856 +0000 UTC m=+0.073222120 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Oct  2 09:08:50 np0005465988 podman[336996]: 2025-10-02 13:08:50.572524357 +0000 UTC m=+0.097794008 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:08:50 np0005465988 podman[336995]: 2025-10-02 13:08:50.58929531 +0000 UTC m=+0.109804244 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:08:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:50.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:51 np0005465988 nova_compute[236126]: 2025-10-02 13:08:51.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:51 np0005465988 systemd[1]: Stopping User Manager for UID 42436...
Oct  2 09:08:51 np0005465988 systemd[336719]: Activating special unit Exit the Session...
Oct  2 09:08:51 np0005465988 systemd[336719]: Stopped target Main User Target.
Oct  2 09:08:51 np0005465988 systemd[336719]: Stopped target Basic System.
Oct  2 09:08:51 np0005465988 systemd[336719]: Stopped target Paths.
Oct  2 09:08:51 np0005465988 systemd[336719]: Stopped target Sockets.
Oct  2 09:08:51 np0005465988 systemd[336719]: Stopped target Timers.
Oct  2 09:08:51 np0005465988 systemd[336719]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  2 09:08:51 np0005465988 systemd[336719]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 09:08:51 np0005465988 systemd[336719]: Closed D-Bus User Message Bus Socket.
Oct  2 09:08:51 np0005465988 systemd[336719]: Stopped Create User's Volatile Files and Directories.
Oct  2 09:08:51 np0005465988 systemd[336719]: Removed slice User Application Slice.
Oct  2 09:08:51 np0005465988 systemd[336719]: Reached target Shutdown.
Oct  2 09:08:51 np0005465988 systemd[336719]: Finished Exit the Session.
Oct  2 09:08:51 np0005465988 systemd[336719]: Reached target Exit the Session.
Oct  2 09:08:51 np0005465988 systemd[1]: user@42436.service: Deactivated successfully.
Oct  2 09:08:51 np0005465988 systemd[1]: Stopped User Manager for UID 42436.
Oct  2 09:08:51 np0005465988 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  2 09:08:51 np0005465988 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  2 09:08:51 np0005465988 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  2 09:08:51 np0005465988 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  2 09:08:51 np0005465988 systemd[1]: Removed slice User Slice of UID 42436.
Oct  2 09:08:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:51.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.511 2 DEBUG nova.network.neutron [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Successfully updated port: 7e799a10-8b7f-44c3-b57d-de4a1255aad4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.532 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Acquiring lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.532 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Acquired lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.533 2 DEBUG nova.network.neutron [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.631 2 DEBUG nova.compute.manager [req-fc1e7953-662e-43ad-b071-906e71a17be0 req-5e33c85d-143a-42c4-ba3b-3d8d1e1dfb90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Received event network-changed-7e799a10-8b7f-44c3-b57d-de4a1255aad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.632 2 DEBUG nova.compute.manager [req-fc1e7953-662e-43ad-b071-906e71a17be0 req-5e33c85d-143a-42c4-ba3b-3d8d1e1dfb90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Refreshing instance network info cache due to event network-changed-7e799a10-8b7f-44c3-b57d-de4a1255aad4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.632 2 DEBUG oslo_concurrency.lockutils [req-fc1e7953-662e-43ad-b071-906e71a17be0 req-5e33c85d-143a-42c4-ba3b-3d8d1e1dfb90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.690 2 DEBUG nova.network.neutron [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:08:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:52.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.711 2 DEBUG nova.network.neutron [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Updating instance_info_cache with network_info: [{"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.728 2 DEBUG oslo_concurrency.lockutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Releasing lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.733 2 DEBUG oslo_concurrency.lockutils [req-d994fe06-4ab6-4653-a41f-15929b58a8a3 req-f62c90e3-135b-49b5-a448-88e26aeb259d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.734 2 DEBUG nova.network.neutron [req-d994fe06-4ab6-4653-a41f-15929b58a8a3 req-f62c90e3-135b-49b5-a448-88e26aeb259d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Refreshing network info cache for port 06105eee-1ccc-4976-9ef2-84b4765d9a79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.824 2 DEBUG nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.826 2 DEBUG nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.827 2 INFO nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Creating image(s)#033[00m
Oct  2 09:08:52 np0005465988 nova_compute[236126]: 2025-10-02 13:08:52.878 2 DEBUG nova.storage.rbd_utils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] creating snapshot(nova-resize) on rbd image(6df9bd3b-6218-4859-aba9-bfbedf2b8f18_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 09:08:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:53.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e407 e407: 3 total, 3 up, 3 in
Oct  2 09:08:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct  2 09:08:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:08:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:08:53 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 09:08:53 np0005465988 nova_compute[236126]: 2025-10-02 13:08:53.910 2 DEBUG nova.objects.instance [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.057 2 DEBUG nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.057 2 DEBUG nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Ensure instance console log exists: /var/lib/nova/instances/6df9bd3b-6218-4859-aba9-bfbedf2b8f18/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.058 2 DEBUG oslo_concurrency.lockutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.059 2 DEBUG oslo_concurrency.lockutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.059 2 DEBUG oslo_concurrency.lockutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.063 2 DEBUG nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Start _get_guest_xml network_info=[{"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--166998389", "vif_mac": "fa:16:3e:3d:d7:ed"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.068 2 WARNING nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.079 2 DEBUG nova.virt.libvirt.host [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.080 2 DEBUG nova.virt.libvirt.host [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.084 2 DEBUG nova.virt.libvirt.host [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.085 2 DEBUG nova.virt.libvirt.host [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.087 2 DEBUG nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.087 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.088 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.088 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.089 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.089 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.089 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.090 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.090 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.090 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.091 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.091 2 DEBUG nova.virt.hardware [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.092 2 DEBUG nova.objects.instance [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.117 2 DEBUG oslo_concurrency.processutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:08:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:08:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2387503929' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:54.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.699 2 DEBUG oslo_concurrency.processutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.582s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:08:54 np0005465988 nova_compute[236126]: 2025-10-02 13:08:54.744 2 DEBUG oslo_concurrency.processutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:08:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 09:08:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:08:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:08:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:08:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:08:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1810644779' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.263 2 DEBUG oslo_concurrency.processutils [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.265 2 DEBUG nova.virt.libvirt.vif [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:08:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-510358772',display_name='tempest-TestNetworkAdvancedServerOps-server-510358772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-510358772',id=209,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNQLYFaK6VzMZ4VXSjIB28DDIVujtRqXaihQsQXdMB+5rY8DD1rQi9P2Y1PwrrLaViv1jTWp23s6ULfYTCXiXfqd1pOSru0GKVbLKUc8HJqBymXrreI8FngJNgN4inx/nA==',key_name='tempest-TestNetworkAdvancedServerOps-178932410',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:08:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-r5xxc3cq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:08:44Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=6df9bd3b-6218-4859-aba9-bfbedf2b8f18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--166998389", "vif_mac": "fa:16:3e:3d:d7:ed"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.266 2 DEBUG nova.network.os_vif_util [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Converting VIF {"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--166998389", "vif_mac": "fa:16:3e:3d:d7:ed"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.266 2 DEBUG nova.network.os_vif_util [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=06105eee-1ccc-4976-9ef2-84b4765d9a79,network=Network(41354ccc-5b80-451f-9510-2c3d0788ecf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06105eee-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.269 2 DEBUG nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  <uuid>6df9bd3b-6218-4859-aba9-bfbedf2b8f18</uuid>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  <name>instance-000000d1</name>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-510358772</nova:name>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:08:54</nova:creationTime>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <nova:user uuid="ffe4d737e4414fb3a3e358f8ca3f3e1e">tempest-TestNetworkAdvancedServerOps-1527846432-project-member</nova:user>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <nova:project uuid="08e102ae48244af2ab448a2e1ff757df">tempest-TestNetworkAdvancedServerOps-1527846432</nova:project>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <nova:port uuid="06105eee-1ccc-4976-9ef2-84b4765d9a79">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <entry name="serial">6df9bd3b-6218-4859-aba9-bfbedf2b8f18</entry>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <entry name="uuid">6df9bd3b-6218-4859-aba9-bfbedf2b8f18</entry>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/6df9bd3b-6218-4859-aba9-bfbedf2b8f18_disk">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/6df9bd3b-6218-4859-aba9-bfbedf2b8f18_disk.config">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:3d:d7:ed"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <target dev="tap06105eee-1c"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/6df9bd3b-6218-4859-aba9-bfbedf2b8f18/console.log" append="off"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:08:55 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:08:55 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:08:55 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:08:55 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.271 2 DEBUG nova.virt.libvirt.vif [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:08:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-510358772',display_name='tempest-TestNetworkAdvancedServerOps-server-510358772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-510358772',id=209,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNQLYFaK6VzMZ4VXSjIB28DDIVujtRqXaihQsQXdMB+5rY8DD1rQi9P2Y1PwrrLaViv1jTWp23s6ULfYTCXiXfqd1pOSru0GKVbLKUc8HJqBymXrreI8FngJNgN4inx/nA==',key_name='tempest-TestNetworkAdvancedServerOps-178932410',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:08:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-r5xxc3cq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:08:44Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=6df9bd3b-6218-4859-aba9-bfbedf2b8f18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--166998389", "vif_mac": "fa:16:3e:3d:d7:ed"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.271 2 DEBUG nova.network.os_vif_util [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Converting VIF {"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--166998389", "vif_mac": "fa:16:3e:3d:d7:ed"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.272 2 DEBUG nova.network.os_vif_util [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=06105eee-1ccc-4976-9ef2-84b4765d9a79,network=Network(41354ccc-5b80-451f-9510-2c3d0788ecf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06105eee-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.272 2 DEBUG os_vif [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=06105eee-1ccc-4976-9ef2-84b4765d9a79,network=Network(41354ccc-5b80-451f-9510-2c3d0788ecf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06105eee-1c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.273 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.274 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.277 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06105eee-1c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.277 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06105eee-1c, col_values=(('external_ids', {'iface-id': '06105eee-1ccc-4976-9ef2-84b4765d9a79', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3d:d7:ed', 'vm-uuid': '6df9bd3b-6218-4859-aba9-bfbedf2b8f18'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 NetworkManager[45041]: <info>  [1759410535.2808] manager: (tap06105eee-1c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/414)
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.289 2 INFO os_vif [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=06105eee-1ccc-4976-9ef2-84b4765d9a79,network=Network(41354ccc-5b80-451f-9510-2c3d0788ecf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06105eee-1c')#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.342 2 DEBUG nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.343 2 DEBUG nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.343 2 DEBUG nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] No VIF found with MAC fa:16:3e:3d:d7:ed, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.343 2 INFO nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Using config drive#033[00m
Oct  2 09:08:55 np0005465988 NetworkManager[45041]: <info>  [1759410535.4604] manager: (tap06105eee-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/415)
Oct  2 09:08:55 np0005465988 kernel: tap06105eee-1c: entered promiscuous mode
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 ovn_controller[132601]: 2025-10-02T13:08:55Z|00946|binding|INFO|Claiming lport 06105eee-1ccc-4976-9ef2-84b4765d9a79 for this chassis.
Oct  2 09:08:55 np0005465988 ovn_controller[132601]: 2025-10-02T13:08:55Z|00947|binding|INFO|06105eee-1ccc-4976-9ef2-84b4765d9a79: Claiming fa:16:3e:3d:d7:ed 10.100.0.11
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 NetworkManager[45041]: <info>  [1759410535.4819] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/416)
Oct  2 09:08:55 np0005465988 NetworkManager[45041]: <info>  [1759410535.4828] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/417)
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.487 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:d7:ed 10.100.0.11'], port_security=['fa:16:3e:3d:d7:ed 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '6df9bd3b-6218-4859-aba9-bfbedf2b8f18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41354ccc-5b80-451f-9510-2c3d0788ecf7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08e102ae48244af2ab448a2e1ff757df', 'neutron:revision_number': '6', 'neutron:security_group_ids': '337a5b6a-7697-4b02-8d14-65af2374695f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f431a9f-5f6a-4914-ae7c-c00e97c25630, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=06105eee-1ccc-4976-9ef2-84b4765d9a79) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.488 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 06105eee-1ccc-4976-9ef2-84b4765d9a79 in datapath 41354ccc-5b80-451f-9510-2c3d0788ecf7 bound to our chassis#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.491 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41354ccc-5b80-451f-9510-2c3d0788ecf7#033[00m
Oct  2 09:08:55 np0005465988 systemd-udevd[337479]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.506 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c184ef74-cc13-4b65-8456-a51bb2074150]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.508 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41354ccc-51 in ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:08:55 np0005465988 systemd-machined[192594]: New machine qemu-99-instance-000000d1.
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.513 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41354ccc-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.513 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3915245a-da49-4ab6-8416-67056bdcbd5d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 NetworkManager[45041]: <info>  [1759410535.5143] device (tap06105eee-1c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:08:55 np0005465988 NetworkManager[45041]: <info>  [1759410535.5150] device (tap06105eee-1c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.515 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[30c016cf-6d53-4838-bd80-0c93590db8f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.529 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[7f9e40bc-32cb-4b82-b05e-9492e9097bbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 systemd[1]: Started Virtual Machine qemu-99-instance-000000d1.
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.556 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bd52efaf-7150-4398-b2a4-2263283e1835]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.591 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[dbf98a0d-50ad-4671-af5f-8ebb1dbabe0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.593 2 DEBUG nova.network.neutron [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Updating instance_info_cache with network_info: [{"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:08:55 np0005465988 systemd-udevd[337483]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:08:55 np0005465988 NetworkManager[45041]: <info>  [1759410535.6027] manager: (tap41354ccc-50): new Veth device (/org/freedesktop/NetworkManager/Devices/418)
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.601 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[125a6846-4dda-45d7-88c8-e08ee4a5a533]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.624 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Releasing lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.626 2 DEBUG nova.compute.manager [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Instance network_info: |[{"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.627 2 DEBUG oslo_concurrency.lockutils [req-fc1e7953-662e-43ad-b071-906e71a17be0 req-5e33c85d-143a-42c4-ba3b-3d8d1e1dfb90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.627 2 DEBUG nova.network.neutron [req-fc1e7953-662e-43ad-b071-906e71a17be0 req-5e33c85d-143a-42c4-ba3b-3d8d1e1dfb90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Refreshing network info cache for port 7e799a10-8b7f-44c3-b57d-de4a1255aad4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.631 2 DEBUG nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Start _get_guest_xml network_info=[{"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-10-02T13:08:33Z,direct_url=<?>,disk_format='raw',id=a1241e22-2f30-42e3-8072-a21ad0ab0f69,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-685976860',owner='60bfd415ee154615b20dd99528061614',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-10-02T13:08:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'a1241e22-2f30-42e3-8072-a21ad0ab0f69'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.639 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9bcd734b-cb42-4186-ab79-e7d8ef64ebe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.640 2 WARNING nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.642 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8d70d650-ccb5-47a0-8442-b1fa27544505]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.647 2 DEBUG nova.virt.libvirt.host [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.648 2 DEBUG nova.virt.libvirt.host [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.653 2 DEBUG nova.virt.libvirt.host [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.653 2 DEBUG nova.virt.libvirt.host [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.654 2 DEBUG nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.655 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-10-02T13:08:33Z,direct_url=<?>,disk_format='raw',id=a1241e22-2f30-42e3-8072-a21ad0ab0f69,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-685976860',owner='60bfd415ee154615b20dd99528061614',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-10-02T13:08:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.655 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.655 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.656 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.656 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.656 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.657 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.657 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.658 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.658 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.658 2 DEBUG nova.virt.hardware [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.662 2 DEBUG oslo_concurrency.processutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:08:55 np0005465988 NetworkManager[45041]: <info>  [1759410535.6730] device (tap41354ccc-50): carrier: link connected
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.678 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8d362d26-7351-42af-ab6f-3d8802ae192b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 ovn_controller[132601]: 2025-10-02T13:08:55Z|00948|binding|INFO|Setting lport 06105eee-1ccc-4976-9ef2-84b4765d9a79 ovn-installed in OVS
Oct  2 09:08:55 np0005465988 ovn_controller[132601]: 2025-10-02T13:08:55Z|00949|binding|INFO|Setting lport 06105eee-1ccc-4976-9ef2-84b4765d9a79 up in Southbound
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.703 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[17f51594-9394-4aa3-9804-98ac7ac73991]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41354ccc-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:ea:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 277], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 859600, 'reachable_time': 43860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 337514, 'error': None, 'target': 'ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.721 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b3cc49ce-fbcc-4e74-8605-10bbdcae62ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:eabc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 859600, 'tstamp': 859600}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 337515, 'error': None, 'target': 'ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.748 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba62537-f25b-4e21-b05c-311bdb648eb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41354ccc-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:ea:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 277], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 859600, 'reachable_time': 43860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 337516, 'error': None, 'target': 'ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.788 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a141eb-6cda-4909-a976-33904c49ec23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:08:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:55.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.861 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[45decebc-331a-4820-8fe5-417caf6488ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.864 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41354ccc-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.865 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.866 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41354ccc-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 kernel: tap41354ccc-50: entered promiscuous mode
Oct  2 09:08:55 np0005465988 NetworkManager[45041]: <info>  [1759410535.8686] manager: (tap41354ccc-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/419)
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.870 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41354ccc-50, col_values=(('external_ids', {'iface-id': '7cea9858-fead-45b4-8830-8edfb5209d69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:55 np0005465988 ovn_controller[132601]: 2025-10-02T13:08:55Z|00950|binding|INFO|Releasing lport 7cea9858-fead-45b4-8830-8edfb5209d69 from this chassis (sb_readonly=0)
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 nova_compute[236126]: 2025-10-02 13:08:55.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.886 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41354ccc-5b80-451f-9510-2c3d0788ecf7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41354ccc-5b80-451f-9510-2c3d0788ecf7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.887 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[de786b84-4086-4e30-a188-065a46441f9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.889 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-41354ccc-5b80-451f-9510-2c3d0788ecf7
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/41354ccc-5b80-451f-9510-2c3d0788ecf7.pid.haproxy
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 41354ccc-5b80-451f-9510-2c3d0788ecf7
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:08:55 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:55.892 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7', 'env', 'PROCESS_TAG=haproxy-41354ccc-5b80-451f-9510-2c3d0788ecf7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41354ccc-5b80-451f-9510-2c3d0788ecf7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:08:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/564078248' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:08:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.136 2 DEBUG oslo_concurrency.processutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.186 2 DEBUG nova.storage.rbd_utils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] rbd image 0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.194 2 DEBUG oslo_concurrency.processutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:08:56 np0005465988 podman[337630]: 2025-10-02 13:08:56.283314173 +0000 UTC m=+0.053061219 container create 6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:08:56 np0005465988 podman[337630]: 2025-10-02 13:08:56.252825915 +0000 UTC m=+0.022572981 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:08:56 np0005465988 systemd[1]: Started libpod-conmon-6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7.scope.
Oct  2 09:08:56 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:08:56 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce3998c11fca383a06e83d5f011269f33480511d288b73b17a71fce55d1e6198/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:08:56 np0005465988 podman[337630]: 2025-10-02 13:08:56.415707766 +0000 UTC m=+0.185454832 container init 6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:08:56 np0005465988 podman[337630]: 2025-10-02 13:08:56.421827883 +0000 UTC m=+0.191574939 container start 6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 09:08:56 np0005465988 neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7[337664]: [NOTICE]   (337668) : New worker (337670) forked
Oct  2 09:08:56 np0005465988 neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7[337664]: [NOTICE]   (337668) : Loading success.
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.527 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410536.5268478, 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.528 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.531 2 DEBUG nova.compute.manager [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.538 2 INFO nova.virt.libvirt.driver [-] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Instance running successfully.#033[00m
Oct  2 09:08:56 np0005465988 virtqemud[235689]: argument unsupported: QEMU guest agent is not configured
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.541 2 DEBUG nova.virt.libvirt.guest [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.542 2 DEBUG nova.virt.libvirt.driver [None req-082413b9-8d2a-4e7e-864e-9b289da5daf9 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.581 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.585 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.653 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.656 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410536.5271127, 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.656 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] VM Started (Lifecycle Event)#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.697 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:08:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:08:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2664833074' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:08:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:56.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.704 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.719 2 DEBUG oslo_concurrency.processutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.720 2 DEBUG nova.virt.libvirt.vif [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:08:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1337888295',display_name='tempest-TestSnapshotPattern-server-1337888295',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1337888295',id=210,image_ref='a1241e22-2f30-42e3-8072-a21ad0ab0f69',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMla0NYshozjIhqPshVDB7V9QQUdVkTPcrZs6EBQpt1OKq+5dB8xVPlwumBL6FU6d1oBZUn9yPH7sBT9aKh0ThWjhX3hBNPhKbMCSDMfEKO24D2SpIWzAY4pSc/pr8RlQ==',key_name='tempest-TestSnapshotPattern-747682339',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='60bfd415ee154615b20dd99528061614',ramdisk_id='',reservation_id='r-nxgj21az',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='fc398644-66fc-44e3-9a6a-7389f5a542b8',image_min_disk='1',image_min_ram='0',image_owner_id='60bfd415ee154615b20dd99528061614',image_owner_project_name='tempest-TestSnapshotPattern-400150385',image_owner_user_name='tempest-TestSnapshotPattern-400150385-project-member',image_user_id='29c8a28c5bdd4feb9412127428bf0c3b',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-400150385',owner_user_name='tempest-TestSnapshotPattern-400150385-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:08:45Z,user_data=None,user_id='29c8a28c5bdd4feb9412127428bf0c3b',uuid=0f002be2-0f9d-4b3b-a8b2-552c569f0d28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.721 2 DEBUG nova.network.os_vif_util [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Converting VIF {"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.721 2 DEBUG nova.network.os_vif_util [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:4f:59,bridge_name='br-int',has_traffic_filtering=True,id=7e799a10-8b7f-44c3-b57d-de4a1255aad4,network=Network(14e748fc-2d6a-4d69-b120-68c995d49660),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e799a10-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.723 2 DEBUG nova.objects.instance [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0f002be2-0f9d-4b3b-a8b2-552c569f0d28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.794 2 DEBUG nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  <uuid>0f002be2-0f9d-4b3b-a8b2-552c569f0d28</uuid>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  <name>instance-000000d2</name>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestSnapshotPattern-server-1337888295</nova:name>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:08:55</nova:creationTime>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <nova:user uuid="29c8a28c5bdd4feb9412127428bf0c3b">tempest-TestSnapshotPattern-400150385-project-member</nova:user>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <nova:project uuid="60bfd415ee154615b20dd99528061614">tempest-TestSnapshotPattern-400150385</nova:project>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="a1241e22-2f30-42e3-8072-a21ad0ab0f69"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <nova:port uuid="7e799a10-8b7f-44c3-b57d-de4a1255aad4">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <entry name="serial">0f002be2-0f9d-4b3b-a8b2-552c569f0d28</entry>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <entry name="uuid">0f002be2-0f9d-4b3b-a8b2-552c569f0d28</entry>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk.config">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:8e:4f:59"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <target dev="tap7e799a10-8b"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/0f002be2-0f9d-4b3b-a8b2-552c569f0d28/console.log" append="off"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <input type="keyboard" bus="usb"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:08:56 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:08:56 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:08:56 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:08:56 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.797 2 DEBUG nova.compute.manager [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Preparing to wait for external event network-vif-plugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.797 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Acquiring lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.798 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.798 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.799 2 DEBUG nova.virt.libvirt.vif [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:08:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1337888295',display_name='tempest-TestSnapshotPattern-server-1337888295',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1337888295',id=210,image_ref='a1241e22-2f30-42e3-8072-a21ad0ab0f69',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMla0NYshozjIhqPshVDB7V9QQUdVkTPcrZs6EBQpt1OKq+5dB8xVPlwumBL6FU6d1oBZUn9yPH7sBT9aKh0ThWjhX3hBNPhKbMCSDMfEKO24D2SpIWzAY4pSc/pr8RlQ==',key_name='tempest-TestSnapshotPattern-747682339',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='60bfd415ee154615b20dd99528061614',ramdisk_id='',reservation_id='r-nxgj21az',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='fc398644-66fc-44e3-9a6a-7389f5a542b8',image_min_disk='1',image_min_ram='0',image_owner_id='60bfd415ee154615b20dd99528061614',image_owner_project_name='tempest-TestSnapshotPattern-400150385',image_owner_user_name='tempest-TestSnapshotPattern-400150385-project-member',image_user_id='29c8a28c5bdd4feb9412127428bf0c3b',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-400150385',owner_user_name='tempest-TestSnapshotPattern-400150385-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:08:45Z,user_data=None,user_id='29c8a28c5bdd4feb9412127428bf0c3b',uuid=0f002be2-0f9d-4b3b-a8b2-552c569f0d28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.799 2 DEBUG nova.network.os_vif_util [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Converting VIF {"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.800 2 DEBUG nova.network.os_vif_util [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:4f:59,bridge_name='br-int',has_traffic_filtering=True,id=7e799a10-8b7f-44c3-b57d-de4a1255aad4,network=Network(14e748fc-2d6a-4d69-b120-68c995d49660),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e799a10-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.801 2 DEBUG os_vif [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:4f:59,bridge_name='br-int',has_traffic_filtering=True,id=7e799a10-8b7f-44c3-b57d-de4a1255aad4,network=Network(14e748fc-2d6a-4d69-b120-68c995d49660),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e799a10-8b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.802 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.803 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.812 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e799a10-8b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.813 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7e799a10-8b, col_values=(('external_ids', {'iface-id': '7e799a10-8b7f-44c3-b57d-de4a1255aad4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:4f:59', 'vm-uuid': '0f002be2-0f9d-4b3b-a8b2-552c569f0d28'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:56 np0005465988 NetworkManager[45041]: <info>  [1759410536.8159] manager: (tap7e799a10-8b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/420)
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.823 2 INFO os_vif [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:4f:59,bridge_name='br-int',has_traffic_filtering=True,id=7e799a10-8b7f-44c3-b57d-de4a1255aad4,network=Network(14e748fc-2d6a-4d69-b120-68c995d49660),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e799a10-8b')#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.877 2 DEBUG nova.network.neutron [req-d994fe06-4ab6-4653-a41f-15929b58a8a3 req-f62c90e3-135b-49b5-a448-88e26aeb259d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Updated VIF entry in instance network info cache for port 06105eee-1ccc-4976-9ef2-84b4765d9a79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.877 2 DEBUG nova.network.neutron [req-d994fe06-4ab6-4653-a41f-15929b58a8a3 req-f62c90e3-135b-49b5-a448-88e26aeb259d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Updating instance_info_cache with network_info: [{"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.917 2 DEBUG oslo_concurrency.lockutils [req-d994fe06-4ab6-4653-a41f-15929b58a8a3 req-f62c90e3-135b-49b5-a448-88e26aeb259d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.939 2 DEBUG nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.940 2 DEBUG nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.940 2 DEBUG nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] No VIF found with MAC fa:16:3e:8e:4f:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.941 2 INFO nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Using config drive#033[00m
Oct  2 09:08:56 np0005465988 nova_compute[236126]: 2025-10-02 13:08:56.973 2 DEBUG nova.storage.rbd_utils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] rbd image 0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:08:57 np0005465988 nova_compute[236126]: 2025-10-02 13:08:57.034 2 DEBUG nova.compute.manager [req-d1cd4141-8e56-481a-89d6-81035ebf91f4 req-89121e1f-b638-4d4f-b4c0-10c242ff434c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:08:57 np0005465988 nova_compute[236126]: 2025-10-02 13:08:57.035 2 DEBUG oslo_concurrency.lockutils [req-d1cd4141-8e56-481a-89d6-81035ebf91f4 req-89121e1f-b638-4d4f-b4c0-10c242ff434c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:57 np0005465988 nova_compute[236126]: 2025-10-02 13:08:57.035 2 DEBUG oslo_concurrency.lockutils [req-d1cd4141-8e56-481a-89d6-81035ebf91f4 req-89121e1f-b638-4d4f-b4c0-10c242ff434c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:57 np0005465988 nova_compute[236126]: 2025-10-02 13:08:57.036 2 DEBUG oslo_concurrency.lockutils [req-d1cd4141-8e56-481a-89d6-81035ebf91f4 req-89121e1f-b638-4d4f-b4c0-10c242ff434c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:57 np0005465988 nova_compute[236126]: 2025-10-02 13:08:57.036 2 DEBUG nova.compute.manager [req-d1cd4141-8e56-481a-89d6-81035ebf91f4 req-89121e1f-b638-4d4f-b4c0-10c242ff434c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] No waiting events found dispatching network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:08:57 np0005465988 nova_compute[236126]: 2025-10-02 13:08:57.036 2 WARNING nova.compute.manager [req-d1cd4141-8e56-481a-89d6-81035ebf91f4 req-89121e1f-b638-4d4f-b4c0-10c242ff434c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received unexpected event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 for instance with vm_state resized and task_state None.#033[00m
Oct  2 09:08:57 np0005465988 nova_compute[236126]: 2025-10-02 13:08:57.037 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:08:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:57.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:58 np0005465988 nova_compute[236126]: 2025-10-02 13:08:58.106 2 INFO nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Creating config drive at /var/lib/nova/instances/0f002be2-0f9d-4b3b-a8b2-552c569f0d28/disk.config#033[00m
Oct  2 09:08:58 np0005465988 nova_compute[236126]: 2025-10-02 13:08:58.112 2 DEBUG oslo_concurrency.processutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0f002be2-0f9d-4b3b-a8b2-552c569f0d28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptmk4b_n8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:08:58 np0005465988 nova_compute[236126]: 2025-10-02 13:08:58.272 2 DEBUG oslo_concurrency.processutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0f002be2-0f9d-4b3b-a8b2-552c569f0d28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptmk4b_n8" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:08:58 np0005465988 nova_compute[236126]: 2025-10-02 13:08:58.308 2 DEBUG nova.storage.rbd_utils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] rbd image 0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:08:58 np0005465988 nova_compute[236126]: 2025-10-02 13:08:58.312 2 DEBUG oslo_concurrency.processutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0f002be2-0f9d-4b3b-a8b2-552c569f0d28/disk.config 0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:08:58 np0005465988 nova_compute[236126]: 2025-10-02 13:08:58.513 2 DEBUG oslo_concurrency.processutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0f002be2-0f9d-4b3b-a8b2-552c569f0d28/disk.config 0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.200s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:08:58 np0005465988 nova_compute[236126]: 2025-10-02 13:08:58.514 2 INFO nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Deleting local config drive /var/lib/nova/instances/0f002be2-0f9d-4b3b-a8b2-552c569f0d28/disk.config because it was imported into RBD.#033[00m
Oct  2 09:08:58 np0005465988 kernel: tap7e799a10-8b: entered promiscuous mode
Oct  2 09:08:58 np0005465988 NetworkManager[45041]: <info>  [1759410538.5958] manager: (tap7e799a10-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/421)
Oct  2 09:08:58 np0005465988 systemd-udevd[337752]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:08:58 np0005465988 ovn_controller[132601]: 2025-10-02T13:08:58Z|00951|binding|INFO|Claiming lport 7e799a10-8b7f-44c3-b57d-de4a1255aad4 for this chassis.
Oct  2 09:08:58 np0005465988 ovn_controller[132601]: 2025-10-02T13:08:58Z|00952|binding|INFO|7e799a10-8b7f-44c3-b57d-de4a1255aad4: Claiming fa:16:3e:8e:4f:59 10.100.0.12
Oct  2 09:08:58 np0005465988 nova_compute[236126]: 2025-10-02 13:08:58.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:58 np0005465988 NetworkManager[45041]: <info>  [1759410538.6637] device (tap7e799a10-8b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:08:58 np0005465988 NetworkManager[45041]: <info>  [1759410538.6645] device (tap7e799a10-8b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:08:58 np0005465988 nova_compute[236126]: 2025-10-02 13:08:58.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:58 np0005465988 ovn_controller[132601]: 2025-10-02T13:08:58Z|00953|binding|INFO|Setting lport 7e799a10-8b7f-44c3-b57d-de4a1255aad4 ovn-installed in OVS
Oct  2 09:08:58 np0005465988 nova_compute[236126]: 2025-10-02 13:08:58.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:58 np0005465988 ovn_controller[132601]: 2025-10-02T13:08:58Z|00954|binding|INFO|Setting lport 7e799a10-8b7f-44c3-b57d-de4a1255aad4 up in Southbound
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.676 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:4f:59 10.100.0.12'], port_security=['fa:16:3e:8e:4f:59 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0f002be2-0f9d-4b3b-a8b2-552c569f0d28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14e748fc-2d6a-4d69-b120-68c995d49660', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '60bfd415ee154615b20dd99528061614', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6863a9d1-67a6-432a-b497-906487ecb0c1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93cd313a-87a8-4538-814c-550ca73d3eca, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7e799a10-8b7f-44c3-b57d-de4a1255aad4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.678 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7e799a10-8b7f-44c3-b57d-de4a1255aad4 in datapath 14e748fc-2d6a-4d69-b120-68c995d49660 bound to our chassis#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.680 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 14e748fc-2d6a-4d69-b120-68c995d49660#033[00m
Oct  2 09:08:58 np0005465988 systemd-machined[192594]: New machine qemu-100-instance-000000d2.
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.696 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9609bc51-ae15-420c-b22d-a3aa875f6838]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.697 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap14e748fc-21 in ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.700 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap14e748fc-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:08:58 np0005465988 systemd[1]: Started Virtual Machine qemu-100-instance-000000d2.
Oct  2 09:08:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.700 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d0efa88b-52d5-4359-9fc6-a706eb14d7ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:08:58.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.702 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[46c6407d-561c-4d9a-aaf1-b69966d7ab4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.718 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[4badaa62-e877-42c1-9098-7b7e999caa3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.748 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5615c090-77f6-4593-bd86-2bfe5ab1f1ec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.777 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[f65ad25e-35a8-45b3-80be-b252e4858dfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.783 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc3e7c9-57fb-4060-a9a9-1169bf42a580]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 NetworkManager[45041]: <info>  [1759410538.7843] manager: (tap14e748fc-20): new Veth device (/org/freedesktop/NetworkManager/Devices/422)
Oct  2 09:08:58 np0005465988 systemd-udevd[337757]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.820 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[5629df01-2ae6-49b7-a678-c2fe5cd96bee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.824 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0271f7ac-e59f-4bff-974a-57e068f2992f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 NetworkManager[45041]: <info>  [1759410538.8556] device (tap14e748fc-20): carrier: link connected
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.882 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c949e5fb-28f1-40b7-b4ec-9e1e159c6a6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.901 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a37e5711-f589-43e3-91c1-91dea713387c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14e748fc-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:1c:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 279], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 859918, 'reachable_time': 40986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 337788, 'error': None, 'target': 'ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.919 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b89b6117-47e9-4255-85a0-6ea9417330d7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe81:1c79'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 859918, 'tstamp': 859918}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 337789, 'error': None, 'target': 'ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.938 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4d7e1d37-8a1f-4358-9e5a-9e814c54179f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14e748fc-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:1c:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 279], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 859918, 'reachable_time': 40986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 337790, 'error': None, 'target': 'ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:58.981 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f05f7931-8ea6-4f0d-a55e-fc29b00328c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:59.059 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0026fdc2-9860-4164-85ea-5f548e915023]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:59.061 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14e748fc-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:59.062 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:59.062 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14e748fc-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:59 np0005465988 NetworkManager[45041]: <info>  [1759410539.0660] manager: (tap14e748fc-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/423)
Oct  2 09:08:59 np0005465988 kernel: tap14e748fc-20: entered promiscuous mode
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:59.069 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap14e748fc-20, col_values=(('external_ids', {'iface-id': '9fb48fb0-3960-4f8b-94a5-d7518fe00fe8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:59 np0005465988 ovn_controller[132601]: 2025-10-02T13:08:59Z|00955|binding|INFO|Releasing lport 9fb48fb0-3960-4f8b-94a5-d7518fe00fe8 from this chassis (sb_readonly=0)
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:59.086 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/14e748fc-2d6a-4d69-b120-68c995d49660.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/14e748fc-2d6a-4d69-b120-68c995d49660.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:59.087 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d636954b-456b-48a8-af61-94e14eca8aa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:59.089 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-14e748fc-2d6a-4d69-b120-68c995d49660
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/14e748fc-2d6a-4d69-b120-68c995d49660.pid.haproxy
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 14e748fc-2d6a-4d69-b120-68c995d49660
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:08:59 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:08:59.089 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660', 'env', 'PROCESS_TAG=haproxy-14e748fc-2d6a-4d69-b120-68c995d49660', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/14e748fc-2d6a-4d69-b120-68c995d49660.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.126 2 DEBUG nova.compute.manager [req-0b450f96-0509-45b5-a86d-bbc44a4d153e req-87b6b1cb-f54c-4b9e-be35-0f62d23a146a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Received event network-vif-plugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.127 2 DEBUG oslo_concurrency.lockutils [req-0b450f96-0509-45b5-a86d-bbc44a4d153e req-87b6b1cb-f54c-4b9e-be35-0f62d23a146a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.127 2 DEBUG oslo_concurrency.lockutils [req-0b450f96-0509-45b5-a86d-bbc44a4d153e req-87b6b1cb-f54c-4b9e-be35-0f62d23a146a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.128 2 DEBUG oslo_concurrency.lockutils [req-0b450f96-0509-45b5-a86d-bbc44a4d153e req-87b6b1cb-f54c-4b9e-be35-0f62d23a146a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.128 2 DEBUG nova.compute.manager [req-0b450f96-0509-45b5-a86d-bbc44a4d153e req-87b6b1cb-f54c-4b9e-be35-0f62d23a146a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Processing event network-vif-plugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.203 2 DEBUG nova.compute.manager [req-c81f3f62-3df1-47a0-afb2-923137573bce req-898bf2ab-4b77-4352-ac91-89f5f9d82bc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.204 2 DEBUG oslo_concurrency.lockutils [req-c81f3f62-3df1-47a0-afb2-923137573bce req-898bf2ab-4b77-4352-ac91-89f5f9d82bc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.205 2 DEBUG oslo_concurrency.lockutils [req-c81f3f62-3df1-47a0-afb2-923137573bce req-898bf2ab-4b77-4352-ac91-89f5f9d82bc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.205 2 DEBUG oslo_concurrency.lockutils [req-c81f3f62-3df1-47a0-afb2-923137573bce req-898bf2ab-4b77-4352-ac91-89f5f9d82bc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.206 2 DEBUG nova.compute.manager [req-c81f3f62-3df1-47a0-afb2-923137573bce req-898bf2ab-4b77-4352-ac91-89f5f9d82bc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] No waiting events found dispatching network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.207 2 WARNING nova.compute.manager [req-c81f3f62-3df1-47a0-afb2-923137573bce req-898bf2ab-4b77-4352-ac91-89f5f9d82bc1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received unexpected event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.209 2 DEBUG nova.network.neutron [req-fc1e7953-662e-43ad-b071-906e71a17be0 req-5e33c85d-143a-42c4-ba3b-3d8d1e1dfb90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Updated VIF entry in instance network info cache for port 7e799a10-8b7f-44c3-b57d-de4a1255aad4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.210 2 DEBUG nova.network.neutron [req-fc1e7953-662e-43ad-b071-906e71a17be0 req-5e33c85d-143a-42c4-ba3b-3d8d1e1dfb90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Updating instance_info_cache with network_info: [{"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:08:59 np0005465988 nova_compute[236126]: 2025-10-02 13:08:59.235 2 DEBUG oslo_concurrency.lockutils [req-fc1e7953-662e-43ad-b071-906e71a17be0 req-5e33c85d-143a-42c4-ba3b-3d8d1e1dfb90 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:08:59 np0005465988 podman[337841]: 2025-10-02 13:08:59.486960292 +0000 UTC m=+0.065617001 container create a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 09:08:59 np0005465988 systemd[1]: Started libpod-conmon-a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f.scope.
Oct  2 09:08:59 np0005465988 podman[337841]: 2025-10-02 13:08:59.455804675 +0000 UTC m=+0.034461414 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:08:59 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:08:59 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9538658bcb5de11794a01a1cc3297861503ba596716c611260930076d3889bc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:08:59 np0005465988 podman[337841]: 2025-10-02 13:08:59.605141986 +0000 UTC m=+0.183798715 container init a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 09:08:59 np0005465988 podman[337841]: 2025-10-02 13:08:59.61672283 +0000 UTC m=+0.195379549 container start a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 09:08:59 np0005465988 neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660[337878]: [NOTICE]   (337882) : New worker (337884) forked
Oct  2 09:08:59 np0005465988 neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660[337878]: [NOTICE]   (337882) : Loading success.
Oct  2 09:08:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:08:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:08:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:08:59.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.044 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410540.0436783, 0f002be2-0f9d-4b3b-a8b2-552c569f0d28 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.044 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] VM Started (Lifecycle Event)#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.048 2 DEBUG nova.compute.manager [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.055 2 DEBUG nova.virt.libvirt.driver [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.060 2 INFO nova.virt.libvirt.driver [-] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Instance spawned successfully.#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.061 2 INFO nova.compute.manager [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Took 14.59 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.061 2 DEBUG nova.compute.manager [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.128 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.132 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.157 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.158 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410540.0438716, 0f002be2-0f9d-4b3b-a8b2-552c569f0d28 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.158 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.160 2 DEBUG nova.network.neutron [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Port 06105eee-1ccc-4976-9ef2-84b4765d9a79 binding to destination host compute-2.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.161 2 DEBUG oslo_concurrency.lockutils [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.161 2 DEBUG oslo_concurrency.lockutils [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquired lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.161 2 DEBUG nova.network.neutron [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.171 2 INFO nova.compute.manager [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Took 15.91 seconds to build instance.#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.183 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.190 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410540.0523026, 0f002be2-0f9d-4b3b-a8b2-552c569f0d28 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.190 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.192 2 DEBUG oslo_concurrency.lockutils [None req-d6b9025c-4532-4962-9224-d2dd58cef10d 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.999s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.207 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:09:00 np0005465988 nova_compute[236126]: 2025-10-02 13:09:00.211 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:09:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:00.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:01 np0005465988 nova_compute[236126]: 2025-10-02 13:09:01.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:01 np0005465988 nova_compute[236126]: 2025-10-02 13:09:01.238 2 DEBUG nova.compute.manager [req-d2c2520d-a078-4556-b0ad-7514a8afa764 req-53980f4b-b64d-43f4-b2ce-6307e56c2238 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Received event network-vif-plugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:01 np0005465988 nova_compute[236126]: 2025-10-02 13:09:01.240 2 DEBUG oslo_concurrency.lockutils [req-d2c2520d-a078-4556-b0ad-7514a8afa764 req-53980f4b-b64d-43f4-b2ce-6307e56c2238 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:01 np0005465988 nova_compute[236126]: 2025-10-02 13:09:01.240 2 DEBUG oslo_concurrency.lockutils [req-d2c2520d-a078-4556-b0ad-7514a8afa764 req-53980f4b-b64d-43f4-b2ce-6307e56c2238 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:01 np0005465988 nova_compute[236126]: 2025-10-02 13:09:01.241 2 DEBUG oslo_concurrency.lockutils [req-d2c2520d-a078-4556-b0ad-7514a8afa764 req-53980f4b-b64d-43f4-b2ce-6307e56c2238 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:01 np0005465988 nova_compute[236126]: 2025-10-02 13:09:01.241 2 DEBUG nova.compute.manager [req-d2c2520d-a078-4556-b0ad-7514a8afa764 req-53980f4b-b64d-43f4-b2ce-6307e56c2238 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] No waiting events found dispatching network-vif-plugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:09:01 np0005465988 nova_compute[236126]: 2025-10-02 13:09:01.242 2 WARNING nova.compute.manager [req-d2c2520d-a078-4556-b0ad-7514a8afa764 req-53980f4b-b64d-43f4-b2ce-6307e56c2238 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Received unexpected event network-vif-plugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 for instance with vm_state active and task_state None.#033[00m
Oct  2 09:09:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:09:01 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:09:01 np0005465988 nova_compute[236126]: 2025-10-02 13:09:01.784 2 DEBUG nova.network.neutron [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Updating instance_info_cache with network_info: [{"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:09:01 np0005465988 nova_compute[236126]: 2025-10-02 13:09:01.807 2 DEBUG oslo_concurrency.lockutils [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Releasing lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:09:01 np0005465988 nova_compute[236126]: 2025-10-02 13:09:01.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:01.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:02 np0005465988 kernel: tap06105eee-1c (unregistering): left promiscuous mode
Oct  2 09:09:02 np0005465988 NetworkManager[45041]: <info>  [1759410542.0434] device (tap06105eee-1c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:09:02 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:02Z|00956|binding|INFO|Releasing lport 06105eee-1ccc-4976-9ef2-84b4765d9a79 from this chassis (sb_readonly=0)
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:02 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:02Z|00957|binding|INFO|Setting lport 06105eee-1ccc-4976-9ef2-84b4765d9a79 down in Southbound
Oct  2 09:09:02 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:02Z|00958|binding|INFO|Removing iface tap06105eee-1c ovn-installed in OVS
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.074 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:d7:ed 10.100.0.11'], port_security=['fa:16:3e:3d:d7:ed 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '6df9bd3b-6218-4859-aba9-bfbedf2b8f18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41354ccc-5b80-451f-9510-2c3d0788ecf7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08e102ae48244af2ab448a2e1ff757df', 'neutron:revision_number': '8', 'neutron:security_group_ids': '337a5b6a-7697-4b02-8d14-65af2374695f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f431a9f-5f6a-4914-ae7c-c00e97c25630, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=06105eee-1ccc-4976-9ef2-84b4765d9a79) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.075 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 06105eee-1ccc-4976-9ef2-84b4765d9a79 in datapath 41354ccc-5b80-451f-9510-2c3d0788ecf7 unbound from our chassis#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.079 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41354ccc-5b80-451f-9510-2c3d0788ecf7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.080 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[880f6045-2f35-42ee-bffe-26a425ef8f34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.081 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7 namespace which is not needed anymore#033[00m
Oct  2 09:09:02 np0005465988 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000d1.scope: Deactivated successfully.
Oct  2 09:09:02 np0005465988 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000d1.scope: Consumed 6.392s CPU time.
Oct  2 09:09:02 np0005465988 systemd-machined[192594]: Machine qemu-99-instance-000000d1 terminated.
Oct  2 09:09:02 np0005465988 neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7[337664]: [NOTICE]   (337668) : haproxy version is 2.8.14-c23fe91
Oct  2 09:09:02 np0005465988 neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7[337664]: [NOTICE]   (337668) : path to executable is /usr/sbin/haproxy
Oct  2 09:09:02 np0005465988 neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7[337664]: [WARNING]  (337668) : Exiting Master process...
Oct  2 09:09:02 np0005465988 neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7[337664]: [WARNING]  (337668) : Exiting Master process...
Oct  2 09:09:02 np0005465988 neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7[337664]: [ALERT]    (337668) : Current worker (337670) exited with code 143 (Terminated)
Oct  2 09:09:02 np0005465988 neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7[337664]: [WARNING]  (337668) : All workers exited. Exiting... (0)
Oct  2 09:09:02 np0005465988 systemd[1]: libpod-6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7.scope: Deactivated successfully.
Oct  2 09:09:02 np0005465988 podman[337966]: 2025-10-02 13:09:02.243587156 +0000 UTC m=+0.059361660 container died 6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.261 2 INFO nova.virt.libvirt.driver [-] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Instance destroyed successfully.#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.264 2 DEBUG nova.objects.instance [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'resources' on Instance uuid 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.280 2 DEBUG nova.virt.libvirt.vif [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:08:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-510358772',display_name='tempest-TestNetworkAdvancedServerOps-server-510358772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-510358772',id=209,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNQLYFaK6VzMZ4VXSjIB28DDIVujtRqXaihQsQXdMB+5rY8DD1rQi9P2Y1PwrrLaViv1jTWp23s6ULfYTCXiXfqd1pOSru0GKVbLKUc8HJqBymXrreI8FngJNgN4inx/nA==',key_name='tempest-TestNetworkAdvancedServerOps-178932410',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:08:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-r5xxc3cq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:08:56Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=6df9bd3b-6218-4859-aba9-bfbedf2b8f18,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.281 2 DEBUG nova.network.os_vif_util [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converting VIF {"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.281 2 DEBUG nova.network.os_vif_util [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3d:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=06105eee-1ccc-4976-9ef2-84b4765d9a79,network=Network(41354ccc-5b80-451f-9510-2c3d0788ecf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06105eee-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.282 2 DEBUG os_vif [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=06105eee-1ccc-4976-9ef2-84b4765d9a79,network=Network(41354ccc-5b80-451f-9510-2c3d0788ecf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06105eee-1c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.284 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06105eee-1c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:09:02 np0005465988 systemd[1]: var-lib-containers-storage-overlay-ce3998c11fca383a06e83d5f011269f33480511d288b73b17a71fce55d1e6198-merged.mount: Deactivated successfully.
Oct  2 09:09:02 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7-userdata-shm.mount: Deactivated successfully.
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.291 2 INFO os_vif [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=06105eee-1ccc-4976-9ef2-84b4765d9a79,network=Network(41354ccc-5b80-451f-9510-2c3d0788ecf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06105eee-1c')#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.295 2 DEBUG oslo_concurrency.lockutils [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.295 2 DEBUG oslo_concurrency.lockutils [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:02 np0005465988 podman[337966]: 2025-10-02 13:09:02.304003766 +0000 UTC m=+0.119778190 container cleanup 6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 09:09:02 np0005465988 systemd[1]: libpod-conmon-6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7.scope: Deactivated successfully.
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.315 2 DEBUG nova.objects.instance [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'migration_context' on Instance uuid 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.349 2 DEBUG nova.compute.manager [req-62354a59-0f21-4c13-93c7-c859853d0d98 req-4e5e5c00-c8c2-440c-972a-a16d77d45517 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received event network-vif-unplugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.350 2 DEBUG oslo_concurrency.lockutils [req-62354a59-0f21-4c13-93c7-c859853d0d98 req-4e5e5c00-c8c2-440c-972a-a16d77d45517 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.350 2 DEBUG oslo_concurrency.lockutils [req-62354a59-0f21-4c13-93c7-c859853d0d98 req-4e5e5c00-c8c2-440c-972a-a16d77d45517 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.350 2 DEBUG oslo_concurrency.lockutils [req-62354a59-0f21-4c13-93c7-c859853d0d98 req-4e5e5c00-c8c2-440c-972a-a16d77d45517 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.351 2 DEBUG nova.compute.manager [req-62354a59-0f21-4c13-93c7-c859853d0d98 req-4e5e5c00-c8c2-440c-972a-a16d77d45517 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] No waiting events found dispatching network-vif-unplugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.351 2 WARNING nova.compute.manager [req-62354a59-0f21-4c13-93c7-c859853d0d98 req-4e5e5c00-c8c2-440c-972a-a16d77d45517 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received unexpected event network-vif-unplugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 09:09:02 np0005465988 podman[338007]: 2025-10-02 13:09:02.388722357 +0000 UTC m=+0.053309587 container remove 6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.396 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[330ac94c-c33e-4d38-9dec-0b37a07b181b]: (4, ('Thu Oct  2 01:09:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7 (6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7)\n6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7\nThu Oct  2 01:09:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7 (6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7)\n6018a6c75534f1d12a23f4d63ede36b8daee75d759b47a1ef8e0f278954ba7f7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.398 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fa4e0d05-74cb-4907-9e3d-13db895a173d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.399 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41354ccc-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:09:02 np0005465988 kernel: tap41354ccc-50: left promiscuous mode
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.420 2 DEBUG oslo_concurrency.processutils [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.420 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e2b46497-36c8-4764-8212-4882247f496d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.456 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[577c7dd8-012b-4946-bf87-4cdb2b88564a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.457 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c37e9ee5-b09f-4b2e-b929-a52a73989cd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.478 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d688048e-be76-415a-9295-21f084196959]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 859591, 'reachable_time': 37379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 338024, 'error': None, 'target': 'ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:02 np0005465988 systemd[1]: run-netns-ovnmeta\x2d41354ccc\x2d5b80\x2d451f\x2d9510\x2d2c3d0788ecf7.mount: Deactivated successfully.
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.485 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41354ccc-5b80-451f-9510-2c3d0788ecf7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:09:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:02.485 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[704196f9-5e46-4195-bfdd-5e0fb7dce509]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:02.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:09:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1154508469' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.889 2 DEBUG oslo_concurrency.processutils [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.897 2 DEBUG nova.compute.provider_tree [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:09:02 np0005465988 nova_compute[236126]: 2025-10-02 13:09:02.955 2 DEBUG nova.scheduler.client.report [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:09:03 np0005465988 nova_compute[236126]: 2025-10-02 13:09:03.019 2 DEBUG oslo_concurrency.lockutils [None req-769dc677-3637-4901-b833-47a3b06deff3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:03.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:04.078 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:09:04 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:04.078 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.420 2 DEBUG nova.compute.manager [req-9fed01f1-2637-4802-8d64-028f4337e409 req-4e7ea46d-f01e-43e9-b7be-1cf982696e17 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received event network-changed-06105eee-1ccc-4976-9ef2-84b4765d9a79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.421 2 DEBUG nova.compute.manager [req-9fed01f1-2637-4802-8d64-028f4337e409 req-4e7ea46d-f01e-43e9-b7be-1cf982696e17 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Refreshing instance network info cache due to event network-changed-06105eee-1ccc-4976-9ef2-84b4765d9a79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.421 2 DEBUG oslo_concurrency.lockutils [req-9fed01f1-2637-4802-8d64-028f4337e409 req-4e7ea46d-f01e-43e9-b7be-1cf982696e17 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.422 2 DEBUG oslo_concurrency.lockutils [req-9fed01f1-2637-4802-8d64-028f4337e409 req-4e7ea46d-f01e-43e9-b7be-1cf982696e17 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.422 2 DEBUG nova.network.neutron [req-9fed01f1-2637-4802-8d64-028f4337e409 req-4e7ea46d-f01e-43e9-b7be-1cf982696e17 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Refreshing network info cache for port 06105eee-1ccc-4976-9ef2-84b4765d9a79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.543 2 DEBUG nova.compute.manager [req-0997286d-ee4c-4d78-8c2b-bd1a512f0a6c req-2b41151a-334e-4267-87dc-54e65e477856 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.544 2 DEBUG oslo_concurrency.lockutils [req-0997286d-ee4c-4d78-8c2b-bd1a512f0a6c req-2b41151a-334e-4267-87dc-54e65e477856 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.544 2 DEBUG oslo_concurrency.lockutils [req-0997286d-ee4c-4d78-8c2b-bd1a512f0a6c req-2b41151a-334e-4267-87dc-54e65e477856 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.545 2 DEBUG oslo_concurrency.lockutils [req-0997286d-ee4c-4d78-8c2b-bd1a512f0a6c req-2b41151a-334e-4267-87dc-54e65e477856 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.545 2 DEBUG nova.compute.manager [req-0997286d-ee4c-4d78-8c2b-bd1a512f0a6c req-2b41151a-334e-4267-87dc-54e65e477856 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] No waiting events found dispatching network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:09:04 np0005465988 nova_compute[236126]: 2025-10-02 13:09:04.545 2 WARNING nova.compute.manager [req-0997286d-ee4c-4d78-8c2b-bd1a512f0a6c req-2b41151a-334e-4267-87dc-54e65e477856 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received unexpected event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 09:09:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:04.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:05 np0005465988 nova_compute[236126]: 2025-10-02 13:09:05.780 2 DEBUG nova.network.neutron [req-9fed01f1-2637-4802-8d64-028f4337e409 req-4e7ea46d-f01e-43e9-b7be-1cf982696e17 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Updated VIF entry in instance network info cache for port 06105eee-1ccc-4976-9ef2-84b4765d9a79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:09:05 np0005465988 nova_compute[236126]: 2025-10-02 13:09:05.781 2 DEBUG nova.network.neutron [req-9fed01f1-2637-4802-8d64-028f4337e409 req-4e7ea46d-f01e-43e9-b7be-1cf982696e17 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Updating instance_info_cache with network_info: [{"id": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "address": "fa:16:3e:3d:d7:ed", "network": {"id": "41354ccc-5b80-451f-9510-2c3d0788ecf7", "bridge": "br-int", "label": "tempest-network-smoke--166998389", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06105eee-1c", "ovs_interfaceid": "06105eee-1ccc-4976-9ef2-84b4765d9a79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:09:05 np0005465988 nova_compute[236126]: 2025-10-02 13:09:05.801 2 DEBUG oslo_concurrency.lockutils [req-9fed01f1-2637-4802-8d64-028f4337e409 req-4e7ea46d-f01e-43e9-b7be-1cf982696e17 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-6df9bd3b-6218-4859-aba9-bfbedf2b8f18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:09:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:05.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:06 np0005465988 nova_compute[236126]: 2025-10-02 13:09:06.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:06 np0005465988 nova_compute[236126]: 2025-10-02 13:09:06.635 2 DEBUG nova.compute.manager [req-b1f925bd-c341-429f-8e36-599082a04d6f req-e181f696-e2ef-43ee-8c13-c2a764504d55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Received event network-changed-7e799a10-8b7f-44c3-b57d-de4a1255aad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:06 np0005465988 nova_compute[236126]: 2025-10-02 13:09:06.635 2 DEBUG nova.compute.manager [req-b1f925bd-c341-429f-8e36-599082a04d6f req-e181f696-e2ef-43ee-8c13-c2a764504d55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Refreshing instance network info cache due to event network-changed-7e799a10-8b7f-44c3-b57d-de4a1255aad4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:09:06 np0005465988 nova_compute[236126]: 2025-10-02 13:09:06.636 2 DEBUG oslo_concurrency.lockutils [req-b1f925bd-c341-429f-8e36-599082a04d6f req-e181f696-e2ef-43ee-8c13-c2a764504d55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:09:06 np0005465988 nova_compute[236126]: 2025-10-02 13:09:06.636 2 DEBUG oslo_concurrency.lockutils [req-b1f925bd-c341-429f-8e36-599082a04d6f req-e181f696-e2ef-43ee-8c13-c2a764504d55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:09:06 np0005465988 nova_compute[236126]: 2025-10-02 13:09:06.636 2 DEBUG nova.network.neutron [req-b1f925bd-c341-429f-8e36-599082a04d6f req-e181f696-e2ef-43ee-8c13-c2a764504d55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Refreshing network info cache for port 7e799a10-8b7f-44c3-b57d-de4a1255aad4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:09:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:06.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e408 e408: 3 total, 3 up, 3 in
Oct  2 09:09:07 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:07.081 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:09:07 np0005465988 nova_compute[236126]: 2025-10-02 13:09:07.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:07 np0005465988 nova_compute[236126]: 2025-10-02 13:09:07.753 2 DEBUG nova.network.neutron [req-b1f925bd-c341-429f-8e36-599082a04d6f req-e181f696-e2ef-43ee-8c13-c2a764504d55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Updated VIF entry in instance network info cache for port 7e799a10-8b7f-44c3-b57d-de4a1255aad4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:09:07 np0005465988 nova_compute[236126]: 2025-10-02 13:09:07.754 2 DEBUG nova.network.neutron [req-b1f925bd-c341-429f-8e36-599082a04d6f req-e181f696-e2ef-43ee-8c13-c2a764504d55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Updating instance_info_cache with network_info: [{"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:09:07 np0005465988 nova_compute[236126]: 2025-10-02 13:09:07.781 2 DEBUG oslo_concurrency.lockutils [req-b1f925bd-c341-429f-8e36-599082a04d6f req-e181f696-e2ef-43ee-8c13-c2a764504d55 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:09:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:07.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:08.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.785 2 DEBUG nova.compute.manager [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.786 2 DEBUG oslo_concurrency.lockutils [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.786 2 DEBUG oslo_concurrency.lockutils [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.787 2 DEBUG oslo_concurrency.lockutils [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.787 2 DEBUG nova.compute.manager [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] No waiting events found dispatching network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.788 2 WARNING nova.compute.manager [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received unexpected event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.788 2 DEBUG nova.compute.manager [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.788 2 DEBUG oslo_concurrency.lockutils [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.789 2 DEBUG oslo_concurrency.lockutils [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.789 2 DEBUG oslo_concurrency.lockutils [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6df9bd3b-6218-4859-aba9-bfbedf2b8f18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.790 2 DEBUG nova.compute.manager [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] No waiting events found dispatching network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:09:08 np0005465988 nova_compute[236126]: 2025-10-02 13:09:08.790 2 WARNING nova.compute.manager [req-54d59cdb-8d36-4349-ade5-5b5da97a6cc9 req-30a9e0c5-1043-4161-a64d-231256150091 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Received unexpected event network-vif-plugged-06105eee-1ccc-4976-9ef2-84b4765d9a79 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 09:09:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:09.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:10 np0005465988 podman[338102]: 2025-10-02 13:09:10.524863628 +0000 UTC m=+0.057441985 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:09:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:10.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e409 e409: 3 total, 3 up, 3 in
Oct  2 09:09:11 np0005465988 nova_compute[236126]: 2025-10-02 13:09:11.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.004000116s ======
Oct  2 09:09:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:11.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000116s
Oct  2 09:09:12 np0005465988 nova_compute[236126]: 2025-10-02 13:09:12.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:12.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:13.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:14 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:14Z|00121|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.6 does not match offer 10.100.0.12
Oct  2 09:09:14 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:14Z|00122|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:8e:4f:59 10.100.0.12
Oct  2 09:09:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:14.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:15.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:16 np0005465988 nova_compute[236126]: 2025-10-02 13:09:16.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:09:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:16.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:09:17 np0005465988 nova_compute[236126]: 2025-10-02 13:09:17.260 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410542.2586715, 6df9bd3b-6218-4859-aba9-bfbedf2b8f18 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:09:17 np0005465988 nova_compute[236126]: 2025-10-02 13:09:17.261 2 INFO nova.compute.manager [-] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:09:17 np0005465988 nova_compute[236126]: 2025-10-02 13:09:17.290 2 DEBUG nova.compute.manager [None req-225106a5-3a1f-4eb0-9e44-626f82b56313 - - - - - -] [instance: 6df9bd3b-6218-4859-aba9-bfbedf2b8f18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:09:17 np0005465988 nova_compute[236126]: 2025-10-02 13:09:17.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:17.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:17 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:17Z|00123|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.6 does not match offer 10.100.0.12
Oct  2 09:09:17 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:17Z|00124|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:8e:4f:59 10.100.0.12
Oct  2 09:09:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:18.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:19 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:19Z|00125|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8e:4f:59 10.100.0.12
Oct  2 09:09:19 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:19Z|00126|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8e:4f:59 10.100.0.12
Oct  2 09:09:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:19.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:20.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:21 np0005465988 nova_compute[236126]: 2025-10-02 13:09:21.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:21 np0005465988 nova_compute[236126]: 2025-10-02 13:09:21.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:09:21 np0005465988 podman[338128]: 2025-10-02 13:09:21.538809914 +0000 UTC m=+0.058863147 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 09:09:21 np0005465988 podman[338129]: 2025-10-02 13:09:21.539104722 +0000 UTC m=+0.067877306 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd)
Oct  2 09:09:21 np0005465988 podman[338127]: 2025-10-02 13:09:21.559582552 +0000 UTC m=+0.097974773 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 09:09:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:21.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:22 np0005465988 nova_compute[236126]: 2025-10-02 13:09:22.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:22.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:23.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:24 np0005465988 nova_compute[236126]: 2025-10-02 13:09:24.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:09:24 np0005465988 nova_compute[236126]: 2025-10-02 13:09:24.507 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:24 np0005465988 nova_compute[236126]: 2025-10-02 13:09:24.508 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:24 np0005465988 nova_compute[236126]: 2025-10-02 13:09:24.508 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:24 np0005465988 nova_compute[236126]: 2025-10-02 13:09:24.508 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:09:24 np0005465988 nova_compute[236126]: 2025-10-02 13:09:24.509 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:09:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:24.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:09:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1313784074' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:09:24 np0005465988 nova_compute[236126]: 2025-10-02 13:09:24.995 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.072 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000d2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.073 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000d2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.228 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.230 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3773MB free_disk=20.890811920166016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.230 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.231 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.324 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 0f002be2-0f9d-4b3b-a8b2-552c569f0d28 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.324 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.325 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.374 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:09:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:09:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2710832825' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.855 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:09:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:25.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.863 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.886 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.906 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:09:25 np0005465988 nova_compute[236126]: 2025-10-02 13:09:25.907 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:26 np0005465988 nova_compute[236126]: 2025-10-02 13:09:26.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:26.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:27 np0005465988 nova_compute[236126]: 2025-10-02 13:09:27.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:27.416 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:27.416 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:27.417 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:27.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:28.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:09:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:29.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:09:30 np0005465988 nova_compute[236126]: 2025-10-02 13:09:30.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:09:30 np0005465988 nova_compute[236126]: 2025-10-02 13:09:30.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 09:09:30 np0005465988 nova_compute[236126]: 2025-10-02 13:09:30.499 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 09:09:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:30.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:31 np0005465988 nova_compute[236126]: 2025-10-02 13:09:31.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:31.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:32 np0005465988 nova_compute[236126]: 2025-10-02 13:09:32.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:32.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:33.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:34.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:35 np0005465988 nova_compute[236126]: 2025-10-02 13:09:35.500 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:09:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:35.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:36 np0005465988 nova_compute[236126]: 2025-10-02 13:09:36.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:36 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:36Z|00959|binding|INFO|Releasing lport 9fb48fb0-3960-4f8b-94a5-d7518fe00fe8 from this chassis (sb_readonly=0)
Oct  2 09:09:36 np0005465988 nova_compute[236126]: 2025-10-02 13:09:36.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:36 np0005465988 nova_compute[236126]: 2025-10-02 13:09:36.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:09:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:09:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:36.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:09:37 np0005465988 nova_compute[236126]: 2025-10-02 13:09:37.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:37.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:37 np0005465988 nova_compute[236126]: 2025-10-02 13:09:37.992 2 DEBUG nova.compute.manager [None req-f9fb5908-c9f2-427b-9d1b-7603b33c292f 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:09:38 np0005465988 nova_compute[236126]: 2025-10-02 13:09:38.073 2 INFO nova.compute.manager [None req-f9fb5908-c9f2-427b-9d1b-7603b33c292f 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] instance snapshotting#033[00m
Oct  2 09:09:38 np0005465988 nova_compute[236126]: 2025-10-02 13:09:38.387 2 INFO nova.virt.libvirt.driver [None req-f9fb5908-c9f2-427b-9d1b-7603b33c292f 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Beginning live snapshot process#033[00m
Oct  2 09:09:38 np0005465988 nova_compute[236126]: 2025-10-02 13:09:38.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:09:38 np0005465988 nova_compute[236126]: 2025-10-02 13:09:38.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:09:38 np0005465988 nova_compute[236126]: 2025-10-02 13:09:38.627 2 DEBUG nova.storage.rbd_utils [None req-f9fb5908-c9f2-427b-9d1b-7603b33c292f 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] creating snapshot(96ee5f4f1d1a4cbc9569efbd8bdf7bb0) on rbd image(0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 09:09:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:38.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e410 e410: 3 total, 3 up, 3 in
Oct  2 09:09:39 np0005465988 nova_compute[236126]: 2025-10-02 13:09:39.062 2 DEBUG nova.storage.rbd_utils [None req-f9fb5908-c9f2-427b-9d1b-7603b33c292f 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] cloning vms/0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk@96ee5f4f1d1a4cbc9569efbd8bdf7bb0 to images/ef37c5e7-f078-42ef-9403-bff6103a0870 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Oct  2 09:09:39 np0005465988 nova_compute[236126]: 2025-10-02 13:09:39.233 2 DEBUG nova.storage.rbd_utils [None req-f9fb5908-c9f2-427b-9d1b-7603b33c292f 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] flattening images/ef37c5e7-f078-42ef-9403-bff6103a0870 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Oct  2 09:09:39 np0005465988 nova_compute[236126]: 2025-10-02 13:09:39.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:09:39 np0005465988 nova_compute[236126]: 2025-10-02 13:09:39.747 2 DEBUG nova.storage.rbd_utils [None req-f9fb5908-c9f2-427b-9d1b-7603b33c292f 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] removing snapshot(96ee5f4f1d1a4cbc9569efbd8bdf7bb0) on rbd image(0f002be2-0f9d-4b3b-a8b2-552c569f0d28_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Oct  2 09:09:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:39.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e411 e411: 3 total, 3 up, 3 in
Oct  2 09:09:40 np0005465988 nova_compute[236126]: 2025-10-02 13:09:40.070 2 DEBUG nova.storage.rbd_utils [None req-f9fb5908-c9f2-427b-9d1b-7603b33c292f 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] creating snapshot(snap) on rbd image(ef37c5e7-f078-42ef-9403-bff6103a0870) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Oct  2 09:09:40 np0005465988 nova_compute[236126]: 2025-10-02 13:09:40.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:09:40 np0005465988 nova_compute[236126]: 2025-10-02 13:09:40.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:09:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:40.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:41 np0005465988 nova_compute[236126]: 2025-10-02 13:09:41.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e412 e412: 3 total, 3 up, 3 in
Oct  2 09:09:41 np0005465988 podman[338439]: 2025-10-02 13:09:41.521522492 +0000 UTC m=+0.062375968 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:09:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:41.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:42 np0005465988 nova_compute[236126]: 2025-10-02 13:09:42.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:42.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:43 np0005465988 nova_compute[236126]: 2025-10-02 13:09:43.029 2 INFO nova.virt.libvirt.driver [None req-f9fb5908-c9f2-427b-9d1b-7603b33c292f 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Snapshot image upload complete#033[00m
Oct  2 09:09:43 np0005465988 nova_compute[236126]: 2025-10-02 13:09:43.030 2 INFO nova.compute.manager [None req-f9fb5908-c9f2-427b-9d1b-7603b33c292f 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Took 4.96 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 09:09:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:43.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:44.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e413 e413: 3 total, 3 up, 3 in
Oct  2 09:09:45 np0005465988 nova_compute[236126]: 2025-10-02 13:09:45.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:45 np0005465988 nova_compute[236126]: 2025-10-02 13:09:45.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:09:45 np0005465988 nova_compute[236126]: 2025-10-02 13:09:45.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:09:45 np0005465988 nova_compute[236126]: 2025-10-02 13:09:45.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:09:45 np0005465988 nova_compute[236126]: 2025-10-02 13:09:45.745 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:09:45 np0005465988 nova_compute[236126]: 2025-10-02 13:09:45.746 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:09:45 np0005465988 nova_compute[236126]: 2025-10-02 13:09:45.746 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 09:09:45 np0005465988 nova_compute[236126]: 2025-10-02 13:09:45.746 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0f002be2-0f9d-4b3b-a8b2-552c569f0d28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:09:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:45.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:45.917 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:09:45 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:45.918 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:09:45 np0005465988 nova_compute[236126]: 2025-10-02 13:09:45.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:46.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.761 2 DEBUG nova.compute.manager [req-b7a9ec79-66fa-4a6f-b4ed-9bce02ab4042 req-e6254f25-c5e0-4116-a632-1c7efe3e634f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Received event network-changed-7e799a10-8b7f-44c3-b57d-de4a1255aad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.762 2 DEBUG nova.compute.manager [req-b7a9ec79-66fa-4a6f-b4ed-9bce02ab4042 req-e6254f25-c5e0-4116-a632-1c7efe3e634f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Refreshing instance network info cache due to event network-changed-7e799a10-8b7f-44c3-b57d-de4a1255aad4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.762 2 DEBUG oslo_concurrency.lockutils [req-b7a9ec79-66fa-4a6f-b4ed-9bce02ab4042 req-e6254f25-c5e0-4116-a632-1c7efe3e634f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.811 2 DEBUG oslo_concurrency.lockutils [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Acquiring lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.812 2 DEBUG oslo_concurrency.lockutils [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.813 2 DEBUG oslo_concurrency.lockutils [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Acquiring lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.813 2 DEBUG oslo_concurrency.lockutils [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.814 2 DEBUG oslo_concurrency.lockutils [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.815 2 INFO nova.compute.manager [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Terminating instance#033[00m
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.817 2 DEBUG nova.compute.manager [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 09:09:46 np0005465988 kernel: tap7e799a10-8b (unregistering): left promiscuous mode
Oct  2 09:09:46 np0005465988 NetworkManager[45041]: <info>  [1759410586.8784] device (tap7e799a10-8b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:46 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:46Z|00960|binding|INFO|Releasing lport 7e799a10-8b7f-44c3-b57d-de4a1255aad4 from this chassis (sb_readonly=0)
Oct  2 09:09:46 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:46Z|00961|binding|INFO|Setting lport 7e799a10-8b7f-44c3-b57d-de4a1255aad4 down in Southbound
Oct  2 09:09:46 np0005465988 ovn_controller[132601]: 2025-10-02T13:09:46Z|00962|binding|INFO|Removing iface tap7e799a10-8b ovn-installed in OVS
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:46.895 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:4f:59 10.100.0.12'], port_security=['fa:16:3e:8e:4f:59 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0f002be2-0f9d-4b3b-a8b2-552c569f0d28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14e748fc-2d6a-4d69-b120-68c995d49660', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '60bfd415ee154615b20dd99528061614', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6863a9d1-67a6-432a-b497-906487ecb0c1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93cd313a-87a8-4538-814c-550ca73d3eca, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=7e799a10-8b7f-44c3-b57d-de4a1255aad4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:09:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:46.896 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 7e799a10-8b7f-44c3-b57d-de4a1255aad4 in datapath 14e748fc-2d6a-4d69-b120-68c995d49660 unbound from our chassis#033[00m
Oct  2 09:09:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:46.897 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 14e748fc-2d6a-4d69-b120-68c995d49660, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:09:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:46.899 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff94af5-fd83-4e36-a9a5-82ca7f095f26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:46 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:46.899 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660 namespace which is not needed anymore#033[00m
Oct  2 09:09:46 np0005465988 nova_compute[236126]: 2025-10-02 13:09:46.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:46 np0005465988 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000d2.scope: Deactivated successfully.
Oct  2 09:09:46 np0005465988 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000d2.scope: Consumed 16.994s CPU time.
Oct  2 09:09:46 np0005465988 systemd-machined[192594]: Machine qemu-100-instance-000000d2 terminated.
Oct  2 09:09:47 np0005465988 neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660[337878]: [NOTICE]   (337882) : haproxy version is 2.8.14-c23fe91
Oct  2 09:09:47 np0005465988 neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660[337878]: [NOTICE]   (337882) : path to executable is /usr/sbin/haproxy
Oct  2 09:09:47 np0005465988 neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660[337878]: [WARNING]  (337882) : Exiting Master process...
Oct  2 09:09:47 np0005465988 neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660[337878]: [ALERT]    (337882) : Current worker (337884) exited with code 143 (Terminated)
Oct  2 09:09:47 np0005465988 neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660[337878]: [WARNING]  (337882) : All workers exited. Exiting... (0)
Oct  2 09:09:47 np0005465988 systemd[1]: libpod-a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f.scope: Deactivated successfully.
Oct  2 09:09:47 np0005465988 podman[338534]: 2025-10-02 13:09:47.060030978 +0000 UTC m=+0.055897491 container died a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.063 2 INFO nova.virt.libvirt.driver [-] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Instance destroyed successfully.#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.064 2 DEBUG nova.objects.instance [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lazy-loading 'resources' on Instance uuid 0f002be2-0f9d-4b3b-a8b2-552c569f0d28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.082 2 DEBUG nova.virt.libvirt.vif [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:08:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1337888295',display_name='tempest-TestSnapshotPattern-server-1337888295',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1337888295',id=210,image_ref='a1241e22-2f30-42e3-8072-a21ad0ab0f69',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMla0NYshozjIhqPshVDB7V9QQUdVkTPcrZs6EBQpt1OKq+5dB8xVPlwumBL6FU6d1oBZUn9yPH7sBT9aKh0ThWjhX3hBNPhKbMCSDMfEKO24D2SpIWzAY4pSc/pr8RlQ==',key_name='tempest-TestSnapshotPattern-747682339',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:09:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='60bfd415ee154615b20dd99528061614',ramdisk_id='',reservation_id='r-nxgj21az',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='fc398644-66fc-44e3-9a6a-7389f5a542b8',image_min_disk='1',image_min_ram='0',image_owner_id='60bfd415ee154615b20dd99528061614',image_owner_project_name='tempest-TestSnapshotPattern-400150385',image_owner_user_name='tempest-TestSnapshotPattern-400150385-project-member',image_user_id='29c8a28c5bdd4feb9412127428bf0c3b',image_version='8.0',owner_project_name='tempest-TestSnapshotPattern-400150385',owner_user_name='tempest-TestSnapshotPattern-400150385-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:09:43Z,user_data=None,user_id='29c8a28c5bdd4feb9412127428bf0c3b',uuid=0f002be2-0f9d-4b3b-a8b2-552c569f0d28,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.083 2 DEBUG nova.network.os_vif_util [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Converting VIF {"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.084 2 DEBUG nova.network.os_vif_util [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8e:4f:59,bridge_name='br-int',has_traffic_filtering=True,id=7e799a10-8b7f-44c3-b57d-de4a1255aad4,network=Network(14e748fc-2d6a-4d69-b120-68c995d49660),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e799a10-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.084 2 DEBUG os_vif [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:4f:59,bridge_name='br-int',has_traffic_filtering=True,id=7e799a10-8b7f-44c3-b57d-de4a1255aad4,network=Network(14e748fc-2d6a-4d69-b120-68c995d49660),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e799a10-8b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.087 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e799a10-8b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:09:47 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f-userdata-shm.mount: Deactivated successfully.
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:47 np0005465988 systemd[1]: var-lib-containers-storage-overlay-f9538658bcb5de11794a01a1cc3297861503ba596716c611260930076d3889bc-merged.mount: Deactivated successfully.
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.148 2 INFO os_vif [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:4f:59,bridge_name='br-int',has_traffic_filtering=True,id=7e799a10-8b7f-44c3-b57d-de4a1255aad4,network=Network(14e748fc-2d6a-4d69-b120-68c995d49660),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e799a10-8b')#033[00m
Oct  2 09:09:47 np0005465988 podman[338534]: 2025-10-02 13:09:47.157265918 +0000 UTC m=+0.153132431 container cleanup a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:09:47 np0005465988 systemd[1]: libpod-conmon-a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f.scope: Deactivated successfully.
Oct  2 09:09:47 np0005465988 podman[338588]: 2025-10-02 13:09:47.240889247 +0000 UTC m=+0.054383008 container remove a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:09:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:47.250 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe5a40b-d595-43a8-b82c-5f7335039789]: (4, ('Thu Oct  2 01:09:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660 (a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f)\na8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f\nThu Oct  2 01:09:47 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660 (a8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f)\na8ca21dd87b77de70490bf3f8ea1bcb26046077d6bb388144fa5c47d33de864f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:47.252 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0d8643-fdbe-4607-859a-93cd2a84bc9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:47.253 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14e748fc-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:47 np0005465988 kernel: tap14e748fc-20: left promiscuous mode
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:47.262 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[214f9cba-a8cc-4c6e-9d6f-1e43b258bb2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:47.303 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[513df444-6a65-4275-9054-fc3101dd5a54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:47.304 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a6cdcdae-7466-4788-9b31-b913e4775856]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:47.324 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[48f879af-0086-4613-a8a1-d635f7fd9118]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 859910, 'reachable_time': 32238, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 338607, 'error': None, 'target': 'ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:47 np0005465988 systemd[1]: run-netns-ovnmeta\x2d14e748fc\x2d2d6a\x2d4d69\x2db120\x2d68c995d49660.mount: Deactivated successfully.
Oct  2 09:09:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:47.328 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-14e748fc-2d6a-4d69-b120-68c995d49660 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:09:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:47.328 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[984f90fa-f64d-4e7e-a76c-05e2bab5297a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.344 2 DEBUG nova.compute.manager [req-22a8239f-7298-4ea9-932b-829924d85efd req-17d88959-73fa-4fe1-bcdc-21698db4c1a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Received event network-vif-unplugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.344 2 DEBUG oslo_concurrency.lockutils [req-22a8239f-7298-4ea9-932b-829924d85efd req-17d88959-73fa-4fe1-bcdc-21698db4c1a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.345 2 DEBUG oslo_concurrency.lockutils [req-22a8239f-7298-4ea9-932b-829924d85efd req-17d88959-73fa-4fe1-bcdc-21698db4c1a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.345 2 DEBUG oslo_concurrency.lockutils [req-22a8239f-7298-4ea9-932b-829924d85efd req-17d88959-73fa-4fe1-bcdc-21698db4c1a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.345 2 DEBUG nova.compute.manager [req-22a8239f-7298-4ea9-932b-829924d85efd req-17d88959-73fa-4fe1-bcdc-21698db4c1a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] No waiting events found dispatching network-vif-unplugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.345 2 DEBUG nova.compute.manager [req-22a8239f-7298-4ea9-932b-829924d85efd req-17d88959-73fa-4fe1-bcdc-21698db4c1a0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Received event network-vif-unplugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.631 2 INFO nova.virt.libvirt.driver [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Deleting instance files /var/lib/nova/instances/0f002be2-0f9d-4b3b-a8b2-552c569f0d28_del#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.632 2 INFO nova.virt.libvirt.driver [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Deletion of /var/lib/nova/instances/0f002be2-0f9d-4b3b-a8b2-552c569f0d28_del complete#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.889 2 INFO nova.compute.manager [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Took 1.07 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.890 2 DEBUG oslo.service.loopingcall [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.890 2 DEBUG nova.compute.manager [-] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.890 2 DEBUG nova.network.neutron [-] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.903 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Updating instance_info_cache with network_info: [{"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:09:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:47.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.946 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.947 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.949 2 DEBUG oslo_concurrency.lockutils [req-b7a9ec79-66fa-4a6f-b4ed-9bce02ab4042 req-e6254f25-c5e0-4116-a632-1c7efe3e634f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:09:47 np0005465988 nova_compute[236126]: 2025-10-02 13:09:47.950 2 DEBUG nova.network.neutron [req-b7a9ec79-66fa-4a6f-b4ed-9bce02ab4042 req-e6254f25-c5e0-4116-a632-1c7efe3e634f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Refreshing network info cache for port 7e799a10-8b7f-44c3-b57d-de4a1255aad4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:09:48 np0005465988 nova_compute[236126]: 2025-10-02 13:09:48.682 2 DEBUG nova.network.neutron [-] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:09:48 np0005465988 nova_compute[236126]: 2025-10-02 13:09:48.702 2 INFO nova.compute.manager [-] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Took 0.81 seconds to deallocate network for instance.#033[00m
Oct  2 09:09:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:48.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:48 np0005465988 nova_compute[236126]: 2025-10-02 13:09:48.759 2 DEBUG oslo_concurrency.lockutils [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:48 np0005465988 nova_compute[236126]: 2025-10-02 13:09:48.760 2 DEBUG oslo_concurrency.lockutils [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:48 np0005465988 nova_compute[236126]: 2025-10-02 13:09:48.778 2 DEBUG nova.compute.manager [req-2115cd6a-de0a-4566-857e-47769b3a58be req-37c37103-7d51-471e-bb6a-5ca1ef47fa3b d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Received event network-vif-deleted-7e799a10-8b7f-44c3-b57d-de4a1255aad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:48 np0005465988 nova_compute[236126]: 2025-10-02 13:09:48.823 2 DEBUG oslo_concurrency.processutils [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.245 2 DEBUG nova.network.neutron [req-b7a9ec79-66fa-4a6f-b4ed-9bce02ab4042 req-e6254f25-c5e0-4116-a632-1c7efe3e634f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Updated VIF entry in instance network info cache for port 7e799a10-8b7f-44c3-b57d-de4a1255aad4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.247 2 DEBUG nova.network.neutron [req-b7a9ec79-66fa-4a6f-b4ed-9bce02ab4042 req-e6254f25-c5e0-4116-a632-1c7efe3e634f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Updating instance_info_cache with network_info: [{"id": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "address": "fa:16:3e:8e:4f:59", "network": {"id": "14e748fc-2d6a-4d69-b120-68c995d49660", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1401412803-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "60bfd415ee154615b20dd99528061614", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e799a10-8b", "ovs_interfaceid": "7e799a10-8b7f-44c3-b57d-de4a1255aad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.268 2 DEBUG oslo_concurrency.lockutils [req-b7a9ec79-66fa-4a6f-b4ed-9bce02ab4042 req-e6254f25-c5e0-4116-a632-1c7efe3e634f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-0f002be2-0f9d-4b3b-a8b2-552c569f0d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:09:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:09:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2119487108' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.305 2 DEBUG oslo_concurrency.processutils [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.312 2 DEBUG nova.compute.provider_tree [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.333 2 DEBUG nova.scheduler.client.report [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.357 2 DEBUG oslo_concurrency.lockutils [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.391 2 INFO nova.scheduler.client.report [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Deleted allocations for instance 0f002be2-0f9d-4b3b-a8b2-552c569f0d28#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.444 2 DEBUG nova.compute.manager [req-3523b5fa-c0a3-4638-91d1-e0a24bf99c2e req-86b5904f-131a-4550-a2a8-10a4990794c9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Received event network-vif-plugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.444 2 DEBUG oslo_concurrency.lockutils [req-3523b5fa-c0a3-4638-91d1-e0a24bf99c2e req-86b5904f-131a-4550-a2a8-10a4990794c9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.445 2 DEBUG oslo_concurrency.lockutils [req-3523b5fa-c0a3-4638-91d1-e0a24bf99c2e req-86b5904f-131a-4550-a2a8-10a4990794c9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.445 2 DEBUG oslo_concurrency.lockutils [req-3523b5fa-c0a3-4638-91d1-e0a24bf99c2e req-86b5904f-131a-4550-a2a8-10a4990794c9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.446 2 DEBUG nova.compute.manager [req-3523b5fa-c0a3-4638-91d1-e0a24bf99c2e req-86b5904f-131a-4550-a2a8-10a4990794c9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] No waiting events found dispatching network-vif-plugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.446 2 WARNING nova.compute.manager [req-3523b5fa-c0a3-4638-91d1-e0a24bf99c2e req-86b5904f-131a-4550-a2a8-10a4990794c9 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Received unexpected event network-vif-plugged-7e799a10-8b7f-44c3-b57d-de4a1255aad4 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 09:09:49 np0005465988 nova_compute[236126]: 2025-10-02 13:09:49.459 2 DEBUG oslo_concurrency.lockutils [None req-2f8daca8-ad76-4eb4-96cf-b7b040002f2c 29c8a28c5bdd4feb9412127428bf0c3b 60bfd415ee154615b20dd99528061614 - - default default] Lock "0f002be2-0f9d-4b3b-a8b2-552c569f0d28" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:49.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:49 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:09:49.920 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:09:50 np0005465988 nova_compute[236126]: 2025-10-02 13:09:50.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e414 e414: 3 total, 3 up, 3 in
Oct  2 09:09:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:50.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e415 e415: 3 total, 3 up, 3 in
Oct  2 09:09:51 np0005465988 nova_compute[236126]: 2025-10-02 13:09:51.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:51.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:52 np0005465988 nova_compute[236126]: 2025-10-02 13:09:52.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:52 np0005465988 podman[338634]: 2025-10-02 13:09:52.551769466 +0000 UTC m=+0.076589207 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0)
Oct  2 09:09:52 np0005465988 podman[338635]: 2025-10-02 13:09:52.57482208 +0000 UTC m=+0.088894302 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 09:09:52 np0005465988 podman[338633]: 2025-10-02 13:09:52.60327544 +0000 UTC m=+0.132864958 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller)
Oct  2 09:09:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:52.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:53 np0005465988 nova_compute[236126]: 2025-10-02 13:09:53.500 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:53 np0005465988 nova_compute[236126]: 2025-10-02 13:09:53.501 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:53 np0005465988 nova_compute[236126]: 2025-10-02 13:09:53.527 2 DEBUG nova.compute.manager [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:09:53 np0005465988 nova_compute[236126]: 2025-10-02 13:09:53.616 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:53 np0005465988 nova_compute[236126]: 2025-10-02 13:09:53.616 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:53 np0005465988 nova_compute[236126]: 2025-10-02 13:09:53.623 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:09:53 np0005465988 nova_compute[236126]: 2025-10-02 13:09:53.623 2 INFO nova.compute.claims [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:09:53 np0005465988 nova_compute[236126]: 2025-10-02 13:09:53.745 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:09:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:53.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:09:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/801396701' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.251 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.259 2 DEBUG nova.compute.provider_tree [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.281 2 DEBUG nova.scheduler.client.report [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.308 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.309 2 DEBUG nova.compute.manager [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.364 2 DEBUG nova.compute.manager [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.364 2 DEBUG nova.network.neutron [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.385 2 INFO nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.403 2 DEBUG nova.compute.manager [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.514 2 DEBUG nova.compute.manager [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.516 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.517 2 INFO nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Creating image(s)#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.552 2 DEBUG nova.storage.rbd_utils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 6087853f-327c-46b2-baa6-c24854d98b97_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.584 2 DEBUG nova.storage.rbd_utils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 6087853f-327c-46b2-baa6-c24854d98b97_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.615 2 DEBUG nova.storage.rbd_utils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 6087853f-327c-46b2-baa6-c24854d98b97_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.620 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.678 2 DEBUG nova.policy [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ffe4d737e4414fb3a3e358f8ca3f3e1e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '08e102ae48244af2ab448a2e1ff757df', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.731 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.111s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.733 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.734 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.734 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:54.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.773 2 DEBUG nova.storage.rbd_utils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 6087853f-327c-46b2-baa6-c24854d98b97_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:09:54 np0005465988 nova_compute[236126]: 2025-10-02 13:09:54.778 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 6087853f-327c-46b2-baa6-c24854d98b97_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:09:55 np0005465988 nova_compute[236126]: 2025-10-02 13:09:55.591 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 6087853f-327c-46b2-baa6-c24854d98b97_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.812s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:09:55 np0005465988 nova_compute[236126]: 2025-10-02 13:09:55.678 2 DEBUG nova.network.neutron [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Successfully created port: 994b5877-4440-4c86-be4e-fd60053ca206 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:09:55 np0005465988 nova_compute[236126]: 2025-10-02 13:09:55.688 2 DEBUG nova.storage.rbd_utils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] resizing rbd image 6087853f-327c-46b2-baa6-c24854d98b97_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 09:09:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:09:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:55.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:09:56 np0005465988 nova_compute[236126]: 2025-10-02 13:09:56.011 2 DEBUG nova.objects.instance [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'migration_context' on Instance uuid 6087853f-327c-46b2-baa6-c24854d98b97 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:09:56 np0005465988 nova_compute[236126]: 2025-10-02 13:09:56.033 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 09:09:56 np0005465988 nova_compute[236126]: 2025-10-02 13:09:56.034 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Ensure instance console log exists: /var/lib/nova/instances/6087853f-327c-46b2-baa6-c24854d98b97/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:09:56 np0005465988 nova_compute[236126]: 2025-10-02 13:09:56.034 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:09:56 np0005465988 nova_compute[236126]: 2025-10-02 13:09:56.034 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:09:56 np0005465988 nova_compute[236126]: 2025-10-02 13:09:56.035 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:09:56 np0005465988 nova_compute[236126]: 2025-10-02 13:09:56.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:09:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:56.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:56 np0005465988 nova_compute[236126]: 2025-10-02 13:09:56.978 2 DEBUG nova.network.neutron [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Successfully updated port: 994b5877-4440-4c86-be4e-fd60053ca206 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:09:57 np0005465988 nova_compute[236126]: 2025-10-02 13:09:57.015 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:09:57 np0005465988 nova_compute[236126]: 2025-10-02 13:09:57.016 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquired lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:09:57 np0005465988 nova_compute[236126]: 2025-10-02 13:09:57.016 2 DEBUG nova.network.neutron [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:09:57 np0005465988 nova_compute[236126]: 2025-10-02 13:09:57.087 2 DEBUG nova.compute.manager [req-b958c3bf-fbed-4a3a-a383-3b5947a7902a req-c06c8051-42c9-4bb0-8595-2f6347b13477 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-changed-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:09:57 np0005465988 nova_compute[236126]: 2025-10-02 13:09:57.088 2 DEBUG nova.compute.manager [req-b958c3bf-fbed-4a3a-a383-3b5947a7902a req-c06c8051-42c9-4bb0-8595-2f6347b13477 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Refreshing instance network info cache due to event network-changed-994b5877-4440-4c86-be4e-fd60053ca206. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:09:57 np0005465988 nova_compute[236126]: 2025-10-02 13:09:57.088 2 DEBUG oslo_concurrency.lockutils [req-b958c3bf-fbed-4a3a-a383-3b5947a7902a req-c06c8051-42c9-4bb0-8595-2f6347b13477 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:09:57 np0005465988 nova_compute[236126]: 2025-10-02 13:09:57.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:09:57 np0005465988 nova_compute[236126]: 2025-10-02 13:09:57.755 2 DEBUG nova.network.neutron [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:09:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:57.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:09:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:09:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:09:58.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.083 2 DEBUG nova.network.neutron [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Updating instance_info_cache with network_info: [{"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.112 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Releasing lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.112 2 DEBUG nova.compute.manager [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Instance network_info: |[{"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.113 2 DEBUG oslo_concurrency.lockutils [req-b958c3bf-fbed-4a3a-a383-3b5947a7902a req-c06c8051-42c9-4bb0-8595-2f6347b13477 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.113 2 DEBUG nova.network.neutron [req-b958c3bf-fbed-4a3a-a383-3b5947a7902a req-c06c8051-42c9-4bb0-8595-2f6347b13477 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Refreshing network info cache for port 994b5877-4440-4c86-be4e-fd60053ca206 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.115 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Start _get_guest_xml network_info=[{"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.120 2 WARNING nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.127 2 DEBUG nova.virt.libvirt.host [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.130 2 DEBUG nova.virt.libvirt.host [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.143 2 DEBUG nova.virt.libvirt.host [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.144 2 DEBUG nova.virt.libvirt.host [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.145 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.146 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.146 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.146 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.146 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.147 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.147 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.147 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.147 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.147 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.148 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.148 2 DEBUG nova.virt.hardware [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.151 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:09:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:09:59 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3009739097' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.631 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.657 2 DEBUG nova.storage.rbd_utils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 6087853f-327c-46b2-baa6-c24854d98b97_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:09:59 np0005465988 nova_compute[236126]: 2025-10-02 13:09:59.661 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:09:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:09:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:09:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:09:59.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:10:00 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3611521309' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.150 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.153 2 DEBUG nova.virt.libvirt.vif [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:09:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1574812382',display_name='tempest-TestNetworkAdvancedServerOps-server-1574812382',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1574812382',id=211,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCSYCMEZ6q/lvfSzWU/Eg4JQitDnhXCQauYJ5tidoiUmnnwGO4dpjXw3f4qFvlN8HpQGCKccR0Wgvw5uEOFmQ9+8YrZuneTD88dhlCtVTonVS+twysO7TRzsLhP5B6Zj9g==',key_name='tempest-TestNetworkAdvancedServerOps-842167042',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-bs5w7gov',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:09:54Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=6087853f-327c-46b2-baa6-c24854d98b97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.154 2 DEBUG nova.network.os_vif_util [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converting VIF {"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.156 2 DEBUG nova.network.os_vif_util [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:bc:0d,bridge_name='br-int',has_traffic_filtering=True,id=994b5877-4440-4c86-be4e-fd60053ca206,network=Network(75dc16ae-c86f-409d-a774-fc174172fb9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap994b5877-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.159 2 DEBUG nova.objects.instance [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'pci_devices' on Instance uuid 6087853f-327c-46b2-baa6-c24854d98b97 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.183 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  <uuid>6087853f-327c-46b2-baa6-c24854d98b97</uuid>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  <name>instance-000000d3</name>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1574812382</nova:name>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:09:59</nova:creationTime>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <nova:user uuid="ffe4d737e4414fb3a3e358f8ca3f3e1e">tempest-TestNetworkAdvancedServerOps-1527846432-project-member</nova:user>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <nova:project uuid="08e102ae48244af2ab448a2e1ff757df">tempest-TestNetworkAdvancedServerOps-1527846432</nova:project>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <nova:port uuid="994b5877-4440-4c86-be4e-fd60053ca206">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <entry name="serial">6087853f-327c-46b2-baa6-c24854d98b97</entry>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <entry name="uuid">6087853f-327c-46b2-baa6-c24854d98b97</entry>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/6087853f-327c-46b2-baa6-c24854d98b97_disk">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/6087853f-327c-46b2-baa6-c24854d98b97_disk.config">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:80:bc:0d"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <target dev="tap994b5877-44"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/6087853f-327c-46b2-baa6-c24854d98b97/console.log" append="off"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:10:00 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:10:00 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:10:00 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:10:00 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.185 2 DEBUG nova.compute.manager [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Preparing to wait for external event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.185 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.186 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.186 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.187 2 DEBUG nova.virt.libvirt.vif [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:09:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1574812382',display_name='tempest-TestNetworkAdvancedServerOps-server-1574812382',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1574812382',id=211,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCSYCMEZ6q/lvfSzWU/Eg4JQitDnhXCQauYJ5tidoiUmnnwGO4dpjXw3f4qFvlN8HpQGCKccR0Wgvw5uEOFmQ9+8YrZuneTD88dhlCtVTonVS+twysO7TRzsLhP5B6Zj9g==',key_name='tempest-TestNetworkAdvancedServerOps-842167042',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-bs5w7gov',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:09:54Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=6087853f-327c-46b2-baa6-c24854d98b97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.187 2 DEBUG nova.network.os_vif_util [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converting VIF {"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.188 2 DEBUG nova.network.os_vif_util [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:bc:0d,bridge_name='br-int',has_traffic_filtering=True,id=994b5877-4440-4c86-be4e-fd60053ca206,network=Network(75dc16ae-c86f-409d-a774-fc174172fb9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap994b5877-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.189 2 DEBUG os_vif [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:bc:0d,bridge_name='br-int',has_traffic_filtering=True,id=994b5877-4440-4c86-be4e-fd60053ca206,network=Network(75dc16ae-c86f-409d-a774-fc174172fb9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap994b5877-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.190 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.191 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.197 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap994b5877-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.197 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap994b5877-44, col_values=(('external_ids', {'iface-id': '994b5877-4440-4c86-be4e-fd60053ca206', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:bc:0d', 'vm-uuid': '6087853f-327c-46b2-baa6-c24854d98b97'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:00 np0005465988 NetworkManager[45041]: <info>  [1759410600.2008] manager: (tap994b5877-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/424)
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.297 2 INFO os_vif [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:bc:0d,bridge_name='br-int',has_traffic_filtering=True,id=994b5877-4440-4c86-be4e-fd60053ca206,network=Network(75dc16ae-c86f-409d-a774-fc174172fb9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap994b5877-44')#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.369 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.370 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.370 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] No VIF found with MAC fa:16:3e:80:bc:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.372 2 INFO nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Using config drive#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.398 2 DEBUG nova.storage.rbd_utils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 6087853f-327c-46b2-baa6-c24854d98b97_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.718 2 INFO nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Creating config drive at /var/lib/nova/instances/6087853f-327c-46b2-baa6-c24854d98b97/disk.config#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.723 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6087853f-327c-46b2-baa6-c24854d98b97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdcakpxx5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:00.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.771 2 DEBUG nova.network.neutron [req-b958c3bf-fbed-4a3a-a383-3b5947a7902a req-c06c8051-42c9-4bb0-8595-2f6347b13477 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Updated VIF entry in instance network info cache for port 994b5877-4440-4c86-be4e-fd60053ca206. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.772 2 DEBUG nova.network.neutron [req-b958c3bf-fbed-4a3a-a383-3b5947a7902a req-c06c8051-42c9-4bb0-8595-2f6347b13477 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Updating instance_info_cache with network_info: [{"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.789 2 DEBUG oslo_concurrency.lockutils [req-b958c3bf-fbed-4a3a-a383-3b5947a7902a req-c06c8051-42c9-4bb0-8595-2f6347b13477 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.865 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6087853f-327c-46b2-baa6-c24854d98b97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdcakpxx5" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.902 2 DEBUG nova.storage.rbd_utils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 6087853f-327c-46b2-baa6-c24854d98b97_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:10:00 np0005465988 nova_compute[236126]: 2025-10-02 13:10:00.909 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6087853f-327c-46b2-baa6-c24854d98b97/disk.config 6087853f-327c-46b2-baa6-c24854d98b97_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:01 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 09:10:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 e416: 3 total, 3 up, 3 in
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.183 2 DEBUG oslo_concurrency.processutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6087853f-327c-46b2-baa6-c24854d98b97/disk.config 6087853f-327c-46b2-baa6-c24854d98b97_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.273s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.184 2 INFO nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Deleting local config drive /var/lib/nova/instances/6087853f-327c-46b2-baa6-c24854d98b97/disk.config because it was imported into RBD.#033[00m
Oct  2 09:10:01 np0005465988 kernel: tap994b5877-44: entered promiscuous mode
Oct  2 09:10:01 np0005465988 NetworkManager[45041]: <info>  [1759410601.2643] manager: (tap994b5877-44): new Tun device (/org/freedesktop/NetworkManager/Devices/425)
Oct  2 09:10:01 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:01Z|00963|binding|INFO|Claiming lport 994b5877-4440-4c86-be4e-fd60053ca206 for this chassis.
Oct  2 09:10:01 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:01Z|00964|binding|INFO|994b5877-4440-4c86-be4e-fd60053ca206: Claiming fa:16:3e:80:bc:0d 10.100.0.7
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.277 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:bc:0d 10.100.0.7'], port_security=['fa:16:3e:80:bc:0d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '6087853f-327c-46b2-baa6-c24854d98b97', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75dc16ae-c86f-409d-a774-fc174172fb9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08e102ae48244af2ab448a2e1ff757df', 'neutron:revision_number': '2', 'neutron:security_group_ids': '46f4bf8b-f955-4599-aca9-099d0db9d91d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=22de97a1-eaec-475f-a946-6832b4d8dec3, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=994b5877-4440-4c86-be4e-fd60053ca206) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.278 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 994b5877-4440-4c86-be4e-fd60053ca206 in datapath 75dc16ae-c86f-409d-a774-fc174172fb9a bound to our chassis#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.280 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 75dc16ae-c86f-409d-a774-fc174172fb9a#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.291 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fad0b806-975d-4146-abc0-db341cb7840b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.292 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap75dc16ae-c1 in ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.294 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap75dc16ae-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.294 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[3c26c176-eac0-4e31-9c3e-b3fd8b48502e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.295 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[198774c6-bd17-46be-8011-9a92b923fe74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 systemd-udevd[339125]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.305 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[a767eef0-96f7-43c0-95b1-eb3a68e392b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 systemd-machined[192594]: New machine qemu-101-instance-000000d3.
Oct  2 09:10:01 np0005465988 NetworkManager[45041]: <info>  [1759410601.3197] device (tap994b5877-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:10:01 np0005465988 NetworkManager[45041]: <info>  [1759410601.3204] device (tap994b5877-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.333 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f77cc1fb-205c-44d8-a320-9bcf22dc1b79]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 systemd[1]: Started Virtual Machine qemu-101-instance-000000d3.
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:01 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:01Z|00965|binding|INFO|Setting lport 994b5877-4440-4c86-be4e-fd60053ca206 ovn-installed in OVS
Oct  2 09:10:01 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:01Z|00966|binding|INFO|Setting lport 994b5877-4440-4c86-be4e-fd60053ca206 up in Southbound
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.372 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d979bbf2-b741-46f2-971a-521d93cbfdb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 NetworkManager[45041]: <info>  [1759410601.3805] manager: (tap75dc16ae-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/426)
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.379 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc4a35f-1c1a-4d87-bc08-0292514f693c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.416 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[83c420bc-f132-45df-a0c7-cbefab61bd48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.423 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f21a91-57ec-4a0b-9cdb-8befdc2d09bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 NetworkManager[45041]: <info>  [1759410601.4542] device (tap75dc16ae-c0): carrier: link connected
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.457 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa63023-2821-4f92-9248-b884305debda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.474 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[be5a18a1-7d53-4615-ab8b-5ad5b85892d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75dc16ae-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:bb:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 283], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 866178, 'reachable_time': 22629, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 339171, 'error': None, 'target': 'ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.489 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[32c917db-b654-4613-b966-371cd9b647f2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:bb70'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 866178, 'tstamp': 866178}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 339172, 'error': None, 'target': 'ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.503 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a34ed8ad-4a6f-4cba-82fc-c12370af0b33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75dc16ae-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:bb:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 283], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 866178, 'reachable_time': 22629, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 339173, 'error': None, 'target': 'ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.554 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3837a5-2bb3-46c7-99d1-4ade073b7b0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.603 2 DEBUG nova.compute.manager [req-911c5c7c-f654-4a80-8f32-b95065aed092 req-cd759f56-33e5-42e0-af35-d7ebd1c24fe1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.604 2 DEBUG oslo_concurrency.lockutils [req-911c5c7c-f654-4a80-8f32-b95065aed092 req-cd759f56-33e5-42e0-af35-d7ebd1c24fe1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.604 2 DEBUG oslo_concurrency.lockutils [req-911c5c7c-f654-4a80-8f32-b95065aed092 req-cd759f56-33e5-42e0-af35-d7ebd1c24fe1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.604 2 DEBUG oslo_concurrency.lockutils [req-911c5c7c-f654-4a80-8f32-b95065aed092 req-cd759f56-33e5-42e0-af35-d7ebd1c24fe1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.605 2 DEBUG nova.compute.manager [req-911c5c7c-f654-4a80-8f32-b95065aed092 req-cd759f56-33e5-42e0-af35-d7ebd1c24fe1 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Processing event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.617 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a3f05b-0a5e-4469-b775-1abb7aae1d98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.622 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75dc16ae-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.623 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.624 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75dc16ae-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:01 np0005465988 NetworkManager[45041]: <info>  [1759410601.6276] manager: (tap75dc16ae-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/427)
Oct  2 09:10:01 np0005465988 kernel: tap75dc16ae-c0: entered promiscuous mode
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.630 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap75dc16ae-c0, col_values=(('external_ids', {'iface-id': 'e0f330cf-7f46-48ea-8216-823cfdc753a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:01 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:01Z|00967|binding|INFO|Releasing lport e0f330cf-7f46-48ea-8216-823cfdc753a5 from this chassis (sb_readonly=0)
Oct  2 09:10:01 np0005465988 nova_compute[236126]: 2025-10-02 13:10:01.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.650 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/75dc16ae-c86f-409d-a774-fc174172fb9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/75dc16ae-c86f-409d-a774-fc174172fb9a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.652 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c7799112-e0e6-4c27-8839-b7e48d09bffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.653 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-75dc16ae-c86f-409d-a774-fc174172fb9a
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/75dc16ae-c86f-409d-a774-fc174172fb9a.pid.haproxy
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 75dc16ae-c86f-409d-a774-fc174172fb9a
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:10:01 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:01.655 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a', 'env', 'PROCESS_TAG=haproxy-75dc16ae-c86f-409d-a774-fc174172fb9a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/75dc16ae-c86f-409d-a774-fc174172fb9a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:10:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:01.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.059 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410587.057959, 0f002be2-0f9d-4b3b-a8b2-552c569f0d28 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.060 2 INFO nova.compute.manager [-] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:10:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:10:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:10:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.094 2 DEBUG nova.compute.manager [None req-ebcd4215-5854-45a8-8774-760591b18771 - - - - - -] [instance: 0f002be2-0f9d-4b3b-a8b2-552c569f0d28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:10:02 np0005465988 podman[339261]: 2025-10-02 13:10:02.116828462 +0000 UTC m=+0.073020464 container create d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 09:10:02 np0005465988 podman[339261]: 2025-10-02 13:10:02.080278649 +0000 UTC m=+0.036470671 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:10:02 np0005465988 systemd[1]: Started libpod-conmon-d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203.scope.
Oct  2 09:10:02 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:10:02 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0947ffa0ee6bf40e39f621835106ea9521ffb9ba24e4e8b58d6b7fcfe515b5f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:10:02 np0005465988 podman[339261]: 2025-10-02 13:10:02.245733034 +0000 UTC m=+0.201925046 container init d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:10:02 np0005465988 podman[339261]: 2025-10-02 13:10:02.253773526 +0000 UTC m=+0.209965528 container start d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 09:10:02 np0005465988 neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a[339277]: [NOTICE]   (339281) : New worker (339283) forked
Oct  2 09:10:02 np0005465988 neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a[339277]: [NOTICE]   (339281) : Loading success.
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.324 2 DEBUG nova.compute.manager [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.326 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410602.325848, 6087853f-327c-46b2-baa6-c24854d98b97 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.326 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] VM Started (Lifecycle Event)#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.334 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.343 2 INFO nova.virt.libvirt.driver [-] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Instance spawned successfully.#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.343 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.353 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.359 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.373 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.374 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.375 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.375 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.376 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.376 2 DEBUG nova.virt.libvirt.driver [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.381 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.381 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410602.3266377, 6087853f-327c-46b2-baa6-c24854d98b97 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.382 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.406 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.411 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410602.3288965, 6087853f-327c-46b2-baa6-c24854d98b97 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.412 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.443 2 INFO nova.compute.manager [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Took 7.93 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.444 2 DEBUG nova.compute.manager [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.456 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.462 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.493 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.540 2 INFO nova.compute.manager [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Took 8.94 seconds to build instance.#033[00m
Oct  2 09:10:02 np0005465988 nova_compute[236126]: 2025-10-02 13:10:02.557 2 DEBUG oslo_concurrency.lockutils [None req-b0db0363-5593-4941-819b-efd8e49b73ec ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:02.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:03 np0005465988 nova_compute[236126]: 2025-10-02 13:10:03.669 2 DEBUG nova.compute.manager [req-3308189b-646a-417e-a66d-4589b4e27c7d req-0e7de92e-05e6-48e5-9d1d-88f9020c2c97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:03 np0005465988 nova_compute[236126]: 2025-10-02 13:10:03.669 2 DEBUG oslo_concurrency.lockutils [req-3308189b-646a-417e-a66d-4589b4e27c7d req-0e7de92e-05e6-48e5-9d1d-88f9020c2c97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:03 np0005465988 nova_compute[236126]: 2025-10-02 13:10:03.670 2 DEBUG oslo_concurrency.lockutils [req-3308189b-646a-417e-a66d-4589b4e27c7d req-0e7de92e-05e6-48e5-9d1d-88f9020c2c97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:03 np0005465988 nova_compute[236126]: 2025-10-02 13:10:03.670 2 DEBUG oslo_concurrency.lockutils [req-3308189b-646a-417e-a66d-4589b4e27c7d req-0e7de92e-05e6-48e5-9d1d-88f9020c2c97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:03 np0005465988 nova_compute[236126]: 2025-10-02 13:10:03.670 2 DEBUG nova.compute.manager [req-3308189b-646a-417e-a66d-4589b4e27c7d req-0e7de92e-05e6-48e5-9d1d-88f9020c2c97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] No waiting events found dispatching network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:03 np0005465988 nova_compute[236126]: 2025-10-02 13:10:03.670 2 WARNING nova.compute.manager [req-3308189b-646a-417e-a66d-4589b4e27c7d req-0e7de92e-05e6-48e5-9d1d-88f9020c2c97 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received unexpected event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 for instance with vm_state active and task_state None.#033[00m
Oct  2 09:10:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:03.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:04.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:04 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:04Z|00968|binding|INFO|Releasing lport e0f330cf-7f46-48ea-8216-823cfdc753a5 from this chassis (sb_readonly=0)
Oct  2 09:10:04 np0005465988 nova_compute[236126]: 2025-10-02 13:10:04.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:04 np0005465988 NetworkManager[45041]: <info>  [1759410604.9911] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/428)
Oct  2 09:10:04 np0005465988 NetworkManager[45041]: <info>  [1759410604.9918] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/429)
Oct  2 09:10:05 np0005465988 nova_compute[236126]: 2025-10-02 13:10:05.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:05 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:05Z|00969|binding|INFO|Releasing lport e0f330cf-7f46-48ea-8216-823cfdc753a5 from this chassis (sb_readonly=0)
Oct  2 09:10:05 np0005465988 nova_compute[236126]: 2025-10-02 13:10:05.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:05 np0005465988 nova_compute[236126]: 2025-10-02 13:10:05.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:05 np0005465988 nova_compute[236126]: 2025-10-02 13:10:05.416 2 DEBUG nova.compute.manager [req-97caf78b-1511-4de3-9c80-e04bcb24f5a8 req-62477c42-f0aa-4d6d-837c-29d8027555ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-changed-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:05 np0005465988 nova_compute[236126]: 2025-10-02 13:10:05.416 2 DEBUG nova.compute.manager [req-97caf78b-1511-4de3-9c80-e04bcb24f5a8 req-62477c42-f0aa-4d6d-837c-29d8027555ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Refreshing instance network info cache due to event network-changed-994b5877-4440-4c86-be4e-fd60053ca206. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:10:05 np0005465988 nova_compute[236126]: 2025-10-02 13:10:05.417 2 DEBUG oslo_concurrency.lockutils [req-97caf78b-1511-4de3-9c80-e04bcb24f5a8 req-62477c42-f0aa-4d6d-837c-29d8027555ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:10:05 np0005465988 nova_compute[236126]: 2025-10-02 13:10:05.417 2 DEBUG oslo_concurrency.lockutils [req-97caf78b-1511-4de3-9c80-e04bcb24f5a8 req-62477c42-f0aa-4d6d-837c-29d8027555ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:10:05 np0005465988 nova_compute[236126]: 2025-10-02 13:10:05.417 2 DEBUG nova.network.neutron [req-97caf78b-1511-4de3-9c80-e04bcb24f5a8 req-62477c42-f0aa-4d6d-837c-29d8027555ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Refreshing network info cache for port 994b5877-4440-4c86-be4e-fd60053ca206 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:10:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:05.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:06 np0005465988 nova_compute[236126]: 2025-10-02 13:10:06.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:06.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:07 np0005465988 nova_compute[236126]: 2025-10-02 13:10:07.059 2 DEBUG nova.network.neutron [req-97caf78b-1511-4de3-9c80-e04bcb24f5a8 req-62477c42-f0aa-4d6d-837c-29d8027555ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Updated VIF entry in instance network info cache for port 994b5877-4440-4c86-be4e-fd60053ca206. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:10:07 np0005465988 nova_compute[236126]: 2025-10-02 13:10:07.061 2 DEBUG nova.network.neutron [req-97caf78b-1511-4de3-9c80-e04bcb24f5a8 req-62477c42-f0aa-4d6d-837c-29d8027555ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Updating instance_info_cache with network_info: [{"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:10:07 np0005465988 nova_compute[236126]: 2025-10-02 13:10:07.082 2 DEBUG oslo_concurrency.lockutils [req-97caf78b-1511-4de3-9c80-e04bcb24f5a8 req-62477c42-f0aa-4d6d-837c-29d8027555ea d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:10:07 np0005465988 nova_compute[236126]: 2025-10-02 13:10:07.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:07.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:10:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:10:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:08.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:10:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:09.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:10:10 np0005465988 nova_compute[236126]: 2025-10-02 13:10:10.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:10.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:11 np0005465988 nova_compute[236126]: 2025-10-02 13:10:11.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:11.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:12 np0005465988 nova_compute[236126]: 2025-10-02 13:10:12.490 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:12 np0005465988 nova_compute[236126]: 2025-10-02 13:10:12.491 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 09:10:12 np0005465988 podman[339398]: 2025-10-02 13:10:12.535327818 +0000 UTC m=+0.066331112 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 09:10:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:12.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:13 np0005465988 nova_compute[236126]: 2025-10-02 13:10:13.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:13.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:14.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:15 np0005465988 nova_compute[236126]: 2025-10-02 13:10:15.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:15 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:15Z|00127|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:80:bc:0d 10.100.0.7
Oct  2 09:10:15 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:15Z|00128|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:80:bc:0d 10.100.0.7
Oct  2 09:10:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:15.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #163. Immutable memtables: 0.
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.062460) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 103] Flushing memtable with next log file: 163
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410616062508, "job": 103, "event": "flush_started", "num_memtables": 1, "num_entries": 1473, "num_deletes": 259, "total_data_size": 3051928, "memory_usage": 3099968, "flush_reason": "Manual Compaction"}
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 103] Level-0 flush table #164: started
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410616073177, "cf_name": "default", "job": 103, "event": "table_file_creation", "file_number": 164, "file_size": 1999361, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 79559, "largest_seqno": 81026, "table_properties": {"data_size": 1993082, "index_size": 3481, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13921, "raw_average_key_size": 20, "raw_value_size": 1980139, "raw_average_value_size": 2886, "num_data_blocks": 152, "num_entries": 686, "num_filter_entries": 686, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759410516, "oldest_key_time": 1759410516, "file_creation_time": 1759410616, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 164, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 103] Flush lasted 10757 microseconds, and 4940 cpu microseconds.
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.073215) [db/flush_job.cc:967] [default] [JOB 103] Level-0 flush table #164: 1999361 bytes OK
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.073236) [db/memtable_list.cc:519] [default] Level-0 commit table #164 started
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.074961) [db/memtable_list.cc:722] [default] Level-0 commit table #164: memtable #1 done
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.074975) EVENT_LOG_v1 {"time_micros": 1759410616074971, "job": 103, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.074993) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 103] Try to delete WAL files size 3044994, prev total WAL file size 3044994, number of live WAL files 2.
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000160.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.075840) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033303130' seq:72057594037927935, type:22 .. '6C6F676D0033323631' seq:0, type:0; will stop at (end)
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 104] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 103 Base level 0, inputs: [164(1952KB)], [162(11MB)]
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410616075877, "job": 104, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [164], "files_L6": [162], "score": -1, "input_data_size": 13731086, "oldest_snapshot_seqno": -1}
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 104] Generated table #165: 10108 keys, 13596123 bytes, temperature: kUnknown
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410616144149, "cf_name": "default", "job": 104, "event": "table_file_creation", "file_number": 165, "file_size": 13596123, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13530223, "index_size": 39547, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25285, "raw_key_size": 266779, "raw_average_key_size": 26, "raw_value_size": 13352584, "raw_average_value_size": 1320, "num_data_blocks": 1509, "num_entries": 10108, "num_filter_entries": 10108, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759410616, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 165, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.144652) [db/compaction/compaction_job.cc:1663] [default] [JOB 104] Compacted 1@0 + 1@6 files to L6 => 13596123 bytes
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.145738) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 200.3 rd, 198.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.2 +0.0 blob) out(13.0 +0.0 blob), read-write-amplify(13.7) write-amplify(6.8) OK, records in: 10642, records dropped: 534 output_compression: NoCompression
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.145758) EVENT_LOG_v1 {"time_micros": 1759410616145747, "job": 104, "event": "compaction_finished", "compaction_time_micros": 68562, "compaction_time_cpu_micros": 31475, "output_level": 6, "num_output_files": 1, "total_output_size": 13596123, "num_input_records": 10642, "num_output_records": 10108, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000164.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410616146187, "job": 104, "event": "table_file_deletion", "file_number": 164}
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000162.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410616148879, "job": 104, "event": "table_file_deletion", "file_number": 162}
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.075749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.148955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.148962) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.148963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:10:16 np0005465988 nova_compute[236126]: 2025-10-02 13:10:16.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.148965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:10:16.148966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:10:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:16 np0005465988 nova_compute[236126]: 2025-10-02 13:10:16.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:16.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:10:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:17.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:10:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:18.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:19.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:20 np0005465988 nova_compute[236126]: 2025-10-02 13:10:20.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:20.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:21 np0005465988 nova_compute[236126]: 2025-10-02 13:10:21.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:21 np0005465988 nova_compute[236126]: 2025-10-02 13:10:21.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:21.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:22.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:23 np0005465988 nova_compute[236126]: 2025-10-02 13:10:23.065 2 INFO nova.compute.manager [None req-8ad3affa-9060-4dbc-9bc2-6cada30b41b2 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Get console output#033[00m
Oct  2 09:10:23 np0005465988 nova_compute[236126]: 2025-10-02 13:10:23.074 15591 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 09:10:23 np0005465988 nova_compute[236126]: 2025-10-02 13:10:23.494 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:23 np0005465988 podman[339424]: 2025-10-02 13:10:23.541303043 +0000 UTC m=+0.068578106 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, config_id=iscsid)
Oct  2 09:10:23 np0005465988 podman[339425]: 2025-10-02 13:10:23.559337123 +0000 UTC m=+0.082718944 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 09:10:23 np0005465988 podman[339423]: 2025-10-02 13:10:23.58145722 +0000 UTC m=+0.112984295 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  2 09:10:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:23.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:24 np0005465988 nova_compute[236126]: 2025-10-02 13:10:24.251 2 INFO nova.compute.manager [None req-1e84f4a6-10ab-41b5-a094-92e729a90450 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Get console output#033[00m
Oct  2 09:10:24 np0005465988 nova_compute[236126]: 2025-10-02 13:10:24.257 15591 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 09:10:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:24.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:25 np0005465988 nova_compute[236126]: 2025-10-02 13:10:25.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:25 np0005465988 nova_compute[236126]: 2025-10-02 13:10:25.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:25 np0005465988 nova_compute[236126]: 2025-10-02 13:10:25.518 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:25 np0005465988 nova_compute[236126]: 2025-10-02 13:10:25.519 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:25 np0005465988 nova_compute[236126]: 2025-10-02 13:10:25.519 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:25 np0005465988 nova_compute[236126]: 2025-10-02 13:10:25.520 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:10:25 np0005465988 nova_compute[236126]: 2025-10-02 13:10:25.520 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:10:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1983806848' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:10:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:25.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:25 np0005465988 nova_compute[236126]: 2025-10-02 13:10:25.969 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.066 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000d3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.067 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000d3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.230 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.231 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3830MB free_disk=20.942764282226562GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.231 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.231 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.349 2 INFO nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance f09d8e34-4c5a-4003-aebf-40ce1a8562af has allocations against this compute host but is not found in the database.#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.350 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.350 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.389 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:26.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:10:26 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3228220737' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.830 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.837 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.854 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.881 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:10:26 np0005465988 nova_compute[236126]: 2025-10-02 13:10:26.882 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:27.418 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:27.418 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:27.419 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:27 np0005465988 nova_compute[236126]: 2025-10-02 13:10:27.788 2 DEBUG nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Check if temp file /var/lib/nova/instances/tmp4_a5vvlk exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Oct  2 09:10:27 np0005465988 nova_compute[236126]: 2025-10-02 13:10:27.789 2 DEBUG nova.compute.manager [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=17408,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4_a5vvlk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='6087853f-327c-46b2-baa6-c24854d98b97',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Oct  2 09:10:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:27.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:10:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:28.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:10:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:29.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:30 np0005465988 nova_compute[236126]: 2025-10-02 13:10:30.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 09:10:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:30.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 09:10:31 np0005465988 nova_compute[236126]: 2025-10-02 13:10:31.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:31.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:32.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:33.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:34.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.201 2 DEBUG nova.compute.manager [req-88a2c2c6-0adc-46f1-8beb-7aa44d4d7eba req-ed85dd8c-d26f-48d7-829d-64cc77eb6cfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-unplugged-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.202 2 DEBUG oslo_concurrency.lockutils [req-88a2c2c6-0adc-46f1-8beb-7aa44d4d7eba req-ed85dd8c-d26f-48d7-829d-64cc77eb6cfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.202 2 DEBUG oslo_concurrency.lockutils [req-88a2c2c6-0adc-46f1-8beb-7aa44d4d7eba req-ed85dd8c-d26f-48d7-829d-64cc77eb6cfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.202 2 DEBUG oslo_concurrency.lockutils [req-88a2c2c6-0adc-46f1-8beb-7aa44d4d7eba req-ed85dd8c-d26f-48d7-829d-64cc77eb6cfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.202 2 DEBUG nova.compute.manager [req-88a2c2c6-0adc-46f1-8beb-7aa44d4d7eba req-ed85dd8c-d26f-48d7-829d-64cc77eb6cfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] No waiting events found dispatching network-vif-unplugged-994b5877-4440-4c86-be4e-fd60053ca206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.202 2 DEBUG nova.compute.manager [req-88a2c2c6-0adc-46f1-8beb-7aa44d4d7eba req-ed85dd8c-d26f-48d7-829d-64cc77eb6cfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-unplugged-994b5877-4440-4c86-be4e-fd60053ca206 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:35.497 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=82, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=81) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:35.498 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:10:35 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:35.499 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '82'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.970 2 INFO nova.compute.manager [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Took 7.51 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.971 2 DEBUG nova.compute.manager [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:10:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:10:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:35.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.991 2 DEBUG nova.compute.manager [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=17408,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4_a5vvlk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='6087853f-327c-46b2-baa6-c24854d98b97',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(f09d8e34-4c5a-4003-aebf-40ce1a8562af),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.996 2 DEBUG nova.objects.instance [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lazy-loading 'migration_context' on Instance uuid 6087853f-327c-46b2-baa6-c24854d98b97 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.997 2 DEBUG nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.999 2 DEBUG nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Oct  2 09:10:35 np0005465988 nova_compute[236126]: 2025-10-02 13:10:35.999 2 DEBUG nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Oct  2 09:10:36 np0005465988 nova_compute[236126]: 2025-10-02 13:10:36.020 2 DEBUG nova.virt.libvirt.vif [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:09:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1574812382',display_name='tempest-TestNetworkAdvancedServerOps-server-1574812382',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1574812382',id=211,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCSYCMEZ6q/lvfSzWU/Eg4JQitDnhXCQauYJ5tidoiUmnnwGO4dpjXw3f4qFvlN8HpQGCKccR0Wgvw5uEOFmQ9+8YrZuneTD88dhlCtVTonVS+twysO7TRzsLhP5B6Zj9g==',key_name='tempest-TestNetworkAdvancedServerOps-842167042',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:10:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-bs5w7gov',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:10:02Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=6087853f-327c-46b2-baa6-c24854d98b97,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:10:36 np0005465988 nova_compute[236126]: 2025-10-02 13:10:36.021 2 DEBUG nova.network.os_vif_util [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Converting VIF {"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:10:36 np0005465988 nova_compute[236126]: 2025-10-02 13:10:36.021 2 DEBUG nova.network.os_vif_util [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:80:bc:0d,bridge_name='br-int',has_traffic_filtering=True,id=994b5877-4440-4c86-be4e-fd60053ca206,network=Network(75dc16ae-c86f-409d-a774-fc174172fb9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap994b5877-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:10:36 np0005465988 nova_compute[236126]: 2025-10-02 13:10:36.022 2 DEBUG nova.virt.libvirt.migration [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Updating guest XML with vif config: <interface type="ethernet">
Oct  2 09:10:36 np0005465988 nova_compute[236126]:  <mac address="fa:16:3e:80:bc:0d"/>
Oct  2 09:10:36 np0005465988 nova_compute[236126]:  <model type="virtio"/>
Oct  2 09:10:36 np0005465988 nova_compute[236126]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:10:36 np0005465988 nova_compute[236126]:  <mtu size="1442"/>
Oct  2 09:10:36 np0005465988 nova_compute[236126]:  <target dev="tap994b5877-44"/>
Oct  2 09:10:36 np0005465988 nova_compute[236126]: </interface>
Oct  2 09:10:36 np0005465988 nova_compute[236126]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Oct  2 09:10:36 np0005465988 nova_compute[236126]: 2025-10-02 13:10:36.022 2 DEBUG nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Oct  2 09:10:36 np0005465988 nova_compute[236126]: 2025-10-02 13:10:36.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:36 np0005465988 nova_compute[236126]: 2025-10-02 13:10:36.503 2 DEBUG nova.virt.libvirt.migration [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 09:10:36 np0005465988 nova_compute[236126]: 2025-10-02 13:10:36.504 2 INFO nova.virt.libvirt.migration [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Oct  2 09:10:36 np0005465988 nova_compute[236126]: 2025-10-02 13:10:36.564 2 INFO nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Oct  2 09:10:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:10:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:36.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.067 2 DEBUG nova.virt.libvirt.migration [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.068 2 DEBUG nova.virt.libvirt.migration [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.295 2 DEBUG nova.compute.manager [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.295 2 DEBUG oslo_concurrency.lockutils [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.296 2 DEBUG oslo_concurrency.lockutils [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.296 2 DEBUG oslo_concurrency.lockutils [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.296 2 DEBUG nova.compute.manager [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] No waiting events found dispatching network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.296 2 WARNING nova.compute.manager [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received unexpected event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.297 2 DEBUG nova.compute.manager [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-changed-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.297 2 DEBUG nova.compute.manager [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Refreshing instance network info cache due to event network-changed-994b5877-4440-4c86-be4e-fd60053ca206. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.297 2 DEBUG oslo_concurrency.lockutils [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.297 2 DEBUG oslo_concurrency.lockutils [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.298 2 DEBUG nova.network.neutron [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Refreshing network info cache for port 994b5877-4440-4c86-be4e-fd60053ca206 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.572 2 DEBUG nova.virt.libvirt.migration [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.573 2 DEBUG nova.virt.libvirt.migration [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.777 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410637.7764857, 6087853f-327c-46b2-baa6-c24854d98b97 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.777 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.802 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.809 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.881 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.882 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.921 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Oct  2 09:10:37 np0005465988 kernel: tap994b5877-44 (unregistering): left promiscuous mode
Oct  2 09:10:37 np0005465988 NetworkManager[45041]: <info>  [1759410637.9534] device (tap994b5877-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:37 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:37Z|00970|binding|INFO|Releasing lport 994b5877-4440-4c86-be4e-fd60053ca206 from this chassis (sb_readonly=0)
Oct  2 09:10:37 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:37Z|00971|binding|INFO|Setting lport 994b5877-4440-4c86-be4e-fd60053ca206 down in Southbound
Oct  2 09:10:37 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:37Z|00972|binding|INFO|Removing iface tap994b5877-44 ovn-installed in OVS
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:37.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:37.983 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:bc:0d 10.100.0.7'], port_security=['fa:16:3e:80:bc:0d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e4c8874a-a81b-4869-98e4-aca2e3f3bf40'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '6087853f-327c-46b2-baa6-c24854d98b97', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75dc16ae-c86f-409d-a774-fc174172fb9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08e102ae48244af2ab448a2e1ff757df', 'neutron:revision_number': '8', 'neutron:security_group_ids': '46f4bf8b-f955-4599-aca9-099d0db9d91d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.187'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=22de97a1-eaec-475f-a946-6832b4d8dec3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=994b5877-4440-4c86-be4e-fd60053ca206) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:10:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:37.985 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 994b5877-4440-4c86-be4e-fd60053ca206 in datapath 75dc16ae-c86f-409d-a774-fc174172fb9a unbound from our chassis#033[00m
Oct  2 09:10:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:37.987 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 75dc16ae-c86f-409d-a774-fc174172fb9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:10:37 np0005465988 nova_compute[236126]: 2025-10-02 13:10:37.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:37.990 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0df9ba14-b506-4507-8639-7f9df7c56624]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:37 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:37.991 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a namespace which is not needed anymore#033[00m
Oct  2 09:10:38 np0005465988 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000d3.scope: Deactivated successfully.
Oct  2 09:10:38 np0005465988 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000d3.scope: Consumed 14.092s CPU time.
Oct  2 09:10:38 np0005465988 systemd-machined[192594]: Machine qemu-101-instance-000000d3 terminated.
Oct  2 09:10:38 np0005465988 virtqemud[235689]: Unable to get XATTR trusted.libvirt.security.ref_selinux on 6087853f-327c-46b2-baa6-c24854d98b97_disk: No such file or directory
Oct  2 09:10:38 np0005465988 virtqemud[235689]: Unable to get XATTR trusted.libvirt.security.ref_dac on 6087853f-327c-46b2-baa6-c24854d98b97_disk: No such file or directory
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.106 2 DEBUG nova.virt.libvirt.guest [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.108 2 INFO nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Migration operation has completed#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.108 2 INFO nova.compute.manager [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] _post_live_migration() is started..#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.115 2 DEBUG nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.116 2 DEBUG nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.116 2 DEBUG nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Oct  2 09:10:38 np0005465988 neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a[339277]: [NOTICE]   (339281) : haproxy version is 2.8.14-c23fe91
Oct  2 09:10:38 np0005465988 neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a[339277]: [NOTICE]   (339281) : path to executable is /usr/sbin/haproxy
Oct  2 09:10:38 np0005465988 neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a[339277]: [WARNING]  (339281) : Exiting Master process...
Oct  2 09:10:38 np0005465988 neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a[339277]: [ALERT]    (339281) : Current worker (339283) exited with code 143 (Terminated)
Oct  2 09:10:38 np0005465988 neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a[339277]: [WARNING]  (339281) : All workers exited. Exiting... (0)
Oct  2 09:10:38 np0005465988 systemd[1]: libpod-d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203.scope: Deactivated successfully.
Oct  2 09:10:38 np0005465988 podman[339623]: 2025-10-02 13:10:38.195033476 +0000 UTC m=+0.076182085 container died d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 09:10:38 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203-userdata-shm.mount: Deactivated successfully.
Oct  2 09:10:38 np0005465988 systemd[1]: var-lib-containers-storage-overlay-0947ffa0ee6bf40e39f621835106ea9521ffb9ba24e4e8b58d6b7fcfe515b5f0-merged.mount: Deactivated successfully.
Oct  2 09:10:38 np0005465988 podman[339623]: 2025-10-02 13:10:38.250308938 +0000 UTC m=+0.131457587 container cleanup d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 09:10:38 np0005465988 systemd[1]: libpod-conmon-d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203.scope: Deactivated successfully.
Oct  2 09:10:38 np0005465988 podman[339658]: 2025-10-02 13:10:38.350274707 +0000 UTC m=+0.073976891 container remove d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 09:10:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:38.360 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dbcb84e0-b922-4f51-af04-71a893d8e158]: (4, ('Thu Oct  2 01:10:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a (d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203)\nd3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203\nThu Oct  2 01:10:38 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a (d3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203)\nd3c5e278467ae81e737696dfde2d66c25e205abbb5224501c8ba65c5807ee203\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:38.362 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8e520052-0667-4b39-9563-3fbb6e60ab92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:38.363 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75dc16ae-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:38 np0005465988 kernel: tap75dc16ae-c0: left promiscuous mode
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:38.401 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ce1c98-7eae-4a77-9a1e-6937a156b854]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:38.430 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9eae7285-8978-489e-bd58-cdfed462b3a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:38.431 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9c89dbdd-bc0f-47a2-b33d-d06d97adfa55]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:38.451 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b51d3075-2ed7-4b7e-b567-9bda8853c093]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 866169, 'reachable_time': 44020, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 339677, 'error': None, 'target': 'ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:38.456 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-75dc16ae-c86f-409d-a774-fc174172fb9a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:10:38 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:38.457 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[ca44f33b-10eb-4377-9095-7d2114eeb0d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:38 np0005465988 systemd[1]: run-netns-ovnmeta\x2d75dc16ae\x2dc86f\x2d409d\x2da774\x2dfc174172fb9a.mount: Deactivated successfully.
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.690 2 DEBUG nova.network.neutron [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Updated VIF entry in instance network info cache for port 994b5877-4440-4c86-be4e-fd60053ca206. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.691 2 DEBUG nova.network.neutron [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Updating instance_info_cache with network_info: [{"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.754 2 DEBUG nova.compute.manager [req-7fcef9f5-7e32-4c07-8f3b-93c0333c21fa req-7b745536-9315-46ce-8d7f-874c8b322bf4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-unplugged-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.754 2 DEBUG oslo_concurrency.lockutils [req-7fcef9f5-7e32-4c07-8f3b-93c0333c21fa req-7b745536-9315-46ce-8d7f-874c8b322bf4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.755 2 DEBUG oslo_concurrency.lockutils [req-7fcef9f5-7e32-4c07-8f3b-93c0333c21fa req-7b745536-9315-46ce-8d7f-874c8b322bf4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.756 2 DEBUG oslo_concurrency.lockutils [req-7fcef9f5-7e32-4c07-8f3b-93c0333c21fa req-7b745536-9315-46ce-8d7f-874c8b322bf4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.756 2 DEBUG nova.compute.manager [req-7fcef9f5-7e32-4c07-8f3b-93c0333c21fa req-7b745536-9315-46ce-8d7f-874c8b322bf4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] No waiting events found dispatching network-vif-unplugged-994b5877-4440-4c86-be4e-fd60053ca206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.757 2 DEBUG nova.compute.manager [req-7fcef9f5-7e32-4c07-8f3b-93c0333c21fa req-7b745536-9315-46ce-8d7f-874c8b322bf4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-unplugged-994b5877-4440-4c86-be4e-fd60053ca206 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.759 2 DEBUG oslo_concurrency.lockutils [req-9fd1166a-c4ab-4acf-b2d1-e9edc468f5bd req-f51e4f08-9c64-43fd-a301-2d4c5aa5b96f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-6087853f-327c-46b2-baa6-c24854d98b97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:10:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:38.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.841 2 DEBUG nova.network.neutron [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Activated binding for port 994b5877-4440-4c86-be4e-fd60053ca206 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.842 2 DEBUG nova.compute.manager [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.843 2 DEBUG nova.virt.libvirt.vif [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:09:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1574812382',display_name='tempest-TestNetworkAdvancedServerOps-server-1574812382',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1574812382',id=211,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCSYCMEZ6q/lvfSzWU/Eg4JQitDnhXCQauYJ5tidoiUmnnwGO4dpjXw3f4qFvlN8HpQGCKccR0Wgvw5uEOFmQ9+8YrZuneTD88dhlCtVTonVS+twysO7TRzsLhP5B6Zj9g==',key_name='tempest-TestNetworkAdvancedServerOps-842167042',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:10:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-bs5w7gov',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:10:26Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=6087853f-327c-46b2-baa6-c24854d98b97,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.843 2 DEBUG nova.network.os_vif_util [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Converting VIF {"id": "994b5877-4440-4c86-be4e-fd60053ca206", "address": "fa:16:3e:80:bc:0d", "network": {"id": "75dc16ae-c86f-409d-a774-fc174172fb9a", "bridge": "br-int", "label": "tempest-network-smoke--1179677419", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994b5877-44", "ovs_interfaceid": "994b5877-4440-4c86-be4e-fd60053ca206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.844 2 DEBUG nova.network.os_vif_util [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:80:bc:0d,bridge_name='br-int',has_traffic_filtering=True,id=994b5877-4440-4c86-be4e-fd60053ca206,network=Network(75dc16ae-c86f-409d-a774-fc174172fb9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap994b5877-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.844 2 DEBUG os_vif [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:bc:0d,bridge_name='br-int',has_traffic_filtering=True,id=994b5877-4440-4c86-be4e-fd60053ca206,network=Network(75dc16ae-c86f-409d-a774-fc174172fb9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap994b5877-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.846 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap994b5877-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.854 2 INFO os_vif [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:bc:0d,bridge_name='br-int',has_traffic_filtering=True,id=994b5877-4440-4c86-be4e-fd60053ca206,network=Network(75dc16ae-c86f-409d-a774-fc174172fb9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap994b5877-44')#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.854 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.855 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.855 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.855 2 DEBUG nova.compute.manager [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.856 2 INFO nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Deleting instance files /var/lib/nova/instances/6087853f-327c-46b2-baa6-c24854d98b97_del#033[00m
Oct  2 09:10:38 np0005465988 nova_compute[236126]: 2025-10-02 13:10:38.856 2 INFO nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Deletion of /var/lib/nova/instances/6087853f-327c-46b2-baa6-c24854d98b97_del complete#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.378 2 DEBUG nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-unplugged-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.378 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.378 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.378 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.379 2 DEBUG nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] No waiting events found dispatching network-vif-unplugged-994b5877-4440-4c86-be4e-fd60053ca206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.379 2 DEBUG nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-unplugged-994b5877-4440-4c86-be4e-fd60053ca206 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.379 2 DEBUG nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.379 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.379 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.379 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.379 2 DEBUG nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] No waiting events found dispatching network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.380 2 WARNING nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received unexpected event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.380 2 DEBUG nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.380 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.380 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.380 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.380 2 DEBUG nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] No waiting events found dispatching network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.380 2 WARNING nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received unexpected event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.381 2 DEBUG nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.381 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.381 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.381 2 DEBUG oslo_concurrency.lockutils [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.381 2 DEBUG nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] No waiting events found dispatching network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.381 2 WARNING nova.compute.manager [req-038b6f69-a6ac-4c05-9f6a-0f645eac6e36 req-0db10b4c-2d6b-491c-874a-6979a69bb6f5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received unexpected event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:39 np0005465988 nova_compute[236126]: 2025-10-02 13:10:39.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:10:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:39.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:40 np0005465988 nova_compute[236126]: 2025-10-02 13:10:40.383 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:40 np0005465988 nova_compute[236126]: 2025-10-02 13:10:40.384 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:40 np0005465988 nova_compute[236126]: 2025-10-02 13:10:40.403 2 DEBUG nova.compute.manager [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:10:40 np0005465988 nova_compute[236126]: 2025-10-02 13:10:40.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:40 np0005465988 nova_compute[236126]: 2025-10-02 13:10:40.472 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:40 np0005465988 nova_compute[236126]: 2025-10-02 13:10:40.486 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:40 np0005465988 nova_compute[236126]: 2025-10-02 13:10:40.486 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:40 np0005465988 nova_compute[236126]: 2025-10-02 13:10:40.495 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:10:40 np0005465988 nova_compute[236126]: 2025-10-02 13:10:40.496 2 INFO nova.compute.claims [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:10:40 np0005465988 nova_compute[236126]: 2025-10-02 13:10:40.630 2 DEBUG oslo_concurrency.processutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:40.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:10:41 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2966095719' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.127 2 DEBUG oslo_concurrency.processutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.134 2 DEBUG nova.compute.provider_tree [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.150 2 DEBUG nova.scheduler.client.report [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.171 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.172 2 DEBUG nova.compute.manager [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.219 2 DEBUG nova.compute.manager [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.220 2 DEBUG nova.network.neutron [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.256 2 INFO nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.277 2 DEBUG nova.compute.manager [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.327 2 INFO nova.virt.block_device [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Booting with volume 8173fb80-12b5-41df-9b9e-84a3204dcf2c at /dev/vda#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.462 2 DEBUG os_brick.utils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.465 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.470 2 DEBUG nova.compute.manager [req-530f6e61-2bb1-4ada-87d8-7d462e293a8a req-d842a670-bb24-409b-9f42-c0802f8e3939 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.471 2 DEBUG oslo_concurrency.lockutils [req-530f6e61-2bb1-4ada-87d8-7d462e293a8a req-d842a670-bb24-409b-9f42-c0802f8e3939 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.471 2 DEBUG oslo_concurrency.lockutils [req-530f6e61-2bb1-4ada-87d8-7d462e293a8a req-d842a670-bb24-409b-9f42-c0802f8e3939 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.471 2 DEBUG oslo_concurrency.lockutils [req-530f6e61-2bb1-4ada-87d8-7d462e293a8a req-d842a670-bb24-409b-9f42-c0802f8e3939 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.471 2 DEBUG nova.compute.manager [req-530f6e61-2bb1-4ada-87d8-7d462e293a8a req-d842a670-bb24-409b-9f42-c0802f8e3939 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] No waiting events found dispatching network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.472 2 WARNING nova.compute.manager [req-530f6e61-2bb1-4ada-87d8-7d462e293a8a req-d842a670-bb24-409b-9f42-c0802f8e3939 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Received unexpected event network-vif-plugged-994b5877-4440-4c86-be4e-fd60053ca206 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.476 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.477 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[2eed5300-06b6-4530-8e8f-84f6254e3e24]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.478 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.489 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.489 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[806b9442-005c-49bc-af05-b213e9ac2836]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.491 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.503 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.503 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[c732fb62-9583-43f8-ac89-29c18aaced90]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.505 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[cba5e9a6-021a-4e39-885e-fa03d86d1e4c]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.505 2 DEBUG oslo_concurrency.processutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.552 2 DEBUG oslo_concurrency.processutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "nvme version" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.554 2 DEBUG os_brick.initiator.connectors.lightos [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.555 2 DEBUG os_brick.initiator.connectors.lightos [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.555 2 DEBUG os_brick.initiator.connectors.lightos [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.555 2 DEBUG os_brick.utils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] <== get_connector_properties: return (92ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.555 2 DEBUG nova.virt.block_device [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Updating existing volume attachment record: 54785efe-e096-4fa5-a9b9-ef294d95eccb _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 09:10:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:41.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:41 np0005465988 nova_compute[236126]: 2025-10-02 13:10:41.996 2 DEBUG nova.policy [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c10de71fef00497981b8b7cec6a3fff3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:10:42 np0005465988 nova_compute[236126]: 2025-10-02 13:10:42.532 2 DEBUG nova.compute.manager [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:10:42 np0005465988 nova_compute[236126]: 2025-10-02 13:10:42.533 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:10:42 np0005465988 nova_compute[236126]: 2025-10-02 13:10:42.533 2 INFO nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Creating image(s)#033[00m
Oct  2 09:10:42 np0005465988 nova_compute[236126]: 2025-10-02 13:10:42.534 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 09:10:42 np0005465988 nova_compute[236126]: 2025-10-02 13:10:42.534 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Ensure instance console log exists: /var/lib/nova/instances/1f3cf63d-4aef-4445-b255-5c235b1a1f7d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:10:42 np0005465988 nova_compute[236126]: 2025-10-02 13:10:42.534 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:42 np0005465988 nova_compute[236126]: 2025-10-02 13:10:42.535 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:42 np0005465988 nova_compute[236126]: 2025-10-02 13:10:42.535 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:42 np0005465988 nova_compute[236126]: 2025-10-02 13:10:42.664 2 DEBUG nova.network.neutron [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Successfully created port: a71ff85d-35ad-4d85-a530-c111eeb17791 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:10:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:42.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:43 np0005465988 podman[339710]: 2025-10-02 13:10:43.552309691 +0000 UTC m=+0.080115108 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 09:10:43 np0005465988 nova_compute[236126]: 2025-10-02 13:10:43.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:43.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:44 np0005465988 nova_compute[236126]: 2025-10-02 13:10:43.999 2 DEBUG nova.network.neutron [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Successfully updated port: a71ff85d-35ad-4d85-a530-c111eeb17791 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:10:44 np0005465988 nova_compute[236126]: 2025-10-02 13:10:44.022 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "refresh_cache-1f3cf63d-4aef-4445-b255-5c235b1a1f7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:10:44 np0005465988 nova_compute[236126]: 2025-10-02 13:10:44.022 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquired lock "refresh_cache-1f3cf63d-4aef-4445-b255-5c235b1a1f7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:10:44 np0005465988 nova_compute[236126]: 2025-10-02 13:10:44.023 2 DEBUG nova.network.neutron [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:10:44 np0005465988 nova_compute[236126]: 2025-10-02 13:10:44.114 2 DEBUG nova.compute.manager [req-efdc90f7-c6ea-4827-ba38-9446e289ad97 req-44e770aa-b62b-4964-baa0-1fbbd8d1d612 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Received event network-changed-a71ff85d-35ad-4d85-a530-c111eeb17791 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:44 np0005465988 nova_compute[236126]: 2025-10-02 13:10:44.115 2 DEBUG nova.compute.manager [req-efdc90f7-c6ea-4827-ba38-9446e289ad97 req-44e770aa-b62b-4964-baa0-1fbbd8d1d612 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Refreshing instance network info cache due to event network-changed-a71ff85d-35ad-4d85-a530-c111eeb17791. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:10:44 np0005465988 nova_compute[236126]: 2025-10-02 13:10:44.115 2 DEBUG oslo_concurrency.lockutils [req-efdc90f7-c6ea-4827-ba38-9446e289ad97 req-44e770aa-b62b-4964-baa0-1fbbd8d1d612 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-1f3cf63d-4aef-4445-b255-5c235b1a1f7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:10:44 np0005465988 nova_compute[236126]: 2025-10-02 13:10:44.204 2 DEBUG nova.network.neutron [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:10:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:44.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.074 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Acquiring lock "6087853f-327c-46b2-baa6-c24854d98b97-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.075 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.076 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "6087853f-327c-46b2-baa6-c24854d98b97-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.096 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.096 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.096 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.097 2 DEBUG nova.compute.resource_tracker [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.097 2 DEBUG oslo_concurrency.processutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:10:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/315835295' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.548 2 DEBUG oslo_concurrency.processutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.557 2 DEBUG nova.network.neutron [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Updating instance_info_cache with network_info: [{"id": "a71ff85d-35ad-4d85-a530-c111eeb17791", "address": "fa:16:3e:f3:9c:e5", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa71ff85d-35", "ovs_interfaceid": "a71ff85d-35ad-4d85-a530-c111eeb17791", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.577 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Releasing lock "refresh_cache-1f3cf63d-4aef-4445-b255-5c235b1a1f7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.578 2 DEBUG nova.compute.manager [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Instance network_info: |[{"id": "a71ff85d-35ad-4d85-a530-c111eeb17791", "address": "fa:16:3e:f3:9c:e5", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa71ff85d-35", "ovs_interfaceid": "a71ff85d-35ad-4d85-a530-c111eeb17791", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.578 2 DEBUG oslo_concurrency.lockutils [req-efdc90f7-c6ea-4827-ba38-9446e289ad97 req-44e770aa-b62b-4964-baa0-1fbbd8d1d612 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-1f3cf63d-4aef-4445-b255-5c235b1a1f7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.578 2 DEBUG nova.network.neutron [req-efdc90f7-c6ea-4827-ba38-9446e289ad97 req-44e770aa-b62b-4964-baa0-1fbbd8d1d612 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Refreshing network info cache for port a71ff85d-35ad-4d85-a530-c111eeb17791 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.582 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Start _get_guest_xml network_info=[{"id": "a71ff85d-35ad-4d85-a530-c111eeb17791", "address": "fa:16:3e:f3:9c:e5", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa71ff85d-35", "ovs_interfaceid": "a71ff85d-35ad-4d85-a530-c111eeb17791", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '54785efe-e096-4fa5-a9b9-ef294d95eccb', 'disk_bus': 'virtio', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-8173fb80-12b5-41df-9b9e-84a3204dcf2c', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '8173fb80-12b5-41df-9b9e-84a3204dcf2c', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': '1f3cf63d-4aef-4445-b255-5c235b1a1f7d', 'attached_at': '', 'detached_at': '', 'volume_id': '8173fb80-12b5-41df-9b9e-84a3204dcf2c', 'serial': '8173fb80-12b5-41df-9b9e-84a3204dcf2c'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.586 2 WARNING nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.590 2 DEBUG nova.virt.libvirt.host [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.591 2 DEBUG nova.virt.libvirt.host [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.597 2 DEBUG nova.virt.libvirt.host [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.598 2 DEBUG nova.virt.libvirt.host [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.599 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.599 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.600 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.600 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.600 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.600 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.601 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.601 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.601 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.601 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.601 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.602 2 DEBUG nova.virt.hardware [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.632 2 DEBUG nova.storage.rbd_utils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 1f3cf63d-4aef-4445-b255-5c235b1a1f7d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.636 2 DEBUG oslo_concurrency.processutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.840 2 WARNING nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.846 2 DEBUG nova.compute.resource_tracker [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3980MB free_disk=20.942699432373047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.847 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.847 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.901 2 DEBUG nova.compute.resource_tracker [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Migration for instance 6087853f-327c-46b2-baa6-c24854d98b97 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.944 2 DEBUG nova.compute.resource_tracker [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.986 2 DEBUG nova.compute.resource_tracker [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Migration f09d8e34-4c5a-4003-aebf-40ce1a8562af is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.986 2 DEBUG nova.compute.resource_tracker [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Instance 1f3cf63d-4aef-4445-b255-5c235b1a1f7d actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.987 2 DEBUG nova.compute.resource_tracker [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:10:45 np0005465988 nova_compute[236126]: 2025-10-02 13:10:45.987 2 DEBUG nova.compute.resource_tracker [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:10:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:45.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.057 2 DEBUG oslo_concurrency.processutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:10:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/339632512' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.120 2 DEBUG oslo_concurrency.processutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.216 2 DEBUG os_brick.encryptors [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Using volume encryption metadata '{'encryption_key_id': '459d2bd8-c04d-4362-8d85-9be2febad4ef', 'control_location': 'front-end', 'cipher': 'aes-xts-plain64', 'key_size': 256, 'provider': 'luks'}' for connection: {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-8173fb80-12b5-41df-9b9e-84a3204dcf2c', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '8173fb80-12b5-41df-9b9e-84a3204dcf2c', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': '1f3cf63d-4aef-4445-b255-5c235b1a1f7d', 'attached_at': '', 'detached_at': '', 'volume_id': '8173fb80-12b5-41df-9b9e-84a3204dcf2c', 'serial': '} get_encryption_metadata /usr/lib/python3.9/site-packages/os_brick/encryptors/__init__.py:135#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.219 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Creating Client object Client /usr/lib/python3.9/site-packages/barbicanclient/client.py:163#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.242 2 DEBUG barbicanclient.v1.secrets [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Getting secret - Secret href: https://barbican-internal.openstack.svc:9311/secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef get /usr/lib/python3.9/site-packages/barbicanclient/v1/secrets.py:514#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.243 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.270 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.270 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.309 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.310 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.358 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.358 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.380 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.380 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.402 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.403 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.421 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.422 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.442 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.443 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.461 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.462 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.479 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.480 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.492 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.492 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.502 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.503 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:10:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2364750605' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.522 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.522 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.537 2 DEBUG oslo_concurrency.processutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.542 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.543 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.545 2 DEBUG nova.compute.provider_tree [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.561 2 DEBUG nova.scheduler.client.report [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.565 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.565 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.584 2 DEBUG nova.compute.resource_tracker [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.585 2 DEBUG oslo_concurrency.lockutils [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.737s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.589 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.589 2 INFO barbicanclient.base [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Calculated Secrets uuid ref: secrets/459d2bd8-c04d-4362-8d85-9be2febad4ef#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.594 2 INFO nova.compute.manager [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Migrating instance to compute-1.ctlplane.example.com finished successfully.#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.619 2 DEBUG barbicanclient.client [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.620 2 DEBUG nova.virt.libvirt.host [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Secret XML: <secret ephemeral="no" private="no">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <usage type="volume">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <volume>8173fb80-12b5-41df-9b9e-84a3204dcf2c</volume>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  </usage>
Oct  2 09:10:46 np0005465988 nova_compute[236126]: </secret>
Oct  2 09:10:46 np0005465988 nova_compute[236126]: create_secret /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1131#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.672 2 DEBUG nova.virt.libvirt.vif [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:10:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1254195989',display_name='tempest-TestVolumeBootPattern-server-1254195989',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1254195989',id=212,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-l0d3ir08',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:10:41Z,user_data=None,user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=1f3cf63d-4aef-4445-b255-5c235b1a1f7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a71ff85d-35ad-4d85-a530-c111eeb17791", "address": "fa:16:3e:f3:9c:e5", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa71ff85d-35", "ovs_interfaceid": "a71ff85d-35ad-4d85-a530-c111eeb17791", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.672 2 DEBUG nova.network.os_vif_util [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "a71ff85d-35ad-4d85-a530-c111eeb17791", "address": "fa:16:3e:f3:9c:e5", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa71ff85d-35", "ovs_interfaceid": "a71ff85d-35ad-4d85-a530-c111eeb17791", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.673 2 DEBUG nova.network.os_vif_util [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:9c:e5,bridge_name='br-int',has_traffic_filtering=True,id=a71ff85d-35ad-4d85-a530-c111eeb17791,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa71ff85d-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.675 2 DEBUG nova.objects.instance [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f3cf63d-4aef-4445-b255-5c235b1a1f7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.684 2 INFO nova.scheduler.client.report [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] Deleted allocation for migration f09d8e34-4c5a-4003-aebf-40ce1a8562af#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.684 2 DEBUG nova.virt.libvirt.driver [None req-28a50335-b302-408e-9e69-1cdb07e8615e 67f7dad14d084636b43800f8e31de2b3 6c41a7573a694f9fb3ae10771b2fa09f - - default default] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.686 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <uuid>1f3cf63d-4aef-4445-b255-5c235b1a1f7d</uuid>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <name>instance-000000d4</name>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestVolumeBootPattern-server-1254195989</nova:name>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:10:45</nova:creationTime>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <nova:user uuid="c10de71fef00497981b8b7cec6a3fff3">tempest-TestVolumeBootPattern-1200415020-project-member</nova:user>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <nova:project uuid="fbbc6cb494464fd9b31f64c1ad75fa6b">tempest-TestVolumeBootPattern-1200415020</nova:project>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <nova:port uuid="a71ff85d-35ad-4d85-a530-c111eeb17791">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <entry name="serial">1f3cf63d-4aef-4445-b255-5c235b1a1f7d</entry>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <entry name="uuid">1f3cf63d-4aef-4445-b255-5c235b1a1f7d</entry>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/1f3cf63d-4aef-4445-b255-5c235b1a1f7d_disk.config">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-8173fb80-12b5-41df-9b9e-84a3204dcf2c">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <serial>8173fb80-12b5-41df-9b9e-84a3204dcf2c</serial>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <encryption format="luks">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:        <secret type="passphrase" uuid="3258405d-12d9-4252-a4d5-027b5ab2fb38"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      </encryption>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:f3:9c:e5"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <target dev="tapa71ff85d-35"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/1f3cf63d-4aef-4445-b255-5c235b1a1f7d/console.log" append="off"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:10:46 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:10:46 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:10:46 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:10:46 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.687 2 DEBUG nova.compute.manager [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Preparing to wait for external event network-vif-plugged-a71ff85d-35ad-4d85-a530-c111eeb17791 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.688 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.688 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.688 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.689 2 DEBUG nova.virt.libvirt.vif [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:10:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1254195989',display_name='tempest-TestVolumeBootPattern-server-1254195989',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1254195989',id=212,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-l0d3ir08',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:10:41Z,user_data=None,user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=1f3cf63d-4aef-4445-b255-5c235b1a1f7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a71ff85d-35ad-4d85-a530-c111eeb17791", "address": "fa:16:3e:f3:9c:e5", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa71ff85d-35", "ovs_interfaceid": "a71ff85d-35ad-4d85-a530-c111eeb17791", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.689 2 DEBUG nova.network.os_vif_util [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "a71ff85d-35ad-4d85-a530-c111eeb17791", "address": "fa:16:3e:f3:9c:e5", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa71ff85d-35", "ovs_interfaceid": "a71ff85d-35ad-4d85-a530-c111eeb17791", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.690 2 DEBUG nova.network.os_vif_util [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:9c:e5,bridge_name='br-int',has_traffic_filtering=True,id=a71ff85d-35ad-4d85-a530-c111eeb17791,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa71ff85d-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.690 2 DEBUG os_vif [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:9c:e5,bridge_name='br-int',has_traffic_filtering=True,id=a71ff85d-35ad-4d85-a530-c111eeb17791,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa71ff85d-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.691 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.692 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.694 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa71ff85d-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.694 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa71ff85d-35, col_values=(('external_ids', {'iface-id': 'a71ff85d-35ad-4d85-a530-c111eeb17791', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:9c:e5', 'vm-uuid': '1f3cf63d-4aef-4445-b255-5c235b1a1f7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:46 np0005465988 NetworkManager[45041]: <info>  [1759410646.6968] manager: (tapa71ff85d-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/430)
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.705 2 INFO os_vif [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:9c:e5,bridge_name='br-int',has_traffic_filtering=True,id=a71ff85d-35ad-4d85-a530-c111eeb17791,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa71ff85d-35')#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.753 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.754 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.754 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No VIF found with MAC fa:16:3e:f3:9c:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.755 2 INFO nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Using config drive#033[00m
Oct  2 09:10:46 np0005465988 nova_compute[236126]: 2025-10-02 13:10:46.787 2 DEBUG nova.storage.rbd_utils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 1f3cf63d-4aef-4445-b255-5c235b1a1f7d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:10:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:46.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.062 2 INFO nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Creating config drive at /var/lib/nova/instances/1f3cf63d-4aef-4445-b255-5c235b1a1f7d/disk.config#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.069 2 DEBUG oslo_concurrency.processutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1f3cf63d-4aef-4445-b255-5c235b1a1f7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0dxqk6fp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.160 2 DEBUG nova.network.neutron [req-efdc90f7-c6ea-4827-ba38-9446e289ad97 req-44e770aa-b62b-4964-baa0-1fbbd8d1d612 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Updated VIF entry in instance network info cache for port a71ff85d-35ad-4d85-a530-c111eeb17791. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.161 2 DEBUG nova.network.neutron [req-efdc90f7-c6ea-4827-ba38-9446e289ad97 req-44e770aa-b62b-4964-baa0-1fbbd8d1d612 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Updating instance_info_cache with network_info: [{"id": "a71ff85d-35ad-4d85-a530-c111eeb17791", "address": "fa:16:3e:f3:9c:e5", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa71ff85d-35", "ovs_interfaceid": "a71ff85d-35ad-4d85-a530-c111eeb17791", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.184 2 DEBUG oslo_concurrency.lockutils [req-efdc90f7-c6ea-4827-ba38-9446e289ad97 req-44e770aa-b62b-4964-baa0-1fbbd8d1d612 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-1f3cf63d-4aef-4445-b255-5c235b1a1f7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.228 2 DEBUG oslo_concurrency.processutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1f3cf63d-4aef-4445-b255-5c235b1a1f7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0dxqk6fp" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.264 2 DEBUG nova.storage.rbd_utils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 1f3cf63d-4aef-4445-b255-5c235b1a1f7d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.269 2 DEBUG oslo_concurrency.processutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1f3cf63d-4aef-4445-b255-5c235b1a1f7d/disk.config 1f3cf63d-4aef-4445-b255-5c235b1a1f7d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.506 2 DEBUG oslo_concurrency.processutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1f3cf63d-4aef-4445-b255-5c235b1a1f7d/disk.config 1f3cf63d-4aef-4445-b255-5c235b1a1f7d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.237s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.507 2 INFO nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Deleting local config drive /var/lib/nova/instances/1f3cf63d-4aef-4445-b255-5c235b1a1f7d/disk.config because it was imported into RBD.#033[00m
Oct  2 09:10:47 np0005465988 kernel: tapa71ff85d-35: entered promiscuous mode
Oct  2 09:10:47 np0005465988 NetworkManager[45041]: <info>  [1759410647.5768] manager: (tapa71ff85d-35): new Tun device (/org/freedesktop/NetworkManager/Devices/431)
Oct  2 09:10:47 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:47Z|00973|binding|INFO|Claiming lport a71ff85d-35ad-4d85-a530-c111eeb17791 for this chassis.
Oct  2 09:10:47 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:47Z|00974|binding|INFO|a71ff85d-35ad-4d85-a530-c111eeb17791: Claiming fa:16:3e:f3:9c:e5 10.100.0.6
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.586 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:9c:e5 10.100.0.6'], port_security=['fa:16:3e:f3:9c:e5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1f3cf63d-4aef-4445-b255-5c235b1a1f7d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-150508fb-9217-4982-8468-977a3b53121a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '53b1884f-337a-4a86-a723-0ac172ede381', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d5e391d-23a7-4f5a-8146-0f24141a74f2, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=a71ff85d-35ad-4d85-a530-c111eeb17791) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.587 142124 INFO neutron.agent.ovn.metadata.agent [-] Port a71ff85d-35ad-4d85-a530-c111eeb17791 in datapath 150508fb-9217-4982-8468-977a3b53121a bound to our chassis#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.588 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 150508fb-9217-4982-8468-977a3b53121a#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:47 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:47Z|00975|binding|INFO|Setting lport a71ff85d-35ad-4d85-a530-c111eeb17791 ovn-installed in OVS
Oct  2 09:10:47 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:47Z|00976|binding|INFO|Setting lport a71ff85d-35ad-4d85-a530-c111eeb17791 up in Southbound
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.609 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3b5389-7e13-470f-97b4-92382af42bc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.610 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap150508fb-91 in ovnmeta-150508fb-9217-4982-8468-977a3b53121a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:10:47 np0005465988 systemd-udevd[339939]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.614 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap150508fb-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.614 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[f8084d68-840a-4757-9e13-6accd6ada966]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.615 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[393405f4-4fd6-4820-8b2c-36431744c87a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 systemd-machined[192594]: New machine qemu-102-instance-000000d4.
Oct  2 09:10:47 np0005465988 NetworkManager[45041]: <info>  [1759410647.6254] device (tapa71ff85d-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:10:47 np0005465988 NetworkManager[45041]: <info>  [1759410647.6264] device (tapa71ff85d-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:10:47 np0005465988 systemd[1]: Started Virtual Machine qemu-102-instance-000000d4.
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.634 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8b69ae-11f2-485b-8ed1-534deba7952a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.665 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d58f7595-ab27-492e-ab06-6af6842bf68a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.713 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[0166fde0-97b9-4277-9f5b-1880f3c57e52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 NetworkManager[45041]: <info>  [1759410647.7224] manager: (tap150508fb-90): new Veth device (/org/freedesktop/NetworkManager/Devices/432)
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.722 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e45d912e-3053-40ad-a4be-497746124293]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 systemd-udevd[339943]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.754 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7d4078-974f-40da-bd08-e73523cd1b78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.759 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb8e370-606e-42e5-b700-af50b349794f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 NetworkManager[45041]: <info>  [1759410647.7850] device (tap150508fb-90): carrier: link connected
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.792 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[51160cf5-bf19-4f2c-9db2-7f9ec5b1a98a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.808 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a843fe72-5e47-494e-8894-447e1ecb754c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap150508fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:69:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 870811, 'reachable_time': 29653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 339972, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.824 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5e414fe8-1624-4a7a-8642-91706fc255e8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:6993'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 870811, 'tstamp': 870811}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 339973, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.842 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a599d1c7-0a6f-48bd-8605-c8dbc0eb78e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap150508fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:69:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 870811, 'reachable_time': 29653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 339974, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.888 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[457738b6-a6dd-4818-b009-50f8e3fd7485]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.960 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7e00e048-54f2-40a3-bf41-687dda46f2fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.963 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap150508fb-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.963 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:10:47 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:47.964 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap150508fb-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.997 2 DEBUG nova.compute.manager [req-2f1740df-a92f-4784-bec0-8728bf3b7d70 req-b8d75a0f-4b45-4e1d-842e-68c4d75e0e94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Received event network-vif-plugged-a71ff85d-35ad-4d85-a530-c111eeb17791 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.998 2 DEBUG oslo_concurrency.lockutils [req-2f1740df-a92f-4784-bec0-8728bf3b7d70 req-b8d75a0f-4b45-4e1d-842e-68c4d75e0e94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.999 2 DEBUG oslo_concurrency.lockutils [req-2f1740df-a92f-4784-bec0-8728bf3b7d70 req-b8d75a0f-4b45-4e1d-842e-68c4d75e0e94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:47 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.999 2 DEBUG oslo_concurrency.lockutils [req-2f1740df-a92f-4784-bec0-8728bf3b7d70 req-b8d75a0f-4b45-4e1d-842e-68c4d75e0e94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:48 np0005465988 nova_compute[236126]: 2025-10-02 13:10:47.999 2 DEBUG nova.compute.manager [req-2f1740df-a92f-4784-bec0-8728bf3b7d70 req-b8d75a0f-4b45-4e1d-842e-68c4d75e0e94 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Processing event network-vif-plugged-a71ff85d-35ad-4d85-a530-c111eeb17791 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:10:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:47.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:48 np0005465988 nova_compute[236126]: 2025-10-02 13:10:48.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:48 np0005465988 NetworkManager[45041]: <info>  [1759410648.0021] manager: (tap150508fb-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/433)
Oct  2 09:10:48 np0005465988 kernel: tap150508fb-90: entered promiscuous mode
Oct  2 09:10:48 np0005465988 nova_compute[236126]: 2025-10-02 13:10:48.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:48.005 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap150508fb-90, col_values=(('external_ids', {'iface-id': '2a2f4068-0f5b-4d26-b914-4d32097d8b55'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:48 np0005465988 nova_compute[236126]: 2025-10-02 13:10:48.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:48 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:48Z|00977|binding|INFO|Releasing lport 2a2f4068-0f5b-4d26-b914-4d32097d8b55 from this chassis (sb_readonly=0)
Oct  2 09:10:48 np0005465988 nova_compute[236126]: 2025-10-02 13:10:48.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:48.037 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:48.038 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4f60cfac-e767-49bc-b59a-72009f8069a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:48.040 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-150508fb-9217-4982-8468-977a3b53121a
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 150508fb-9217-4982-8468-977a3b53121a
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:10:48 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:48.041 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'env', 'PROCESS_TAG=haproxy-150508fb-9217-4982-8468-977a3b53121a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/150508fb-9217-4982-8468-977a3b53121a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:10:48 np0005465988 podman[340042]: 2025-10-02 13:10:48.449508207 +0000 UTC m=+0.028716048 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:10:48 np0005465988 podman[340042]: 2025-10-02 13:10:48.65826661 +0000 UTC m=+0.237474431 container create 8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:10:48 np0005465988 systemd[1]: Started libpod-conmon-8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f.scope.
Oct  2 09:10:48 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:10:48 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a60dc24a2e531585ddde5caad3add9079a9210919047c9e45b7f506957bade8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:10:48 np0005465988 podman[340042]: 2025-10-02 13:10:48.809692071 +0000 UTC m=+0.388899892 container init 8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 09:10:48 np0005465988 podman[340042]: 2025-10-02 13:10:48.815864779 +0000 UTC m=+0.395072600 container start 8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 09:10:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:48.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:48 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340057]: [NOTICE]   (340061) : New worker (340063) forked
Oct  2 09:10:48 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340057]: [NOTICE]   (340061) : Loading success.
Oct  2 09:10:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:49.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.086 2 DEBUG nova.compute.manager [req-6e007577-62ce-4835-a214-2b85ac96f2b2 req-442f2f52-9e5d-4140-a047-3ee9466c83b3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Received event network-vif-plugged-a71ff85d-35ad-4d85-a530-c111eeb17791 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.087 2 DEBUG oslo_concurrency.lockutils [req-6e007577-62ce-4835-a214-2b85ac96f2b2 req-442f2f52-9e5d-4140-a047-3ee9466c83b3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.087 2 DEBUG oslo_concurrency.lockutils [req-6e007577-62ce-4835-a214-2b85ac96f2b2 req-442f2f52-9e5d-4140-a047-3ee9466c83b3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.088 2 DEBUG oslo_concurrency.lockutils [req-6e007577-62ce-4835-a214-2b85ac96f2b2 req-442f2f52-9e5d-4140-a047-3ee9466c83b3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.088 2 DEBUG nova.compute.manager [req-6e007577-62ce-4835-a214-2b85ac96f2b2 req-442f2f52-9e5d-4140-a047-3ee9466c83b3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] No waiting events found dispatching network-vif-plugged-a71ff85d-35ad-4d85-a530-c111eeb17791 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.089 2 WARNING nova.compute.manager [req-6e007577-62ce-4835-a214-2b85ac96f2b2 req-442f2f52-9e5d-4140-a047-3ee9466c83b3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Received unexpected event network-vif-plugged-a71ff85d-35ad-4d85-a530-c111eeb17791 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 09:10:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:50.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.862 2 DEBUG nova.compute.manager [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.863 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410650.8617802, 1f3cf63d-4aef-4445-b255-5c235b1a1f7d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.863 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] VM Started (Lifecycle Event)#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.867 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.872 2 INFO nova.virt.libvirt.driver [-] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Instance spawned successfully.#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.872 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.884 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.891 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.895 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.896 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.896 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.897 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.897 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.897 2 DEBUG nova.virt.libvirt.driver [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.923 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.924 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410650.8618777, 1f3cf63d-4aef-4445-b255-5c235b1a1f7d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.924 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.953 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.957 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410650.8661203, 1f3cf63d-4aef-4445-b255-5c235b1a1f7d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.957 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.964 2 INFO nova.compute.manager [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Took 8.43 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.965 2 DEBUG nova.compute.manager [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.981 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:10:50 np0005465988 nova_compute[236126]: 2025-10-02 13:10:50.984 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:10:51 np0005465988 nova_compute[236126]: 2025-10-02 13:10:51.007 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:10:51 np0005465988 nova_compute[236126]: 2025-10-02 13:10:51.038 2 INFO nova.compute.manager [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Took 10.58 seconds to build instance.#033[00m
Oct  2 09:10:51 np0005465988 nova_compute[236126]: 2025-10-02 13:10:51.055 2 DEBUG oslo_concurrency.lockutils [None req-9f2a3a20-f035-47bb-9f5a-9b5872a7c2a9 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:51 np0005465988 nova_compute[236126]: 2025-10-02 13:10:51.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:51 np0005465988 nova_compute[236126]: 2025-10-02 13:10:51.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:51.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:52 np0005465988 nova_compute[236126]: 2025-10-02 13:10:52.488 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:10:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:52.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.107 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410638.1057014, 6087853f-327c-46b2-baa6-c24854d98b97 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.107 2 INFO nova.compute.manager [-] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.132 2 DEBUG nova.compute.manager [None req-32b04683-463c-4763-a335-3712345e82e4 - - - - - -] [instance: 6087853f-327c-46b2-baa6-c24854d98b97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.226 2 DEBUG oslo_concurrency.lockutils [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.227 2 DEBUG oslo_concurrency.lockutils [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.227 2 DEBUG oslo_concurrency.lockutils [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.227 2 DEBUG oslo_concurrency.lockutils [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.228 2 DEBUG oslo_concurrency.lockutils [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.229 2 INFO nova.compute.manager [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Terminating instance#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.230 2 DEBUG nova.compute.manager [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 09:10:53 np0005465988 kernel: tapa71ff85d-35 (unregistering): left promiscuous mode
Oct  2 09:10:53 np0005465988 NetworkManager[45041]: <info>  [1759410653.2736] device (tapa71ff85d-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:53 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:53Z|00978|binding|INFO|Releasing lport a71ff85d-35ad-4d85-a530-c111eeb17791 from this chassis (sb_readonly=0)
Oct  2 09:10:53 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:53Z|00979|binding|INFO|Setting lport a71ff85d-35ad-4d85-a530-c111eeb17791 down in Southbound
Oct  2 09:10:53 np0005465988 ovn_controller[132601]: 2025-10-02T13:10:53Z|00980|binding|INFO|Removing iface tapa71ff85d-35 ovn-installed in OVS
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.309 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:9c:e5 10.100.0.6'], port_security=['fa:16:3e:f3:9c:e5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1f3cf63d-4aef-4445-b255-5c235b1a1f7d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-150508fb-9217-4982-8468-977a3b53121a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53b1884f-337a-4a86-a723-0ac172ede381', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d5e391d-23a7-4f5a-8146-0f24141a74f2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=a71ff85d-35ad-4d85-a530-c111eeb17791) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.311 142124 INFO neutron.agent.ovn.metadata.agent [-] Port a71ff85d-35ad-4d85-a530-c111eeb17791 in datapath 150508fb-9217-4982-8468-977a3b53121a unbound from our chassis#033[00m
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.314 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 150508fb-9217-4982-8468-977a3b53121a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.318 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dbf452db-9c78-4573-8447-3afa0026576f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.319 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-150508fb-9217-4982-8468-977a3b53121a namespace which is not needed anymore#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:53 np0005465988 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000d4.scope: Deactivated successfully.
Oct  2 09:10:53 np0005465988 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000d4.scope: Consumed 3.528s CPU time.
Oct  2 09:10:53 np0005465988 systemd-machined[192594]: Machine qemu-102-instance-000000d4 terminated.
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:53 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340057]: [NOTICE]   (340061) : haproxy version is 2.8.14-c23fe91
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.474 2 DEBUG nova.compute.manager [req-806bc281-0cf3-4905-9e4b-40dcd79fe016 req-ac878381-0f6c-43e7-bef0-a0c5cf429430 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Received event network-vif-unplugged-a71ff85d-35ad-4d85-a530-c111eeb17791 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.475 2 DEBUG oslo_concurrency.lockutils [req-806bc281-0cf3-4905-9e4b-40dcd79fe016 req-ac878381-0f6c-43e7-bef0-a0c5cf429430 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.476 2 DEBUG oslo_concurrency.lockutils [req-806bc281-0cf3-4905-9e4b-40dcd79fe016 req-ac878381-0f6c-43e7-bef0-a0c5cf429430 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:53 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340057]: [NOTICE]   (340061) : path to executable is /usr/sbin/haproxy
Oct  2 09:10:53 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340057]: [WARNING]  (340061) : Exiting Master process...
Oct  2 09:10:53 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340057]: [WARNING]  (340061) : Exiting Master process...
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.477 2 DEBUG oslo_concurrency.lockutils [req-806bc281-0cf3-4905-9e4b-40dcd79fe016 req-ac878381-0f6c-43e7-bef0-a0c5cf429430 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.477 2 DEBUG nova.compute.manager [req-806bc281-0cf3-4905-9e4b-40dcd79fe016 req-ac878381-0f6c-43e7-bef0-a0c5cf429430 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] No waiting events found dispatching network-vif-unplugged-a71ff85d-35ad-4d85-a530-c111eeb17791 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.477 2 DEBUG nova.compute.manager [req-806bc281-0cf3-4905-9e4b-40dcd79fe016 req-ac878381-0f6c-43e7-bef0-a0c5cf429430 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Received event network-vif-unplugged-a71ff85d-35ad-4d85-a530-c111eeb17791 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:10:53 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340057]: [ALERT]    (340061) : Current worker (340063) exited with code 143 (Terminated)
Oct  2 09:10:53 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340057]: [WARNING]  (340061) : All workers exited. Exiting... (0)
Oct  2 09:10:53 np0005465988 systemd[1]: libpod-8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f.scope: Deactivated successfully.
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.481 2 INFO nova.virt.libvirt.driver [-] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Instance destroyed successfully.#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.483 2 DEBUG nova.objects.instance [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lazy-loading 'resources' on Instance uuid 1f3cf63d-4aef-4445-b255-5c235b1a1f7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:10:53 np0005465988 podman[340105]: 2025-10-02 13:10:53.492029397 +0000 UTC m=+0.065462426 container died 8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.508 2 DEBUG nova.virt.libvirt.vif [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:10:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1254195989',display_name='tempest-TestVolumeBootPattern-server-1254195989',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1254195989',id=212,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:10:50Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-l0d3ir08',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:10:51Z,user_data=None,user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=1f3cf63d-4aef-4445-b255-5c235b1a1f7d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a71ff85d-35ad-4d85-a530-c111eeb17791", "address": "fa:16:3e:f3:9c:e5", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa71ff85d-35", "ovs_interfaceid": "a71ff85d-35ad-4d85-a530-c111eeb17791", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.509 2 DEBUG nova.network.os_vif_util [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "a71ff85d-35ad-4d85-a530-c111eeb17791", "address": "fa:16:3e:f3:9c:e5", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa71ff85d-35", "ovs_interfaceid": "a71ff85d-35ad-4d85-a530-c111eeb17791", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.509 2 DEBUG nova.network.os_vif_util [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:9c:e5,bridge_name='br-int',has_traffic_filtering=True,id=a71ff85d-35ad-4d85-a530-c111eeb17791,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa71ff85d-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.510 2 DEBUG os_vif [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:9c:e5,bridge_name='br-int',has_traffic_filtering=True,id=a71ff85d-35ad-4d85-a530-c111eeb17791,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa71ff85d-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.512 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa71ff85d-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.518 2 INFO os_vif [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:9c:e5,bridge_name='br-int',has_traffic_filtering=True,id=a71ff85d-35ad-4d85-a530-c111eeb17791,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa71ff85d-35')#033[00m
Oct  2 09:10:53 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f-userdata-shm.mount: Deactivated successfully.
Oct  2 09:10:53 np0005465988 systemd[1]: var-lib-containers-storage-overlay-8a60dc24a2e531585ddde5caad3add9079a9210919047c9e45b7f506957bade8-merged.mount: Deactivated successfully.
Oct  2 09:10:53 np0005465988 podman[340105]: 2025-10-02 13:10:53.546941799 +0000 UTC m=+0.120374808 container cleanup 8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 09:10:53 np0005465988 systemd[1]: libpod-conmon-8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f.scope: Deactivated successfully.
Oct  2 09:10:53 np0005465988 podman[340160]: 2025-10-02 13:10:53.619406786 +0000 UTC m=+0.047681354 container remove 8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.624 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[69a28089-eb89-4381-b728-9fa53f4e349c]: (4, ('Thu Oct  2 01:10:53 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a (8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f)\n8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f\nThu Oct  2 01:10:53 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a (8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f)\n8f1affb057c5322a8f8a30a4d2f9ff9be449626b0aa5aa2e1f93a3d19b69059f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.626 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[69523026-534a-4fae-9f76-519413155bb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.628 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap150508fb-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:53 np0005465988 kernel: tap150508fb-90: left promiscuous mode
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.651 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[493e9481-621c-40b7-9742-caae2731ea48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:53 np0005465988 podman[340162]: 2025-10-02 13:10:53.668796809 +0000 UTC m=+0.082485467 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.676 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cd751f04-c53c-46b3-b8d0-48203d9c4397]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.678 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[45b92345-38f9-4ea9-86c4-646bb0797b30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.693 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0319c367-efef-4c83-9deb-ed0fcbf1232e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 870803, 'reachable_time': 27254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 340221, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:53 np0005465988 systemd[1]: run-netns-ovnmeta\x2d150508fb\x2d9217\x2d4982\x2d8468\x2d977a3b53121a.mount: Deactivated successfully.
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.696 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-150508fb-9217-4982-8468-977a3b53121a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:10:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:10:53.697 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[346939bc-f94e-4898-9bde-809e56848949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:10:53 np0005465988 podman[340170]: 2025-10-02 13:10:53.701450489 +0000 UTC m=+0.108335371 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:10:53 np0005465988 podman[340198]: 2025-10-02 13:10:53.752594682 +0000 UTC m=+0.085356399 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.757 2 INFO nova.virt.libvirt.driver [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Deleting instance files /var/lib/nova/instances/1f3cf63d-4aef-4445-b255-5c235b1a1f7d_del#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.758 2 INFO nova.virt.libvirt.driver [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Deletion of /var/lib/nova/instances/1f3cf63d-4aef-4445-b255-5c235b1a1f7d_del complete#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.801 2 INFO nova.compute.manager [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Took 0.57 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.801 2 DEBUG oslo.service.loopingcall [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.802 2 DEBUG nova.compute.manager [-] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 09:10:53 np0005465988 nova_compute[236126]: 2025-10-02 13:10:53.802 2 DEBUG nova.network.neutron [-] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 09:10:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:53.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:54 np0005465988 nova_compute[236126]: 2025-10-02 13:10:54.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:54 np0005465988 nova_compute[236126]: 2025-10-02 13:10:54.481 2 DEBUG nova.network.neutron [-] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:10:54 np0005465988 nova_compute[236126]: 2025-10-02 13:10:54.505 2 INFO nova.compute.manager [-] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Took 0.70 seconds to deallocate network for instance.#033[00m
Oct  2 09:10:54 np0005465988 nova_compute[236126]: 2025-10-02 13:10:54.557 2 DEBUG nova.compute.manager [req-06bc50da-8e4d-4164-afbe-e155f07c2f12 req-3ec3d547-3f4d-41ed-a428-899c59c75462 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Received event network-vif-deleted-a71ff85d-35ad-4d85-a530-c111eeb17791 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:54 np0005465988 nova_compute[236126]: 2025-10-02 13:10:54.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:54 np0005465988 nova_compute[236126]: 2025-10-02 13:10:54.708 2 INFO nova.compute.manager [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Took 0.20 seconds to detach 1 volumes for instance.#033[00m
Oct  2 09:10:54 np0005465988 nova_compute[236126]: 2025-10-02 13:10:54.761 2 DEBUG oslo_concurrency.lockutils [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:54 np0005465988 nova_compute[236126]: 2025-10-02 13:10:54.761 2 DEBUG oslo_concurrency.lockutils [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:54 np0005465988 nova_compute[236126]: 2025-10-02 13:10:54.824 2 DEBUG oslo_concurrency.processutils [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:10:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:54.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:10:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/953289654' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.290 2 DEBUG oslo_concurrency.processutils [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.297 2 DEBUG nova.compute.provider_tree [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.317 2 DEBUG nova.scheduler.client.report [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.339 2 DEBUG oslo_concurrency.lockutils [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.394 2 INFO nova.scheduler.client.report [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Deleted allocations for instance 1f3cf63d-4aef-4445-b255-5c235b1a1f7d#033[00m
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.474 2 DEBUG oslo_concurrency.lockutils [None req-755b6f86-d8f9-4ca3-9dfc-bc35d2084a0e c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.555 2 DEBUG nova.compute.manager [req-b7b43890-55cb-4d19-b947-c38c64622c1b req-d6429d99-0a73-4f2b-a8fd-6e1c4dbb71d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Received event network-vif-plugged-a71ff85d-35ad-4d85-a530-c111eeb17791 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.556 2 DEBUG oslo_concurrency.lockutils [req-b7b43890-55cb-4d19-b947-c38c64622c1b req-d6429d99-0a73-4f2b-a8fd-6e1c4dbb71d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.556 2 DEBUG oslo_concurrency.lockutils [req-b7b43890-55cb-4d19-b947-c38c64622c1b req-d6429d99-0a73-4f2b-a8fd-6e1c4dbb71d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.556 2 DEBUG oslo_concurrency.lockutils [req-b7b43890-55cb-4d19-b947-c38c64622c1b req-d6429d99-0a73-4f2b-a8fd-6e1c4dbb71d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "1f3cf63d-4aef-4445-b255-5c235b1a1f7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.556 2 DEBUG nova.compute.manager [req-b7b43890-55cb-4d19-b947-c38c64622c1b req-d6429d99-0a73-4f2b-a8fd-6e1c4dbb71d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] No waiting events found dispatching network-vif-plugged-a71ff85d-35ad-4d85-a530-c111eeb17791 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:10:55 np0005465988 nova_compute[236126]: 2025-10-02 13:10:55.557 2 WARNING nova.compute.manager [req-b7b43890-55cb-4d19-b947-c38c64622c1b req-d6429d99-0a73-4f2b-a8fd-6e1c4dbb71d8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Received unexpected event network-vif-plugged-a71ff85d-35ad-4d85-a530-c111eeb17791 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 09:10:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:56.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:56 np0005465988 nova_compute[236126]: 2025-10-02 13:10:56.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:10:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:56.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:10:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:10:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:10:58.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:10:58 np0005465988 nova_compute[236126]: 2025-10-02 13:10:58.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:10:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:10:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:10:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:10:58.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:00.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:00.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:01 np0005465988 nova_compute[236126]: 2025-10-02 13:11:01.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:02.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:02.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:03 np0005465988 nova_compute[236126]: 2025-10-02 13:11:03.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:04.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:04.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:06.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:06 np0005465988 nova_compute[236126]: 2025-10-02 13:11:06.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e417 e417: 3 total, 3 up, 3 in
Oct  2 09:11:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:06.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:08.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:08 np0005465988 nova_compute[236126]: 2025-10-02 13:11:08.470 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410653.4685125, 1f3cf63d-4aef-4445-b255-5c235b1a1f7d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:11:08 np0005465988 nova_compute[236126]: 2025-10-02 13:11:08.471 2 INFO nova.compute.manager [-] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:11:08 np0005465988 nova_compute[236126]: 2025-10-02 13:11:08.533 2 DEBUG nova.compute.manager [None req-ebe67b3c-bf61-48d2-ab72-5de3e4659620 - - - - - -] [instance: 1f3cf63d-4aef-4445-b255-5c235b1a1f7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:11:08 np0005465988 nova_compute[236126]: 2025-10-02 13:11:08.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:11:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:11:08 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:11:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:08.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:10.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:10 np0005465988 nova_compute[236126]: 2025-10-02 13:11:10.257 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "7f90eabc-4880-4ade-ba65-ec56679e12ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:10 np0005465988 nova_compute[236126]: 2025-10-02 13:11:10.258 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:10 np0005465988 nova_compute[236126]: 2025-10-02 13:11:10.280 2 DEBUG nova.compute.manager [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:11:10 np0005465988 nova_compute[236126]: 2025-10-02 13:11:10.353 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:10 np0005465988 nova_compute[236126]: 2025-10-02 13:11:10.354 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:10 np0005465988 nova_compute[236126]: 2025-10-02 13:11:10.362 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:11:10 np0005465988 nova_compute[236126]: 2025-10-02 13:11:10.363 2 INFO nova.compute.claims [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:11:10 np0005465988 nova_compute[236126]: 2025-10-02 13:11:10.466 2 DEBUG oslo_concurrency.processutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:10.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:11:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3052946091' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:11:10 np0005465988 nova_compute[236126]: 2025-10-02 13:11:10.963 2 DEBUG oslo_concurrency.processutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:10 np0005465988 nova_compute[236126]: 2025-10-02 13:11:10.971 2 DEBUG nova.compute.provider_tree [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:11:10 np0005465988 nova_compute[236126]: 2025-10-02 13:11:10.994 2 DEBUG nova.scheduler.client.report [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:11:11 np0005465988 nova_compute[236126]: 2025-10-02 13:11:11.032 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:11 np0005465988 nova_compute[236126]: 2025-10-02 13:11:11.034 2 DEBUG nova.compute.manager [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:11:11 np0005465988 nova_compute[236126]: 2025-10-02 13:11:11.089 2 DEBUG nova.compute.manager [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:11:11 np0005465988 nova_compute[236126]: 2025-10-02 13:11:11.090 2 DEBUG nova.network.neutron [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:11:11 np0005465988 nova_compute[236126]: 2025-10-02 13:11:11.124 2 INFO nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:11:11 np0005465988 nova_compute[236126]: 2025-10-02 13:11:11.147 2 DEBUG nova.compute.manager [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:11:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:11 np0005465988 nova_compute[236126]: 2025-10-02 13:11:11.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:11 np0005465988 nova_compute[236126]: 2025-10-02 13:11:11.195 2 INFO nova.virt.block_device [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Booting with volume snapshot fbd3196a-4338-47e7-a120-33dcd2b49454 at /dev/vda#033[00m
Oct  2 09:11:11 np0005465988 nova_compute[236126]: 2025-10-02 13:11:11.376 2 DEBUG nova.policy [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c10de71fef00497981b8b7cec6a3fff3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:11:12 np0005465988 nova_compute[236126]: 2025-10-02 13:11:12.019 2 DEBUG nova.network.neutron [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Successfully created port: 65790950-bfa2-4a00-a085-786edd82063b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:11:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:12.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:12.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:13 np0005465988 nova_compute[236126]: 2025-10-02 13:11:13.078 2 DEBUG nova.network.neutron [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Successfully updated port: 65790950-bfa2-4a00-a085-786edd82063b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:11:13 np0005465988 nova_compute[236126]: 2025-10-02 13:11:13.099 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "refresh_cache-7f90eabc-4880-4ade-ba65-ec56679e12ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:11:13 np0005465988 nova_compute[236126]: 2025-10-02 13:11:13.099 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquired lock "refresh_cache-7f90eabc-4880-4ade-ba65-ec56679e12ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:11:13 np0005465988 nova_compute[236126]: 2025-10-02 13:11:13.099 2 DEBUG nova.network.neutron [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:11:13 np0005465988 nova_compute[236126]: 2025-10-02 13:11:13.146 2 DEBUG nova.compute.manager [req-8b927c5a-d713-468b-adba-fdfd6ac7ca95 req-7de47dec-42f2-453e-a635-e129c504a8c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Received event network-changed-65790950-bfa2-4a00-a085-786edd82063b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:11:13 np0005465988 nova_compute[236126]: 2025-10-02 13:11:13.147 2 DEBUG nova.compute.manager [req-8b927c5a-d713-468b-adba-fdfd6ac7ca95 req-7de47dec-42f2-453e-a635-e129c504a8c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Refreshing instance network info cache due to event network-changed-65790950-bfa2-4a00-a085-786edd82063b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:11:13 np0005465988 nova_compute[236126]: 2025-10-02 13:11:13.147 2 DEBUG oslo_concurrency.lockutils [req-8b927c5a-d713-468b-adba-fdfd6ac7ca95 req-7de47dec-42f2-453e-a635-e129c504a8c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-7f90eabc-4880-4ade-ba65-ec56679e12ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:11:13 np0005465988 nova_compute[236126]: 2025-10-02 13:11:13.242 2 DEBUG nova.network.neutron [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:11:13 np0005465988 nova_compute[236126]: 2025-10-02 13:11:13.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:14.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:14 np0005465988 podman[340475]: 2025-10-02 13:11:14.580717023 +0000 UTC m=+0.110477823 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 09:11:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:14.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.310 2 DEBUG nova.network.neutron [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Updating instance_info_cache with network_info: [{"id": "65790950-bfa2-4a00-a085-786edd82063b", "address": "fa:16:3e:28:e8:ad", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65790950-bf", "ovs_interfaceid": "65790950-bfa2-4a00-a085-786edd82063b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.340 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Releasing lock "refresh_cache-7f90eabc-4880-4ade-ba65-ec56679e12ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.340 2 DEBUG nova.compute.manager [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Instance network_info: |[{"id": "65790950-bfa2-4a00-a085-786edd82063b", "address": "fa:16:3e:28:e8:ad", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65790950-bf", "ovs_interfaceid": "65790950-bfa2-4a00-a085-786edd82063b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.341 2 DEBUG oslo_concurrency.lockutils [req-8b927c5a-d713-468b-adba-fdfd6ac7ca95 req-7de47dec-42f2-453e-a635-e129c504a8c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-7f90eabc-4880-4ade-ba65-ec56679e12ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.341 2 DEBUG nova.network.neutron [req-8b927c5a-d713-468b-adba-fdfd6ac7ca95 req-7de47dec-42f2-453e-a635-e129c504a8c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Refreshing network info cache for port 65790950-bfa2-4a00-a085-786edd82063b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.496 2 DEBUG os_brick.utils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.498 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.514 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.514 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e30401-82ef-4df3-ad3d-3703680bde44]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.516 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.525 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.525 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[39eca812-7c89-46e4-b011-a1396b5605dc]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.527 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.535 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.535 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[059e6c85-4312-44cf-b4d1-37cc60073313]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.537 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[2917b25e-ba6a-4ce5-b600-b13574ebe2dd]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.538 2 DEBUG oslo_concurrency.processutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.591 2 DEBUG oslo_concurrency.processutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "nvme version" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.593 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.594 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.594 2 DEBUG os_brick.initiator.connectors.lightos [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.594 2 DEBUG os_brick.utils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] <== get_connector_properties: return (97ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 09:11:15 np0005465988 nova_compute[236126]: 2025-10-02 13:11:15.595 2 DEBUG nova.virt.block_device [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Updating existing volume attachment record: 4e19eeb2-9d5b-4380-9267-1cecb534311f _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 09:11:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:11:15 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:11:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:16.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.649 2 DEBUG nova.compute.manager [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.651 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.651 2 INFO nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Creating image(s)#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.652 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.652 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Ensure instance console log exists: /var/lib/nova/instances/7f90eabc-4880-4ade-ba65-ec56679e12ee/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.652 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.653 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.653 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.655 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Start _get_guest_xml network_info=[{"id": "65790950-bfa2-4a00-a085-786edd82063b", "address": "fa:16:3e:28:e8:ad", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65790950-bf", "ovs_interfaceid": "65790950-bfa2-4a00-a085-786edd82063b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '4e19eeb2-9d5b-4380-9267-1cecb534311f', 'disk_bus': 'virtio', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-434edc39-4c42-4a71-aa46-369312e08301', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '434edc39-4c42-4a71-aa46-369312e08301', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '7f90eabc-4880-4ade-ba65-ec56679e12ee', 'attached_at': '', 'detached_at': '', 'volume_id': '434edc39-4c42-4a71-aa46-369312e08301', 'serial': '434edc39-4c42-4a71-aa46-369312e08301'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.663 2 WARNING nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.668 2 DEBUG nova.virt.libvirt.host [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.668 2 DEBUG nova.virt.libvirt.host [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.672 2 DEBUG nova.virt.libvirt.host [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.673 2 DEBUG nova.virt.libvirt.host [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.674 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.674 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.675 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.675 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.675 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.675 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.675 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.676 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.676 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.676 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.676 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.676 2 DEBUG nova.virt.hardware [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.707 2 DEBUG nova.storage.rbd_utils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 7f90eabc-4880-4ade-ba65-ec56679e12ee_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:11:16 np0005465988 nova_compute[236126]: 2025-10-02 13:11:16.713 2 DEBUG oslo_concurrency.processutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:16.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:11:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4198246259' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.250 2 DEBUG oslo_concurrency.processutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.277 2 DEBUG nova.virt.libvirt.vif [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:11:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-591954218',display_name='tempest-TestVolumeBootPattern-server-591954218',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-591954218',id=214,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-zkyf1odz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:11:11Z,user_data=None,user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=7f90eabc-4880-4ade-ba65-ec56679e12ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "65790950-bfa2-4a00-a085-786edd82063b", "address": "fa:16:3e:28:e8:ad", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65790950-bf", "ovs_interfaceid": "65790950-bfa2-4a00-a085-786edd82063b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.277 2 DEBUG nova.network.os_vif_util [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "65790950-bfa2-4a00-a085-786edd82063b", "address": "fa:16:3e:28:e8:ad", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65790950-bf", "ovs_interfaceid": "65790950-bfa2-4a00-a085-786edd82063b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.279 2 DEBUG nova.network.os_vif_util [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:e8:ad,bridge_name='br-int',has_traffic_filtering=True,id=65790950-bfa2-4a00-a085-786edd82063b,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65790950-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.280 2 DEBUG nova.objects.instance [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f90eabc-4880-4ade-ba65-ec56679e12ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.296 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  <uuid>7f90eabc-4880-4ade-ba65-ec56679e12ee</uuid>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  <name>instance-000000d6</name>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestVolumeBootPattern-server-591954218</nova:name>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:11:16</nova:creationTime>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <nova:user uuid="c10de71fef00497981b8b7cec6a3fff3">tempest-TestVolumeBootPattern-1200415020-project-member</nova:user>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <nova:project uuid="fbbc6cb494464fd9b31f64c1ad75fa6b">tempest-TestVolumeBootPattern-1200415020</nova:project>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <nova:port uuid="65790950-bfa2-4a00-a085-786edd82063b">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <entry name="serial">7f90eabc-4880-4ade-ba65-ec56679e12ee</entry>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <entry name="uuid">7f90eabc-4880-4ade-ba65-ec56679e12ee</entry>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/7f90eabc-4880-4ade-ba65-ec56679e12ee_disk.config">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-434edc39-4c42-4a71-aa46-369312e08301">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <serial>434edc39-4c42-4a71-aa46-369312e08301</serial>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:28:e8:ad"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <target dev="tap65790950-bf"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/7f90eabc-4880-4ade-ba65-ec56679e12ee/console.log" append="off"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:11:17 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:11:17 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:11:17 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:11:17 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.297 2 DEBUG nova.compute.manager [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Preparing to wait for external event network-vif-plugged-65790950-bfa2-4a00-a085-786edd82063b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.299 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.299 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.300 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.301 2 DEBUG nova.virt.libvirt.vif [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:11:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-591954218',display_name='tempest-TestVolumeBootPattern-server-591954218',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-591954218',id=214,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-zkyf1odz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:11:11Z,user_data=None,user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=7f90eabc-4880-4ade-ba65-ec56679e12ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "65790950-bfa2-4a00-a085-786edd82063b", "address": "fa:16:3e:28:e8:ad", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65790950-bf", "ovs_interfaceid": "65790950-bfa2-4a00-a085-786edd82063b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.301 2 DEBUG nova.network.os_vif_util [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "65790950-bfa2-4a00-a085-786edd82063b", "address": "fa:16:3e:28:e8:ad", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65790950-bf", "ovs_interfaceid": "65790950-bfa2-4a00-a085-786edd82063b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.302 2 DEBUG nova.network.os_vif_util [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:e8:ad,bridge_name='br-int',has_traffic_filtering=True,id=65790950-bfa2-4a00-a085-786edd82063b,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65790950-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.302 2 DEBUG os_vif [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:e8:ad,bridge_name='br-int',has_traffic_filtering=True,id=65790950-bfa2-4a00-a085-786edd82063b,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65790950-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.304 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.304 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.307 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65790950-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.308 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap65790950-bf, col_values=(('external_ids', {'iface-id': '65790950-bfa2-4a00-a085-786edd82063b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:e8:ad', 'vm-uuid': '7f90eabc-4880-4ade-ba65-ec56679e12ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:17 np0005465988 NetworkManager[45041]: <info>  [1759410677.3122] manager: (tap65790950-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/434)
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.320 2 INFO os_vif [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:e8:ad,bridge_name='br-int',has_traffic_filtering=True,id=65790950-bfa2-4a00-a085-786edd82063b,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65790950-bf')#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.379 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.380 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.380 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No VIF found with MAC fa:16:3e:28:e8:ad, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.380 2 INFO nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Using config drive#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.403 2 DEBUG nova.storage.rbd_utils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 7f90eabc-4880-4ade-ba65-ec56679e12ee_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.773 2 DEBUG nova.network.neutron [req-8b927c5a-d713-468b-adba-fdfd6ac7ca95 req-7de47dec-42f2-453e-a635-e129c504a8c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Updated VIF entry in instance network info cache for port 65790950-bfa2-4a00-a085-786edd82063b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.774 2 DEBUG nova.network.neutron [req-8b927c5a-d713-468b-adba-fdfd6ac7ca95 req-7de47dec-42f2-453e-a635-e129c504a8c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Updating instance_info_cache with network_info: [{"id": "65790950-bfa2-4a00-a085-786edd82063b", "address": "fa:16:3e:28:e8:ad", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65790950-bf", "ovs_interfaceid": "65790950-bfa2-4a00-a085-786edd82063b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:11:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:17.799 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=83, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=82) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.798 2 DEBUG oslo_concurrency.lockutils [req-8b927c5a-d713-468b-adba-fdfd6ac7ca95 req-7de47dec-42f2-453e-a635-e129c504a8c8 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-7f90eabc-4880-4ade-ba65-ec56679e12ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:17 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:17.801 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.912 2 INFO nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Creating config drive at /var/lib/nova/instances/7f90eabc-4880-4ade-ba65-ec56679e12ee/disk.config#033[00m
Oct  2 09:11:17 np0005465988 nova_compute[236126]: 2025-10-02 13:11:17.919 2 DEBUG oslo_concurrency.processutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f90eabc-4880-4ade-ba65-ec56679e12ee/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpra9zd3eg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:18.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.085 2 DEBUG oslo_concurrency.processutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f90eabc-4880-4ade-ba65-ec56679e12ee/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpra9zd3eg" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.212 2 DEBUG nova.storage.rbd_utils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 7f90eabc-4880-4ade-ba65-ec56679e12ee_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.220 2 DEBUG oslo_concurrency.processutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7f90eabc-4880-4ade-ba65-ec56679e12ee/disk.config 7f90eabc-4880-4ade-ba65-ec56679e12ee_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.437 2 DEBUG oslo_concurrency.processutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7f90eabc-4880-4ade-ba65-ec56679e12ee/disk.config 7f90eabc-4880-4ade-ba65-ec56679e12ee_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.218s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.439 2 INFO nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Deleting local config drive /var/lib/nova/instances/7f90eabc-4880-4ade-ba65-ec56679e12ee/disk.config because it was imported into RBD.#033[00m
Oct  2 09:11:18 np0005465988 kernel: tap65790950-bf: entered promiscuous mode
Oct  2 09:11:18 np0005465988 NetworkManager[45041]: <info>  [1759410678.5093] manager: (tap65790950-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/435)
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:18 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:18Z|00981|binding|INFO|Claiming lport 65790950-bfa2-4a00-a085-786edd82063b for this chassis.
Oct  2 09:11:18 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:18Z|00982|binding|INFO|65790950-bfa2-4a00-a085-786edd82063b: Claiming fa:16:3e:28:e8:ad 10.100.0.8
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.530 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:e8:ad 10.100.0.8'], port_security=['fa:16:3e:28:e8:ad 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7f90eabc-4880-4ade-ba65-ec56679e12ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-150508fb-9217-4982-8468-977a3b53121a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '53b1884f-337a-4a86-a723-0ac172ede381', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d5e391d-23a7-4f5a-8146-0f24141a74f2, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=65790950-bfa2-4a00-a085-786edd82063b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.532 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 65790950-bfa2-4a00-a085-786edd82063b in datapath 150508fb-9217-4982-8468-977a3b53121a bound to our chassis#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.535 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 150508fb-9217-4982-8468-977a3b53121a#033[00m
Oct  2 09:11:18 np0005465988 systemd-udevd[340666]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.550 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bf219f6b-04c4-4a90-8b89-78315344ca92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.552 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap150508fb-91 in ovnmeta-150508fb-9217-4982-8468-977a3b53121a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:11:18 np0005465988 NetworkManager[45041]: <info>  [1759410678.5543] device (tap65790950-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:11:18 np0005465988 NetworkManager[45041]: <info>  [1759410678.5558] device (tap65790950-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:11:18 np0005465988 systemd-machined[192594]: New machine qemu-103-instance-000000d6.
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.556 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap150508fb-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.556 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2bad2c11-4e8b-4c20-9ec2-cadb65c2794d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.557 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8ed974-321e-4dde-a7ae-7f88509e842c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.569 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[8b26ad79-6527-4d4f-a0c4-a16e402bf12c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:18 np0005465988 systemd[1]: Started Virtual Machine qemu-103-instance-000000d6.
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.584 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7a93ea12-9475-4ca8-8098-c3fecc23d721]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:18Z|00983|binding|INFO|Setting lport 65790950-bfa2-4a00-a085-786edd82063b ovn-installed in OVS
Oct  2 09:11:18 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:18Z|00984|binding|INFO|Setting lport 65790950-bfa2-4a00-a085-786edd82063b up in Southbound
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.621 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[33de6412-2849-4e9b-9335-ded99d19be9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.628 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c2259d5d-27de-4b82-a8aa-b1489efbdd70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 NetworkManager[45041]: <info>  [1759410678.6314] manager: (tap150508fb-90): new Veth device (/org/freedesktop/NetworkManager/Devices/436)
Oct  2 09:11:18 np0005465988 systemd-udevd[340670]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.668 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b7047c51-7638-478d-877f-518c06523789]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.672 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[a3080e65-a42b-44c8-a3b3-7dd1f23590df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 NetworkManager[45041]: <info>  [1759410678.6995] device (tap150508fb-90): carrier: link connected
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.704 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[18c4d051-9ec5-4410-a7ed-5bc6cd540c03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.726 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4adeaffa-5f22-47dc-9f45-b4c56dd822e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap150508fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:69:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 289], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 873903, 'reachable_time': 23707, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 340700, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.744 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[577586fe-e2dd-483a-b2f2-3957dcbaf4d7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:6993'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 873903, 'tstamp': 873903}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 340701, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.765 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7cf25a83-f511-45c6-8bcb-685461b3b789]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap150508fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:69:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 289], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 873903, 'reachable_time': 23707, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 340702, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.794 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[71407154-2af0-4fc8-a38e-157d4a177b83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.801 2 DEBUG nova.compute.manager [req-5348a66b-b4dc-4926-a831-8a6841cfb6e3 req-2fad54fd-5c12-44f8-a715-7cecceab6a2a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Received event network-vif-plugged-65790950-bfa2-4a00-a085-786edd82063b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.801 2 DEBUG oslo_concurrency.lockutils [req-5348a66b-b4dc-4926-a831-8a6841cfb6e3 req-2fad54fd-5c12-44f8-a715-7cecceab6a2a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.802 2 DEBUG oslo_concurrency.lockutils [req-5348a66b-b4dc-4926-a831-8a6841cfb6e3 req-2fad54fd-5c12-44f8-a715-7cecceab6a2a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.802 2 DEBUG oslo_concurrency.lockutils [req-5348a66b-b4dc-4926-a831-8a6841cfb6e3 req-2fad54fd-5c12-44f8-a715-7cecceab6a2a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.802 2 DEBUG nova.compute.manager [req-5348a66b-b4dc-4926-a831-8a6841cfb6e3 req-2fad54fd-5c12-44f8-a715-7cecceab6a2a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Processing event network-vif-plugged-65790950-bfa2-4a00-a085-786edd82063b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:11:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:18.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.868 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[18303ab2-9da0-42c1-a2a6-c49b2dad1aeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.871 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap150508fb-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.871 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.872 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap150508fb-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:18 np0005465988 kernel: tap150508fb-90: entered promiscuous mode
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:18 np0005465988 NetworkManager[45041]: <info>  [1759410678.8753] manager: (tap150508fb-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/437)
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.879 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap150508fb-90, col_values=(('external_ids', {'iface-id': '2a2f4068-0f5b-4d26-b914-4d32097d8b55'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:18 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:18Z|00985|binding|INFO|Releasing lport 2a2f4068-0f5b-4d26-b914-4d32097d8b55 from this chassis (sb_readonly=0)
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.882 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.883 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[607cfa39-eba7-46f9-a629-fe4e77828451]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.884 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-150508fb-9217-4982-8468-977a3b53121a
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 150508fb-9217-4982-8468-977a3b53121a
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:11:18 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:18.885 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'env', 'PROCESS_TAG=haproxy-150508fb-9217-4982-8468-977a3b53121a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/150508fb-9217-4982-8468-977a3b53121a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:11:18 np0005465988 nova_compute[236126]: 2025-10-02 13:11:18.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:19 np0005465988 podman[340776]: 2025-10-02 13:11:19.257513571 +0000 UTC m=+0.023584530 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.657 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410679.656451, 7f90eabc-4880-4ade-ba65-ec56679e12ee => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.658 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] VM Started (Lifecycle Event)#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.662 2 DEBUG nova.compute.manager [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.667 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.671 2 INFO nova.virt.libvirt.driver [-] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Instance spawned successfully.#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.671 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.679 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.683 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.693 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.694 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.694 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.695 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.696 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.696 2 DEBUG nova.virt.libvirt.driver [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.728 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.729 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410679.6579266, 7f90eabc-4880-4ade-ba65-ec56679e12ee => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.729 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.752 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.756 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410679.6652517, 7f90eabc-4880-4ade-ba65-ec56679e12ee => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.757 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.761 2 INFO nova.compute.manager [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Took 3.11 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.761 2 DEBUG nova.compute.manager [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.775 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.779 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.811 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.844 2 INFO nova.compute.manager [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Took 9.51 seconds to build instance.#033[00m
Oct  2 09:11:19 np0005465988 nova_compute[236126]: 2025-10-02 13:11:19.867 2 DEBUG oslo_concurrency.lockutils [None req-e12f3faa-c157-46fb-81f2-f003861f8d22 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:19 np0005465988 podman[340776]: 2025-10-02 13:11:19.910633152 +0000 UTC m=+0.676704101 container create 18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 09:11:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:20.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:20 np0005465988 systemd[1]: Started libpod-conmon-18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6.scope.
Oct  2 09:11:20 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:11:20 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fec543a6a606baa5975e2ede8b1fd0ea18305ce70de4c535a7b9108477bbe2ad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:11:20 np0005465988 podman[340776]: 2025-10-02 13:11:20.285167349 +0000 UTC m=+1.051238308 container init 18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:11:20 np0005465988 podman[340776]: 2025-10-02 13:11:20.292943073 +0000 UTC m=+1.059014012 container start 18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:11:20 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340792]: [NOTICE]   (340796) : New worker (340798) forked
Oct  2 09:11:20 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340792]: [NOTICE]   (340796) : Loading success.
Oct  2 09:11:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:20.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:20 np0005465988 nova_compute[236126]: 2025-10-02 13:11:20.899 2 DEBUG nova.compute.manager [req-707e9eb3-07c9-4612-a61d-bc23808a0842 req-14de9eb1-2242-4b8f-986f-3d4f2c512e16 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Received event network-vif-plugged-65790950-bfa2-4a00-a085-786edd82063b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:11:20 np0005465988 nova_compute[236126]: 2025-10-02 13:11:20.900 2 DEBUG oslo_concurrency.lockutils [req-707e9eb3-07c9-4612-a61d-bc23808a0842 req-14de9eb1-2242-4b8f-986f-3d4f2c512e16 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:20 np0005465988 nova_compute[236126]: 2025-10-02 13:11:20.900 2 DEBUG oslo_concurrency.lockutils [req-707e9eb3-07c9-4612-a61d-bc23808a0842 req-14de9eb1-2242-4b8f-986f-3d4f2c512e16 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:20 np0005465988 nova_compute[236126]: 2025-10-02 13:11:20.900 2 DEBUG oslo_concurrency.lockutils [req-707e9eb3-07c9-4612-a61d-bc23808a0842 req-14de9eb1-2242-4b8f-986f-3d4f2c512e16 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:20 np0005465988 nova_compute[236126]: 2025-10-02 13:11:20.901 2 DEBUG nova.compute.manager [req-707e9eb3-07c9-4612-a61d-bc23808a0842 req-14de9eb1-2242-4b8f-986f-3d4f2c512e16 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] No waiting events found dispatching network-vif-plugged-65790950-bfa2-4a00-a085-786edd82063b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:11:20 np0005465988 nova_compute[236126]: 2025-10-02 13:11:20.901 2 WARNING nova.compute.manager [req-707e9eb3-07c9-4612-a61d-bc23808a0842 req-14de9eb1-2242-4b8f-986f-3d4f2c512e16 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Received unexpected event network-vif-plugged-65790950-bfa2-4a00-a085-786edd82063b for instance with vm_state active and task_state None.#033[00m
Oct  2 09:11:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:21 np0005465988 nova_compute[236126]: 2025-10-02 13:11:21.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:22.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:22 np0005465988 nova_compute[236126]: 2025-10-02 13:11:22.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:22.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:24.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.358 2 DEBUG oslo_concurrency.lockutils [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "7f90eabc-4880-4ade-ba65-ec56679e12ee" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.359 2 DEBUG oslo_concurrency.lockutils [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.359 2 DEBUG oslo_concurrency.lockutils [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.360 2 DEBUG oslo_concurrency.lockutils [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.360 2 DEBUG oslo_concurrency.lockutils [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.362 2 INFO nova.compute.manager [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Terminating instance#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.364 2 DEBUG nova.compute.manager [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 09:11:24 np0005465988 kernel: tap65790950-bf (unregistering): left promiscuous mode
Oct  2 09:11:24 np0005465988 NetworkManager[45041]: <info>  [1759410684.4167] device (tap65790950-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:24 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:24Z|00986|binding|INFO|Releasing lport 65790950-bfa2-4a00-a085-786edd82063b from this chassis (sb_readonly=0)
Oct  2 09:11:24 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:24Z|00987|binding|INFO|Setting lport 65790950-bfa2-4a00-a085-786edd82063b down in Southbound
Oct  2 09:11:24 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:24Z|00988|binding|INFO|Removing iface tap65790950-bf ovn-installed in OVS
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:24 np0005465988 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d000000d6.scope: Deactivated successfully.
Oct  2 09:11:24 np0005465988 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d000000d6.scope: Consumed 5.946s CPU time.
Oct  2 09:11:24 np0005465988 systemd-machined[192594]: Machine qemu-103-instance-000000d6 terminated.
Oct  2 09:11:24 np0005465988 podman[340813]: 2025-10-02 13:11:24.53256279 +0000 UTC m=+0.084225977 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=iscsid)
Oct  2 09:11:24 np0005465988 podman[340814]: 2025-10-02 13:11:24.560939557 +0000 UTC m=+0.097419687 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 09:11:24 np0005465988 podman[340812]: 2025-10-02 13:11:24.571417269 +0000 UTC m=+0.125165696 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.601 2 INFO nova.virt.libvirt.driver [-] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Instance destroyed successfully.#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.602 2 DEBUG nova.objects.instance [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lazy-loading 'resources' on Instance uuid 7f90eabc-4880-4ade-ba65-ec56679e12ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:11:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:24.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:24.927 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:e8:ad 10.100.0.8'], port_security=['fa:16:3e:28:e8:ad 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7f90eabc-4880-4ade-ba65-ec56679e12ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-150508fb-9217-4982-8468-977a3b53121a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53b1884f-337a-4a86-a723-0ac172ede381', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d5e391d-23a7-4f5a-8146-0f24141a74f2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=65790950-bfa2-4a00-a085-786edd82063b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:24.931 142124 INFO neutron.agent.ovn.metadata.agent [-] Port 65790950-bfa2-4a00-a085-786edd82063b in datapath 150508fb-9217-4982-8468-977a3b53121a unbound from our chassis#033[00m
Oct  2 09:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:24.934 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 150508fb-9217-4982-8468-977a3b53121a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:24.936 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[02f3da92-86f9-437f-a818-63ea1e27a0bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:24 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:24.936 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-150508fb-9217-4982-8468-977a3b53121a namespace which is not needed anymore#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.979 2 DEBUG nova.virt.libvirt.vif [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:11:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-591954218',display_name='tempest-TestVolumeBootPattern-server-591954218',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-591954218',id=214,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:11:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-zkyf1odz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:11:19Z,user_data=None,user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=7f90eabc-4880-4ade-ba65-ec56679e12ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "65790950-bfa2-4a00-a085-786edd82063b", "address": "fa:16:3e:28:e8:ad", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65790950-bf", "ovs_interfaceid": "65790950-bfa2-4a00-a085-786edd82063b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.980 2 DEBUG nova.network.os_vif_util [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "65790950-bfa2-4a00-a085-786edd82063b", "address": "fa:16:3e:28:e8:ad", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65790950-bf", "ovs_interfaceid": "65790950-bfa2-4a00-a085-786edd82063b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.981 2 DEBUG nova.network.os_vif_util [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:e8:ad,bridge_name='br-int',has_traffic_filtering=True,id=65790950-bfa2-4a00-a085-786edd82063b,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65790950-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.981 2 DEBUG os_vif [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:e8:ad,bridge_name='br-int',has_traffic_filtering=True,id=65790950-bfa2-4a00-a085-786edd82063b,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65790950-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.984 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65790950-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:11:24 np0005465988 nova_compute[236126]: 2025-10-02 13:11:24.990 2 INFO os_vif [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:e8:ad,bridge_name='br-int',has_traffic_filtering=True,id=65790950-bfa2-4a00-a085-786edd82063b,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65790950-bf')#033[00m
Oct  2 09:11:25 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340792]: [NOTICE]   (340796) : haproxy version is 2.8.14-c23fe91
Oct  2 09:11:25 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340792]: [NOTICE]   (340796) : path to executable is /usr/sbin/haproxy
Oct  2 09:11:25 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340792]: [WARNING]  (340796) : Exiting Master process...
Oct  2 09:11:25 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340792]: [WARNING]  (340796) : Exiting Master process...
Oct  2 09:11:25 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340792]: [ALERT]    (340796) : Current worker (340798) exited with code 143 (Terminated)
Oct  2 09:11:25 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[340792]: [WARNING]  (340796) : All workers exited. Exiting... (0)
Oct  2 09:11:25 np0005465988 systemd[1]: libpod-18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6.scope: Deactivated successfully.
Oct  2 09:11:25 np0005465988 podman[340921]: 2025-10-02 13:11:25.091894798 +0000 UTC m=+0.046539391 container died 18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 09:11:25 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6-userdata-shm.mount: Deactivated successfully.
Oct  2 09:11:25 np0005465988 systemd[1]: var-lib-containers-storage-overlay-fec543a6a606baa5975e2ede8b1fd0ea18305ce70de4c535a7b9108477bbe2ad-merged.mount: Deactivated successfully.
Oct  2 09:11:25 np0005465988 podman[340921]: 2025-10-02 13:11:25.138073078 +0000 UTC m=+0.092717671 container cleanup 18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:11:25 np0005465988 systemd[1]: libpod-conmon-18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6.scope: Deactivated successfully.
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.174 2 DEBUG nova.compute.manager [req-4f4fd75d-1f36-4fd8-9c27-c87a822f4934 req-14ead0cf-eb30-42c8-8054-04c149d69a92 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Received event network-vif-unplugged-65790950-bfa2-4a00-a085-786edd82063b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.174 2 DEBUG oslo_concurrency.lockutils [req-4f4fd75d-1f36-4fd8-9c27-c87a822f4934 req-14ead0cf-eb30-42c8-8054-04c149d69a92 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.175 2 DEBUG oslo_concurrency.lockutils [req-4f4fd75d-1f36-4fd8-9c27-c87a822f4934 req-14ead0cf-eb30-42c8-8054-04c149d69a92 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.175 2 DEBUG oslo_concurrency.lockutils [req-4f4fd75d-1f36-4fd8-9c27-c87a822f4934 req-14ead0cf-eb30-42c8-8054-04c149d69a92 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.176 2 DEBUG nova.compute.manager [req-4f4fd75d-1f36-4fd8-9c27-c87a822f4934 req-14ead0cf-eb30-42c8-8054-04c149d69a92 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] No waiting events found dispatching network-vif-unplugged-65790950-bfa2-4a00-a085-786edd82063b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.176 2 DEBUG nova.compute.manager [req-4f4fd75d-1f36-4fd8-9c27-c87a822f4934 req-14ead0cf-eb30-42c8-8054-04c149d69a92 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Received event network-vif-unplugged-65790950-bfa2-4a00-a085-786edd82063b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:11:25 np0005465988 podman[340951]: 2025-10-02 13:11:25.211418741 +0000 UTC m=+0.042283459 container remove 18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 09:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:25.218 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[62c85f51-48ce-4769-af99-56772bd5bbd1]: (4, ('Thu Oct  2 01:11:25 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a (18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6)\n18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6\nThu Oct  2 01:11:25 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a (18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6)\n18370ba75fa7e2e93fa3d977ff447a661aaf856806b5c10474408bb6f6805ed6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:25.220 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c485edeb-d0c2-4c6a-b97e-344d2f574caf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:25.221 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap150508fb-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:25 np0005465988 kernel: tap150508fb-90: left promiscuous mode
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:25.230 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[63c3f95e-030b-4856-80fb-f7fa220a4f8c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:25.258 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dd910e46-973b-4b6f-836a-fe4b8e03fd24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:25.260 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c71a6573-1bf3-49cb-86cf-21289770077e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:25.280 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7d4ecd4d-5de0-47eb-86eb-1674ba55c08d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 873894, 'reachable_time': 32522, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 340966, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:25.283 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-150508fb-9217-4982-8468-977a3b53121a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:25.283 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[d4fb9797-62dc-4c66-8305-01e0efc845c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:25 np0005465988 systemd[1]: run-netns-ovnmeta\x2d150508fb\x2d9217\x2d4982\x2d8468\x2d977a3b53121a.mount: Deactivated successfully.
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.364 2 INFO nova.virt.libvirt.driver [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Deleting instance files /var/lib/nova/instances/7f90eabc-4880-4ade-ba65-ec56679e12ee_del#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.365 2 INFO nova.virt.libvirt.driver [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Deletion of /var/lib/nova/instances/7f90eabc-4880-4ade-ba65-ec56679e12ee_del complete#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.465 2 INFO nova.compute.manager [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Took 1.10 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.466 2 DEBUG oslo.service.loopingcall [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.467 2 DEBUG nova.compute.manager [-] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.467 2 DEBUG nova.network.neutron [-] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 09:11:25 np0005465988 nova_compute[236126]: 2025-10-02 13:11:25.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:11:25 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:25.804 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '83'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:26.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:26 np0005465988 nova_compute[236126]: 2025-10-02 13:11:26.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:26 np0005465988 nova_compute[236126]: 2025-10-02 13:11:26.403 2 DEBUG nova.network.neutron [-] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:11:26 np0005465988 nova_compute[236126]: 2025-10-02 13:11:26.479 2 DEBUG nova.compute.manager [req-355a132e-0de6-4909-a56e-84cebfc4df1f req-99576f9d-5442-40fe-af6c-7bae8d5fda93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Received event network-vif-deleted-65790950-bfa2-4a00-a085-786edd82063b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:11:26 np0005465988 nova_compute[236126]: 2025-10-02 13:11:26.479 2 INFO nova.compute.manager [req-355a132e-0de6-4909-a56e-84cebfc4df1f req-99576f9d-5442-40fe-af6c-7bae8d5fda93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Neutron deleted interface 65790950-bfa2-4a00-a085-786edd82063b; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 09:11:26 np0005465988 nova_compute[236126]: 2025-10-02 13:11:26.479 2 DEBUG nova.network.neutron [req-355a132e-0de6-4909-a56e-84cebfc4df1f req-99576f9d-5442-40fe-af6c-7bae8d5fda93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:11:26 np0005465988 nova_compute[236126]: 2025-10-02 13:11:26.559 2 INFO nova.compute.manager [-] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Took 1.09 seconds to deallocate network for instance.#033[00m
Oct  2 09:11:26 np0005465988 nova_compute[236126]: 2025-10-02 13:11:26.577 2 DEBUG nova.compute.manager [req-355a132e-0de6-4909-a56e-84cebfc4df1f req-99576f9d-5442-40fe-af6c-7bae8d5fda93 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Detach interface failed, port_id=65790950-bfa2-4a00-a085-786edd82063b, reason: Instance 7f90eabc-4880-4ade-ba65-ec56679e12ee could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 09:11:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:26.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:26 np0005465988 nova_compute[236126]: 2025-10-02 13:11:26.937 2 INFO nova.compute.manager [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Took 0.38 seconds to detach 1 volumes for instance.#033[00m
Oct  2 09:11:26 np0005465988 nova_compute[236126]: 2025-10-02 13:11:26.939 2 DEBUG nova.compute.manager [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Deleting volume: 434edc39-4c42-4a71-aa46-369312e08301 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.371 2 DEBUG nova.compute.manager [req-3b64e2b6-c3f0-45a9-805d-bcef4b4ba7bf req-a0ab2956-c824-47db-97ed-4523731c392d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Received event network-vif-plugged-65790950-bfa2-4a00-a085-786edd82063b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.372 2 DEBUG oslo_concurrency.lockutils [req-3b64e2b6-c3f0-45a9-805d-bcef4b4ba7bf req-a0ab2956-c824-47db-97ed-4523731c392d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.372 2 DEBUG oslo_concurrency.lockutils [req-3b64e2b6-c3f0-45a9-805d-bcef4b4ba7bf req-a0ab2956-c824-47db-97ed-4523731c392d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.373 2 DEBUG oslo_concurrency.lockutils [req-3b64e2b6-c3f0-45a9-805d-bcef4b4ba7bf req-a0ab2956-c824-47db-97ed-4523731c392d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.373 2 DEBUG nova.compute.manager [req-3b64e2b6-c3f0-45a9-805d-bcef4b4ba7bf req-a0ab2956-c824-47db-97ed-4523731c392d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] No waiting events found dispatching network-vif-plugged-65790950-bfa2-4a00-a085-786edd82063b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.373 2 WARNING nova.compute.manager [req-3b64e2b6-c3f0-45a9-805d-bcef4b4ba7bf req-a0ab2956-c824-47db-97ed-4523731c392d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Received unexpected event network-vif-plugged-65790950-bfa2-4a00-a085-786edd82063b for instance with vm_state active and task_state deleting.#033[00m
Oct  2 09:11:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:27.419 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:27.420 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:27.420 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.429 2 DEBUG oslo_concurrency.lockutils [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.430 2 DEBUG oslo_concurrency.lockutils [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.464 2 DEBUG nova.scheduler.client.report [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.472 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.497 2 DEBUG nova.scheduler.client.report [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.497 2 DEBUG nova.compute.provider_tree [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.518 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.521 2 DEBUG nova.scheduler.client.report [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.553 2 DEBUG nova.scheduler.client.report [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 09:11:27 np0005465988 nova_compute[236126]: 2025-10-02 13:11:27.590 2 DEBUG oslo_concurrency.processutils [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:28.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:11:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3003176227' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.080 2 DEBUG oslo_concurrency.processutils [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.088 2 DEBUG nova.compute.provider_tree [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.112 2 DEBUG nova.scheduler.client.report [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.138 2 DEBUG oslo_concurrency.lockutils [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.143 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.144 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.144 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.145 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.207 2 INFO nova.scheduler.client.report [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Deleted allocations for instance 7f90eabc-4880-4ade-ba65-ec56679e12ee#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.279 2 DEBUG oslo_concurrency.lockutils [None req-2ff06c87-f452-4337-b2a1-cbc342a0fb5a c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "7f90eabc-4880-4ade-ba65-ec56679e12ee" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.920s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:11:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1898368673' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.629 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.826 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.827 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3995MB free_disk=20.95989227294922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.827 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.827 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:28.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.887 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.887 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:11:28 np0005465988 nova_compute[236126]: 2025-10-02 13:11:28.902 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:11:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1393390085' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:11:29 np0005465988 nova_compute[236126]: 2025-10-02 13:11:29.414 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:29 np0005465988 nova_compute[236126]: 2025-10-02 13:11:29.421 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:11:29 np0005465988 nova_compute[236126]: 2025-10-02 13:11:29.462 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:11:29 np0005465988 nova_compute[236126]: 2025-10-02 13:11:29.509 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:11:29 np0005465988 nova_compute[236126]: 2025-10-02 13:11:29.509 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e418 e418: 3 total, 3 up, 3 in
Oct  2 09:11:29 np0005465988 nova_compute[236126]: 2025-10-02 13:11:29.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:30.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:30.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:31 np0005465988 nova_compute[236126]: 2025-10-02 13:11:31.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:32.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:32.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:34.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:34.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:34 np0005465988 nova_compute[236126]: 2025-10-02 13:11:34.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:36.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:36 np0005465988 nova_compute[236126]: 2025-10-02 13:11:36.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:36.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:38.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:38 np0005465988 nova_compute[236126]: 2025-10-02 13:11:38.511 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:11:38 np0005465988 nova_compute[236126]: 2025-10-02 13:11:38.512 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:11:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:11:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:38.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:11:39 np0005465988 nova_compute[236126]: 2025-10-02 13:11:39.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:11:39 np0005465988 nova_compute[236126]: 2025-10-02 13:11:39.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:11:39 np0005465988 nova_compute[236126]: 2025-10-02 13:11:39.600 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410684.5988607, 7f90eabc-4880-4ade-ba65-ec56679e12ee => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:11:39 np0005465988 nova_compute[236126]: 2025-10-02 13:11:39.600 2 INFO nova.compute.manager [-] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:11:39 np0005465988 nova_compute[236126]: 2025-10-02 13:11:39.627 2 DEBUG nova.compute.manager [None req-da8c76fa-2bbe-4715-9811-e1ed32ca981d - - - - - -] [instance: 7f90eabc-4880-4ade-ba65-ec56679e12ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:11:39 np0005465988 nova_compute[236126]: 2025-10-02 13:11:39.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:40.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:40 np0005465988 nova_compute[236126]: 2025-10-02 13:11:40.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:11:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:40.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:41 np0005465988 nova_compute[236126]: 2025-10-02 13:11:41.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e419 e419: 3 total, 3 up, 3 in
Oct  2 09:11:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:42.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:42 np0005465988 nova_compute[236126]: 2025-10-02 13:11:42.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:11:42 np0005465988 nova_compute[236126]: 2025-10-02 13:11:42.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:11:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:42.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:44.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:44.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:44 np0005465988 nova_compute[236126]: 2025-10-02 13:11:44.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:45 np0005465988 podman[341096]: 2025-10-02 13:11:45.554407754 +0000 UTC m=+0.095186983 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 09:11:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:46.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:46 np0005465988 nova_compute[236126]: 2025-10-02 13:11:46.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:46 np0005465988 nova_compute[236126]: 2025-10-02 13:11:46.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:11:46 np0005465988 nova_compute[236126]: 2025-10-02 13:11:46.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:11:46 np0005465988 nova_compute[236126]: 2025-10-02 13:11:46.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:11:46 np0005465988 nova_compute[236126]: 2025-10-02 13:11:46.508 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:11:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:46.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:47 np0005465988 nova_compute[236126]: 2025-10-02 13:11:47.386 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:47 np0005465988 nova_compute[236126]: 2025-10-02 13:11:47.387 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:47 np0005465988 nova_compute[236126]: 2025-10-02 13:11:47.404 2 DEBUG nova.compute.manager [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:11:47 np0005465988 nova_compute[236126]: 2025-10-02 13:11:47.480 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:47 np0005465988 nova_compute[236126]: 2025-10-02 13:11:47.480 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:47 np0005465988 nova_compute[236126]: 2025-10-02 13:11:47.489 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:11:47 np0005465988 nova_compute[236126]: 2025-10-02 13:11:47.490 2 INFO nova.compute.claims [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:11:47 np0005465988 nova_compute[236126]: 2025-10-02 13:11:47.578 2 DEBUG oslo_concurrency.processutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:11:48 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1733348881' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:11:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:48.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.096 2 DEBUG oslo_concurrency.processutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.104 2 DEBUG nova.compute.provider_tree [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.122 2 DEBUG nova.scheduler.client.report [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.144 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.145 2 DEBUG nova.compute.manager [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.194 2 DEBUG nova.compute.manager [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.195 2 DEBUG nova.network.neutron [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.224 2 INFO nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.257 2 DEBUG nova.compute.manager [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.328 2 INFO nova.virt.block_device [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Booting with volume cce428bb-67ad-45c7-9e76-52ebd4f984b0 at /dev/vda#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.396 2 DEBUG nova.policy [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c10de71fef00497981b8b7cec6a3fff3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.543 2 DEBUG os_brick.utils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.545 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.558 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.559 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[0c92925e-93cf-4195-b26e-a7c7ceca7313]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.561 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.571 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.572 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[3d3d0668-72a8-4c89-bfe5-4b734be47106]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.575 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.584 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.584 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[f3ccadef-ace1-475f-a51f-b58ef04f3198]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.586 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[7e099edd-50ba-4f5c-8aba-1ad635e5f059]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.586 2 DEBUG oslo_concurrency.processutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.637 2 DEBUG oslo_concurrency.processutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "nvme version" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.640 2 DEBUG os_brick.initiator.connectors.lightos [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.641 2 DEBUG os_brick.initiator.connectors.lightos [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.641 2 DEBUG os_brick.initiator.connectors.lightos [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.642 2 DEBUG os_brick.utils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] <== get_connector_properties: return (98ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 09:11:48 np0005465988 nova_compute[236126]: 2025-10-02 13:11:48.642 2 DEBUG nova.virt.block_device [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Updating existing volume attachment record: 3948d9dc-5b2e-4a5e-bc17-9b30f836ba23 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 09:11:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:48.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:49 np0005465988 nova_compute[236126]: 2025-10-02 13:11:49.297 2 DEBUG nova.network.neutron [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Successfully created port: dd3cdaf8-55ea-439f-ae16-3363c19b1ccc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:11:49 np0005465988 nova_compute[236126]: 2025-10-02 13:11:49.856 2 DEBUG nova.compute.manager [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:11:49 np0005465988 nova_compute[236126]: 2025-10-02 13:11:49.859 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:11:49 np0005465988 nova_compute[236126]: 2025-10-02 13:11:49.859 2 INFO nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Creating image(s)#033[00m
Oct  2 09:11:49 np0005465988 nova_compute[236126]: 2025-10-02 13:11:49.860 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 09:11:49 np0005465988 nova_compute[236126]: 2025-10-02 13:11:49.860 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Ensure instance console log exists: /var/lib/nova/instances/6de68f58-d90f-4bb4-ad9d-4bfa90dcb765/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:11:49 np0005465988 nova_compute[236126]: 2025-10-02 13:11:49.861 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:49 np0005465988 nova_compute[236126]: 2025-10-02 13:11:49.861 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:49 np0005465988 nova_compute[236126]: 2025-10-02 13:11:49.861 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:49 np0005465988 nova_compute[236126]: 2025-10-02 13:11:49.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:50.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:50 np0005465988 nova_compute[236126]: 2025-10-02 13:11:50.875 2 DEBUG nova.network.neutron [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Successfully updated port: dd3cdaf8-55ea-439f-ae16-3363c19b1ccc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:11:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:50.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:50 np0005465988 nova_compute[236126]: 2025-10-02 13:11:50.935 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:11:50 np0005465988 nova_compute[236126]: 2025-10-02 13:11:50.935 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquired lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:11:50 np0005465988 nova_compute[236126]: 2025-10-02 13:11:50.936 2 DEBUG nova.network.neutron [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:11:50 np0005465988 nova_compute[236126]: 2025-10-02 13:11:50.998 2 DEBUG nova.compute.manager [req-58778fd4-db42-48d1-8953-744afcdf7fda req-c066e2b6-871e-4e5c-ae91-00d943530978 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Received event network-changed-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:11:50 np0005465988 nova_compute[236126]: 2025-10-02 13:11:50.999 2 DEBUG nova.compute.manager [req-58778fd4-db42-48d1-8953-744afcdf7fda req-c066e2b6-871e-4e5c-ae91-00d943530978 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Refreshing instance network info cache due to event network-changed-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:11:50 np0005465988 nova_compute[236126]: 2025-10-02 13:11:50.999 2 DEBUG oslo_concurrency.lockutils [req-58778fd4-db42-48d1-8953-744afcdf7fda req-c066e2b6-871e-4e5c-ae91-00d943530978 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:11:51 np0005465988 nova_compute[236126]: 2025-10-02 13:11:51.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:51 np0005465988 nova_compute[236126]: 2025-10-02 13:11:51.928 2 DEBUG nova.network.neutron [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:11:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:52.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:52.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #166. Immutable memtables: 0.
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.816337) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 105] Flushing memtable with next log file: 166
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410713816425, "job": 105, "event": "flush_started", "num_memtables": 1, "num_entries": 1248, "num_deletes": 252, "total_data_size": 2610300, "memory_usage": 2647056, "flush_reason": "Manual Compaction"}
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 105] Level-0 flush table #167: started
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410713826572, "cf_name": "default", "job": 105, "event": "table_file_creation", "file_number": 167, "file_size": 1720803, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 81032, "largest_seqno": 82274, "table_properties": {"data_size": 1715425, "index_size": 2773, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12118, "raw_average_key_size": 20, "raw_value_size": 1704411, "raw_average_value_size": 2835, "num_data_blocks": 123, "num_entries": 601, "num_filter_entries": 601, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759410616, "oldest_key_time": 1759410616, "file_creation_time": 1759410713, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 167, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 105] Flush lasted 10280 microseconds, and 6069 cpu microseconds.
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.826614) [db/flush_job.cc:967] [default] [JOB 105] Level-0 flush table #167: 1720803 bytes OK
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.826641) [db/memtable_list.cc:519] [default] Level-0 commit table #167 started
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.827837) [db/memtable_list.cc:722] [default] Level-0 commit table #167: memtable #1 done
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.827849) EVENT_LOG_v1 {"time_micros": 1759410713827845, "job": 105, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.827871) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 105] Try to delete WAL files size 2604372, prev total WAL file size 2604372, number of live WAL files 2.
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000163.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.828615) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037303238' seq:72057594037927935, type:22 .. '7061786F730037323830' seq:0, type:0; will stop at (end)
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 106] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 105 Base level 0, inputs: [167(1680KB)], [165(12MB)]
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410713828655, "job": 106, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [167], "files_L6": [165], "score": -1, "input_data_size": 15316926, "oldest_snapshot_seqno": -1}
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 106] Generated table #168: 10188 keys, 13364092 bytes, temperature: kUnknown
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410713922700, "cf_name": "default", "job": 106, "event": "table_file_creation", "file_number": 168, "file_size": 13364092, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13297815, "index_size": 39732, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25477, "raw_key_size": 269233, "raw_average_key_size": 26, "raw_value_size": 13118920, "raw_average_value_size": 1287, "num_data_blocks": 1512, "num_entries": 10188, "num_filter_entries": 10188, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759410713, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 168, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.923018) [db/compaction/compaction_job.cc:1663] [default] [JOB 106] Compacted 1@0 + 1@6 files to L6 => 13364092 bytes
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.924763) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.7 rd, 141.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 13.0 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(16.7) write-amplify(7.8) OK, records in: 10709, records dropped: 521 output_compression: NoCompression
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.924782) EVENT_LOG_v1 {"time_micros": 1759410713924772, "job": 106, "event": "compaction_finished", "compaction_time_micros": 94159, "compaction_time_cpu_micros": 41169, "output_level": 6, "num_output_files": 1, "total_output_size": 13364092, "num_input_records": 10709, "num_output_records": 10188, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000167.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410713925258, "job": 106, "event": "table_file_deletion", "file_number": 167}
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000165.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410713928504, "job": 106, "event": "table_file_deletion", "file_number": 165}
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.828529) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.928596) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.928603) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.928605) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.928606) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:11:53 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:11:53.928608) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:11:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 09:11:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:54.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 09:11:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:54.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.016 2 DEBUG nova.network.neutron [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Updating instance_info_cache with network_info: [{"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.056 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Releasing lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.057 2 DEBUG nova.compute.manager [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Instance network_info: |[{"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.057 2 DEBUG oslo_concurrency.lockutils [req-58778fd4-db42-48d1-8953-744afcdf7fda req-c066e2b6-871e-4e5c-ae91-00d943530978 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.058 2 DEBUG nova.network.neutron [req-58778fd4-db42-48d1-8953-744afcdf7fda req-c066e2b6-871e-4e5c-ae91-00d943530978 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Refreshing network info cache for port dd3cdaf8-55ea-439f-ae16-3363c19b1ccc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.062 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Start _get_guest_xml network_info=[{"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '3948d9dc-5b2e-4a5e-bc17-9b30f836ba23', 'disk_bus': 'virtio', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-cce428bb-67ad-45c7-9e76-52ebd4f984b0', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'cce428bb-67ad-45c7-9e76-52ebd4f984b0', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '6de68f58-d90f-4bb4-ad9d-4bfa90dcb765', 'attached_at': '', 'detached_at': '', 'volume_id': 'cce428bb-67ad-45c7-9e76-52ebd4f984b0', 'serial': 'cce428bb-67ad-45c7-9e76-52ebd4f984b0'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.067 2 WARNING nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.073 2 DEBUG nova.virt.libvirt.host [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.074 2 DEBUG nova.virt.libvirt.host [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.081 2 DEBUG nova.virt.libvirt.host [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.082 2 DEBUG nova.virt.libvirt.host [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.083 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.083 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.084 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.084 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.084 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.084 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.084 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.085 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.085 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.085 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.085 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.086 2 DEBUG nova.virt.hardware [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.116 2 DEBUG nova.storage.rbd_utils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.121 2 DEBUG oslo_concurrency.processutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:55 np0005465988 podman[341242]: 2025-10-02 13:11:55.551283847 +0000 UTC m=+0.072031486 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:11:55 np0005465988 podman[341241]: 2025-10-02 13:11:55.567272447 +0000 UTC m=+0.083368972 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 09:11:55 np0005465988 podman[341240]: 2025-10-02 13:11:55.58263059 +0000 UTC m=+0.109607838 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:11:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:11:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1474483427' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.640 2 DEBUG oslo_concurrency.processutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.681 2 DEBUG nova.virt.libvirt.vif [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:11:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1895710534',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1895710534',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1895710534',id=215,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJyVmDplTyBUuGu+PxDHYy0cGpMO6l1bDwFxzUhHuP8Q5cQVtZgJ8nphmotjM+1tM1ayRodD11OEZs5etfn1F5kmg2y3i5ZSFmiZu7fk0vYLFVnhbokqYJHSZo3M77maaw==',key_name='tempest-keypair-281057978',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-ggrq1m9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:11:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=6de68f58-d90f-4bb4-ad9d-4bfa90dcb765,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.682 2 DEBUG nova.network.os_vif_util [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.683 2 DEBUG nova.network.os_vif_util [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:12:72,bridge_name='br-int',has_traffic_filtering=True,id=dd3cdaf8-55ea-439f-ae16-3363c19b1ccc,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3cdaf8-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.684 2 DEBUG nova.objects.instance [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lazy-loading 'pci_devices' on Instance uuid 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.710 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  <uuid>6de68f58-d90f-4bb4-ad9d-4bfa90dcb765</uuid>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  <name>instance-000000d7</name>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestVolumeBootPattern-volume-backed-server-1895710534</nova:name>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:11:55</nova:creationTime>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <nova:user uuid="c10de71fef00497981b8b7cec6a3fff3">tempest-TestVolumeBootPattern-1200415020-project-member</nova:user>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <nova:project uuid="fbbc6cb494464fd9b31f64c1ad75fa6b">tempest-TestVolumeBootPattern-1200415020</nova:project>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <nova:port uuid="dd3cdaf8-55ea-439f-ae16-3363c19b1ccc">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <entry name="serial">6de68f58-d90f-4bb4-ad9d-4bfa90dcb765</entry>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <entry name="uuid">6de68f58-d90f-4bb4-ad9d-4bfa90dcb765</entry>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/6de68f58-d90f-4bb4-ad9d-4bfa90dcb765_disk.config">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-cce428bb-67ad-45c7-9e76-52ebd4f984b0">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <serial>cce428bb-67ad-45c7-9e76-52ebd4f984b0</serial>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:4f:12:72"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <target dev="tapdd3cdaf8-55"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/6de68f58-d90f-4bb4-ad9d-4bfa90dcb765/console.log" append="off"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:11:55 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:11:55 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:11:55 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:11:55 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.713 2 DEBUG nova.compute.manager [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Preparing to wait for external event network-vif-plugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.713 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.714 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.714 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.715 2 DEBUG nova.virt.libvirt.vif [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:11:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1895710534',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1895710534',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1895710534',id=215,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJyVmDplTyBUuGu+PxDHYy0cGpMO6l1bDwFxzUhHuP8Q5cQVtZgJ8nphmotjM+1tM1ayRodD11OEZs5etfn1F5kmg2y3i5ZSFmiZu7fk0vYLFVnhbokqYJHSZo3M77maaw==',key_name='tempest-keypair-281057978',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-ggrq1m9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:11:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=6de68f58-d90f-4bb4-ad9d-4bfa90dcb765,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.715 2 DEBUG nova.network.os_vif_util [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.716 2 DEBUG nova.network.os_vif_util [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:12:72,bridge_name='br-int',has_traffic_filtering=True,id=dd3cdaf8-55ea-439f-ae16-3363c19b1ccc,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3cdaf8-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.716 2 DEBUG os_vif [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:12:72,bridge_name='br-int',has_traffic_filtering=True,id=dd3cdaf8-55ea-439f-ae16-3363c19b1ccc,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3cdaf8-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.717 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.718 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.726 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd3cdaf8-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.727 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdd3cdaf8-55, col_values=(('external_ids', {'iface-id': 'dd3cdaf8-55ea-439f-ae16-3363c19b1ccc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:12:72', 'vm-uuid': '6de68f58-d90f-4bb4-ad9d-4bfa90dcb765'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:55 np0005465988 NetworkManager[45041]: <info>  [1759410715.7297] manager: (tapdd3cdaf8-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/438)
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.738 2 INFO os_vif [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:12:72,bridge_name='br-int',has_traffic_filtering=True,id=dd3cdaf8-55ea-439f-ae16-3363c19b1ccc,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3cdaf8-55')#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.825 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.825 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.825 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No VIF found with MAC fa:16:3e:4f:12:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.826 2 INFO nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Using config drive#033[00m
Oct  2 09:11:55 np0005465988 nova_compute[236126]: 2025-10-02 13:11:55.858 2 DEBUG nova.storage.rbd_utils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:11:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:56.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:56 np0005465988 nova_compute[236126]: 2025-10-02 13:11:56.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:11:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:11:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:56.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:11:56 np0005465988 nova_compute[236126]: 2025-10-02 13:11:56.978 2 INFO nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Creating config drive at /var/lib/nova/instances/6de68f58-d90f-4bb4-ad9d-4bfa90dcb765/disk.config#033[00m
Oct  2 09:11:56 np0005465988 nova_compute[236126]: 2025-10-02 13:11:56.984 2 DEBUG oslo_concurrency.processutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6de68f58-d90f-4bb4-ad9d-4bfa90dcb765/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9na5my89 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.127 2 DEBUG oslo_concurrency.processutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6de68f58-d90f-4bb4-ad9d-4bfa90dcb765/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9na5my89" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.167 2 DEBUG nova.storage.rbd_utils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.172 2 DEBUG oslo_concurrency.processutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6de68f58-d90f-4bb4-ad9d-4bfa90dcb765/disk.config 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.432 2 DEBUG oslo_concurrency.processutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6de68f58-d90f-4bb4-ad9d-4bfa90dcb765/disk.config 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.433 2 INFO nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Deleting local config drive /var/lib/nova/instances/6de68f58-d90f-4bb4-ad9d-4bfa90dcb765/disk.config because it was imported into RBD.#033[00m
Oct  2 09:11:57 np0005465988 kernel: tapdd3cdaf8-55: entered promiscuous mode
Oct  2 09:11:57 np0005465988 NetworkManager[45041]: <info>  [1759410717.4991] manager: (tapdd3cdaf8-55): new Tun device (/org/freedesktop/NetworkManager/Devices/439)
Oct  2 09:11:57 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:57Z|00989|binding|INFO|Claiming lport dd3cdaf8-55ea-439f-ae16-3363c19b1ccc for this chassis.
Oct  2 09:11:57 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:57Z|00990|binding|INFO|dd3cdaf8-55ea-439f-ae16-3363c19b1ccc: Claiming fa:16:3e:4f:12:72 10.100.0.14
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:57 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:57Z|00991|binding|INFO|Setting lport dd3cdaf8-55ea-439f-ae16-3363c19b1ccc ovn-installed in OVS
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:57 np0005465988 systemd-udevd[341376]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:11:57 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:57Z|00992|binding|INFO|Setting lport dd3cdaf8-55ea-439f-ae16-3363c19b1ccc up in Southbound
Oct  2 09:11:57 np0005465988 systemd-machined[192594]: New machine qemu-104-instance-000000d7.
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.545 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:12:72 10.100.0.14'], port_security=['fa:16:3e:4f:12:72 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6de68f58-d90f-4bb4-ad9d-4bfa90dcb765', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-150508fb-9217-4982-8468-977a3b53121a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6903cca5-a4c4-4f8f-a0c5-b42a4e15d418', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d5e391d-23a7-4f5a-8146-0f24141a74f2, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=dd3cdaf8-55ea-439f-ae16-3363c19b1ccc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.547 142124 INFO neutron.agent.ovn.metadata.agent [-] Port dd3cdaf8-55ea-439f-ae16-3363c19b1ccc in datapath 150508fb-9217-4982-8468-977a3b53121a bound to our chassis#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.549 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 150508fb-9217-4982-8468-977a3b53121a#033[00m
Oct  2 09:11:57 np0005465988 NetworkManager[45041]: <info>  [1759410717.5527] device (tapdd3cdaf8-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:11:57 np0005465988 NetworkManager[45041]: <info>  [1759410717.5542] device (tapdd3cdaf8-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:11:57 np0005465988 systemd[1]: Started Virtual Machine qemu-104-instance-000000d7.
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.564 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e522d6cd-106d-480f-8eee-1840c87fb456]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.565 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap150508fb-91 in ovnmeta-150508fb-9217-4982-8468-977a3b53121a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.567 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap150508fb-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.568 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9b1cf5ed-b09e-4431-9c0b-a2c06dd49bd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.569 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d1119e2f-5134-4bbf-b1fb-18364055b0ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.581 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[6b50df9f-4aee-4a0e-a738-6d3f58c31063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.593 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c605c09f-ddaa-4a2b-9514-94d727836f6b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.634 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d7e07bf6-2cd7-424d-a4e8-f53b15aa193d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 systemd-udevd[341380]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:11:57 np0005465988 NetworkManager[45041]: <info>  [1759410717.6420] manager: (tap150508fb-90): new Veth device (/org/freedesktop/NetworkManager/Devices/440)
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.640 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab65dd4-83ab-402e-ad94-97081ab16976]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.682 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[98489f47-5b0e-4793-9c6b-c4418f67e59b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.686 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[8436dd93-141e-476d-a5ee-3b883ed5bbee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 NetworkManager[45041]: <info>  [1759410717.7141] device (tap150508fb-90): carrier: link connected
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.721 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[832abc3e-fd72-457c-ae3c-d16c7e23f95d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.741 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[facf9b2c-5c27-48c5-8b74-6528909943b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap150508fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:69:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 292], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 877804, 'reachable_time': 44616, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 341410, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.757 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a868e567-3510-4de9-a37d-578a1b9bf19a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:6993'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 877804, 'tstamp': 877804}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 341411, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.771 2 DEBUG nova.network.neutron [req-58778fd4-db42-48d1-8953-744afcdf7fda req-c066e2b6-871e-4e5c-ae91-00d943530978 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Updated VIF entry in instance network info cache for port dd3cdaf8-55ea-439f-ae16-3363c19b1ccc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.772 2 DEBUG nova.network.neutron [req-58778fd4-db42-48d1-8953-744afcdf7fda req-c066e2b6-871e-4e5c-ae91-00d943530978 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Updating instance_info_cache with network_info: [{"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.776 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[43e5bdb8-8bd7-4b56-90b0-e0790a78ebdb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap150508fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:69:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 292], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 877804, 'reachable_time': 44616, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 341412, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.818 2 DEBUG oslo_concurrency.lockutils [req-58778fd4-db42-48d1-8953-744afcdf7fda req-c066e2b6-871e-4e5c-ae91-00d943530978 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.814 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8fda29be-3708-443c-b53c-adc1f5748324]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.911 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0d540693-abda-48f9-bd6f-959d934d92e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.914 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap150508fb-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.915 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.915 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap150508fb-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:57 np0005465988 NetworkManager[45041]: <info>  [1759410717.9182] manager: (tap150508fb-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/441)
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:57 np0005465988 kernel: tap150508fb-90: entered promiscuous mode
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.926 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap150508fb-90, col_values=(('external_ids', {'iface-id': '2a2f4068-0f5b-4d26-b914-4d32097d8b55'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:11:57 np0005465988 ovn_controller[132601]: 2025-10-02T13:11:57Z|00993|binding|INFO|Releasing lport 2a2f4068-0f5b-4d26-b914-4d32097d8b55 from this chassis (sb_readonly=0)
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.931 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.932 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[15fefb08-36f1-4fe3-b353-da92007217d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.934 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-150508fb-9217-4982-8468-977a3b53121a
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 150508fb-9217-4982-8468-977a3b53121a
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:11:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:11:57.936 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'env', 'PROCESS_TAG=haproxy-150508fb-9217-4982-8468-977a3b53121a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/150508fb-9217-4982-8468-977a3b53121a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:11:57 np0005465988 nova_compute[236126]: 2025-10-02 13:11:57.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.069 2 DEBUG nova.compute.manager [req-dc7816d0-33d8-4bc3-9063-14ce9facef09 req-c47e2e59-b5c2-4222-a108-a793862b7dfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Received event network-vif-plugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.070 2 DEBUG oslo_concurrency.lockutils [req-dc7816d0-33d8-4bc3-9063-14ce9facef09 req-c47e2e59-b5c2-4222-a108-a793862b7dfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.070 2 DEBUG oslo_concurrency.lockutils [req-dc7816d0-33d8-4bc3-9063-14ce9facef09 req-c47e2e59-b5c2-4222-a108-a793862b7dfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.070 2 DEBUG oslo_concurrency.lockutils [req-dc7816d0-33d8-4bc3-9063-14ce9facef09 req-c47e2e59-b5c2-4222-a108-a793862b7dfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.070 2 DEBUG nova.compute.manager [req-dc7816d0-33d8-4bc3-9063-14ce9facef09 req-c47e2e59-b5c2-4222-a108-a793862b7dfa d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Processing event network-vif-plugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:11:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:11:58.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:58 np0005465988 podman[341486]: 2025-10-02 13:11:58.377469154 +0000 UTC m=+0.090652651 container create bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 09:11:58 np0005465988 podman[341486]: 2025-10-02 13:11:58.313527483 +0000 UTC m=+0.026710990 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:11:58 np0005465988 systemd[1]: Started libpod-conmon-bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f.scope.
Oct  2 09:11:58 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:11:58 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9402cd6c6a236c7f8b0804a5407bf9c7d99ec134d4fee474d1e9f91a15cef072/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:11:58 np0005465988 podman[341486]: 2025-10-02 13:11:58.502103794 +0000 UTC m=+0.215287311 container init bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 09:11:58 np0005465988 podman[341486]: 2025-10-02 13:11:58.508806127 +0000 UTC m=+0.221989644 container start bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:11:58 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[341501]: [NOTICE]   (341505) : New worker (341507) forked
Oct  2 09:11:58 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[341501]: [NOTICE]   (341505) : Loading success.
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.730 2 DEBUG nova.compute.manager [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.731 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410718.72913, 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.731 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] VM Started (Lifecycle Event)#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.738 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.744 2 INFO nova.virt.libvirt.driver [-] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Instance spawned successfully.#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.745 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.813 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.825 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.833 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.833 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.834 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.834 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.835 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.836 2 DEBUG nova.virt.libvirt.driver [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.855 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.856 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410718.7307947, 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.856 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.892 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.902 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410718.7399516, 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.903 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:11:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:11:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:11:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:11:58.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.933 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.937 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.979 2 INFO nova.compute.manager [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Took 9.12 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.979 2 DEBUG nova.compute.manager [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:11:58 np0005465988 nova_compute[236126]: 2025-10-02 13:11:58.980 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:11:59 np0005465988 nova_compute[236126]: 2025-10-02 13:11:59.087 2 INFO nova.compute.manager [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Took 11.64 seconds to build instance.#033[00m
Oct  2 09:11:59 np0005465988 nova_compute[236126]: 2025-10-02 13:11:59.141 2 DEBUG oslo_concurrency.lockutils [None req-07488fb6-0016-4576-ab78-8bb22fb86c05 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:12:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:00.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:00 np0005465988 nova_compute[236126]: 2025-10-02 13:12:00.207 2 DEBUG nova.compute.manager [req-87f69d08-dd62-4cf5-85c3-ca1121433f2e req-87be04af-f5d2-47ff-9164-c48451fc6437 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Received event network-vif-plugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:12:00 np0005465988 nova_compute[236126]: 2025-10-02 13:12:00.207 2 DEBUG oslo_concurrency.lockutils [req-87f69d08-dd62-4cf5-85c3-ca1121433f2e req-87be04af-f5d2-47ff-9164-c48451fc6437 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:12:00 np0005465988 nova_compute[236126]: 2025-10-02 13:12:00.208 2 DEBUG oslo_concurrency.lockutils [req-87f69d08-dd62-4cf5-85c3-ca1121433f2e req-87be04af-f5d2-47ff-9164-c48451fc6437 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:12:00 np0005465988 nova_compute[236126]: 2025-10-02 13:12:00.208 2 DEBUG oslo_concurrency.lockutils [req-87f69d08-dd62-4cf5-85c3-ca1121433f2e req-87be04af-f5d2-47ff-9164-c48451fc6437 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:12:00 np0005465988 nova_compute[236126]: 2025-10-02 13:12:00.208 2 DEBUG nova.compute.manager [req-87f69d08-dd62-4cf5-85c3-ca1121433f2e req-87be04af-f5d2-47ff-9164-c48451fc6437 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] No waiting events found dispatching network-vif-plugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:12:00 np0005465988 nova_compute[236126]: 2025-10-02 13:12:00.208 2 WARNING nova.compute.manager [req-87f69d08-dd62-4cf5-85c3-ca1121433f2e req-87be04af-f5d2-47ff-9164-c48451fc6437 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Received unexpected event network-vif-plugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc for instance with vm_state active and task_state None.#033[00m
Oct  2 09:12:00 np0005465988 nova_compute[236126]: 2025-10-02 13:12:00.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:00.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:01 np0005465988 nova_compute[236126]: 2025-10-02 13:12:01.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:02 np0005465988 nova_compute[236126]: 2025-10-02 13:12:02.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:02 np0005465988 NetworkManager[45041]: <info>  [1759410722.0237] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/442)
Oct  2 09:12:02 np0005465988 NetworkManager[45041]: <info>  [1759410722.0250] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/443)
Oct  2 09:12:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:02.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:02 np0005465988 nova_compute[236126]: 2025-10-02 13:12:02.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:02 np0005465988 ovn_controller[132601]: 2025-10-02T13:12:02Z|00994|binding|INFO|Releasing lport 2a2f4068-0f5b-4d26-b914-4d32097d8b55 from this chassis (sb_readonly=0)
Oct  2 09:12:02 np0005465988 nova_compute[236126]: 2025-10-02 13:12:02.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:02.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:03 np0005465988 nova_compute[236126]: 2025-10-02 13:12:03.089 2 DEBUG nova.compute.manager [req-6bc01954-4094-424e-a585-60586488fc14 req-beff0364-2b11-47c6-8536-6b8b5251ab43 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Received event network-changed-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:12:03 np0005465988 nova_compute[236126]: 2025-10-02 13:12:03.090 2 DEBUG nova.compute.manager [req-6bc01954-4094-424e-a585-60586488fc14 req-beff0364-2b11-47c6-8536-6b8b5251ab43 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Refreshing instance network info cache due to event network-changed-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:12:03 np0005465988 nova_compute[236126]: 2025-10-02 13:12:03.090 2 DEBUG oslo_concurrency.lockutils [req-6bc01954-4094-424e-a585-60586488fc14 req-beff0364-2b11-47c6-8536-6b8b5251ab43 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:12:03 np0005465988 nova_compute[236126]: 2025-10-02 13:12:03.090 2 DEBUG oslo_concurrency.lockutils [req-6bc01954-4094-424e-a585-60586488fc14 req-beff0364-2b11-47c6-8536-6b8b5251ab43 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:12:03 np0005465988 nova_compute[236126]: 2025-10-02 13:12:03.090 2 DEBUG nova.network.neutron [req-6bc01954-4094-424e-a585-60586488fc14 req-beff0364-2b11-47c6-8536-6b8b5251ab43 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Refreshing network info cache for port dd3cdaf8-55ea-439f-ae16-3363c19b1ccc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:12:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:04.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:04.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:05 np0005465988 nova_compute[236126]: 2025-10-02 13:12:05.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:05 np0005465988 nova_compute[236126]: 2025-10-02 13:12:05.756 2 DEBUG nova.network.neutron [req-6bc01954-4094-424e-a585-60586488fc14 req-beff0364-2b11-47c6-8536-6b8b5251ab43 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Updated VIF entry in instance network info cache for port dd3cdaf8-55ea-439f-ae16-3363c19b1ccc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:12:05 np0005465988 nova_compute[236126]: 2025-10-02 13:12:05.757 2 DEBUG nova.network.neutron [req-6bc01954-4094-424e-a585-60586488fc14 req-beff0364-2b11-47c6-8536-6b8b5251ab43 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Updating instance_info_cache with network_info: [{"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:12:05 np0005465988 nova_compute[236126]: 2025-10-02 13:12:05.778 2 DEBUG oslo_concurrency.lockutils [req-6bc01954-4094-424e-a585-60586488fc14 req-beff0364-2b11-47c6-8536-6b8b5251ab43 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:12:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:06.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:06 np0005465988 nova_compute[236126]: 2025-10-02 13:12:06.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:06 np0005465988 nova_compute[236126]: 2025-10-02 13:12:06.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:06.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:08.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:08.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:10.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:10 np0005465988 nova_compute[236126]: 2025-10-02 13:12:10.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:10.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:10 np0005465988 ovn_controller[132601]: 2025-10-02T13:12:10Z|00129|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4f:12:72 10.100.0.14
Oct  2 09:12:10 np0005465988 ovn_controller[132601]: 2025-10-02T13:12:10Z|00130|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4f:12:72 10.100.0.14
Oct  2 09:12:11 np0005465988 nova_compute[236126]: 2025-10-02 13:12:11.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:12.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:12.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:14.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:14.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:15 np0005465988 podman[341676]: 2025-10-02 13:12:15.676428715 +0000 UTC m=+0.056467957 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 09:12:15 np0005465988 nova_compute[236126]: 2025-10-02 13:12:15.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:12:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:16.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:12:16 np0005465988 nova_compute[236126]: 2025-10-02 13:12:16.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:12:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:16.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:12:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:12:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:12:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:12:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:12:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:18.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:12:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:12:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:12:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:18.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:20.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:20 np0005465988 nova_compute[236126]: 2025-10-02 13:12:20.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e420 e420: 3 total, 3 up, 3 in
Oct  2 09:12:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:20.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:21 np0005465988 nova_compute[236126]: 2025-10-02 13:12:21.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e420 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 e421: 3 total, 3 up, 3 in
Oct  2 09:12:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:22.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:12:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:22.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:12:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:24.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:24.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:25 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:12:25 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:12:25 np0005465988 nova_compute[236126]: 2025-10-02 13:12:25.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:12:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:26.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:12:26 np0005465988 nova_compute[236126]: 2025-10-02 13:12:26.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:26 np0005465988 nova_compute[236126]: 2025-10-02 13:12:26.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:12:26 np0005465988 podman[341788]: 2025-10-02 13:12:26.562671569 +0000 UTC m=+0.086776680 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:12:26 np0005465988 podman[341787]: 2025-10-02 13:12:26.578949178 +0000 UTC m=+0.103961216 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:12:26 np0005465988 podman[341786]: 2025-10-02 13:12:26.623234153 +0000 UTC m=+0.151744231 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller)
Oct  2 09:12:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:26.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:12:27.420 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:12:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:12:27.421 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:12:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:12:27.422 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:12:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:28.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:28.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:29 np0005465988 nova_compute[236126]: 2025-10-02 13:12:29.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:12:29 np0005465988 nova_compute[236126]: 2025-10-02 13:12:29.547 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:12:29 np0005465988 nova_compute[236126]: 2025-10-02 13:12:29.547 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:12:29 np0005465988 nova_compute[236126]: 2025-10-02 13:12:29.548 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:12:29 np0005465988 nova_compute[236126]: 2025-10-02 13:12:29.548 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:12:29 np0005465988 nova_compute[236126]: 2025-10-02 13:12:29.548 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:12:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:12:30 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1454490702' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.104 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:12:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:30.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.185 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000d7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.186 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000d7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.374 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.375 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3807MB free_disk=20.967212677001953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.376 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.376 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.536 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.536 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.537 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.717 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:12:30 np0005465988 nova_compute[236126]: 2025-10-02 13:12:30.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:30.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:12:31 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3016931118' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:12:31 np0005465988 nova_compute[236126]: 2025-10-02 13:12:31.255 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:12:31 np0005465988 nova_compute[236126]: 2025-10-02 13:12:31.264 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:12:31 np0005465988 nova_compute[236126]: 2025-10-02 13:12:31.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:31 np0005465988 nova_compute[236126]: 2025-10-02 13:12:31.283 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:12:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:31 np0005465988 nova_compute[236126]: 2025-10-02 13:12:31.304 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:12:31 np0005465988 nova_compute[236126]: 2025-10-02 13:12:31.304 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:12:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:32.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:32.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:34.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:34 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #59. Immutable memtables: 15.
Oct  2 09:12:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:34.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:35 np0005465988 nova_compute[236126]: 2025-10-02 13:12:35.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:36.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:36 np0005465988 nova_compute[236126]: 2025-10-02 13:12:36.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:36.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:38.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:38.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:40.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:40 np0005465988 nova_compute[236126]: 2025-10-02 13:12:40.305 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:12:40 np0005465988 nova_compute[236126]: 2025-10-02 13:12:40.306 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:12:40 np0005465988 nova_compute[236126]: 2025-10-02 13:12:40.306 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:12:40 np0005465988 nova_compute[236126]: 2025-10-02 13:12:40.306 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:12:40 np0005465988 nova_compute[236126]: 2025-10-02 13:12:40.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:12:40 np0005465988 nova_compute[236126]: 2025-10-02 13:12:40.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:40.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:41 np0005465988 nova_compute[236126]: 2025-10-02 13:12:41.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:42.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:42 np0005465988 nova_compute[236126]: 2025-10-02 13:12:42.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:12:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:42.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:44.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:44 np0005465988 nova_compute[236126]: 2025-10-02 13:12:44.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:12:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:44.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:45 np0005465988 nova_compute[236126]: 2025-10-02 13:12:45.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:46.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:46 np0005465988 nova_compute[236126]: 2025-10-02 13:12:46.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:46 np0005465988 nova_compute[236126]: 2025-10-02 13:12:46.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:12:46 np0005465988 nova_compute[236126]: 2025-10-02 13:12:46.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:12:46 np0005465988 nova_compute[236126]: 2025-10-02 13:12:46.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:12:46 np0005465988 podman[341955]: 2025-10-02 13:12:46.542279822 +0000 UTC m=+0.071884491 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 09:12:46 np0005465988 nova_compute[236126]: 2025-10-02 13:12:46.630 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:12:46 np0005465988 nova_compute[236126]: 2025-10-02 13:12:46.630 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:12:46 np0005465988 nova_compute[236126]: 2025-10-02 13:12:46.631 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 09:12:46 np0005465988 nova_compute[236126]: 2025-10-02 13:12:46.631 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:12:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:46.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:48.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:48 np0005465988 nova_compute[236126]: 2025-10-02 13:12:48.178 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Updating instance_info_cache with network_info: [{"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:12:48 np0005465988 nova_compute[236126]: 2025-10-02 13:12:48.194 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:12:48 np0005465988 nova_compute[236126]: 2025-10-02 13:12:48.195 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 09:12:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:48.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:50.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:50 np0005465988 nova_compute[236126]: 2025-10-02 13:12:50.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:50.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:51 np0005465988 nova_compute[236126]: 2025-10-02 13:12:51.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:52.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:52.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:54.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:54.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:55 np0005465988 nova_compute[236126]: 2025-10-02 13:12:55.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:56.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:56 np0005465988 nova_compute[236126]: 2025-10-02 13:12:56.189 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:12:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:12:56 np0005465988 nova_compute[236126]: 2025-10-02 13:12:56.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:12:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:56.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:12:57 np0005465988 podman[342032]: 2025-10-02 13:12:57.553323685 +0000 UTC m=+0.082448416 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:12:57 np0005465988 podman[342033]: 2025-10-02 13:12:57.557551737 +0000 UTC m=+0.076961548 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:12:57 np0005465988 podman[342031]: 2025-10-02 13:12:57.570142989 +0000 UTC m=+0.108452524 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 09:12:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:12:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:12:58.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:12:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:12:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:12:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:12:58.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:00.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:00 np0005465988 nova_compute[236126]: 2025-10-02 13:13:00.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:00.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:01 np0005465988 nova_compute[236126]: 2025-10-02 13:13:01.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:02.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:02.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:04.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:04.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:05 np0005465988 nova_compute[236126]: 2025-10-02 13:13:05.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:06.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:06 np0005465988 nova_compute[236126]: 2025-10-02 13:13:06.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:06.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:08.081 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=84, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=83) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:13:08 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:08.082 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:13:08 np0005465988 nova_compute[236126]: 2025-10-02 13:13:08.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:08.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:08.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:10 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:10.084 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '84'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:13:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:10.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:10 np0005465988 nova_compute[236126]: 2025-10-02 13:13:10.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:10.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:11 np0005465988 nova_compute[236126]: 2025-10-02 13:13:11.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:12.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:12.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:14.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:14 np0005465988 ovn_controller[132601]: 2025-10-02T13:13:14Z|00995|binding|INFO|Releasing lport 2a2f4068-0f5b-4d26-b914-4d32097d8b55 from this chassis (sb_readonly=0)
Oct  2 09:13:14 np0005465988 nova_compute[236126]: 2025-10-02 13:13:14.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:14.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:15 np0005465988 nova_compute[236126]: 2025-10-02 13:13:15.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:16.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:16 np0005465988 nova_compute[236126]: 2025-10-02 13:13:16.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:16.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:17 np0005465988 podman[342153]: 2025-10-02 13:13:17.536953309 +0000 UTC m=+0.068980567 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 09:13:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:13:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:18.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:13:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:19.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:20.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:20 np0005465988 nova_compute[236126]: 2025-10-02 13:13:20.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:21.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:21 np0005465988 nova_compute[236126]: 2025-10-02 13:13:21.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:22.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:23.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:23 np0005465988 nova_compute[236126]: 2025-10-02 13:13:23.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:24.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:25.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e422 e422: 3 total, 3 up, 3 in
Oct  2 09:13:25 np0005465988 nova_compute[236126]: 2025-10-02 13:13:25.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:13:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:26.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:13:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e422 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:26 np0005465988 nova_compute[236126]: 2025-10-02 13:13:26.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:13:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:13:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:13:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:27.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.077 2 DEBUG oslo_concurrency.lockutils [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.078 2 DEBUG oslo_concurrency.lockutils [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.078 2 DEBUG oslo_concurrency.lockutils [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.078 2 DEBUG oslo_concurrency.lockutils [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.079 2 DEBUG oslo_concurrency.lockutils [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.080 2 INFO nova.compute.manager [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Terminating instance#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.082 2 DEBUG nova.compute.manager [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 09:13:27 np0005465988 kernel: tapdd3cdaf8-55 (unregistering): left promiscuous mode
Oct  2 09:13:27 np0005465988 NetworkManager[45041]: <info>  [1759410807.1513] device (tapdd3cdaf8-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:13:27 np0005465988 ovn_controller[132601]: 2025-10-02T13:13:27Z|00996|binding|INFO|Releasing lport dd3cdaf8-55ea-439f-ae16-3363c19b1ccc from this chassis (sb_readonly=0)
Oct  2 09:13:27 np0005465988 ovn_controller[132601]: 2025-10-02T13:13:27Z|00997|binding|INFO|Setting lport dd3cdaf8-55ea-439f-ae16-3363c19b1ccc down in Southbound
Oct  2 09:13:27 np0005465988 ovn_controller[132601]: 2025-10-02T13:13:27Z|00998|binding|INFO|Removing iface tapdd3cdaf8-55 ovn-installed in OVS
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:27.177 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:12:72 10.100.0.14'], port_security=['fa:16:3e:4f:12:72 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6de68f58-d90f-4bb4-ad9d-4bfa90dcb765', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-150508fb-9217-4982-8468-977a3b53121a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6903cca5-a4c4-4f8f-a0c5-b42a4e15d418', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d5e391d-23a7-4f5a-8146-0f24141a74f2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=dd3cdaf8-55ea-439f-ae16-3363c19b1ccc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:13:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:27.179 142124 INFO neutron.agent.ovn.metadata.agent [-] Port dd3cdaf8-55ea-439f-ae16-3363c19b1ccc in datapath 150508fb-9217-4982-8468-977a3b53121a unbound from our chassis#033[00m
Oct  2 09:13:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:27.181 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 150508fb-9217-4982-8468-977a3b53121a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:13:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:27.182 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1320d9-2020-487b-a8fa-62b77f47ce78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:13:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:27.183 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-150508fb-9217-4982-8468-977a3b53121a namespace which is not needed anymore#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:27 np0005465988 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d000000d7.scope: Deactivated successfully.
Oct  2 09:13:27 np0005465988 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d000000d7.scope: Consumed 16.791s CPU time.
Oct  2 09:13:27 np0005465988 systemd-machined[192594]: Machine qemu-104-instance-000000d7 terminated.
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.324 2 INFO nova.virt.libvirt.driver [-] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Instance destroyed successfully.#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.324 2 DEBUG nova.objects.instance [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lazy-loading 'resources' on Instance uuid 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.347 2 DEBUG nova.virt.libvirt.vif [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:11:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1895710534',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1895710534',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1895710534',id=215,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJyVmDplTyBUuGu+PxDHYy0cGpMO6l1bDwFxzUhHuP8Q5cQVtZgJ8nphmotjM+1tM1ayRodD11OEZs5etfn1F5kmg2y3i5ZSFmiZu7fk0vYLFVnhbokqYJHSZo3M77maaw==',key_name='tempest-keypair-281057978',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:11:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-ggrq1m9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:11:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=6de68f58-d90f-4bb4-ad9d-4bfa90dcb765,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.347 2 DEBUG nova.network.os_vif_util [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "address": "fa:16:3e:4f:12:72", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3cdaf8-55", "ovs_interfaceid": "dd3cdaf8-55ea-439f-ae16-3363c19b1ccc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.348 2 DEBUG nova.network.os_vif_util [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:12:72,bridge_name='br-int',has_traffic_filtering=True,id=dd3cdaf8-55ea-439f-ae16-3363c19b1ccc,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3cdaf8-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.348 2 DEBUG os_vif [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:12:72,bridge_name='br-int',has_traffic_filtering=True,id=dd3cdaf8-55ea-439f-ae16-3363c19b1ccc,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3cdaf8-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.351 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd3cdaf8-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.355 2 INFO os_vif [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:12:72,bridge_name='br-int',has_traffic_filtering=True,id=dd3cdaf8-55ea-439f-ae16-3363c19b1ccc,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3cdaf8-55')#033[00m
Oct  2 09:13:27 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[341501]: [NOTICE]   (341505) : haproxy version is 2.8.14-c23fe91
Oct  2 09:13:27 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[341501]: [NOTICE]   (341505) : path to executable is /usr/sbin/haproxy
Oct  2 09:13:27 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[341501]: [ALERT]    (341505) : Current worker (341507) exited with code 143 (Terminated)
Oct  2 09:13:27 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[341501]: [WARNING]  (341505) : All workers exited. Exiting... (0)
Oct  2 09:13:27 np0005465988 systemd[1]: libpod-bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f.scope: Deactivated successfully.
Oct  2 09:13:27 np0005465988 conmon[341501]: conmon bf0fc510684380ac4473 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f.scope/container/memory.events
Oct  2 09:13:27 np0005465988 podman[342333]: 2025-10-02 13:13:27.392993827 +0000 UTC m=+0.085181165 container died bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 09:13:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:27.421 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:13:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:27.422 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:13:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:27.422 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:13:27 np0005465988 nova_compute[236126]: 2025-10-02 13:13:27.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:13:27 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f-userdata-shm.mount: Deactivated successfully.
Oct  2 09:13:27 np0005465988 systemd[1]: var-lib-containers-storage-overlay-9402cd6c6a236c7f8b0804a5407bf9c7d99ec134d4fee474d1e9f91a15cef072-merged.mount: Deactivated successfully.
Oct  2 09:13:27 np0005465988 podman[342389]: 2025-10-02 13:13:27.749307169 +0000 UTC m=+0.135601336 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  2 09:13:27 np0005465988 podman[342388]: 2025-10-02 13:13:27.753228702 +0000 UTC m=+0.143469543 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid)
Oct  2 09:13:27 np0005465988 podman[342390]: 2025-10-02 13:13:27.772393684 +0000 UTC m=+0.147122768 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  2 09:13:27 np0005465988 podman[342333]: 2025-10-02 13:13:27.996064346 +0000 UTC m=+0.688251714 container cleanup bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 09:13:28 np0005465988 systemd[1]: libpod-conmon-bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f.scope: Deactivated successfully.
Oct  2 09:13:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:28.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:28 np0005465988 nova_compute[236126]: 2025-10-02 13:13:28.324 2 DEBUG nova.compute.manager [req-81cb2c30-231f-4838-910e-4fd2948ec1f3 req-367bb7cc-8168-486b-bb0d-98c0285e1ad0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Received event network-vif-unplugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:13:28 np0005465988 nova_compute[236126]: 2025-10-02 13:13:28.325 2 DEBUG oslo_concurrency.lockutils [req-81cb2c30-231f-4838-910e-4fd2948ec1f3 req-367bb7cc-8168-486b-bb0d-98c0285e1ad0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:13:28 np0005465988 nova_compute[236126]: 2025-10-02 13:13:28.325 2 DEBUG oslo_concurrency.lockutils [req-81cb2c30-231f-4838-910e-4fd2948ec1f3 req-367bb7cc-8168-486b-bb0d-98c0285e1ad0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:13:28 np0005465988 nova_compute[236126]: 2025-10-02 13:13:28.326 2 DEBUG oslo_concurrency.lockutils [req-81cb2c30-231f-4838-910e-4fd2948ec1f3 req-367bb7cc-8168-486b-bb0d-98c0285e1ad0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:13:28 np0005465988 nova_compute[236126]: 2025-10-02 13:13:28.326 2 DEBUG nova.compute.manager [req-81cb2c30-231f-4838-910e-4fd2948ec1f3 req-367bb7cc-8168-486b-bb0d-98c0285e1ad0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] No waiting events found dispatching network-vif-unplugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:13:28 np0005465988 nova_compute[236126]: 2025-10-02 13:13:28.326 2 DEBUG nova.compute.manager [req-81cb2c30-231f-4838-910e-4fd2948ec1f3 req-367bb7cc-8168-486b-bb0d-98c0285e1ad0 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Received event network-vif-unplugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:13:28 np0005465988 podman[342459]: 2025-10-02 13:13:28.664665533 +0000 UTC m=+0.625595559 container remove bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 09:13:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:28.674 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[48204d94-e8c2-4020-854c-14ec35d20b29]: (4, ('Thu Oct  2 01:13:27 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a (bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f)\nbf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f\nThu Oct  2 01:13:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a (bf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f)\nbf0fc510684380ac4473c7e003236834259b8a3126e02c96106b09ba972e6c9f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:13:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:28.677 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[aeece9eb-d5b9-4617-8e8e-0d4ee2c093a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:13:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:28.678 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap150508fb-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:13:28 np0005465988 nova_compute[236126]: 2025-10-02 13:13:28.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:28 np0005465988 kernel: tap150508fb-90: left promiscuous mode
Oct  2 09:13:28 np0005465988 nova_compute[236126]: 2025-10-02 13:13:28.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:28.700 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8e2fce-a41a-4728-9a51-44416a8eb578]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:13:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:28.724 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2ac29653-43f6-40ef-a883-ef044c74ade7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:13:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:28.726 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf7a79a-76f6-4fb9-b5ac-4afeaf6cf351]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:13:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:28.754 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[baaa7aa7-8b86-4798-8f9a-090e7a60018a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 877795, 'reachable_time': 44852, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 342525, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:13:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:28.757 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-150508fb-9217-4982-8468-977a3b53121a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:13:28 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:13:28.758 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[a12d25da-5b53-4ca4-ad30-7eb387527500]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:13:28 np0005465988 systemd[1]: run-netns-ovnmeta\x2d150508fb\x2d9217\x2d4982\x2d8468\x2d977a3b53121a.mount: Deactivated successfully.
Oct  2 09:13:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:29.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:13:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.0 total, 600.0 interval#012Cumulative writes: 16K writes, 83K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.16 GB, 0.03 MB/s#012Cumulative WAL: 16K writes, 16K syncs, 1.00 writes per sync, written: 0.16 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1610 writes, 7856 keys, 1610 commit groups, 1.0 writes per commit group, ingest: 16.04 MB, 0.03 MB/s#012Interval WAL: 1610 writes, 1610 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     60.0      1.70              0.35        53    0.032       0      0       0.0       0.0#012  L6      1/0   12.74 MB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   5.2    114.4     98.0      5.46              1.88        52    0.105    391K    28K       0.0       0.0#012 Sum      1/0   12.74 MB   0.0      0.6     0.1      0.5       0.6      0.1       0.0   6.2     87.2     89.0      7.16              2.23       105    0.068    391K    28K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.8    137.0    139.0      0.60              0.26        12    0.050     62K   3141       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   0.0    114.4     98.0      5.46              1.88        52    0.105    391K    28K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     60.1      1.70              0.35        52    0.033       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.0 total, 600.0 interval#012Flush(GB): cumulative 0.100, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.62 GB write, 0.11 MB/s write, 0.61 GB read, 0.10 MB/s read, 7.2 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.14 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 304.00 MB usage: 67.91 MB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 0 last_secs: 0.000538 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3884,65.08 MB,21.4073%) FilterBlock(105,1.05 MB,0.346169%) IndexBlock(105,1.78 MB,0.583885%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 09:13:29 np0005465988 nova_compute[236126]: 2025-10-02 13:13:29.273 2 INFO nova.virt.libvirt.driver [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Deleting instance files /var/lib/nova/instances/6de68f58-d90f-4bb4-ad9d-4bfa90dcb765_del#033[00m
Oct  2 09:13:29 np0005465988 nova_compute[236126]: 2025-10-02 13:13:29.274 2 INFO nova.virt.libvirt.driver [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Deletion of /var/lib/nova/instances/6de68f58-d90f-4bb4-ad9d-4bfa90dcb765_del complete#033[00m
Oct  2 09:13:29 np0005465988 nova_compute[236126]: 2025-10-02 13:13:29.382 2 INFO nova.compute.manager [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Took 2.30 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 09:13:29 np0005465988 nova_compute[236126]: 2025-10-02 13:13:29.384 2 DEBUG oslo.service.loopingcall [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 09:13:29 np0005465988 nova_compute[236126]: 2025-10-02 13:13:29.385 2 DEBUG nova.compute.manager [-] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 09:13:29 np0005465988 nova_compute[236126]: 2025-10-02 13:13:29.385 2 DEBUG nova.network.neutron [-] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 09:13:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:30.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:30 np0005465988 nova_compute[236126]: 2025-10-02 13:13:30.482 2 DEBUG nova.compute.manager [req-a82e71d6-7b87-440b-afe3-762d38dab647 req-b68c982b-c210-4e3a-9640-38c5b3f9e787 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Received event network-vif-plugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:13:30 np0005465988 nova_compute[236126]: 2025-10-02 13:13:30.483 2 DEBUG oslo_concurrency.lockutils [req-a82e71d6-7b87-440b-afe3-762d38dab647 req-b68c982b-c210-4e3a-9640-38c5b3f9e787 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:13:30 np0005465988 nova_compute[236126]: 2025-10-02 13:13:30.483 2 DEBUG oslo_concurrency.lockutils [req-a82e71d6-7b87-440b-afe3-762d38dab647 req-b68c982b-c210-4e3a-9640-38c5b3f9e787 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:13:30 np0005465988 nova_compute[236126]: 2025-10-02 13:13:30.483 2 DEBUG oslo_concurrency.lockutils [req-a82e71d6-7b87-440b-afe3-762d38dab647 req-b68c982b-c210-4e3a-9640-38c5b3f9e787 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:13:30 np0005465988 nova_compute[236126]: 2025-10-02 13:13:30.484 2 DEBUG nova.compute.manager [req-a82e71d6-7b87-440b-afe3-762d38dab647 req-b68c982b-c210-4e3a-9640-38c5b3f9e787 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] No waiting events found dispatching network-vif-plugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:13:30 np0005465988 nova_compute[236126]: 2025-10-02 13:13:30.484 2 WARNING nova.compute.manager [req-a82e71d6-7b87-440b-afe3-762d38dab647 req-b68c982b-c210-4e3a-9640-38c5b3f9e787 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Received unexpected event network-vif-plugged-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc for instance with vm_state active and task_state deleting.#033[00m
Oct  2 09:13:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:31.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.171 2 DEBUG nova.network.neutron [-] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.198 2 INFO nova.compute.manager [-] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Took 1.81 seconds to deallocate network for instance.#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.260 2 DEBUG nova.compute.manager [req-bb116215-f739-40cc-8b81-8a5a8ad0b94f req-9928bca2-7435-4103-b13c-3abb7a936bb3 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Received event network-vif-deleted-dd3cdaf8-55ea-439f-ae16-3363c19b1ccc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:13:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e423 e423: 3 total, 3 up, 3 in
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e423 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.388 2 INFO nova.compute.manager [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Took 0.19 seconds to detach 1 volumes for instance.#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.390 2 DEBUG nova.compute.manager [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Deleting volume: cce428bb-67ad-45c7-9e76-52ebd4f984b0 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.518 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.519 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.519 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.520 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.520 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.669 2 DEBUG oslo_concurrency.lockutils [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.670 2 DEBUG oslo_concurrency.lockutils [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:13:31 np0005465988 nova_compute[236126]: 2025-10-02 13:13:31.820 2 DEBUG oslo_concurrency.processutils [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:13:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:13:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/408768300' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.118 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.598s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:13:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:32.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:13:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:13:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:13:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/120125028' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.309 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.310 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3945MB free_disk=20.988113403320312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.310 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.317 2 DEBUG oslo_concurrency.processutils [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.322 2 DEBUG nova.compute.provider_tree [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.344 2 DEBUG nova.scheduler.client.report [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.381 2 DEBUG oslo_concurrency.lockutils [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.711s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.384 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.417 2 INFO nova.scheduler.client.report [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Deleted allocations for instance 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.457 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.458 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.482 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.532 2 DEBUG oslo_concurrency.lockutils [None req-66107922-4e40-4984-b98f-06cc77ab3956 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "6de68f58-d90f-4bb4-ad9d-4bfa90dcb765" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:13:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:13:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3420663888' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.926 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:13:32 np0005465988 nova_compute[236126]: 2025-10-02 13:13:32.932 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:13:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:33.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:33 np0005465988 nova_compute[236126]: 2025-10-02 13:13:33.770 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:13:33 np0005465988 nova_compute[236126]: 2025-10-02 13:13:33.820 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:13:33 np0005465988 nova_compute[236126]: 2025-10-02 13:13:33.821 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:13:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:34.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:35.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:36.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:36 np0005465988 nova_compute[236126]: 2025-10-02 13:13:36.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e423 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e424 e424: 3 total, 3 up, 3 in
Oct  2 09:13:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:37.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:37 np0005465988 nova_compute[236126]: 2025-10-02 13:13:37.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:38.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:39.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:40.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:40 np0005465988 nova_compute[236126]: 2025-10-02 13:13:40.822 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:13:40 np0005465988 nova_compute[236126]: 2025-10-02 13:13:40.823 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:13:40 np0005465988 nova_compute[236126]: 2025-10-02 13:13:40.823 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:13:40 np0005465988 nova_compute[236126]: 2025-10-02 13:13:40.823 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:13:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:41.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 e425: 3 total, 3 up, 3 in
Oct  2 09:13:41 np0005465988 nova_compute[236126]: 2025-10-02 13:13:41.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:41 np0005465988 nova_compute[236126]: 2025-10-02 13:13:41.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:13:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:42.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:42 np0005465988 nova_compute[236126]: 2025-10-02 13:13:42.323 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410807.321742, 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:13:42 np0005465988 nova_compute[236126]: 2025-10-02 13:13:42.324 2 INFO nova.compute.manager [-] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:13:42 np0005465988 nova_compute[236126]: 2025-10-02 13:13:42.356 2 DEBUG nova.compute.manager [None req-5824a571-5932-47c5-a4e7-bd2484667e25 - - - - - -] [instance: 6de68f58-d90f-4bb4-ad9d-4bfa90dcb765] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:13:42 np0005465988 nova_compute[236126]: 2025-10-02 13:13:42.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:43.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:13:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:44.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:13:44 np0005465988 nova_compute[236126]: 2025-10-02 13:13:44.476 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:13:44 np0005465988 nova_compute[236126]: 2025-10-02 13:13:44.477 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:13:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:45.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:46.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:46 np0005465988 nova_compute[236126]: 2025-10-02 13:13:46.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:47.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:47 np0005465988 nova_compute[236126]: 2025-10-02 13:13:47.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:47 np0005465988 nova_compute[236126]: 2025-10-02 13:13:47.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:13:47 np0005465988 nova_compute[236126]: 2025-10-02 13:13:47.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:13:47 np0005465988 nova_compute[236126]: 2025-10-02 13:13:47.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:13:47 np0005465988 nova_compute[236126]: 2025-10-02 13:13:47.498 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:13:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:48.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:48 np0005465988 podman[342678]: 2025-10-02 13:13:48.526593962 +0000 UTC m=+0.060517864 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:13:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:49.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:50.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:51.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:51 np0005465988 nova_compute[236126]: 2025-10-02 13:13:51.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:52.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:52 np0005465988 nova_compute[236126]: 2025-10-02 13:13:52.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:13:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:53.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:13:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:54.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:55.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #169. Immutable memtables: 0.
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.185436) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 107] Flushing memtable with next log file: 169
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410836185532, "job": 107, "event": "flush_started", "num_memtables": 1, "num_entries": 1576, "num_deletes": 252, "total_data_size": 3479351, "memory_usage": 3529640, "flush_reason": "Manual Compaction"}
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 107] Level-0 flush table #170: started
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410836292193, "cf_name": "default", "job": 107, "event": "table_file_creation", "file_number": 170, "file_size": 1431528, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 82279, "largest_seqno": 83850, "table_properties": {"data_size": 1426258, "index_size": 2537, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13959, "raw_average_key_size": 21, "raw_value_size": 1414718, "raw_average_value_size": 2153, "num_data_blocks": 112, "num_entries": 657, "num_filter_entries": 657, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759410714, "oldest_key_time": 1759410714, "file_creation_time": 1759410836, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 170, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 107] Flush lasted 106808 microseconds, and 4521 cpu microseconds.
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:13:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:13:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:56.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.292253) [db/flush_job.cc:967] [default] [JOB 107] Level-0 flush table #170: 1431528 bytes OK
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.292280) [db/memtable_list.cc:519] [default] Level-0 commit table #170 started
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.300173) [db/memtable_list.cc:722] [default] Level-0 commit table #170: memtable #1 done
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.300196) EVENT_LOG_v1 {"time_micros": 1759410836300189, "job": 107, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.300218) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 107] Try to delete WAL files size 3472099, prev total WAL file size 3472099, number of live WAL files 2.
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000166.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.301206) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032373632' seq:72057594037927935, type:22 .. '6D6772737461740033303133' seq:0, type:0; will stop at (end)
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 108] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 107 Base level 0, inputs: [170(1397KB)], [168(12MB)]
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410836301275, "job": 108, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [170], "files_L6": [168], "score": -1, "input_data_size": 14795620, "oldest_snapshot_seqno": -1}
Oct  2 09:13:56 np0005465988 nova_compute[236126]: 2025-10-02 13:13:56.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 108] Generated table #171: 10377 keys, 11784059 bytes, temperature: kUnknown
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410836478204, "cf_name": "default", "job": 108, "event": "table_file_creation", "file_number": 171, "file_size": 11784059, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11719416, "index_size": 37586, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25989, "raw_key_size": 273460, "raw_average_key_size": 26, "raw_value_size": 11540145, "raw_average_value_size": 1112, "num_data_blocks": 1425, "num_entries": 10377, "num_filter_entries": 10377, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759410836, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 171, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.478585) [db/compaction/compaction_job.cc:1663] [default] [JOB 108] Compacted 1@0 + 1@6 files to L6 => 11784059 bytes
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.486780) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 83.6 rd, 66.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 12.7 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(18.6) write-amplify(8.2) OK, records in: 10845, records dropped: 468 output_compression: NoCompression
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.486810) EVENT_LOG_v1 {"time_micros": 1759410836486798, "job": 108, "event": "compaction_finished", "compaction_time_micros": 177006, "compaction_time_cpu_micros": 32053, "output_level": 6, "num_output_files": 1, "total_output_size": 11784059, "num_input_records": 10845, "num_output_records": 10377, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000170.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410836487295, "job": 108, "event": "table_file_deletion", "file_number": 170}
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000168.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410836489783, "job": 108, "event": "table_file_deletion", "file_number": 168}
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.301086) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.489889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.489897) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.489899) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.489900) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:13:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:13:56.489903) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:13:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:57.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:57 np0005465988 nova_compute[236126]: 2025-10-02 13:13:57.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:13:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:13:58.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:13:58 np0005465988 podman[342731]: 2025-10-02 13:13:58.546268114 +0000 UTC m=+0.070197423 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 09:13:58 np0005465988 podman[342732]: 2025-10-02 13:13:58.553759259 +0000 UTC m=+0.071650644 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  2 09:13:58 np0005465988 podman[342730]: 2025-10-02 13:13:58.580354225 +0000 UTC m=+0.107717603 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 09:13:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:13:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:13:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:13:59.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:00.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:01.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:01 np0005465988 nova_compute[236126]: 2025-10-02 13:14:01.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:02.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:02 np0005465988 nova_compute[236126]: 2025-10-02 13:14:02.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:03.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:14:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:04.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:14:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:05.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:06.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:06 np0005465988 nova_compute[236126]: 2025-10-02 13:14:06.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:14:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6001.0 total, 600.0 interval#012Cumulative writes: 77K writes, 314K keys, 77K commit groups, 1.0 writes per commit group, ingest: 0.31 GB, 0.05 MB/s#012Cumulative WAL: 77K writes, 28K syncs, 2.70 writes per sync, written: 0.31 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5288 writes, 20K keys, 5288 commit groups, 1.0 writes per commit group, ingest: 21.34 MB, 0.04 MB/s#012Interval WAL: 5288 writes, 2082 syncs, 2.54 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6001.0 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6001.0 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6001.0 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me
Oct  2 09:14:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:07.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:07 np0005465988 nova_compute[236126]: 2025-10-02 13:14:07.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:08.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:09.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:14:09.087 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=85, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=84) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:14:09 np0005465988 nova_compute[236126]: 2025-10-02 13:14:09.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:14:09.090 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:14:09 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:14:09.091 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '85'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:14:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:10.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:14:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:11.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:14:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:11 np0005465988 nova_compute[236126]: 2025-10-02 13:14:11.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:12.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:12 np0005465988 nova_compute[236126]: 2025-10-02 13:14:12.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:13.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:14.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:15.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:16.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:16 np0005465988 nova_compute[236126]: 2025-10-02 13:14:16.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:17.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:17 np0005465988 nova_compute[236126]: 2025-10-02 13:14:17.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:18.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:19.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:19 np0005465988 podman[342855]: 2025-10-02 13:14:19.545422026 +0000 UTC m=+0.074679942 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 09:14:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:20.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:21.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:21 np0005465988 nova_compute[236126]: 2025-10-02 13:14:21.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:22.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:22 np0005465988 nova_compute[236126]: 2025-10-02 13:14:22.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:23.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:24.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:25.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:26.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:26 np0005465988 nova_compute[236126]: 2025-10-02 13:14:26.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:27.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:27 np0005465988 nova_compute[236126]: 2025-10-02 13:14:27.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:14:27.422 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:14:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:14:27.422 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:14:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:14:27.423 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:14:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:28.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:28 np0005465988 podman[342905]: 2025-10-02 13:14:28.958973317 +0000 UTC m=+0.102459332 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  2 09:14:28 np0005465988 podman[342903]: 2025-10-02 13:14:28.963307422 +0000 UTC m=+0.122101437 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  2 09:14:28 np0005465988 podman[342904]: 2025-10-02 13:14:28.96671572 +0000 UTC m=+0.116564608 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 09:14:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:29.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:29 np0005465988 nova_compute[236126]: 2025-10-02 13:14:29.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:30.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:31.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:31 np0005465988 nova_compute[236126]: 2025-10-02 13:14:31.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:32.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:32 np0005465988 nova_compute[236126]: 2025-10-02 13:14:32.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:33.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:33 np0005465988 nova_compute[236126]: 2025-10-02 13:14:33.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:33 np0005465988 nova_compute[236126]: 2025-10-02 13:14:33.576 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:14:33 np0005465988 nova_compute[236126]: 2025-10-02 13:14:33.576 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:14:33 np0005465988 nova_compute[236126]: 2025-10-02 13:14:33.576 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:14:33 np0005465988 nova_compute[236126]: 2025-10-02 13:14:33.577 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:14:33 np0005465988 nova_compute[236126]: 2025-10-02 13:14:33.577 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:14:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:14:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1268725376' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:14:34 np0005465988 nova_compute[236126]: 2025-10-02 13:14:34.030 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:14:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:14:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:14:34 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:14:34 np0005465988 nova_compute[236126]: 2025-10-02 13:14:34.264 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:14:34 np0005465988 nova_compute[236126]: 2025-10-02 13:14:34.266 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3992MB free_disk=20.949909210205078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:14:34 np0005465988 nova_compute[236126]: 2025-10-02 13:14:34.266 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:14:34 np0005465988 nova_compute[236126]: 2025-10-02 13:14:34.266 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:14:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:34.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:34 np0005465988 nova_compute[236126]: 2025-10-02 13:14:34.463 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:14:34 np0005465988 nova_compute[236126]: 2025-10-02 13:14:34.464 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:14:34 np0005465988 nova_compute[236126]: 2025-10-02 13:14:34.481 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:14:34 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:14:34 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1851246415' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:14:34 np0005465988 nova_compute[236126]: 2025-10-02 13:14:34.984 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:14:34 np0005465988 nova_compute[236126]: 2025-10-02 13:14:34.992 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:14:35 np0005465988 nova_compute[236126]: 2025-10-02 13:14:35.015 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:14:35 np0005465988 nova_compute[236126]: 2025-10-02 13:14:35.017 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:14:35 np0005465988 nova_compute[236126]: 2025-10-02 13:14:35.017 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:14:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:35.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:36.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:36 np0005465988 nova_compute[236126]: 2025-10-02 13:14:36.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:37.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:37 np0005465988 nova_compute[236126]: 2025-10-02 13:14:37.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:38.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:39.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:39 np0005465988 nova_compute[236126]: 2025-10-02 13:14:39.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:39 np0005465988 nova_compute[236126]: 2025-10-02 13:14:39.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:39 np0005465988 nova_compute[236126]: 2025-10-02 13:14:39.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 09:14:39 np0005465988 nova_compute[236126]: 2025-10-02 13:14:39.646 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 09:14:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:14:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:14:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:40.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:41.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:41 np0005465988 nova_compute[236126]: 2025-10-02 13:14:41.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:42.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:14:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1019955647' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:14:42 np0005465988 nova_compute[236126]: 2025-10-02 13:14:42.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:42 np0005465988 nova_compute[236126]: 2025-10-02 13:14:42.641 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:42 np0005465988 nova_compute[236126]: 2025-10-02 13:14:42.642 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:42 np0005465988 nova_compute[236126]: 2025-10-02 13:14:42.642 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:42 np0005465988 nova_compute[236126]: 2025-10-02 13:14:42.642 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:14:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:43.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:14:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:44.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.474 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.474 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.475 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.475 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.476 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.476 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.571 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.572 2 WARNING nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.572 2 WARNING nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.573 2 WARNING nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.573 2 INFO nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Removable base files: /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.574 2 INFO nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.574 2 INFO nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/0b6beffbf0661cbda2e4327409b903fc2160c26f#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.575 2 INFO nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/5133c8c7459ce4fa1cf043a638fc1b5c66ed8609#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.575 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.575 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Oct  2 09:14:44 np0005465988 nova_compute[236126]: 2025-10-02 13:14:44.576 2 DEBUG nova.virt.libvirt.imagecache [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Oct  2 09:14:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:45.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:46.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:46 np0005465988 nova_compute[236126]: 2025-10-02 13:14:46.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:46 np0005465988 nova_compute[236126]: 2025-10-02 13:14:46.576 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:47.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:47 np0005465988 nova_compute[236126]: 2025-10-02 13:14:47.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:48.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:49.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:49 np0005465988 nova_compute[236126]: 2025-10-02 13:14:49.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:49 np0005465988 nova_compute[236126]: 2025-10-02 13:14:49.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:14:49 np0005465988 nova_compute[236126]: 2025-10-02 13:14:49.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:14:50 np0005465988 nova_compute[236126]: 2025-10-02 13:14:50.061 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:14:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:50.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:50 np0005465988 podman[343275]: 2025-10-02 13:14:50.533302412 +0000 UTC m=+0.063381706 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 09:14:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:51.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:51 np0005465988 nova_compute[236126]: 2025-10-02 13:14:51.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:14:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:52.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:14:52 np0005465988 nova_compute[236126]: 2025-10-02 13:14:52.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:53.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:54.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:54 np0005465988 nova_compute[236126]: 2025-10-02 13:14:54.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:54 np0005465988 nova_compute[236126]: 2025-10-02 13:14:54.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:55 np0005465988 nova_compute[236126]: 2025-10-02 13:14:55.056 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:14:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:55.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 09:14:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2527150884' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 09:14:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 09:14:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2527150884' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 09:14:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:14:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:56.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:56 np0005465988 nova_compute[236126]: 2025-10-02 13:14:56.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:57.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:57 np0005465988 nova_compute[236126]: 2025-10-02 13:14:57.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:14:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:14:58.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:14:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:14:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:14:59.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:14:59 np0005465988 podman[343301]: 2025-10-02 13:14:59.531748959 +0000 UTC m=+0.062691976 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 09:14:59 np0005465988 podman[343302]: 2025-10-02 13:14:59.554507045 +0000 UTC m=+0.074634951 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 09:14:59 np0005465988 podman[343300]: 2025-10-02 13:14:59.563170104 +0000 UTC m=+0.099559488 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 09:15:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:00.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:01.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:01 np0005465988 nova_compute[236126]: 2025-10-02 13:15:01.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:02.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:02 np0005465988 nova_compute[236126]: 2025-10-02 13:15:02.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:03.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:04.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:05.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:06.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:06 np0005465988 nova_compute[236126]: 2025-10-02 13:15:06.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:07.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:07 np0005465988 nova_compute[236126]: 2025-10-02 13:15:07.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:08.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:09.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:10.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:11.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:11 np0005465988 nova_compute[236126]: 2025-10-02 13:15:11.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:12.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:12 np0005465988 nova_compute[236126]: 2025-10-02 13:15:12.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:15:12 np0005465988 nova_compute[236126]: 2025-10-02 13:15:12.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:13.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:13.253 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=86, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=85) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:15:13 np0005465988 nova_compute[236126]: 2025-10-02 13:15:13.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:13 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:13.254 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:15:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:14.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e426 e426: 3 total, 3 up, 3 in
Oct  2 09:15:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:15.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:16.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:16 np0005465988 nova_compute[236126]: 2025-10-02 13:15:16.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:17.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:17 np0005465988 nova_compute[236126]: 2025-10-02 13:15:17.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:18.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:19.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:20.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:21.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #172. Immutable memtables: 0.
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.223439) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 109] Flushing memtable with next log file: 172
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410921223479, "job": 109, "event": "flush_started", "num_memtables": 1, "num_entries": 1077, "num_deletes": 251, "total_data_size": 2241817, "memory_usage": 2270800, "flush_reason": "Manual Compaction"}
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 109] Level-0 flush table #173: started
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410921242451, "cf_name": "default", "job": 109, "event": "table_file_creation", "file_number": 173, "file_size": 1478566, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 83855, "largest_seqno": 84927, "table_properties": {"data_size": 1473836, "index_size": 2317, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10624, "raw_average_key_size": 19, "raw_value_size": 1464188, "raw_average_value_size": 2731, "num_data_blocks": 103, "num_entries": 536, "num_filter_entries": 536, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759410837, "oldest_key_time": 1759410837, "file_creation_time": 1759410921, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 173, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 109] Flush lasted 19072 microseconds, and 4934 cpu microseconds.
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.242501) [db/flush_job.cc:967] [default] [JOB 109] Level-0 flush table #173: 1478566 bytes OK
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.242534) [db/memtable_list.cc:519] [default] Level-0 commit table #173 started
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.246947) [db/memtable_list.cc:722] [default] Level-0 commit table #173: memtable #1 done
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.246963) EVENT_LOG_v1 {"time_micros": 1759410921246958, "job": 109, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.246987) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 109] Try to delete WAL files size 2236595, prev total WAL file size 2236595, number of live WAL files 2.
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000169.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.247731) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037323739' seq:72057594037927935, type:22 .. '7061786F730037353331' seq:0, type:0; will stop at (end)
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 110] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 109 Base level 0, inputs: [173(1443KB)], [171(11MB)]
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410921247814, "job": 110, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [173], "files_L6": [171], "score": -1, "input_data_size": 13262625, "oldest_snapshot_seqno": -1}
Oct  2 09:15:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:21.256 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '86'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 110] Generated table #174: 10394 keys, 11301399 bytes, temperature: kUnknown
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410921352108, "cf_name": "default", "job": 110, "event": "table_file_creation", "file_number": 174, "file_size": 11301399, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11237039, "index_size": 37239, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 274514, "raw_average_key_size": 26, "raw_value_size": 11057910, "raw_average_value_size": 1063, "num_data_blocks": 1405, "num_entries": 10394, "num_filter_entries": 10394, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759410921, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 174, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.352441) [db/compaction/compaction_job.cc:1663] [default] [JOB 110] Compacted 1@0 + 1@6 files to L6 => 11301399 bytes
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.354585) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 127.1 rd, 108.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 11.2 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(16.6) write-amplify(7.6) OK, records in: 10913, records dropped: 519 output_compression: NoCompression
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.354603) EVENT_LOG_v1 {"time_micros": 1759410921354594, "job": 110, "event": "compaction_finished", "compaction_time_micros": 104378, "compaction_time_cpu_micros": 39015, "output_level": 6, "num_output_files": 1, "total_output_size": 11301399, "num_input_records": 10913, "num_output_records": 10394, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000173.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410921354966, "job": 110, "event": "table_file_deletion", "file_number": 173}
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000171.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759410921357336, "job": 110, "event": "table_file_deletion", "file_number": 171}
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.247638) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.357405) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.357410) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.357412) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.357414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:15:21.357416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:15:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:21 np0005465988 nova_compute[236126]: 2025-10-02 13:15:21.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:21 np0005465988 podman[343427]: 2025-10-02 13:15:21.518927983 +0000 UTC m=+0.055070384 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:15:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:22.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:22 np0005465988 nova_compute[236126]: 2025-10-02 13:15:22.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:23.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:23 np0005465988 nova_compute[236126]: 2025-10-02 13:15:23.832 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "3a7474e0-ede0-4d42-adf1-28b16d03074b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:23 np0005465988 nova_compute[236126]: 2025-10-02 13:15:23.833 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:23 np0005465988 nova_compute[236126]: 2025-10-02 13:15:23.867 2 DEBUG nova.compute.manager [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:15:23 np0005465988 nova_compute[236126]: 2025-10-02 13:15:23.985 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:23 np0005465988 nova_compute[236126]: 2025-10-02 13:15:23.986 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:23 np0005465988 nova_compute[236126]: 2025-10-02 13:15:23.996 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:15:23 np0005465988 nova_compute[236126]: 2025-10-02 13:15:23.996 2 INFO nova.compute.claims [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:15:24 np0005465988 nova_compute[236126]: 2025-10-02 13:15:24.117 2 DEBUG oslo_concurrency.processutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:15:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:24.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:15:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/257275' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:15:24 np0005465988 nova_compute[236126]: 2025-10-02 13:15:24.574 2 DEBUG oslo_concurrency.processutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:15:24 np0005465988 nova_compute[236126]: 2025-10-02 13:15:24.580 2 DEBUG nova.compute.provider_tree [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:15:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:25.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.158 2 DEBUG nova.scheduler.client.report [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.287 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.288 2 DEBUG nova.compute.manager [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:15:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.400 2 DEBUG nova.compute.manager [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.401 2 DEBUG nova.network.neutron [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:15:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:26.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.426 2 INFO nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.449 2 DEBUG nova.compute.manager [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.513 2 INFO nova.virt.block_device [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Booting with volume fb042846-a135-4c8d-8115-d7e363fd7891 at /dev/vda#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.656 2 DEBUG nova.policy [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c10de71fef00497981b8b7cec6a3fff3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.765 2 DEBUG os_brick.utils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.766 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.780 5711 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.780 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[cf98d01f-38a4-41a5-ad54-66a1adf2a2d2]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.781 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.792 5711 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.792 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[c2aee8f5-97d3-4a66-a99b-ee8e73e41a09]: (4, ('InitiatorName=iqn.1994-05.com.redhat:7daf2c659dfe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.793 5711 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.808 5711 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.808 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[82b500ab-9bff-416b-9d91-361e8202220b]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.810 5711 DEBUG oslo.privsep.daemon [-] privsep: reply[ab6927e7-a791-4ab9-9777-b6c9d84806dc]: (4, '93278213-1c3c-4fb4-9fd1-d481e0b53ce1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.810 2 DEBUG oslo_concurrency.processutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.860 2 DEBUG oslo_concurrency.processutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "nvme version" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.863 2 DEBUG os_brick.initiator.connectors.lightos [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.863 2 DEBUG os_brick.initiator.connectors.lightos [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.864 2 DEBUG os_brick.initiator.connectors.lightos [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.864 2 DEBUG os_brick.utils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] <== get_connector_properties: return (98ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:7daf2c659dfe', 'do_local_attach': False, 'nvme_hostid': '2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'system uuid': '93278213-1c3c-4fb4-9fd1-d481e0b53ce1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:2f7d2450-18ac-43a6-80ee-9caa4a7736e0', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Oct  2 09:15:26 np0005465988 nova_compute[236126]: 2025-10-02 13:15:26.864 2 DEBUG nova.virt.block_device [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updating existing volume attachment record: bd85fdc1-9d9d-491a-942f-b625f9757354 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Oct  2 09:15:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:27.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:27 np0005465988 nova_compute[236126]: 2025-10-02 13:15:27.275 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:15:27 np0005465988 nova_compute[236126]: 2025-10-02 13:15:27.276 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 09:15:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:27.423 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:27.423 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:27.423 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:15:27 np0005465988 nova_compute[236126]: 2025-10-02 13:15:27.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:27 np0005465988 nova_compute[236126]: 2025-10-02 13:15:27.719 2 DEBUG nova.network.neutron [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Successfully created port: a3e548ad-dc1c-45a0-bb2c-54476fc1f716 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.332 2 DEBUG nova.compute.manager [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.333 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.334 2 INFO nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Creating image(s)#033[00m
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.334 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.335 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Ensure instance console log exists: /var/lib/nova/instances/3a7474e0-ede0-4d42-adf1-28b16d03074b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.335 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.335 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.336 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:15:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:28.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.937 2 DEBUG nova.network.neutron [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Successfully updated port: a3e548ad-dc1c-45a0-bb2c-54476fc1f716 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.971 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.972 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquired lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:15:28 np0005465988 nova_compute[236126]: 2025-10-02 13:15:28.972 2 DEBUG nova.network.neutron [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:15:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:29.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:29 np0005465988 nova_compute[236126]: 2025-10-02 13:15:29.149 2 DEBUG nova.compute.manager [req-05a4b827-ca57-470e-bc0d-739e1edfc10b req-c2df3991-6b11-472e-bc0c-f4345bce9b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Received event network-changed-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:15:29 np0005465988 nova_compute[236126]: 2025-10-02 13:15:29.149 2 DEBUG nova.compute.manager [req-05a4b827-ca57-470e-bc0d-739e1edfc10b req-c2df3991-6b11-472e-bc0c-f4345bce9b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Refreshing instance network info cache due to event network-changed-a3e548ad-dc1c-45a0-bb2c-54476fc1f716. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:15:29 np0005465988 nova_compute[236126]: 2025-10-02 13:15:29.149 2 DEBUG oslo_concurrency.lockutils [req-05a4b827-ca57-470e-bc0d-739e1edfc10b req-c2df3991-6b11-472e-bc0c-f4345bce9b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:15:29 np0005465988 nova_compute[236126]: 2025-10-02 13:15:29.233 2 DEBUG nova.network.neutron [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:15:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:30.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:30 np0005465988 podman[343530]: 2025-10-02 13:15:30.538010833 +0000 UTC m=+0.062168456 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.3)
Oct  2 09:15:30 np0005465988 podman[343531]: 2025-10-02 13:15:30.548652137 +0000 UTC m=+0.074789057 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 09:15:30 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 09:15:30 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 09:15:30 np0005465988 podman[343529]: 2025-10-02 13:15:30.576510923 +0000 UTC m=+0.098319109 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.053 2 DEBUG nova.network.neutron [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updating instance_info_cache with network_info: [{"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.090 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Releasing lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.091 2 DEBUG nova.compute.manager [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Instance network_info: |[{"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.091 2 DEBUG oslo_concurrency.lockutils [req-05a4b827-ca57-470e-bc0d-739e1edfc10b req-c2df3991-6b11-472e-bc0c-f4345bce9b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.091 2 DEBUG nova.network.neutron [req-05a4b827-ca57-470e-bc0d-739e1edfc10b req-c2df3991-6b11-472e-bc0c-f4345bce9b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Refreshing network info cache for port a3e548ad-dc1c-45a0-bb2c-54476fc1f716 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.095 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Start _get_guest_xml network_info=[{"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': 'bd85fdc1-9d9d-491a-942f-b625f9757354', 'disk_bus': 'virtio', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-fb042846-a135-4c8d-8115-d7e363fd7891', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'fb042846-a135-4c8d-8115-d7e363fd7891', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '3a7474e0-ede0-4d42-adf1-28b16d03074b', 'attached_at': '', 'detached_at': '', 'volume_id': 'fb042846-a135-4c8d-8115-d7e363fd7891', 'serial': 'fb042846-a135-4c8d-8115-d7e363fd7891'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.102 2 WARNING nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.106 2 DEBUG nova.virt.libvirt.host [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.107 2 DEBUG nova.virt.libvirt.host [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.114 2 DEBUG nova.virt.libvirt.host [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.115 2 DEBUG nova.virt.libvirt.host [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.116 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.116 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.117 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.117 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.117 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.117 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.118 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.118 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.118 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.118 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.119 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.119 2 DEBUG nova.virt.hardware [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:15:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:31.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.153 2 DEBUG nova.storage.rbd_utils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 3a7474e0-ede0-4d42-adf1-28b16d03074b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.158 2 DEBUG oslo_concurrency.processutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:15:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.531 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:15:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:15:31 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1013938976' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.612 2 DEBUG oslo_concurrency.processutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.671 2 DEBUG nova.virt.libvirt.vif [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:15:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-357187967',display_name='tempest-TestVolumeBootPattern-server-357187967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-357187967',id=222,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBTAKk7vcjhR2v3hQpxHbnD8D+5EFYQASqHngnH89TfDPr9LwfPo4GlBaSvBU1kSzEsKlDYOjmvBACnYkU3g9qIDzQdk5Sxb5IqNRVXiy650FCjpN5wXe8XSUYo7rJct4A==',key_name='tempest-TestVolumeBootPattern-892734420',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-l3gxm8w1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:15:26Z,user_data=None,user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=3a7474e0-ede0-4d42-adf1-28b16d03074b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.672 2 DEBUG nova.network.os_vif_util [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.674 2 DEBUG nova.network.os_vif_util [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:38:d6,bridge_name='br-int',has_traffic_filtering=True,id=a3e548ad-dc1c-45a0-bb2c-54476fc1f716,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3e548ad-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.675 2 DEBUG nova.objects.instance [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a7474e0-ede0-4d42-adf1-28b16d03074b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.713 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  <uuid>3a7474e0-ede0-4d42-adf1-28b16d03074b</uuid>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  <name>instance-000000de</name>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestVolumeBootPattern-server-357187967</nova:name>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:15:31</nova:creationTime>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <nova:user uuid="c10de71fef00497981b8b7cec6a3fff3">tempest-TestVolumeBootPattern-1200415020-project-member</nova:user>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <nova:project uuid="fbbc6cb494464fd9b31f64c1ad75fa6b">tempest-TestVolumeBootPattern-1200415020</nova:project>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <nova:port uuid="a3e548ad-dc1c-45a0-bb2c-54476fc1f716">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <entry name="serial">3a7474e0-ede0-4d42-adf1-28b16d03074b</entry>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <entry name="uuid">3a7474e0-ede0-4d42-adf1-28b16d03074b</entry>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/3a7474e0-ede0-4d42-adf1-28b16d03074b_disk.config">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="volumes/volume-fb042846-a135-4c8d-8115-d7e363fd7891">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <serial>fb042846-a135-4c8d-8115-d7e363fd7891</serial>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:59:38:d6"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <target dev="tapa3e548ad-dc"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/3a7474e0-ede0-4d42-adf1-28b16d03074b/console.log" append="off"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:15:31 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:15:31 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:15:31 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:15:31 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.715 2 DEBUG nova.compute.manager [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Preparing to wait for external event network-vif-plugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.716 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.717 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.717 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.718 2 DEBUG nova.virt.libvirt.vif [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:15:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-357187967',display_name='tempest-TestVolumeBootPattern-server-357187967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-357187967',id=222,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBTAKk7vcjhR2v3hQpxHbnD8D+5EFYQASqHngnH89TfDPr9LwfPo4GlBaSvBU1kSzEsKlDYOjmvBACnYkU3g9qIDzQdk5Sxb5IqNRVXiy650FCjpN5wXe8XSUYo7rJct4A==',key_name='tempest-TestVolumeBootPattern-892734420',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-l3gxm8w1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:15:26Z,user_data=None,user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=3a7474e0-ede0-4d42-adf1-28b16d03074b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.718 2 DEBUG nova.network.os_vif_util [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.719 2 DEBUG nova.network.os_vif_util [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:38:d6,bridge_name='br-int',has_traffic_filtering=True,id=a3e548ad-dc1c-45a0-bb2c-54476fc1f716,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3e548ad-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.719 2 DEBUG os_vif [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:38:d6,bridge_name='br-int',has_traffic_filtering=True,id=a3e548ad-dc1c-45a0-bb2c-54476fc1f716,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3e548ad-dc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.720 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.721 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.726 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3e548ad-dc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.727 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa3e548ad-dc, col_values=(('external_ids', {'iface-id': 'a3e548ad-dc1c-45a0-bb2c-54476fc1f716', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:59:38:d6', 'vm-uuid': '3a7474e0-ede0-4d42-adf1-28b16d03074b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:31 np0005465988 NetworkManager[45041]: <info>  [1759410931.7315] manager: (tapa3e548ad-dc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/444)
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.741 2 INFO os_vif [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:38:d6,bridge_name='br-int',has_traffic_filtering=True,id=a3e548ad-dc1c-45a0-bb2c-54476fc1f716,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3e548ad-dc')#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.926 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.927 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.927 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] No VIF found with MAC fa:16:3e:59:38:d6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.929 2 INFO nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Using config drive#033[00m
Oct  2 09:15:31 np0005465988 nova_compute[236126]: 2025-10-02 13:15:31.976 2 DEBUG nova.storage.rbd_utils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 3a7474e0-ede0-4d42-adf1-28b16d03074b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:15:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:32.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:33.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:33 np0005465988 nova_compute[236126]: 2025-10-02 13:15:33.270 2 INFO nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Creating config drive at /var/lib/nova/instances/3a7474e0-ede0-4d42-adf1-28b16d03074b/disk.config#033[00m
Oct  2 09:15:33 np0005465988 nova_compute[236126]: 2025-10-02 13:15:33.275 2 DEBUG oslo_concurrency.processutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3a7474e0-ede0-4d42-adf1-28b16d03074b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcux3c8gc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:15:33 np0005465988 nova_compute[236126]: 2025-10-02 13:15:33.430 2 DEBUG oslo_concurrency.processutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3a7474e0-ede0-4d42-adf1-28b16d03074b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcux3c8gc" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:15:33 np0005465988 nova_compute[236126]: 2025-10-02 13:15:33.464 2 DEBUG nova.storage.rbd_utils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] rbd image 3a7474e0-ede0-4d42-adf1-28b16d03074b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:15:33 np0005465988 nova_compute[236126]: 2025-10-02 13:15:33.468 2 DEBUG oslo_concurrency.processutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3a7474e0-ede0-4d42-adf1-28b16d03074b/disk.config 3a7474e0-ede0-4d42-adf1-28b16d03074b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:15:33 np0005465988 nova_compute[236126]: 2025-10-02 13:15:33.656 2 DEBUG oslo_concurrency.processutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3a7474e0-ede0-4d42-adf1-28b16d03074b/disk.config 3a7474e0-ede0-4d42-adf1-28b16d03074b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:15:33 np0005465988 nova_compute[236126]: 2025-10-02 13:15:33.657 2 INFO nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Deleting local config drive /var/lib/nova/instances/3a7474e0-ede0-4d42-adf1-28b16d03074b/disk.config because it was imported into RBD.#033[00m
Oct  2 09:15:33 np0005465988 kernel: tapa3e548ad-dc: entered promiscuous mode
Oct  2 09:15:33 np0005465988 NetworkManager[45041]: <info>  [1759410933.7159] manager: (tapa3e548ad-dc): new Tun device (/org/freedesktop/NetworkManager/Devices/445)
Oct  2 09:15:33 np0005465988 ovn_controller[132601]: 2025-10-02T13:15:33Z|00999|binding|INFO|Claiming lport a3e548ad-dc1c-45a0-bb2c-54476fc1f716 for this chassis.
Oct  2 09:15:33 np0005465988 ovn_controller[132601]: 2025-10-02T13:15:33Z|01000|binding|INFO|a3e548ad-dc1c-45a0-bb2c-54476fc1f716: Claiming fa:16:3e:59:38:d6 10.100.0.9
Oct  2 09:15:33 np0005465988 nova_compute[236126]: 2025-10-02 13:15:33.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:33 np0005465988 nova_compute[236126]: 2025-10-02 13:15:33.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:33 np0005465988 NetworkManager[45041]: <info>  [1759410933.7348] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/446)
Oct  2 09:15:33 np0005465988 NetworkManager[45041]: <info>  [1759410933.7355] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/447)
Oct  2 09:15:33 np0005465988 systemd-machined[192594]: New machine qemu-105-instance-000000de.
Oct  2 09:15:33 np0005465988 systemd[1]: Started Virtual Machine qemu-105-instance-000000de.
Oct  2 09:15:33 np0005465988 systemd-udevd[343711]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:15:33 np0005465988 NetworkManager[45041]: <info>  [1759410933.8050] device (tapa3e548ad-dc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:15:33 np0005465988 NetworkManager[45041]: <info>  [1759410933.8058] device (tapa3e548ad-dc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:15:33 np0005465988 nova_compute[236126]: 2025-10-02 13:15:33.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:33 np0005465988 nova_compute[236126]: 2025-10-02 13:15:33.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.364 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:38:d6 10.100.0.9'], port_security=['fa:16:3e:59:38:d6 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '3a7474e0-ede0-4d42-adf1-28b16d03074b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-150508fb-9217-4982-8468-977a3b53121a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9a538a4f-f761-421e-aa00-1341aedd2ba6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d5e391d-23a7-4f5a-8146-0f24141a74f2, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=a3e548ad-dc1c-45a0-bb2c-54476fc1f716) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.365 142124 INFO neutron.agent.ovn.metadata.agent [-] Port a3e548ad-dc1c-45a0-bb2c-54476fc1f716 in datapath 150508fb-9217-4982-8468-977a3b53121a bound to our chassis#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.367 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 150508fb-9217-4982-8468-977a3b53121a#033[00m
Oct  2 09:15:34 np0005465988 ovn_controller[132601]: 2025-10-02T13:15:34Z|01001|binding|INFO|Setting lport a3e548ad-dc1c-45a0-bb2c-54476fc1f716 ovn-installed in OVS
Oct  2 09:15:34 np0005465988 ovn_controller[132601]: 2025-10-02T13:15:34Z|01002|binding|INFO|Setting lport a3e548ad-dc1c-45a0-bb2c-54476fc1f716 up in Southbound
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.382 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8e8015-7dce-4e29-b3f3-0ca3b02ba212]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.384 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap150508fb-91 in ovnmeta-150508fb-9217-4982-8468-977a3b53121a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.386 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap150508fb-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.386 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[06a6bc7d-2044-4ecc-938c-99bbee0f62eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.387 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d228ef06-ec14-49c9-9810-0116689bedfc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.407 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[ae2cc122-da08-4e87-8049-aff5ea6df191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.429 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[108630e2-ef82-44a8-8aa2-0a2b659ba22f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:34.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.470 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[c55c7c45-1f05-43ee-b601-d5734946d02f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 NetworkManager[45041]: <info>  [1759410934.4806] manager: (tap150508fb-90): new Veth device (/org/freedesktop/NetworkManager/Devices/448)
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.482 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8170ea-764c-446f-ba38-722e041d25b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.521 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[ead229a0-7bd4-4e42-acd5-af567dc99e5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.525 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4c03ad3c-cc7c-4eaa-bdda-090238945f07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 NetworkManager[45041]: <info>  [1759410934.5541] device (tap150508fb-90): carrier: link connected
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.560 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[d840034b-e777-436b-9331-54116ffce1ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.580 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[17599f5b-aec5-4673-9786-8ab6ac6f1acc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap150508fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:69:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 295], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 899488, 'reachable_time': 19965, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 343786, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.597 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dbae3c93-6415-4f48-94a5-8695782a9ddd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:6993'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 899488, 'tstamp': 899488}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 343787, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.616 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[23608179-c4d6-4fb6-98b4-4c32b44f9c5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap150508fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:69:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 295], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 899488, 'reachable_time': 19965, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 343788, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.655 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[97a39951-2094-46c7-80af-72c0fff92924]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.731 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[04f75a65-16d3-4c8b-a051-e1e42132fae8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.733 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap150508fb-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.733 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.733 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap150508fb-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:15:34 np0005465988 NetworkManager[45041]: <info>  [1759410934.7366] manager: (tap150508fb-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/449)
Oct  2 09:15:34 np0005465988 kernel: tap150508fb-90: entered promiscuous mode
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.743 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap150508fb-90, col_values=(('external_ids', {'iface-id': '2a2f4068-0f5b-4d26-b914-4d32097d8b55'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:15:34 np0005465988 ovn_controller[132601]: 2025-10-02T13:15:34Z|01003|binding|INFO|Releasing lport 2a2f4068-0f5b-4d26-b914-4d32097d8b55 from this chassis (sb_readonly=0)
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.746 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.750 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[50a0a3fb-97fa-4a67-96b4-a80bd7455e08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.751 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-150508fb-9217-4982-8468-977a3b53121a
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/150508fb-9217-4982-8468-977a3b53121a.pid.haproxy
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID 150508fb-9217-4982-8468-977a3b53121a
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:15:34 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:34.752 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'env', 'PROCESS_TAG=haproxy-150508fb-9217-4982-8468-977a3b53121a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/150508fb-9217-4982-8468-977a3b53121a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.823 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410934.8220458, 3a7474e0-ede0-4d42-adf1-28b16d03074b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.823 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] VM Started (Lifecycle Event)#033[00m
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.846 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.852 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410934.822786, 3a7474e0-ede0-4d42-adf1-28b16d03074b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.852 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.886 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.891 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:15:34 np0005465988 nova_compute[236126]: 2025-10-02 13:15:34.919 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.025 2 DEBUG nova.network.neutron [req-05a4b827-ca57-470e-bc0d-739e1edfc10b req-c2df3991-6b11-472e-bc0c-f4345bce9b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updated VIF entry in instance network info cache for port a3e548ad-dc1c-45a0-bb2c-54476fc1f716. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.026 2 DEBUG nova.network.neutron [req-05a4b827-ca57-470e-bc0d-739e1edfc10b req-c2df3991-6b11-472e-bc0c-f4345bce9b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updating instance_info_cache with network_info: [{"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.043 2 DEBUG nova.compute.manager [req-5e172ff5-2242-4410-bbb9-97789b1011db req-4376342d-898e-43da-8389-0fc9428de6c5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Received event network-vif-plugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.044 2 DEBUG oslo_concurrency.lockutils [req-5e172ff5-2242-4410-bbb9-97789b1011db req-4376342d-898e-43da-8389-0fc9428de6c5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.045 2 DEBUG oslo_concurrency.lockutils [req-5e172ff5-2242-4410-bbb9-97789b1011db req-4376342d-898e-43da-8389-0fc9428de6c5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.045 2 DEBUG oslo_concurrency.lockutils [req-5e172ff5-2242-4410-bbb9-97789b1011db req-4376342d-898e-43da-8389-0fc9428de6c5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.045 2 DEBUG nova.compute.manager [req-5e172ff5-2242-4410-bbb9-97789b1011db req-4376342d-898e-43da-8389-0fc9428de6c5 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Processing event network-vif-plugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.047 2 DEBUG nova.compute.manager [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.051 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759410935.0513377, 3a7474e0-ede0-4d42-adf1-28b16d03074b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.052 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.054 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.058 2 INFO nova.virt.libvirt.driver [-] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Instance spawned successfully.#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.058 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.069 2 DEBUG oslo_concurrency.lockutils [req-05a4b827-ca57-470e-bc0d-739e1edfc10b req-c2df3991-6b11-472e-bc0c-f4345bce9b6f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.097 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.100 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.108 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.108 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.109 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.109 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.110 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.111 2 DEBUG nova.virt.libvirt.driver [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.136 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:15:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:35.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:35 np0005465988 podman[343820]: 2025-10-02 13:15:35.219231216 +0000 UTC m=+0.073703546 container create d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.235 2 INFO nova.compute.manager [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Took 6.90 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.235 2 DEBUG nova.compute.manager [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:15:35 np0005465988 systemd[1]: Started libpod-conmon-d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1.scope.
Oct  2 09:15:35 np0005465988 podman[343820]: 2025-10-02 13:15:35.172425909 +0000 UTC m=+0.026898249 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:15:35 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:15:35 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b7b23a86506f16790c30b1e839d9ba000bb2b2b5864a175ea0ee7dadb73d9e9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:15:35 np0005465988 podman[343820]: 2025-10-02 13:15:35.315102134 +0000 UTC m=+0.169574504 container init d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 09:15:35 np0005465988 podman[343820]: 2025-10-02 13:15:35.325158841 +0000 UTC m=+0.179631181 container start d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.327 2 INFO nova.compute.manager [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Took 11.41 seconds to build instance.#033[00m
Oct  2 09:15:35 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[343837]: [NOTICE]   (343841) : New worker (343843) forked
Oct  2 09:15:35 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[343837]: [NOTICE]   (343841) : Loading success.
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.367 2 DEBUG oslo_concurrency.lockutils [None req-4bbfad0d-60e5-40a1-87eb-03c8b0acf502 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.526 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.527 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.527 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.527 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:15:35 np0005465988 nova_compute[236126]: 2025-10-02 13:15:35.527 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:15:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:15:35 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2077986245' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.009 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.121 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000de as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.122 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000de as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.283 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.284 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3940MB free_disk=20.967098236083984GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.285 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.285 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:36.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.453 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 3a7474e0-ede0-4d42-adf1-28b16d03074b actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.454 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.454 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.519 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:15:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1796848928' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.973 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:15:36 np0005465988 nova_compute[236126]: 2025-10-02 13:15:36.983 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:15:37 np0005465988 nova_compute[236126]: 2025-10-02 13:15:37.019 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:15:37 np0005465988 nova_compute[236126]: 2025-10-02 13:15:37.073 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:15:37 np0005465988 nova_compute[236126]: 2025-10-02 13:15:37.074 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.789s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:15:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:37.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:37 np0005465988 nova_compute[236126]: 2025-10-02 13:15:37.168 2 DEBUG nova.compute.manager [req-1009524e-c6bc-4d8d-b4ca-fc5cdd23b6b9 req-eac63638-0b9a-4389-8133-b3793d129aad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Received event network-vif-plugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:15:37 np0005465988 nova_compute[236126]: 2025-10-02 13:15:37.168 2 DEBUG oslo_concurrency.lockutils [req-1009524e-c6bc-4d8d-b4ca-fc5cdd23b6b9 req-eac63638-0b9a-4389-8133-b3793d129aad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:37 np0005465988 nova_compute[236126]: 2025-10-02 13:15:37.169 2 DEBUG oslo_concurrency.lockutils [req-1009524e-c6bc-4d8d-b4ca-fc5cdd23b6b9 req-eac63638-0b9a-4389-8133-b3793d129aad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:37 np0005465988 nova_compute[236126]: 2025-10-02 13:15:37.169 2 DEBUG oslo_concurrency.lockutils [req-1009524e-c6bc-4d8d-b4ca-fc5cdd23b6b9 req-eac63638-0b9a-4389-8133-b3793d129aad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:15:37 np0005465988 nova_compute[236126]: 2025-10-02 13:15:37.169 2 DEBUG nova.compute.manager [req-1009524e-c6bc-4d8d-b4ca-fc5cdd23b6b9 req-eac63638-0b9a-4389-8133-b3793d129aad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] No waiting events found dispatching network-vif-plugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:15:37 np0005465988 nova_compute[236126]: 2025-10-02 13:15:37.169 2 WARNING nova.compute.manager [req-1009524e-c6bc-4d8d-b4ca-fc5cdd23b6b9 req-eac63638-0b9a-4389-8133-b3793d129aad d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Received unexpected event network-vif-plugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 for instance with vm_state active and task_state None.#033[00m
Oct  2 09:15:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:38.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:39.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:40.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:40 np0005465988 podman[344073]: 2025-10-02 13:15:40.594154656 +0000 UTC m=+0.068965530 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct  2 09:15:40 np0005465988 podman[344073]: 2025-10-02 13:15:40.696868349 +0000 UTC m=+0.171679223 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Oct  2 09:15:40 np0005465988 nova_compute[236126]: 2025-10-02 13:15:40.697 2 DEBUG nova.compute.manager [req-3cfbcdbc-91bd-4ac3-8cd2-e1fd341d9952 req-06978748-a1d2-4229-9a20-e005fcc3dd68 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Received event network-changed-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:15:40 np0005465988 nova_compute[236126]: 2025-10-02 13:15:40.699 2 DEBUG nova.compute.manager [req-3cfbcdbc-91bd-4ac3-8cd2-e1fd341d9952 req-06978748-a1d2-4229-9a20-e005fcc3dd68 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Refreshing instance network info cache due to event network-changed-a3e548ad-dc1c-45a0-bb2c-54476fc1f716. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:15:40 np0005465988 nova_compute[236126]: 2025-10-02 13:15:40.699 2 DEBUG oslo_concurrency.lockutils [req-3cfbcdbc-91bd-4ac3-8cd2-e1fd341d9952 req-06978748-a1d2-4229-9a20-e005fcc3dd68 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:15:40 np0005465988 nova_compute[236126]: 2025-10-02 13:15:40.699 2 DEBUG oslo_concurrency.lockutils [req-3cfbcdbc-91bd-4ac3-8cd2-e1fd341d9952 req-06978748-a1d2-4229-9a20-e005fcc3dd68 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:15:40 np0005465988 nova_compute[236126]: 2025-10-02 13:15:40.700 2 DEBUG nova.network.neutron [req-3cfbcdbc-91bd-4ac3-8cd2-e1fd341d9952 req-06978748-a1d2-4229-9a20-e005fcc3dd68 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Refreshing network info cache for port a3e548ad-dc1c-45a0-bb2c-54476fc1f716 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:15:41 np0005465988 nova_compute[236126]: 2025-10-02 13:15:41.075 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:15:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:41.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:41 np0005465988 podman[344209]: 2025-10-02 13:15:41.295470914 +0000 UTC m=+0.060867990 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 09:15:41 np0005465988 podman[344209]: 2025-10-02 13:15:41.306829828 +0000 UTC m=+0.072226914 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 09:15:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:41 np0005465988 nova_compute[236126]: 2025-10-02 13:15:41.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:41 np0005465988 podman[344271]: 2025-10-02 13:15:41.555961252 +0000 UTC m=+0.067929651 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, vendor=Red Hat, Inc., io.openshift.expose-services=, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, architecture=x86_64, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived)
Oct  2 09:15:41 np0005465988 podman[344271]: 2025-10-02 13:15:41.565425672 +0000 UTC m=+0.077394041 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, vendor=Red Hat, Inc., name=keepalived, release=1793, vcs-type=git, version=2.2.4, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, distribution-scope=public, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64)
Oct  2 09:15:41 np0005465988 nova_compute[236126]: 2025-10-02 13:15:41.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:42.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:15:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:15:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:15:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:15:42 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:15:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:43.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:43 np0005465988 nova_compute[236126]: 2025-10-02 13:15:43.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:15:43 np0005465988 nova_compute[236126]: 2025-10-02 13:15:43.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:15:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:44.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:44 np0005465988 nova_compute[236126]: 2025-10-02 13:15:44.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:15:44 np0005465988 nova_compute[236126]: 2025-10-02 13:15:44.472 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:15:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:45.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:45 np0005465988 nova_compute[236126]: 2025-10-02 13:15:45.419 2 DEBUG nova.network.neutron [req-3cfbcdbc-91bd-4ac3-8cd2-e1fd341d9952 req-06978748-a1d2-4229-9a20-e005fcc3dd68 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updated VIF entry in instance network info cache for port a3e548ad-dc1c-45a0-bb2c-54476fc1f716. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:15:45 np0005465988 nova_compute[236126]: 2025-10-02 13:15:45.420 2 DEBUG nova.network.neutron [req-3cfbcdbc-91bd-4ac3-8cd2-e1fd341d9952 req-06978748-a1d2-4229-9a20-e005fcc3dd68 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updating instance_info_cache with network_info: [{"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:15:45 np0005465988 nova_compute[236126]: 2025-10-02 13:15:45.447 2 DEBUG oslo_concurrency.lockutils [req-3cfbcdbc-91bd-4ac3-8cd2-e1fd341d9952 req-06978748-a1d2-4229-9a20-e005fcc3dd68 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:15:45 np0005465988 nova_compute[236126]: 2025-10-02 13:15:45.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:15:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:46.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:46 np0005465988 nova_compute[236126]: 2025-10-02 13:15:46.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:46 np0005465988 nova_compute[236126]: 2025-10-02 13:15:46.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:47.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:48.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:48 np0005465988 nova_compute[236126]: 2025-10-02 13:15:48.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:15:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:49.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:49 np0005465988 ovn_controller[132601]: 2025-10-02T13:15:49Z|00131|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:59:38:d6 10.100.0.9
Oct  2 09:15:49 np0005465988 ovn_controller[132601]: 2025-10-02T13:15:49Z|00132|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:59:38:d6 10.100.0.9
Oct  2 09:15:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:15:49 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:15:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:50.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:51.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:51 np0005465988 nova_compute[236126]: 2025-10-02 13:15:51.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:15:51 np0005465988 nova_compute[236126]: 2025-10-02 13:15:51.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:15:51 np0005465988 nova_compute[236126]: 2025-10-02 13:15:51.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:15:51 np0005465988 nova_compute[236126]: 2025-10-02 13:15:51.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:51 np0005465988 nova_compute[236126]: 2025-10-02 13:15:51.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:51 np0005465988 nova_compute[236126]: 2025-10-02 13:15:51.805 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:15:51 np0005465988 nova_compute[236126]: 2025-10-02 13:15:51.805 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquired lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:15:51 np0005465988 nova_compute[236126]: 2025-10-02 13:15:51.805 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 09:15:51 np0005465988 nova_compute[236126]: 2025-10-02 13:15:51.806 2 DEBUG nova.objects.instance [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3a7474e0-ede0-4d42-adf1-28b16d03074b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:15:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:52.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:52 np0005465988 podman[344556]: 2025-10-02 13:15:52.569588498 +0000 UTC m=+0.086580953 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 09:15:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:53.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:53 np0005465988 nova_compute[236126]: 2025-10-02 13:15:53.768 2 DEBUG nova.network.neutron [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updating instance_info_cache with network_info: [{"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:15:53 np0005465988 nova_compute[236126]: 2025-10-02 13:15:53.900 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Releasing lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:15:53 np0005465988 nova_compute[236126]: 2025-10-02 13:15:53.900 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 09:15:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:54.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:15:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:55.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:15:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:15:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:15:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:56.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:15:56 np0005465988 nova_compute[236126]: 2025-10-02 13:15:56.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:56.551 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=87, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=86) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:15:56 np0005465988 nova_compute[236126]: 2025-10-02 13:15:56.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:56 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:56.553 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:15:56 np0005465988 nova_compute[236126]: 2025-10-02 13:15:56.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:56 np0005465988 nova_compute[236126]: 2025-10-02 13:15:56.874 2 DEBUG nova.compute.manager [req-50a80a3c-de3a-445c-b53f-ca86a1c149be req-1d237f90-7246-4111-baee-69db6188020f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Received event network-changed-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:15:56 np0005465988 nova_compute[236126]: 2025-10-02 13:15:56.874 2 DEBUG nova.compute.manager [req-50a80a3c-de3a-445c-b53f-ca86a1c149be req-1d237f90-7246-4111-baee-69db6188020f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Refreshing instance network info cache due to event network-changed-a3e548ad-dc1c-45a0-bb2c-54476fc1f716. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:15:56 np0005465988 nova_compute[236126]: 2025-10-02 13:15:56.875 2 DEBUG oslo_concurrency.lockutils [req-50a80a3c-de3a-445c-b53f-ca86a1c149be req-1d237f90-7246-4111-baee-69db6188020f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:15:56 np0005465988 nova_compute[236126]: 2025-10-02 13:15:56.875 2 DEBUG oslo_concurrency.lockutils [req-50a80a3c-de3a-445c-b53f-ca86a1c149be req-1d237f90-7246-4111-baee-69db6188020f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:15:56 np0005465988 nova_compute[236126]: 2025-10-02 13:15:56.875 2 DEBUG nova.network.neutron [req-50a80a3c-de3a-445c-b53f-ca86a1c149be req-1d237f90-7246-4111-baee-69db6188020f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Refreshing network info cache for port a3e548ad-dc1c-45a0-bb2c-54476fc1f716 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.076 2 DEBUG oslo_concurrency.lockutils [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "3a7474e0-ede0-4d42-adf1-28b16d03074b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.077 2 DEBUG oslo_concurrency.lockutils [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.077 2 DEBUG oslo_concurrency.lockutils [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.078 2 DEBUG oslo_concurrency.lockutils [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.078 2 DEBUG oslo_concurrency.lockutils [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.080 2 INFO nova.compute.manager [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Terminating instance#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.082 2 DEBUG nova.compute.manager [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 09:15:57 np0005465988 kernel: tapa3e548ad-dc (unregistering): left promiscuous mode
Oct  2 09:15:57 np0005465988 NetworkManager[45041]: <info>  [1759410957.1626] device (tapa3e548ad-dc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:57 np0005465988 ovn_controller[132601]: 2025-10-02T13:15:57Z|01004|binding|INFO|Releasing lport a3e548ad-dc1c-45a0-bb2c-54476fc1f716 from this chassis (sb_readonly=0)
Oct  2 09:15:57 np0005465988 ovn_controller[132601]: 2025-10-02T13:15:57Z|01005|binding|INFO|Setting lport a3e548ad-dc1c-45a0-bb2c-54476fc1f716 down in Southbound
Oct  2 09:15:57 np0005465988 ovn_controller[132601]: 2025-10-02T13:15:57Z|01006|binding|INFO|Removing iface tapa3e548ad-dc ovn-installed in OVS
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:57.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.192 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:38:d6 10.100.0.9'], port_security=['fa:16:3e:59:38:d6 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '3a7474e0-ede0-4d42-adf1-28b16d03074b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-150508fb-9217-4982-8468-977a3b53121a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbbc6cb494464fd9b31f64c1ad75fa6b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9a538a4f-f761-421e-aa00-1341aedd2ba6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d5e391d-23a7-4f5a-8146-0f24141a74f2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=a3e548ad-dc1c-45a0-bb2c-54476fc1f716) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.194 142124 INFO neutron.agent.ovn.metadata.agent [-] Port a3e548ad-dc1c-45a0-bb2c-54476fc1f716 in datapath 150508fb-9217-4982-8468-977a3b53121a unbound from our chassis#033[00m
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.196 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 150508fb-9217-4982-8468-977a3b53121a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.198 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b342388c-7d87-4b28-8947-e7f67a41b4da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.200 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-150508fb-9217-4982-8468-977a3b53121a namespace which is not needed anymore#033[00m
Oct  2 09:15:57 np0005465988 systemd[1]: machine-qemu\x2d105\x2dinstance\x2d000000de.scope: Deactivated successfully.
Oct  2 09:15:57 np0005465988 systemd[1]: machine-qemu\x2d105\x2dinstance\x2d000000de.scope: Consumed 15.002s CPU time.
Oct  2 09:15:57 np0005465988 systemd-machined[192594]: Machine qemu-105-instance-000000de terminated.
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.322 2 INFO nova.virt.libvirt.driver [-] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Instance destroyed successfully.#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.322 2 DEBUG nova.objects.instance [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lazy-loading 'resources' on Instance uuid 3a7474e0-ede0-4d42-adf1-28b16d03074b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:15:57 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[343837]: [NOTICE]   (343841) : haproxy version is 2.8.14-c23fe91
Oct  2 09:15:57 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[343837]: [NOTICE]   (343841) : path to executable is /usr/sbin/haproxy
Oct  2 09:15:57 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[343837]: [WARNING]  (343841) : Exiting Master process...
Oct  2 09:15:57 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[343837]: [ALERT]    (343841) : Current worker (343843) exited with code 143 (Terminated)
Oct  2 09:15:57 np0005465988 neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a[343837]: [WARNING]  (343841) : All workers exited. Exiting... (0)
Oct  2 09:15:57 np0005465988 systemd[1]: libpod-d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1.scope: Deactivated successfully.
Oct  2 09:15:57 np0005465988 podman[344604]: 2025-10-02 13:15:57.396945202 +0000 UTC m=+0.082148667 container died d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.397 2 DEBUG nova.virt.libvirt.vif [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:15:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-357187967',display_name='tempest-TestVolumeBootPattern-server-357187967',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-357187967',id=222,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBTAKk7vcjhR2v3hQpxHbnD8D+5EFYQASqHngnH89TfDPr9LwfPo4GlBaSvBU1kSzEsKlDYOjmvBACnYkU3g9qIDzQdk5Sxb5IqNRVXiy650FCjpN5wXe8XSUYo7rJct4A==',key_name='tempest-TestVolumeBootPattern-892734420',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:15:35Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fbbc6cb494464fd9b31f64c1ad75fa6b',ramdisk_id='',reservation_id='r-l3gxm8w1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-1200415020',owner_user_name='tempest-TestVolumeBootPattern-1200415020-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:15:35Z,user_data=None,user_id='c10de71fef00497981b8b7cec6a3fff3',uuid=3a7474e0-ede0-4d42-adf1-28b16d03074b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.398 2 DEBUG nova.network.os_vif_util [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converting VIF {"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.399 2 DEBUG nova.network.os_vif_util [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:59:38:d6,bridge_name='br-int',has_traffic_filtering=True,id=a3e548ad-dc1c-45a0-bb2c-54476fc1f716,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3e548ad-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.399 2 DEBUG os_vif [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:38:d6,bridge_name='br-int',has_traffic_filtering=True,id=a3e548ad-dc1c-45a0-bb2c-54476fc1f716,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3e548ad-dc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.403 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3e548ad-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.412 2 INFO os_vif [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:38:d6,bridge_name='br-int',has_traffic_filtering=True,id=a3e548ad-dc1c-45a0-bb2c-54476fc1f716,network=Network(150508fb-9217-4982-8468-977a3b53121a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3e548ad-dc')#033[00m
Oct  2 09:15:57 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1-userdata-shm.mount: Deactivated successfully.
Oct  2 09:15:57 np0005465988 systemd[1]: var-lib-containers-storage-overlay-7b7b23a86506f16790c30b1e839d9ba000bb2b2b5864a175ea0ee7dadb73d9e9-merged.mount: Deactivated successfully.
Oct  2 09:15:57 np0005465988 podman[344604]: 2025-10-02 13:15:57.453302591 +0000 UTC m=+0.138506046 container cleanup d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:15:57 np0005465988 systemd[1]: libpod-conmon-d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1.scope: Deactivated successfully.
Oct  2 09:15:57 np0005465988 podman[344662]: 2025-10-02 13:15:57.541076478 +0000 UTC m=+0.063147365 container remove d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.549 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d63a191a-a468-45c8-993c-13db97ff27f8]: (4, ('Thu Oct  2 01:15:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a (d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1)\nd1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1\nThu Oct  2 01:15:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-150508fb-9217-4982-8468-977a3b53121a (d1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1)\nd1f4a16362f0d5684a24ff58ebcc56a2afd7e5c06de05c0c9a576c71ed2e3bd1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.551 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[c085847d-0602-40c9-8c09-9432fff8a688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.552 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap150508fb-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:15:57 np0005465988 kernel: tap150508fb-90: left promiscuous mode
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.598 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b238eef1-6a6a-4dc1-ab41-d757cdd2c3dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.630 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9ace7e98-5309-49bf-bef6-964578131635]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.633 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d676f625-518b-432c-9c92-611890f8019b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.647 2 DEBUG nova.compute.manager [req-4ec27548-fb7f-45f6-86b3-9b80d5f58dc8 req-914f15ac-22fc-40a5-8654-2aafe363d6f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Received event network-vif-unplugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.647 2 DEBUG oslo_concurrency.lockutils [req-4ec27548-fb7f-45f6-86b3-9b80d5f58dc8 req-914f15ac-22fc-40a5-8654-2aafe363d6f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.648 2 DEBUG oslo_concurrency.lockutils [req-4ec27548-fb7f-45f6-86b3-9b80d5f58dc8 req-914f15ac-22fc-40a5-8654-2aafe363d6f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.648 2 DEBUG oslo_concurrency.lockutils [req-4ec27548-fb7f-45f6-86b3-9b80d5f58dc8 req-914f15ac-22fc-40a5-8654-2aafe363d6f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.648 2 DEBUG nova.compute.manager [req-4ec27548-fb7f-45f6-86b3-9b80d5f58dc8 req-914f15ac-22fc-40a5-8654-2aafe363d6f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] No waiting events found dispatching network-vif-unplugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.648 2 DEBUG nova.compute.manager [req-4ec27548-fb7f-45f6-86b3-9b80d5f58dc8 req-914f15ac-22fc-40a5-8654-2aafe363d6f6 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Received event network-vif-unplugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.650 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[8100fe3e-ea46-456f-b1b8-3794f01d1d2f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 899479, 'reachable_time': 36508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 344676, 'error': None, 'target': 'ovnmeta-150508fb-9217-4982-8468-977a3b53121a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:57 np0005465988 systemd[1]: run-netns-ovnmeta\x2d150508fb\x2d9217\x2d4982\x2d8468\x2d977a3b53121a.mount: Deactivated successfully.
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.655 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-150508fb-9217-4982-8468-977a3b53121a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:15:57 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:15:57.656 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[560d0aa9-8ab0-44c9-becb-12d34e891f2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.749 2 INFO nova.virt.libvirt.driver [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Deleting instance files /var/lib/nova/instances/3a7474e0-ede0-4d42-adf1-28b16d03074b_del#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.751 2 INFO nova.virt.libvirt.driver [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Deletion of /var/lib/nova/instances/3a7474e0-ede0-4d42-adf1-28b16d03074b_del complete#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.869 2 INFO nova.compute.manager [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Took 0.79 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.870 2 DEBUG oslo.service.loopingcall [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.871 2 DEBUG nova.compute.manager [-] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 09:15:57 np0005465988 nova_compute[236126]: 2025-10-02 13:15:57.871 2 DEBUG nova.network.neutron [-] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 09:15:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:15:58.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:15:58 np0005465988 nova_compute[236126]: 2025-10-02 13:15:58.795 2 DEBUG nova.network.neutron [req-50a80a3c-de3a-445c-b53f-ca86a1c149be req-1d237f90-7246-4111-baee-69db6188020f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updated VIF entry in instance network info cache for port a3e548ad-dc1c-45a0-bb2c-54476fc1f716. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:15:58 np0005465988 nova_compute[236126]: 2025-10-02 13:15:58.796 2 DEBUG nova.network.neutron [req-50a80a3c-de3a-445c-b53f-ca86a1c149be req-1d237f90-7246-4111-baee-69db6188020f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updating instance_info_cache with network_info: [{"id": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "address": "fa:16:3e:59:38:d6", "network": {"id": "150508fb-9217-4982-8468-977a3b53121a", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1348951324-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fbbc6cb494464fd9b31f64c1ad75fa6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3e548ad-dc", "ovs_interfaceid": "a3e548ad-dc1c-45a0-bb2c-54476fc1f716", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:15:58 np0005465988 nova_compute[236126]: 2025-10-02 13:15:58.881 2 DEBUG oslo_concurrency.lockutils [req-50a80a3c-de3a-445c-b53f-ca86a1c149be req-1d237f90-7246-4111-baee-69db6188020f d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-3a7474e0-ede0-4d42-adf1-28b16d03074b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:15:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:15:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:15:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:15:59.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.124 2 DEBUG nova.compute.manager [req-53d83cb3-bf9a-4aa4-8a43-7ca4219841d7 req-7281f1d7-aa39-4949-b990-8ac5bea69dd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Received event network-vif-plugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.124 2 DEBUG oslo_concurrency.lockutils [req-53d83cb3-bf9a-4aa4-8a43-7ca4219841d7 req-7281f1d7-aa39-4949-b990-8ac5bea69dd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.124 2 DEBUG oslo_concurrency.lockutils [req-53d83cb3-bf9a-4aa4-8a43-7ca4219841d7 req-7281f1d7-aa39-4949-b990-8ac5bea69dd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.125 2 DEBUG oslo_concurrency.lockutils [req-53d83cb3-bf9a-4aa4-8a43-7ca4219841d7 req-7281f1d7-aa39-4949-b990-8ac5bea69dd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.125 2 DEBUG nova.compute.manager [req-53d83cb3-bf9a-4aa4-8a43-7ca4219841d7 req-7281f1d7-aa39-4949-b990-8ac5bea69dd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] No waiting events found dispatching network-vif-plugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.125 2 WARNING nova.compute.manager [req-53d83cb3-bf9a-4aa4-8a43-7ca4219841d7 req-7281f1d7-aa39-4949-b990-8ac5bea69dd4 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Received unexpected event network-vif-plugged-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.249 2 DEBUG nova.network.neutron [-] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.365 2 DEBUG nova.compute.manager [req-9682c67b-96b5-4db8-ad91-0e005859d4a3 req-17010ed7-4ed8-4e60-97bf-8467e337e79c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Received event network-vif-deleted-a3e548ad-dc1c-45a0-bb2c-54476fc1f716 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.366 2 INFO nova.compute.manager [req-9682c67b-96b5-4db8-ad91-0e005859d4a3 req-17010ed7-4ed8-4e60-97bf-8467e337e79c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Neutron deleted interface a3e548ad-dc1c-45a0-bb2c-54476fc1f716; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.366 2 DEBUG nova.network.neutron [req-9682c67b-96b5-4db8-ad91-0e005859d4a3 req-17010ed7-4ed8-4e60-97bf-8467e337e79c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.369 2 INFO nova.compute.manager [-] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Took 2.50 seconds to deallocate network for instance.#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.412 2 DEBUG nova.compute.manager [req-9682c67b-96b5-4db8-ad91-0e005859d4a3 req-17010ed7-4ed8-4e60-97bf-8467e337e79c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Detach interface failed, port_id=a3e548ad-dc1c-45a0-bb2c-54476fc1f716, reason: Instance 3a7474e0-ede0-4d42-adf1-28b16d03074b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 09:16:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:16:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:00.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.747 2 INFO nova.compute.manager [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Took 0.38 seconds to detach 1 volumes for instance.#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.822 2 DEBUG oslo_concurrency.lockutils [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.823 2 DEBUG oslo_concurrency.lockutils [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:00 np0005465988 nova_compute[236126]: 2025-10-02 13:16:00.891 2 DEBUG oslo_concurrency.processutils [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:16:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:01.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:16:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1743447586' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:16:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:01 np0005465988 nova_compute[236126]: 2025-10-02 13:16:01.382 2 DEBUG oslo_concurrency.processutils [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:16:01 np0005465988 nova_compute[236126]: 2025-10-02 13:16:01.390 2 DEBUG nova.compute.provider_tree [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:16:01 np0005465988 nova_compute[236126]: 2025-10-02 13:16:01.412 2 DEBUG nova.scheduler.client.report [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:16:01 np0005465988 nova_compute[236126]: 2025-10-02 13:16:01.454 2 DEBUG oslo_concurrency.lockutils [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:01 np0005465988 nova_compute[236126]: 2025-10-02 13:16:01.495 2 INFO nova.scheduler.client.report [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Deleted allocations for instance 3a7474e0-ede0-4d42-adf1-28b16d03074b#033[00m
Oct  2 09:16:01 np0005465988 nova_compute[236126]: 2025-10-02 13:16:01.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:01 np0005465988 podman[344703]: 2025-10-02 13:16:01.524611407 +0000 UTC m=+0.058721388 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:16:01 np0005465988 podman[344704]: 2025-10-02 13:16:01.534835259 +0000 UTC m=+0.067827088 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 09:16:01 np0005465988 podman[344702]: 2025-10-02 13:16:01.590317954 +0000 UTC m=+0.125753452 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 09:16:01 np0005465988 nova_compute[236126]: 2025-10-02 13:16:01.603 2 DEBUG oslo_concurrency.lockutils [None req-37e09705-216e-404e-a6fe-25fc4ddee355 c10de71fef00497981b8b7cec6a3fff3 fbbc6cb494464fd9b31f64c1ad75fa6b - - default default] Lock "3a7474e0-ede0-4d42-adf1-28b16d03074b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.526s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:02 np0005465988 nova_compute[236126]: 2025-10-02 13:16:02.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:02.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:16:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:03.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:16:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:03.556 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '87'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:16:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:04.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e427 e427: 3 total, 3 up, 3 in
Oct  2 09:16:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:05.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:06.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:06 np0005465988 nova_compute[236126]: 2025-10-02 13:16:06.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:07.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:07 np0005465988 nova_compute[236126]: 2025-10-02 13:16:07.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:08 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [O-0] New memtable created with log file: #60. Immutable memtables: 0.
Oct  2 09:16:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:08.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:16:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:09.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:16:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:10.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:16:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:11.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:16:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e428 e428: 3 total, 3 up, 3 in
Oct  2 09:16:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e428 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:11 np0005465988 nova_compute[236126]: 2025-10-02 13:16:11.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:12 np0005465988 nova_compute[236126]: 2025-10-02 13:16:12.320 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759410957.3184438, 3a7474e0-ede0-4d42-adf1-28b16d03074b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:16:12 np0005465988 nova_compute[236126]: 2025-10-02 13:16:12.320 2 INFO nova.compute.manager [-] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:16:12 np0005465988 nova_compute[236126]: 2025-10-02 13:16:12.340 2 DEBUG nova.compute.manager [None req-c7254e54-5c3e-45d9-a4f7-05dc3d2a606f - - - - - -] [instance: 3a7474e0-ede0-4d42-adf1-28b16d03074b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:16:12 np0005465988 nova_compute[236126]: 2025-10-02 13:16:12.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:16:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:12.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:16:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e429 e429: 3 total, 3 up, 3 in
Oct  2 09:16:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:13.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:16:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:14.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:16:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:15.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e429 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:16.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:16 np0005465988 nova_compute[236126]: 2025-10-02 13:16:16.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:17.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:17 np0005465988 nova_compute[236126]: 2025-10-02 13:16:17.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:18 np0005465988 nova_compute[236126]: 2025-10-02 13:16:18.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:18 np0005465988 nova_compute[236126]: 2025-10-02 13:16:18.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:18.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:19.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:20.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:21.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 e430: 3 total, 3 up, 3 in
Oct  2 09:16:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:21 np0005465988 nova_compute[236126]: 2025-10-02 13:16:21.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:22 np0005465988 nova_compute[236126]: 2025-10-02 13:16:22.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:22.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:23.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:23 np0005465988 podman[344829]: 2025-10-02 13:16:23.545584973 +0000 UTC m=+0.080974364 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 09:16:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:16:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:24.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:16:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:25.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:26.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:26 np0005465988 nova_compute[236126]: 2025-10-02 13:16:26.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:16:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:27.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:16:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:27.424 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:27.425 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:27.425 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:27 np0005465988 nova_compute[236126]: 2025-10-02 13:16:27.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:28.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:29.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:30.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:31.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:31 np0005465988 nova_compute[236126]: 2025-10-02 13:16:31.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:32 np0005465988 nova_compute[236126]: 2025-10-02 13:16:32.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:32 np0005465988 nova_compute[236126]: 2025-10-02 13:16:32.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:16:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:32.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:32 np0005465988 podman[344904]: 2025-10-02 13:16:32.547733749 +0000 UTC m=+0.071683789 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:16:32 np0005465988 podman[344903]: 2025-10-02 13:16:32.577606232 +0000 UTC m=+0.097535897 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 09:16:32 np0005465988 podman[344902]: 2025-10-02 13:16:32.580451293 +0000 UTC m=+0.116283932 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Oct  2 09:16:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:33.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:34.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:35.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:35 np0005465988 nova_compute[236126]: 2025-10-02 13:16:35.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:16:35 np0005465988 nova_compute[236126]: 2025-10-02 13:16:35.496 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:35 np0005465988 nova_compute[236126]: 2025-10-02 13:16:35.497 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:35 np0005465988 nova_compute[236126]: 2025-10-02 13:16:35.497 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:35 np0005465988 nova_compute[236126]: 2025-10-02 13:16:35.497 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:16:35 np0005465988 nova_compute[236126]: 2025-10-02 13:16:35.498 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:16:35 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:16:35 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1757148826' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:16:35 np0005465988 nova_compute[236126]: 2025-10-02 13:16:35.993 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.200 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.202 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4017MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.203 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.203 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.273 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.274 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.287 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.302 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.303 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 09:16:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.449 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.472 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.493 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:16:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:36.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:16:36 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2173683488' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.979 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:16:36 np0005465988 nova_compute[236126]: 2025-10-02 13:16:36.987 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:16:37 np0005465988 nova_compute[236126]: 2025-10-02 13:16:37.013 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:16:37 np0005465988 nova_compute[236126]: 2025-10-02 13:16:37.044 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:16:37 np0005465988 nova_compute[236126]: 2025-10-02 13:16:37.045 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.841s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:16:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:37.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:16:37 np0005465988 nova_compute[236126]: 2025-10-02 13:16:37.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:38.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:39.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:16:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:40.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:16:41 np0005465988 nova_compute[236126]: 2025-10-02 13:16:41.045 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:16:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:41.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:41 np0005465988 nova_compute[236126]: 2025-10-02 13:16:41.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:42 np0005465988 nova_compute[236126]: 2025-10-02 13:16:42.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:42.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:16:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:43.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:16:44 np0005465988 nova_compute[236126]: 2025-10-02 13:16:44.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:16:44 np0005465988 nova_compute[236126]: 2025-10-02 13:16:44.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:16:44 np0005465988 nova_compute[236126]: 2025-10-02 13:16:44.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:16:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:44.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:45.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:46 np0005465988 nova_compute[236126]: 2025-10-02 13:16:46.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:16:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:46.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:46 np0005465988 nova_compute[236126]: 2025-10-02 13:16:46.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:47.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:47 np0005465988 nova_compute[236126]: 2025-10-02 13:16:47.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:16:47 np0005465988 nova_compute[236126]: 2025-10-02 13:16:47.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:48 np0005465988 nova_compute[236126]: 2025-10-02 13:16:48.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:16:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:48.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:16:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:49.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:16:50 np0005465988 nova_compute[236126]: 2025-10-02 13:16:50.549 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:50 np0005465988 nova_compute[236126]: 2025-10-02 13:16:50.551 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.003000085s ======
Oct  2 09:16:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:50.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000085s
Oct  2 09:16:50 np0005465988 nova_compute[236126]: 2025-10-02 13:16:50.568 2 DEBUG nova.compute.manager [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 09:16:50 np0005465988 nova_compute[236126]: 2025-10-02 13:16:50.642 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:50 np0005465988 nova_compute[236126]: 2025-10-02 13:16:50.643 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:50 np0005465988 nova_compute[236126]: 2025-10-02 13:16:50.650 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 09:16:50 np0005465988 nova_compute[236126]: 2025-10-02 13:16:50.650 2 INFO nova.compute.claims [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 09:16:50 np0005465988 nova_compute[236126]: 2025-10-02 13:16:50.773 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:16:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:16:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:16:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:16:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:51.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:16:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2965026992' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.289 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.295 2 DEBUG nova.compute.provider_tree [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.310 2 DEBUG nova.scheduler.client.report [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.343 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.700s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.344 2 DEBUG nova.compute.manager [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 09:16:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.408 2 DEBUG nova.compute.manager [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.409 2 DEBUG nova.network.neutron [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.432 2 INFO nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.455 2 DEBUG nova.compute.manager [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.499 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.499 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.556 2 DEBUG nova.compute.manager [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.558 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.558 2 INFO nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Creating image(s)#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.592 2 DEBUG nova.storage.rbd_utils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.629 2 DEBUG nova.storage.rbd_utils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.663 2 DEBUG nova.storage.rbd_utils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.668 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.755 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.756 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.757 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.757 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "50c3d0e01c5fd68886c717f1fdd053015a0fe968" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.792 2 DEBUG nova.storage.rbd_utils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:16:51 np0005465988 nova_compute[236126]: 2025-10-02 13:16:51.798 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:16:52 np0005465988 nova_compute[236126]: 2025-10-02 13:16:52.108 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/50c3d0e01c5fd68886c717f1fdd053015a0fe968 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.310s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:16:52 np0005465988 nova_compute[236126]: 2025-10-02 13:16:52.204 2 DEBUG nova.storage.rbd_utils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] resizing rbd image 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct  2 09:16:52 np0005465988 nova_compute[236126]: 2025-10-02 13:16:52.249 2 DEBUG nova.policy [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ffe4d737e4414fb3a3e358f8ca3f3e1e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '08e102ae48244af2ab448a2e1ff757df', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 09:16:52 np0005465988 nova_compute[236126]: 2025-10-02 13:16:52.321 2 DEBUG nova.objects.instance [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'migration_context' on Instance uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:16:52 np0005465988 nova_compute[236126]: 2025-10-02 13:16:52.337 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 09:16:52 np0005465988 nova_compute[236126]: 2025-10-02 13:16:52.338 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Ensure instance console log exists: /var/lib/nova/instances/09e6757a-c973-4e60-bc00-5f66b22c4f51/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 09:16:52 np0005465988 nova_compute[236126]: 2025-10-02 13:16:52.338 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:52 np0005465988 nova_compute[236126]: 2025-10-02 13:16:52.338 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:52 np0005465988 nova_compute[236126]: 2025-10-02 13:16:52.339 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:52 np0005465988 nova_compute[236126]: 2025-10-02 13:16:52.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:52.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:53.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:54 np0005465988 nova_compute[236126]: 2025-10-02 13:16:54.273 2 DEBUG nova.network.neutron [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Successfully created port: f0875847-ec73-40e0-960f-0e3d0cd9a411 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 09:16:54 np0005465988 podman[345388]: 2025-10-02 13:16:54.530615773 +0000 UTC m=+0.061408625 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 09:16:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:16:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:54.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.007 2 DEBUG nova.network.neutron [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Successfully updated port: f0875847-ec73-40e0-960f-0e3d0cd9a411 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.028 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.028 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquired lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.029 2 DEBUG nova.network.neutron [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.119 2 DEBUG nova.compute.manager [req-8bef0aa1-573c-4925-a78a-be7b79885961 req-4e039fc4-f103-4c4c-9a23-08e8bd716a85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-changed-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.120 2 DEBUG nova.compute.manager [req-8bef0aa1-573c-4925-a78a-be7b79885961 req-4e039fc4-f103-4c4c-9a23-08e8bd716a85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Refreshing instance network info cache due to event network-changed-f0875847-ec73-40e0-960f-0e3d0cd9a411. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.120 2 DEBUG oslo_concurrency.lockutils [req-8bef0aa1-573c-4925-a78a-be7b79885961 req-4e039fc4-f103-4c4c-9a23-08e8bd716a85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.174 2 DEBUG nova.network.neutron [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 09:16:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:55.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.856 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.880 2 WARNING nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.#033[00m
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.881 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Triggering sync for uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 09:16:55 np0005465988 nova_compute[236126]: 2025-10-02 13:16:55.881 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.199 2 DEBUG nova.network.neutron [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Updating instance_info_cache with network_info: [{"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.220 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Releasing lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.220 2 DEBUG nova.compute.manager [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Instance network_info: |[{"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.221 2 DEBUG oslo_concurrency.lockutils [req-8bef0aa1-573c-4925-a78a-be7b79885961 req-4e039fc4-f103-4c4c-9a23-08e8bd716a85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.222 2 DEBUG nova.network.neutron [req-8bef0aa1-573c-4925-a78a-be7b79885961 req-4e039fc4-f103-4c4c-9a23-08e8bd716a85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Refreshing network info cache for port f0875847-ec73-40e0-960f-0e3d0cd9a411 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.225 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Start _get_guest_xml network_info=[{"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.232 2 WARNING nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.237 2 DEBUG nova.virt.libvirt.host [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.238 2 DEBUG nova.virt.libvirt.host [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.248 2 DEBUG nova.virt.libvirt.host [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.248 2 DEBUG nova.virt.libvirt.host [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.250 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.251 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:57:41Z,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1533ac528d35434c826050eed402afba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:57:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.251 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.251 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.252 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.252 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.252 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.252 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.253 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.253 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.253 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.253 2 DEBUG nova.virt.hardware [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.257 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:16:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:16:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:16:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:56.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:16:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3897488117' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.825 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.568s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.858 2 DEBUG nova.storage.rbd_utils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:16:56 np0005465988 nova_compute[236126]: 2025-10-02 13:16:56.864 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:16:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:16:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:57.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:16:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:16:57 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1857333575' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.315 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.317 2 DEBUG nova.virt.libvirt.vif [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:16:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-122576717',display_name='tempest-TestNetworkAdvancedServerOps-server-122576717',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-122576717',id=223,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMf7reumBTydD8Bvj9fduWIu4RpBeEqz1kj5y/au6S9Ayi0aksKCGdFTLh16toR0AjipEWmJ4nHDHJ2lgdfLPKA/NumJYC5Za7YiBxtc1LNqsXSZIPPvfd4KUEHXCwXjIQ==',key_name='tempest-TestNetworkAdvancedServerOps-168802950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-lbbdviwb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:16:51Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=09e6757a-c973-4e60-bc00-5f66b22c4f51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.317 2 DEBUG nova.network.os_vif_util [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converting VIF {"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.318 2 DEBUG nova.network.os_vif_util [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.319 2 DEBUG nova.objects.instance [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'pci_devices' on Instance uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.337 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  <uuid>09e6757a-c973-4e60-bc00-5f66b22c4f51</uuid>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  <name>instance-000000df</name>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-122576717</nova:name>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:16:56</nova:creationTime>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <nova:user uuid="ffe4d737e4414fb3a3e358f8ca3f3e1e">tempest-TestNetworkAdvancedServerOps-1527846432-project-member</nova:user>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <nova:project uuid="08e102ae48244af2ab448a2e1ff757df">tempest-TestNetworkAdvancedServerOps-1527846432</nova:project>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <nova:port uuid="f0875847-ec73-40e0-960f-0e3d0cd9a411">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <entry name="serial">09e6757a-c973-4e60-bc00-5f66b22c4f51</entry>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <entry name="uuid">09e6757a-c973-4e60-bc00-5f66b22c4f51</entry>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/09e6757a-c973-4e60-bc00-5f66b22c4f51_disk">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/09e6757a-c973-4e60-bc00-5f66b22c4f51_disk.config">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:53:9d:74"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <target dev="tapf0875847-ec"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/09e6757a-c973-4e60-bc00-5f66b22c4f51/console.log" append="off"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:16:57 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:16:57 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:16:57 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:16:57 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.339 2 DEBUG nova.compute.manager [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Preparing to wait for external event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.339 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.339 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.340 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.340 2 DEBUG nova.virt.libvirt.vif [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T13:16:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-122576717',display_name='tempest-TestNetworkAdvancedServerOps-server-122576717',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-122576717',id=223,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMf7reumBTydD8Bvj9fduWIu4RpBeEqz1kj5y/au6S9Ayi0aksKCGdFTLh16toR0AjipEWmJ4nHDHJ2lgdfLPKA/NumJYC5Za7YiBxtc1LNqsXSZIPPvfd4KUEHXCwXjIQ==',key_name='tempest-TestNetworkAdvancedServerOps-168802950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-lbbdviwb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T13:16:51Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=09e6757a-c973-4e60-bc00-5f66b22c4f51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.341 2 DEBUG nova.network.os_vif_util [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converting VIF {"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.341 2 DEBUG nova.network.os_vif_util [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.341 2 DEBUG os_vif [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.342 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.343 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.346 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0875847-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.346 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf0875847-ec, col_values=(('external_ids', {'iface-id': 'f0875847-ec73-40e0-960f-0e3d0cd9a411', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:9d:74', 'vm-uuid': '09e6757a-c973-4e60-bc00-5f66b22c4f51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:57 np0005465988 NetworkManager[45041]: <info>  [1759411017.3505] manager: (tapf0875847-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/450)
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.356 2 INFO os_vif [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec')#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.417 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.418 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.418 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] No VIF found with MAC fa:16:3e:53:9d:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.419 2 INFO nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Using config drive#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.452 2 DEBUG nova.storage.rbd_utils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:16:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:16:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.712 2 DEBUG nova.network.neutron [req-8bef0aa1-573c-4925-a78a-be7b79885961 req-4e039fc4-f103-4c4c-9a23-08e8bd716a85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Updated VIF entry in instance network info cache for port f0875847-ec73-40e0-960f-0e3d0cd9a411. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.713 2 DEBUG nova.network.neutron [req-8bef0aa1-573c-4925-a78a-be7b79885961 req-4e039fc4-f103-4c4c-9a23-08e8bd716a85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Updating instance_info_cache with network_info: [{"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.740 2 DEBUG oslo_concurrency.lockutils [req-8bef0aa1-573c-4925-a78a-be7b79885961 req-4e039fc4-f103-4c4c-9a23-08e8bd716a85 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.924 2 INFO nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Creating config drive at /var/lib/nova/instances/09e6757a-c973-4e60-bc00-5f66b22c4f51/disk.config#033[00m
Oct  2 09:16:57 np0005465988 nova_compute[236126]: 2025-10-02 13:16:57.929 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/09e6757a-c973-4e60-bc00-5f66b22c4f51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqsvkmtdm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.072 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/09e6757a-c973-4e60-bc00-5f66b22c4f51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqsvkmtdm" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.108 2 DEBUG nova.storage.rbd_utils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] rbd image 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.113 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/09e6757a-c973-4e60-bc00-5f66b22c4f51/disk.config 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.300 2 DEBUG oslo_concurrency.processutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/09e6757a-c973-4e60-bc00-5f66b22c4f51/disk.config 09e6757a-c973-4e60-bc00-5f66b22c4f51_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.187s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.301 2 INFO nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Deleting local config drive /var/lib/nova/instances/09e6757a-c973-4e60-bc00-5f66b22c4f51/disk.config because it was imported into RBD.#033[00m
Oct  2 09:16:58 np0005465988 kernel: tapf0875847-ec: entered promiscuous mode
Oct  2 09:16:58 np0005465988 NetworkManager[45041]: <info>  [1759411018.3554] manager: (tapf0875847-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/451)
Oct  2 09:16:58 np0005465988 ovn_controller[132601]: 2025-10-02T13:16:58Z|01007|binding|INFO|Claiming lport f0875847-ec73-40e0-960f-0e3d0cd9a411 for this chassis.
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:58 np0005465988 ovn_controller[132601]: 2025-10-02T13:16:58Z|01008|binding|INFO|f0875847-ec73-40e0-960f-0e3d0cd9a411: Claiming fa:16:3e:53:9d:74 10.100.0.3
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.374 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:9d:74 10.100.0.3'], port_security=['fa:16:3e:53:9d:74 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '09e6757a-c973-4e60-bc00-5f66b22c4f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f64db62a-2972-4b2e-a8f5-e51887945e90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08e102ae48244af2ab448a2e1ff757df', 'neutron:revision_number': '2', 'neutron:security_group_ids': '57a62add-9cfb-4284-af25-abd70022edf9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c562599-5308-42d3-bd7e-fd65d6049e08, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f0875847-ec73-40e0-960f-0e3d0cd9a411) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.376 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f0875847-ec73-40e0-960f-0e3d0cd9a411 in datapath f64db62a-2972-4b2e-a8f5-e51887945e90 bound to our chassis#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.378 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f64db62a-2972-4b2e-a8f5-e51887945e90#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.390 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7a8c9b2e-8aa7-41d2-9f1c-3312e40d2bab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.390 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf64db62a-21 in ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.395 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf64db62a-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.395 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[5d203366-8c98-4280-8f70-2362fe45d7c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.395 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dec4f044-7dfe-4874-88b2-02cb89fe3e50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 systemd-machined[192594]: New machine qemu-106-instance-000000df.
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.407 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[d1580e28-e574-4960-b1b5-b41e57f5b5d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 systemd[1]: Started Virtual Machine qemu-106-instance-000000df.
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.435 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ae3a160c-734f-4e6b-b01e-8e736deffc32]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 ovn_controller[132601]: 2025-10-02T13:16:58Z|01009|binding|INFO|Setting lport f0875847-ec73-40e0-960f-0e3d0cd9a411 ovn-installed in OVS
Oct  2 09:16:58 np0005465988 ovn_controller[132601]: 2025-10-02T13:16:58Z|01010|binding|INFO|Setting lport f0875847-ec73-40e0-960f-0e3d0cd9a411 up in Southbound
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:58 np0005465988 systemd-udevd[345597]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:16:58 np0005465988 NetworkManager[45041]: <info>  [1759411018.4526] device (tapf0875847-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:16:58 np0005465988 NetworkManager[45041]: <info>  [1759411018.4535] device (tapf0875847-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.472 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[88391bf4-f8cf-4458-9d8f-fa83b610d22f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 NetworkManager[45041]: <info>  [1759411018.4803] manager: (tapf64db62a-20): new Veth device (/org/freedesktop/NetworkManager/Devices/452)
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.479 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[1733b7f8-db79-4986-a71b-e3413f4190e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.514 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[63699761-0781-4cec-8431-dcea8f2964d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.517 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[b73b32ec-1ef8-40ce-9beb-0caa7f481665]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 NetworkManager[45041]: <info>  [1759411018.5396] device (tapf64db62a-20): carrier: link connected
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.544 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb111d9-5369-417b-a026-57b48b6b684b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.564 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[91ef1bd6-edfa-44df-9d2b-ee5c8e111b07]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf64db62a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:d1:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 298], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 907887, 'reachable_time': 43726, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 345627, 'error': None, 'target': 'ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:16:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:16:58.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.579 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[d088c662-904b-444c-b0f3-63f47ebd5431]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3b:d13d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 907887, 'tstamp': 907887}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 345628, 'error': None, 'target': 'ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.596 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a869c5-b797-44dd-892d-2c4623a0433a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf64db62a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:d1:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 298], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 907887, 'reachable_time': 43726, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 345629, 'error': None, 'target': 'ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.623 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9070f717-20f7-4e36-b632-fb8c1ab1ec5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.660 2 DEBUG nova.compute.manager [req-738428c2-4280-447d-abcc-ac5becc644f8 req-bd4becd1-1451-4d55-949c-4c9cb253fe5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.661 2 DEBUG oslo_concurrency.lockutils [req-738428c2-4280-447d-abcc-ac5becc644f8 req-bd4becd1-1451-4d55-949c-4c9cb253fe5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.661 2 DEBUG oslo_concurrency.lockutils [req-738428c2-4280-447d-abcc-ac5becc644f8 req-bd4becd1-1451-4d55-949c-4c9cb253fe5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.662 2 DEBUG oslo_concurrency.lockutils [req-738428c2-4280-447d-abcc-ac5becc644f8 req-bd4becd1-1451-4d55-949c-4c9cb253fe5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.662 2 DEBUG nova.compute.manager [req-738428c2-4280-447d-abcc-ac5becc644f8 req-bd4becd1-1451-4d55-949c-4c9cb253fe5d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Processing event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.698 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc7d335-ed4a-4cf7-8d41-b87fbbad1699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.699 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf64db62a-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.700 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.700 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf64db62a-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:16:58 np0005465988 NetworkManager[45041]: <info>  [1759411018.7031] manager: (tapf64db62a-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/453)
Oct  2 09:16:58 np0005465988 kernel: tapf64db62a-20: entered promiscuous mode
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.705 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf64db62a-20, col_values=(('external_ids', {'iface-id': '0446a480-1e12-4c33-972d-ae44155e006e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:16:58 np0005465988 ovn_controller[132601]: 2025-10-02T13:16:58Z|01011|binding|INFO|Releasing lport 0446a480-1e12-4c33-972d-ae44155e006e from this chassis (sb_readonly=0)
Oct  2 09:16:58 np0005465988 nova_compute[236126]: 2025-10-02 13:16:58.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.719 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f64db62a-2972-4b2e-a8f5-e51887945e90.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f64db62a-2972-4b2e-a8f5-e51887945e90.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.720 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bebcc325-9ee6-46e9-aea8-2dfebcbd6e53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.720 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-f64db62a-2972-4b2e-a8f5-e51887945e90
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/f64db62a-2972-4b2e-a8f5-e51887945e90.pid.haproxy
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID f64db62a-2972-4b2e-a8f5-e51887945e90
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:16:58 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:16:58.721 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90', 'env', 'PROCESS_TAG=haproxy-f64db62a-2972-4b2e-a8f5-e51887945e90', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f64db62a-2972-4b2e-a8f5-e51887945e90.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:16:59 np0005465988 podman[345677]: 2025-10-02 13:16:59.110783699 +0000 UTC m=+0.058613555 container create 62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 09:16:59 np0005465988 systemd[1]: Started libpod-conmon-62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1.scope.
Oct  2 09:16:59 np0005465988 podman[345677]: 2025-10-02 13:16:59.086195427 +0000 UTC m=+0.034025313 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:16:59 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:16:59 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4b51808a5f9909a9a2c09c5f042b40e12bf116b1da36a578185efb001bca537/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:16:59 np0005465988 podman[345677]: 2025-10-02 13:16:59.207198933 +0000 UTC m=+0.155028789 container init 62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:16:59 np0005465988 podman[345677]: 2025-10-02 13:16:59.215756827 +0000 UTC m=+0.163586683 container start 62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 09:16:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:16:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:16:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:16:59.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:16:59 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[345717]: [NOTICE]   (345723) : New worker (345725) forked
Oct  2 09:16:59 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[345717]: [NOTICE]   (345723) : Loading success.
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.494 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.556 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759411019.5563028, 09e6757a-c973-4e60-bc00-5f66b22c4f51 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.557 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] VM Started (Lifecycle Event)#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.559 2 DEBUG nova.compute.manager [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.563 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.566 2 INFO nova.virt.libvirt.driver [-] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Instance spawned successfully.#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.566 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.580 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.588 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.595 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.596 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.597 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.597 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.598 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.598 2 DEBUG nova.virt.libvirt.driver [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.607 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.608 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759411019.5572593, 09e6757a-c973-4e60-bc00-5f66b22c4f51 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.608 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] VM Paused (Lifecycle Event)#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.631 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.636 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759411019.5614564, 09e6757a-c973-4e60-bc00-5f66b22c4f51 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.639 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.653 2 INFO nova.compute.manager [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Took 8.10 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.654 2 DEBUG nova.compute.manager [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.663 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.667 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.705 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.728 2 INFO nova.compute.manager [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Took 9.11 seconds to build instance.#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.747 2 DEBUG oslo_concurrency.lockutils [None req-37af9a88-6d3c-4a24-b223-5d69afe741b5 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.748 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 3.867s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.748 2 INFO nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 09:16:59 np0005465988 nova_compute[236126]: 2025-10-02 13:16:59.749 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:00.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:17:00 np0005465988 nova_compute[236126]: 2025-10-02 13:17:00.756 2 DEBUG nova.compute.manager [req-bb0bd1fa-f859-4f6e-9ce2-28797ca95b6a req-8513b078-632f-4e9c-8c58-3fe2c7f52a5a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:17:00 np0005465988 nova_compute[236126]: 2025-10-02 13:17:00.757 2 DEBUG oslo_concurrency.lockutils [req-bb0bd1fa-f859-4f6e-9ce2-28797ca95b6a req-8513b078-632f-4e9c-8c58-3fe2c7f52a5a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:00 np0005465988 nova_compute[236126]: 2025-10-02 13:17:00.757 2 DEBUG oslo_concurrency.lockutils [req-bb0bd1fa-f859-4f6e-9ce2-28797ca95b6a req-8513b078-632f-4e9c-8c58-3fe2c7f52a5a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:00 np0005465988 nova_compute[236126]: 2025-10-02 13:17:00.758 2 DEBUG oslo_concurrency.lockutils [req-bb0bd1fa-f859-4f6e-9ce2-28797ca95b6a req-8513b078-632f-4e9c-8c58-3fe2c7f52a5a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:00 np0005465988 nova_compute[236126]: 2025-10-02 13:17:00.758 2 DEBUG nova.compute.manager [req-bb0bd1fa-f859-4f6e-9ce2-28797ca95b6a req-8513b078-632f-4e9c-8c58-3fe2c7f52a5a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] No waiting events found dispatching network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:17:00 np0005465988 nova_compute[236126]: 2025-10-02 13:17:00.758 2 WARNING nova.compute.manager [req-bb0bd1fa-f859-4f6e-9ce2-28797ca95b6a req-8513b078-632f-4e9c-8c58-3fe2c7f52a5a d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received unexpected event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 for instance with vm_state active and task_state None.#033[00m
Oct  2 09:17:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:01.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #175. Immutable memtables: 0.
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.390724) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 111] Flushing memtable with next log file: 175
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411021390821, "job": 111, "event": "flush_started", "num_memtables": 1, "num_entries": 1339, "num_deletes": 257, "total_data_size": 2833841, "memory_usage": 2874040, "flush_reason": "Manual Compaction"}
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 111] Level-0 flush table #176: started
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411021407947, "cf_name": "default", "job": 111, "event": "table_file_creation", "file_number": 176, "file_size": 1857610, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 84932, "largest_seqno": 86266, "table_properties": {"data_size": 1851818, "index_size": 3122, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12837, "raw_average_key_size": 20, "raw_value_size": 1839913, "raw_average_value_size": 2874, "num_data_blocks": 137, "num_entries": 640, "num_filter_entries": 640, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759410922, "oldest_key_time": 1759410922, "file_creation_time": 1759411021, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 176, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 111] Flush lasted 17702 microseconds, and 6007 cpu microseconds.
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.408440) [db/flush_job.cc:967] [default] [JOB 111] Level-0 flush table #176: 1857610 bytes OK
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.408575) [db/memtable_list.cc:519] [default] Level-0 commit table #176 started
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.410621) [db/memtable_list.cc:722] [default] Level-0 commit table #176: memtable #1 done
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.410662) EVENT_LOG_v1 {"time_micros": 1759411021410651, "job": 111, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.410689) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 111] Try to delete WAL files size 2827499, prev total WAL file size 2827499, number of live WAL files 2.
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000172.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.412402) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033323630' seq:72057594037927935, type:22 .. '6C6F676D0033353131' seq:0, type:0; will stop at (end)
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 112] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 111 Base level 0, inputs: [176(1814KB)], [174(10MB)]
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411021412473, "job": 112, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [176], "files_L6": [174], "score": -1, "input_data_size": 13159009, "oldest_snapshot_seqno": -1}
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 112] Generated table #177: 10501 keys, 13019042 bytes, temperature: kUnknown
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411021500736, "cf_name": "default", "job": 112, "event": "table_file_creation", "file_number": 177, "file_size": 13019042, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12951863, "index_size": 39802, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26309, "raw_key_size": 277802, "raw_average_key_size": 26, "raw_value_size": 12768742, "raw_average_value_size": 1215, "num_data_blocks": 1511, "num_entries": 10501, "num_filter_entries": 10501, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759411021, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 177, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.501016) [db/compaction/compaction_job.cc:1663] [default] [JOB 112] Compacted 1@0 + 1@6 files to L6 => 13019042 bytes
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.502283) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 149.0 rd, 147.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 10.8 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(14.1) write-amplify(7.0) OK, records in: 11034, records dropped: 533 output_compression: NoCompression
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.502298) EVENT_LOG_v1 {"time_micros": 1759411021502290, "job": 112, "event": "compaction_finished", "compaction_time_micros": 88332, "compaction_time_cpu_micros": 34044, "output_level": 6, "num_output_files": 1, "total_output_size": 13019042, "num_input_records": 11034, "num_output_records": 10501, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000176.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411021502718, "job": 112, "event": "table_file_deletion", "file_number": 176}
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000174.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411021504529, "job": 112, "event": "table_file_deletion", "file_number": 174}
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.412247) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.505796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.505805) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.505809) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.505813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:17:01 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:17:01.505816) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:17:01 np0005465988 nova_compute[236126]: 2025-10-02 13:17:01.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:02 np0005465988 nova_compute[236126]: 2025-10-02 13:17:02.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:02.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:17:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:03.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:17:03 np0005465988 podman[345738]: 2025-10-02 13:17:03.525664306 +0000 UTC m=+0.062008761 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 09:17:03 np0005465988 podman[345737]: 2025-10-02 13:17:03.557120454 +0000 UTC m=+0.093694267 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 09:17:03 np0005465988 podman[345739]: 2025-10-02 13:17:03.564791853 +0000 UTC m=+0.094692045 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  2 09:17:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:03.700 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=88, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=87) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:17:03 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:03.702 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:17:03 np0005465988 nova_compute[236126]: 2025-10-02 13:17:03.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:04 np0005465988 nova_compute[236126]: 2025-10-02 13:17:04.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:04 np0005465988 NetworkManager[45041]: <info>  [1759411024.4958] manager: (patch-br-int-to-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/454)
Oct  2 09:17:04 np0005465988 NetworkManager[45041]: <info>  [1759411024.4974] manager: (patch-provnet-d4c15165-6ca5-4aca-85b0-69b58109c521-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/455)
Oct  2 09:17:04 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:04Z|01012|binding|INFO|Releasing lport 0446a480-1e12-4c33-972d-ae44155e006e from this chassis (sb_readonly=0)
Oct  2 09:17:04 np0005465988 nova_compute[236126]: 2025-10-02 13:17:04.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:04 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:04Z|01013|binding|INFO|Releasing lport 0446a480-1e12-4c33-972d-ae44155e006e from this chassis (sb_readonly=0)
Oct  2 09:17:04 np0005465988 nova_compute[236126]: 2025-10-02 13:17:04.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:04.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:17:04 np0005465988 nova_compute[236126]: 2025-10-02 13:17:04.864 2 DEBUG nova.compute.manager [req-ac251d0d-d42c-46f8-adbc-7156e960fafd req-e3eda776-2aab-448b-bcdb-4864d32f5cdb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-changed-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:17:04 np0005465988 nova_compute[236126]: 2025-10-02 13:17:04.865 2 DEBUG nova.compute.manager [req-ac251d0d-d42c-46f8-adbc-7156e960fafd req-e3eda776-2aab-448b-bcdb-4864d32f5cdb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Refreshing instance network info cache due to event network-changed-f0875847-ec73-40e0-960f-0e3d0cd9a411. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:17:04 np0005465988 nova_compute[236126]: 2025-10-02 13:17:04.866 2 DEBUG oslo_concurrency.lockutils [req-ac251d0d-d42c-46f8-adbc-7156e960fafd req-e3eda776-2aab-448b-bcdb-4864d32f5cdb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:17:04 np0005465988 nova_compute[236126]: 2025-10-02 13:17:04.866 2 DEBUG oslo_concurrency.lockutils [req-ac251d0d-d42c-46f8-adbc-7156e960fafd req-e3eda776-2aab-448b-bcdb-4864d32f5cdb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:17:04 np0005465988 nova_compute[236126]: 2025-10-02 13:17:04.866 2 DEBUG nova.network.neutron [req-ac251d0d-d42c-46f8-adbc-7156e960fafd req-e3eda776-2aab-448b-bcdb-4864d32f5cdb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Refreshing network info cache for port f0875847-ec73-40e0-960f-0e3d0cd9a411 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:17:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:05.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:17:06 np0005465988 nova_compute[236126]: 2025-10-02 13:17:06.231 2 DEBUG nova.network.neutron [req-ac251d0d-d42c-46f8-adbc-7156e960fafd req-e3eda776-2aab-448b-bcdb-4864d32f5cdb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Updated VIF entry in instance network info cache for port f0875847-ec73-40e0-960f-0e3d0cd9a411. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:17:06 np0005465988 nova_compute[236126]: 2025-10-02 13:17:06.233 2 DEBUG nova.network.neutron [req-ac251d0d-d42c-46f8-adbc-7156e960fafd req-e3eda776-2aab-448b-bcdb-4864d32f5cdb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Updating instance_info_cache with network_info: [{"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:17:06 np0005465988 nova_compute[236126]: 2025-10-02 13:17:06.256 2 DEBUG oslo_concurrency.lockutils [req-ac251d0d-d42c-46f8-adbc-7156e960fafd req-e3eda776-2aab-448b-bcdb-4864d32f5cdb d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:17:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:06.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:17:06 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:06.706 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '88'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:17:06 np0005465988 nova_compute[236126]: 2025-10-02 13:17:06.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:07.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:17:07 np0005465988 nova_compute[236126]: 2025-10-02 13:17:07.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:08.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:09.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:17:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:10.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:17:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:11.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:17:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:11 np0005465988 nova_compute[236126]: 2025-10-02 13:17:11.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:12 np0005465988 nova_compute[236126]: 2025-10-02 13:17:12.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:12.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:13.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:17:13 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:13Z|00133|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:53:9d:74 10.100.0.3
Oct  2 09:17:13 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:13Z|00134|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:53:9d:74 10.100.0.3
Oct  2 09:17:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:14.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:15.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:16.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:16 np0005465988 nova_compute[236126]: 2025-10-02 13:17:16.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:17.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:17 np0005465988 nova_compute[236126]: 2025-10-02 13:17:17.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:18 np0005465988 nova_compute[236126]: 2025-10-02 13:17:18.035 2 INFO nova.compute.manager [None req-e1516bed-5220-4f71-9816-4a272f59aa63 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Get console output#033[00m
Oct  2 09:17:18 np0005465988 nova_compute[236126]: 2025-10-02 13:17:18.043 15591 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 09:17:18 np0005465988 nova_compute[236126]: 2025-10-02 13:17:18.349 2 DEBUG oslo_concurrency.lockutils [None req-b40aa8fe-00ca-40b9-8b0a-fbed64c36124 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:18 np0005465988 nova_compute[236126]: 2025-10-02 13:17:18.350 2 DEBUG oslo_concurrency.lockutils [None req-b40aa8fe-00ca-40b9-8b0a-fbed64c36124 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:18 np0005465988 nova_compute[236126]: 2025-10-02 13:17:18.350 2 DEBUG nova.compute.manager [None req-b40aa8fe-00ca-40b9-8b0a-fbed64c36124 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:17:18 np0005465988 nova_compute[236126]: 2025-10-02 13:17:18.354 2 DEBUG nova.compute.manager [None req-b40aa8fe-00ca-40b9-8b0a-fbed64c36124 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Oct  2 09:17:18 np0005465988 nova_compute[236126]: 2025-10-02 13:17:18.354 2 DEBUG nova.objects.instance [None req-b40aa8fe-00ca-40b9-8b0a-fbed64c36124 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'flavor' on Instance uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:17:18 np0005465988 nova_compute[236126]: 2025-10-02 13:17:18.384 2 DEBUG nova.virt.libvirt.driver [None req-b40aa8fe-00ca-40b9-8b0a-fbed64c36124 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 09:17:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:18.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:19.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:20.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:20 np0005465988 kernel: tapf0875847-ec (unregistering): left promiscuous mode
Oct  2 09:17:20 np0005465988 NetworkManager[45041]: <info>  [1759411040.7647] device (tapf0875847-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:17:20 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:20Z|01014|binding|INFO|Releasing lport f0875847-ec73-40e0-960f-0e3d0cd9a411 from this chassis (sb_readonly=0)
Oct  2 09:17:20 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:20Z|01015|binding|INFO|Setting lport f0875847-ec73-40e0-960f-0e3d0cd9a411 down in Southbound
Oct  2 09:17:20 np0005465988 nova_compute[236126]: 2025-10-02 13:17:20.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:20 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:20Z|01016|binding|INFO|Removing iface tapf0875847-ec ovn-installed in OVS
Oct  2 09:17:20 np0005465988 nova_compute[236126]: 2025-10-02 13:17:20.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:20.781 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:9d:74 10.100.0.3'], port_security=['fa:16:3e:53:9d:74 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '09e6757a-c973-4e60-bc00-5f66b22c4f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f64db62a-2972-4b2e-a8f5-e51887945e90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08e102ae48244af2ab448a2e1ff757df', 'neutron:revision_number': '4', 'neutron:security_group_ids': '57a62add-9cfb-4284-af25-abd70022edf9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.187'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c562599-5308-42d3-bd7e-fd65d6049e08, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f0875847-ec73-40e0-960f-0e3d0cd9a411) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:17:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:20.782 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f0875847-ec73-40e0-960f-0e3d0cd9a411 in datapath f64db62a-2972-4b2e-a8f5-e51887945e90 unbound from our chassis#033[00m
Oct  2 09:17:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:20.783 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f64db62a-2972-4b2e-a8f5-e51887945e90, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:17:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:20.785 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[57ca4f31-cf5c-4e34-9091-ae4f57b0ddab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:20 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:20.786 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90 namespace which is not needed anymore#033[00m
Oct  2 09:17:20 np0005465988 nova_compute[236126]: 2025-10-02 13:17:20.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:20 np0005465988 systemd[1]: machine-qemu\x2d106\x2dinstance\x2d000000df.scope: Deactivated successfully.
Oct  2 09:17:20 np0005465988 systemd[1]: machine-qemu\x2d106\x2dinstance\x2d000000df.scope: Consumed 14.422s CPU time.
Oct  2 09:17:20 np0005465988 systemd-machined[192594]: Machine qemu-106-instance-000000df terminated.
Oct  2 09:17:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:21.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:17:21 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[345717]: [NOTICE]   (345723) : haproxy version is 2.8.14-c23fe91
Oct  2 09:17:21 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[345717]: [NOTICE]   (345723) : path to executable is /usr/sbin/haproxy
Oct  2 09:17:21 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[345717]: [WARNING]  (345723) : Exiting Master process...
Oct  2 09:17:21 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[345717]: [WARNING]  (345723) : Exiting Master process...
Oct  2 09:17:21 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[345717]: [ALERT]    (345723) : Current worker (345725) exited with code 143 (Terminated)
Oct  2 09:17:21 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[345717]: [WARNING]  (345723) : All workers exited. Exiting... (0)
Oct  2 09:17:21 np0005465988 systemd[1]: libpod-62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1.scope: Deactivated successfully.
Oct  2 09:17:21 np0005465988 podman[345880]: 2025-10-02 13:17:21.285254596 +0000 UTC m=+0.393933900 container died 62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:17:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.407 2 INFO nova.virt.libvirt.driver [None req-b40aa8fe-00ca-40b9-8b0a-fbed64c36124 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.413 2 DEBUG nova.compute.manager [req-8d32228d-8125-42cf-af2b-1c2e68d0809e req-4442345d-171c-4d92-8be8-b90b45358823 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-vif-unplugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.414 2 DEBUG oslo_concurrency.lockutils [req-8d32228d-8125-42cf-af2b-1c2e68d0809e req-4442345d-171c-4d92-8be8-b90b45358823 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.414 2 DEBUG oslo_concurrency.lockutils [req-8d32228d-8125-42cf-af2b-1c2e68d0809e req-4442345d-171c-4d92-8be8-b90b45358823 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.414 2 DEBUG oslo_concurrency.lockutils [req-8d32228d-8125-42cf-af2b-1c2e68d0809e req-4442345d-171c-4d92-8be8-b90b45358823 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.415 2 DEBUG nova.compute.manager [req-8d32228d-8125-42cf-af2b-1c2e68d0809e req-4442345d-171c-4d92-8be8-b90b45358823 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] No waiting events found dispatching network-vif-unplugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.415 2 WARNING nova.compute.manager [req-8d32228d-8125-42cf-af2b-1c2e68d0809e req-4442345d-171c-4d92-8be8-b90b45358823 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received unexpected event network-vif-unplugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 for instance with vm_state active and task_state powering-off.#033[00m
Oct  2 09:17:21 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1-userdata-shm.mount: Deactivated successfully.
Oct  2 09:17:21 np0005465988 systemd[1]: var-lib-containers-storage-overlay-f4b51808a5f9909a9a2c09c5f042b40e12bf116b1da36a578185efb001bca537-merged.mount: Deactivated successfully.
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.421 2 INFO nova.virt.libvirt.driver [-] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Instance destroyed successfully.#033[00m
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.421 2 DEBUG nova.objects.instance [None req-b40aa8fe-00ca-40b9-8b0a-fbed64c36124 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'numa_topology' on Instance uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:17:21 np0005465988 podman[345880]: 2025-10-02 13:17:21.43143585 +0000 UTC m=+0.540115154 container cleanup 62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.437 2 DEBUG nova.compute.manager [None req-b40aa8fe-00ca-40b9-8b0a-fbed64c36124 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:17:21 np0005465988 systemd[1]: libpod-conmon-62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1.scope: Deactivated successfully.
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.490 2 DEBUG oslo_concurrency.lockutils [None req-b40aa8fe-00ca-40b9-8b0a-fbed64c36124 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:21 np0005465988 podman[345921]: 2025-10-02 13:17:21.499329949 +0000 UTC m=+0.044279975 container remove 62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 09:17:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:21.505 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fd652241-aeb1-4dfc-849d-2b33698a49da]: (4, ('Thu Oct  2 01:17:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90 (62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1)\n62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1\nThu Oct  2 01:17:21 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90 (62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1)\n62ebe793ff019cc1e46e3e0544b19ddb53858fdd49489ce63b46915b6888e1d1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:21.507 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[a4fb99f9-9cd8-4265-bebf-61d7fa5f857e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:21.509 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf64db62a-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:21 np0005465988 kernel: tapf64db62a-20: left promiscuous mode
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:21.538 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[699c5ac1-56bc-4bbe-b3af-231b68c734ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:21.582 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[dae533f7-a0bd-4775-af80-300f2c8bbb60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:21.584 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[611a03f0-8739-4fe8-872e-03e3849cfe59]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:21.602 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7380efbe-9887-4a15-bade-7b741b128f6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 907879, 'reachable_time': 44527, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 345939, 'error': None, 'target': 'ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:21.606 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:17:21 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:21.606 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[e142790c-9242-453e-abe5-912ad92c6667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:21 np0005465988 systemd[1]: run-netns-ovnmeta\x2df64db62a\x2d2972\x2d4b2e\x2da8f5\x2de51887945e90.mount: Deactivated successfully.
Oct  2 09:17:21 np0005465988 nova_compute[236126]: 2025-10-02 13:17:21.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:22 np0005465988 nova_compute[236126]: 2025-10-02 13:17:22.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:22.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:23.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:23 np0005465988 nova_compute[236126]: 2025-10-02 13:17:23.530 2 DEBUG nova.compute.manager [req-8e2a7f71-82c5-4ca5-b1ef-707b69ba851f req-602531f2-fe19-4f4c-b5d4-372bd406bb2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:17:23 np0005465988 nova_compute[236126]: 2025-10-02 13:17:23.530 2 DEBUG oslo_concurrency.lockutils [req-8e2a7f71-82c5-4ca5-b1ef-707b69ba851f req-602531f2-fe19-4f4c-b5d4-372bd406bb2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:23 np0005465988 nova_compute[236126]: 2025-10-02 13:17:23.531 2 DEBUG oslo_concurrency.lockutils [req-8e2a7f71-82c5-4ca5-b1ef-707b69ba851f req-602531f2-fe19-4f4c-b5d4-372bd406bb2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:23 np0005465988 nova_compute[236126]: 2025-10-02 13:17:23.531 2 DEBUG oslo_concurrency.lockutils [req-8e2a7f71-82c5-4ca5-b1ef-707b69ba851f req-602531f2-fe19-4f4c-b5d4-372bd406bb2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:23 np0005465988 nova_compute[236126]: 2025-10-02 13:17:23.531 2 DEBUG nova.compute.manager [req-8e2a7f71-82c5-4ca5-b1ef-707b69ba851f req-602531f2-fe19-4f4c-b5d4-372bd406bb2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] No waiting events found dispatching network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:17:23 np0005465988 nova_compute[236126]: 2025-10-02 13:17:23.532 2 WARNING nova.compute.manager [req-8e2a7f71-82c5-4ca5-b1ef-707b69ba851f req-602531f2-fe19-4f4c-b5d4-372bd406bb2e d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received unexpected event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 for instance with vm_state stopped and task_state None.#033[00m
Oct  2 09:17:24 np0005465988 nova_compute[236126]: 2025-10-02 13:17:24.590 2 INFO nova.compute.manager [None req-b0e0bbaa-71cd-4200-86a9-9a228b38fba3 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Get console output#033[00m
Oct  2 09:17:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:24.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:24 np0005465988 nova_compute[236126]: 2025-10-02 13:17:24.829 2 DEBUG nova.objects.instance [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'flavor' on Instance uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:17:24 np0005465988 nova_compute[236126]: 2025-10-02 13:17:24.856 2 DEBUG oslo_concurrency.lockutils [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:17:24 np0005465988 nova_compute[236126]: 2025-10-02 13:17:24.857 2 DEBUG oslo_concurrency.lockutils [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquired lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:17:24 np0005465988 nova_compute[236126]: 2025-10-02 13:17:24.857 2 DEBUG nova.network.neutron [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 09:17:24 np0005465988 nova_compute[236126]: 2025-10-02 13:17:24.857 2 DEBUG nova.objects.instance [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'info_cache' on Instance uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:17:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:25.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:25 np0005465988 podman[345942]: 2025-10-02 13:17:25.539660199 +0000 UTC m=+0.069044053 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:17:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:26.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:26 np0005465988 nova_compute[236126]: 2025-10-02 13:17:26.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:27.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:27 np0005465988 nova_compute[236126]: 2025-10-02 13:17:27.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:27.425 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:27.426 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:27.426 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:27 np0005465988 nova_compute[236126]: 2025-10-02 13:17:27.941 2 DEBUG nova.network.neutron [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Updating instance_info_cache with network_info: [{"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:17:27 np0005465988 nova_compute[236126]: 2025-10-02 13:17:27.966 2 DEBUG oslo_concurrency.lockutils [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Releasing lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:17:27 np0005465988 nova_compute[236126]: 2025-10-02 13:17:27.991 2 INFO nova.virt.libvirt.driver [-] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Instance destroyed successfully.#033[00m
Oct  2 09:17:27 np0005465988 nova_compute[236126]: 2025-10-02 13:17:27.992 2 DEBUG nova.objects.instance [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'numa_topology' on Instance uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.005 2 DEBUG nova.objects.instance [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'resources' on Instance uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.017 2 DEBUG nova.virt.libvirt.vif [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:16:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-122576717',display_name='tempest-TestNetworkAdvancedServerOps-server-122576717',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-122576717',id=223,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMf7reumBTydD8Bvj9fduWIu4RpBeEqz1kj5y/au6S9Ayi0aksKCGdFTLh16toR0AjipEWmJ4nHDHJ2lgdfLPKA/NumJYC5Za7YiBxtc1LNqsXSZIPPvfd4KUEHXCwXjIQ==',key_name='tempest-TestNetworkAdvancedServerOps-168802950',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:16:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-lbbdviwb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:17:21Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=09e6757a-c973-4e60-bc00-5f66b22c4f51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.018 2 DEBUG nova.network.os_vif_util [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converting VIF {"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.018 2 DEBUG nova.network.os_vif_util [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.019 2 DEBUG os_vif [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.022 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0875847-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.028 2 INFO os_vif [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec')#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.035 2 DEBUG nova.virt.libvirt.driver [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Start _get_guest_xml network_info=[{"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'c2d0c2bc-fe21-4689-86ae-d6728c15874c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.039 2 WARNING nova.virt.libvirt.driver [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.049 2 DEBUG nova.virt.libvirt.host [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.049 2 DEBUG nova.virt.libvirt.host [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.052 2 DEBUG nova.virt.libvirt.host [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.053 2 DEBUG nova.virt.libvirt.host [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.054 2 DEBUG nova.virt.libvirt.driver [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.054 2 DEBUG nova.virt.hardware [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:57:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cef129e5-cce4-4465-9674-03d3559e8a14',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=c2d0c2bc-fe21-4689-86ae-d6728c15874c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.054 2 DEBUG nova.virt.hardware [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.055 2 DEBUG nova.virt.hardware [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.055 2 DEBUG nova.virt.hardware [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.055 2 DEBUG nova.virt.hardware [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.055 2 DEBUG nova.virt.hardware [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.055 2 DEBUG nova.virt.hardware [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.056 2 DEBUG nova.virt.hardware [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.056 2 DEBUG nova.virt.hardware [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.056 2 DEBUG nova.virt.hardware [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.056 2 DEBUG nova.virt.hardware [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.057 2 DEBUG nova.objects.instance [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'vcpu_model' on Instance uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.072 2 DEBUG oslo_concurrency.processutils [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:17:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:17:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1594647742' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.527 2 DEBUG oslo_concurrency.processutils [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:17:28 np0005465988 nova_compute[236126]: 2025-10-02 13:17:28.572 2 DEBUG oslo_concurrency.processutils [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:17:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:28.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:29 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Oct  2 09:17:29 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3463319671' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.035 2 DEBUG oslo_concurrency.processutils [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.039 2 DEBUG nova.virt.libvirt.vif [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:16:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-122576717',display_name='tempest-TestNetworkAdvancedServerOps-server-122576717',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-122576717',id=223,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMf7reumBTydD8Bvj9fduWIu4RpBeEqz1kj5y/au6S9Ayi0aksKCGdFTLh16toR0AjipEWmJ4nHDHJ2lgdfLPKA/NumJYC5Za7YiBxtc1LNqsXSZIPPvfd4KUEHXCwXjIQ==',key_name='tempest-TestNetworkAdvancedServerOps-168802950',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:16:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-lbbdviwb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:17:21Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=09e6757a-c973-4e60-bc00-5f66b22c4f51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.040 2 DEBUG nova.network.os_vif_util [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converting VIF {"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.041 2 DEBUG nova.network.os_vif_util [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.043 2 DEBUG nova.objects.instance [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'pci_devices' on Instance uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.067 2 DEBUG nova.virt.libvirt.driver [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] End _get_guest_xml xml=<domain type="kvm">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  <uuid>09e6757a-c973-4e60-bc00-5f66b22c4f51</uuid>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  <name>instance-000000df</name>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  <memory>131072</memory>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  <vcpu>1</vcpu>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  <metadata>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-122576717</nova:name>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <nova:creationTime>2025-10-02 13:17:28</nova:creationTime>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <nova:flavor name="m1.nano">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <nova:memory>128</nova:memory>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <nova:disk>1</nova:disk>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <nova:swap>0</nova:swap>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <nova:vcpus>1</nova:vcpus>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      </nova:flavor>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <nova:owner>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <nova:user uuid="ffe4d737e4414fb3a3e358f8ca3f3e1e">tempest-TestNetworkAdvancedServerOps-1527846432-project-member</nova:user>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <nova:project uuid="08e102ae48244af2ab448a2e1ff757df">tempest-TestNetworkAdvancedServerOps-1527846432</nova:project>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      </nova:owner>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <nova:root type="image" uuid="c2d0c2bc-fe21-4689-86ae-d6728c15874c"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <nova:ports>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <nova:port uuid="f0875847-ec73-40e0-960f-0e3d0cd9a411">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        </nova:port>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      </nova:ports>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    </nova:instance>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  </metadata>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  <sysinfo type="smbios">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <system>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <entry name="manufacturer">RDO</entry>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <entry name="product">OpenStack Compute</entry>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <entry name="serial">09e6757a-c973-4e60-bc00-5f66b22c4f51</entry>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <entry name="uuid">09e6757a-c973-4e60-bc00-5f66b22c4f51</entry>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <entry name="family">Virtual Machine</entry>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    </system>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  </sysinfo>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  <os>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <boot dev="hd"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <smbios mode="sysinfo"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  </os>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  <features>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <acpi/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <apic/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <vmcoreinfo/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  </features>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  <clock offset="utc">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <timer name="hpet" present="no"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  </clock>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  <cpu mode="custom" match="exact">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <model>Nehalem</model>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  </cpu>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  <devices>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <disk type="network" device="disk">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/09e6757a-c973-4e60-bc00-5f66b22c4f51_disk">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <target dev="vda" bus="virtio"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <disk type="network" device="cdrom">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <driver type="raw" cache="none"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <source protocol="rbd" name="vms/09e6757a-c973-4e60-bc00-5f66b22c4f51_disk.config">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.100" port="6789"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.102" port="6789"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <host name="192.168.122.101" port="6789"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      </source>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <auth username="openstack">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:        <secret type="ceph" uuid="fd4c5763-22d1-50ea-ad0b-96a3dc3040b2"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      </auth>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <target dev="sda" bus="sata"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    </disk>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <interface type="ethernet">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <mac address="fa:16:3e:53:9d:74"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <mtu size="1442"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <target dev="tapf0875847-ec"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    </interface>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <serial type="pty">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <log file="/var/lib/nova/instances/09e6757a-c973-4e60-bc00-5f66b22c4f51/console.log" append="off"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    </serial>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <video>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <model type="virtio"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    </video>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <input type="tablet" bus="usb"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <input type="keyboard" bus="usb"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <rng model="virtio">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <backend model="random">/dev/urandom</backend>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    </rng>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <controller type="usb" index="0"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    <memballoon model="virtio">
Oct  2 09:17:29 np0005465988 nova_compute[236126]:      <stats period="10"/>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:    </memballoon>
Oct  2 09:17:29 np0005465988 nova_compute[236126]:  </devices>
Oct  2 09:17:29 np0005465988 nova_compute[236126]: </domain>
Oct  2 09:17:29 np0005465988 nova_compute[236126]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.069 2 DEBUG nova.virt.libvirt.driver [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] skipping disk for instance-000000df as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.070 2 DEBUG nova.virt.libvirt.driver [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] skipping disk for instance-000000df as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.071 2 DEBUG nova.virt.libvirt.vif [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:16:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-122576717',display_name='tempest-TestNetworkAdvancedServerOps-server-122576717',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-122576717',id=223,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMf7reumBTydD8Bvj9fduWIu4RpBeEqz1kj5y/au6S9Ayi0aksKCGdFTLh16toR0AjipEWmJ4nHDHJ2lgdfLPKA/NumJYC5Za7YiBxtc1LNqsXSZIPPvfd4KUEHXCwXjIQ==',key_name='tempest-TestNetworkAdvancedServerOps-168802950',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:16:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-lbbdviwb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:17:21Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=09e6757a-c973-4e60-bc00-5f66b22c4f51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.072 2 DEBUG nova.network.os_vif_util [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converting VIF {"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.073 2 DEBUG nova.network.os_vif_util [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.073 2 DEBUG os_vif [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.074 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.074 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.081 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0875847-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.082 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf0875847-ec, col_values=(('external_ids', {'iface-id': 'f0875847-ec73-40e0-960f-0e3d0cd9a411', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:9d:74', 'vm-uuid': '09e6757a-c973-4e60-bc00-5f66b22c4f51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:17:29 np0005465988 NetworkManager[45041]: <info>  [1759411049.0846] manager: (tapf0875847-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/456)
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.090 2 INFO os_vif [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec')#033[00m
Oct  2 09:17:29 np0005465988 kernel: tapf0875847-ec: entered promiscuous mode
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:29 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:29Z|01017|binding|INFO|Claiming lport f0875847-ec73-40e0-960f-0e3d0cd9a411 for this chassis.
Oct  2 09:17:29 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:29Z|01018|binding|INFO|f0875847-ec73-40e0-960f-0e3d0cd9a411: Claiming fa:16:3e:53:9d:74 10.100.0.3
Oct  2 09:17:29 np0005465988 NetworkManager[45041]: <info>  [1759411049.1851] manager: (tapf0875847-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/457)
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.189 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:9d:74 10.100.0.3'], port_security=['fa:16:3e:53:9d:74 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '09e6757a-c973-4e60-bc00-5f66b22c4f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f64db62a-2972-4b2e-a8f5-e51887945e90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08e102ae48244af2ab448a2e1ff757df', 'neutron:revision_number': '5', 'neutron:security_group_ids': '57a62add-9cfb-4284-af25-abd70022edf9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.187'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c562599-5308-42d3-bd7e-fd65d6049e08, chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f0875847-ec73-40e0-960f-0e3d0cd9a411) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.191 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f0875847-ec73-40e0-960f-0e3d0cd9a411 in datapath f64db62a-2972-4b2e-a8f5-e51887945e90 bound to our chassis#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.192 142124 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f64db62a-2972-4b2e-a8f5-e51887945e90#033[00m
Oct  2 09:17:29 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:29Z|01019|binding|INFO|Setting lport f0875847-ec73-40e0-960f-0e3d0cd9a411 ovn-installed in OVS
Oct  2 09:17:29 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:29Z|01020|binding|INFO|Setting lport f0875847-ec73-40e0-960f-0e3d0cd9a411 up in Southbound
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.210 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[e95fd5fc-a168-4efe-9f5e-df8dd9f35352]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.211 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf64db62a-21 in ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.213 239912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf64db62a-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.213 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b5af1986-9c0d-4e9c-ab27-7f01f6469730]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.214 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[81d0592b-9dc7-44d8-a1db-203ea01f9dc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 systemd-udevd[346041]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.227 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[86675325-87be-4585-a0d3-d1be643a7c12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 NetworkManager[45041]: <info>  [1759411049.2370] device (tapf0875847-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 09:17:29 np0005465988 NetworkManager[45041]: <info>  [1759411049.2384] device (tapf0875847-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.248 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[01720064-209b-4870-b846-b9d2526f4145]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 systemd-machined[192594]: New machine qemu-107-instance-000000df.
Oct  2 09:17:29 np0005465988 systemd[1]: Started Virtual Machine qemu-107-instance-000000df.
Oct  2 09:17:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:29.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.288 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[9cf4519c-af9b-4647-99d8-e5706067faa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 systemd-udevd[346045]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 09:17:29 np0005465988 NetworkManager[45041]: <info>  [1759411049.2976] manager: (tapf64db62a-20): new Veth device (/org/freedesktop/NetworkManager/Devices/458)
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.296 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc1234c-b527-4b7a-866b-71e57d537c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.335 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[76e796eb-277f-450d-9b92-a63ea994e318]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.338 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7d81b6-05d5-445c-9412-43aa42c97328]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 NetworkManager[45041]: <info>  [1759411049.3681] device (tapf64db62a-20): carrier: link connected
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.378 239928 DEBUG oslo.privsep.daemon [-] privsep: reply[90624efb-28d1-47b4-8742-9fe9b25464ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.399 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[70acf654-fe31-4eec-aabb-45296b5780de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf64db62a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:d1:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 301], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 910969, 'reachable_time': 44266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 346074, 'error': None, 'target': 'ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.415 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[70eb8e7b-58dd-4c9a-8c28-1fad665a9a05]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3b:d13d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 910969, 'tstamp': 910969}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 346075, 'error': None, 'target': 'ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.431 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[341c4961-a019-4ead-8cd6-8e3ccaa255da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf64db62a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:d1:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 301], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 910969, 'reachable_time': 44266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 346076, 'error': None, 'target': 'ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.458 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[2744e470-b8d7-4f19-836c-3a3e0c7f9abe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.513 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[0f6e46c4-dd7f-4d6d-88fb-e7678e8388eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.515 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf64db62a-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.515 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.516 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf64db62a-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:29 np0005465988 NetworkManager[45041]: <info>  [1759411049.5193] manager: (tapf64db62a-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/459)
Oct  2 09:17:29 np0005465988 kernel: tapf64db62a-20: entered promiscuous mode
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.522 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf64db62a-20, col_values=(('external_ids', {'iface-id': '0446a480-1e12-4c33-972d-ae44155e006e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:29 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:29Z|01021|binding|INFO|Releasing lport 0446a480-1e12-4c33-972d-ae44155e006e from this chassis (sb_readonly=0)
Oct  2 09:17:29 np0005465988 nova_compute[236126]: 2025-10-02 13:17:29.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.547 142124 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f64db62a-2972-4b2e-a8f5-e51887945e90.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f64db62a-2972-4b2e-a8f5-e51887945e90.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.548 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[7b98aeaa-d5f1-4d89-82c9-49470a0c93cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.549 142124 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: global
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    log         /dev/log local0 debug
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    log-tag     haproxy-metadata-proxy-f64db62a-2972-4b2e-a8f5-e51887945e90
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    user        root
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    group       root
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    maxconn     1024
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    pidfile     /var/lib/neutron/external/pids/f64db62a-2972-4b2e-a8f5-e51887945e90.pid.haproxy
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    daemon
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: defaults
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    log global
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    mode http
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    option httplog
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    option dontlognull
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    option http-server-close
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    option forwardfor
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    retries                 3
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    timeout http-request    30s
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    timeout connect         30s
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    timeout client          32s
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    timeout server          32s
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    timeout http-keep-alive 30s
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: listen listener
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    bind 169.254.169.254:80
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]:    http-request add-header X-OVN-Network-ID f64db62a-2972-4b2e-a8f5-e51887945e90
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 09:17:29 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:29.550 142124 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90', 'env', 'PROCESS_TAG=haproxy-f64db62a-2972-4b2e-a8f5-e51887945e90', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f64db62a-2972-4b2e-a8f5-e51887945e90.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 09:17:29 np0005465988 podman[346108]: 2025-10-02 13:17:29.916285872 +0000 UTC m=+0.051640546 container create 5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:17:29 np0005465988 systemd[1]: Started libpod-conmon-5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212.scope.
Oct  2 09:17:29 np0005465988 systemd[1]: Started libcrun container.
Oct  2 09:17:29 np0005465988 podman[346108]: 2025-10-02 13:17:29.889490467 +0000 UTC m=+0.024845161 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 09:17:29 np0005465988 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06b3ff283c143219e430e85e7c49dbfc428f8df0774587085659e6b7da835f2b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 09:17:30 np0005465988 podman[346108]: 2025-10-02 13:17:30.036453724 +0000 UTC m=+0.171808428 container init 5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 09:17:30 np0005465988 podman[346108]: 2025-10-02 13:17:30.043517925 +0000 UTC m=+0.178872599 container start 5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 09:17:30 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[346123]: [NOTICE]   (346146) : New worker (346148) forked
Oct  2 09:17:30 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[346123]: [NOTICE]   (346146) : Loading success.
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.459 2 DEBUG nova.compute.manager [req-f19f9e7c-84d0-4b79-a584-28d3c1363271 req-dfa0973c-bfa6-4f8d-aaaa-73f235f08f4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.459 2 DEBUG oslo_concurrency.lockutils [req-f19f9e7c-84d0-4b79-a584-28d3c1363271 req-dfa0973c-bfa6-4f8d-aaaa-73f235f08f4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.459 2 DEBUG oslo_concurrency.lockutils [req-f19f9e7c-84d0-4b79-a584-28d3c1363271 req-dfa0973c-bfa6-4f8d-aaaa-73f235f08f4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.460 2 DEBUG oslo_concurrency.lockutils [req-f19f9e7c-84d0-4b79-a584-28d3c1363271 req-dfa0973c-bfa6-4f8d-aaaa-73f235f08f4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.460 2 DEBUG nova.compute.manager [req-f19f9e7c-84d0-4b79-a584-28d3c1363271 req-dfa0973c-bfa6-4f8d-aaaa-73f235f08f4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] No waiting events found dispatching network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.460 2 WARNING nova.compute.manager [req-f19f9e7c-84d0-4b79-a584-28d3c1363271 req-dfa0973c-bfa6-4f8d-aaaa-73f235f08f4c d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received unexpected event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 for instance with vm_state stopped and task_state powering-on.#033[00m
Oct  2 09:17:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:17:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:30.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.726 2 DEBUG nova.virt.libvirt.host [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Removed pending event for 09e6757a-c973-4e60-bc00-5f66b22c4f51 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.726 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759411050.7256205, 09e6757a-c973-4e60-bc00-5f66b22c4f51 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.727 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] VM Resumed (Lifecycle Event)#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.730 2 DEBUG nova.compute.manager [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.734 2 INFO nova.virt.libvirt.driver [-] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Instance rebooted successfully.#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.734 2 DEBUG nova.compute.manager [None req-4c67d2a3-b5e7-4adf-b770-9bcba1195d4a ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.763 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.767 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.798 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.798 2 DEBUG nova.virt.driver [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] Emitting event <LifecycleEvent: 1759411050.7284875, 09e6757a-c973-4e60-bc00-5f66b22c4f51 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.798 2 INFO nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] VM Started (Lifecycle Event)#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.821 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:17:30 np0005465988 nova_compute[236126]: 2025-10-02 13:17:30.825 2 DEBUG nova.compute.manager [None req-ae2a7c55-6c7d-43cd-bff4-fdc2d7018fb2 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 09:17:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:31.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:31 np0005465988 nova_compute[236126]: 2025-10-02 13:17:31.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:32 np0005465988 nova_compute[236126]: 2025-10-02 13:17:32.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:17:32 np0005465988 nova_compute[236126]: 2025-10-02 13:17:32.565 2 DEBUG nova.compute.manager [req-d8eb890f-59b6-4502-b316-9a49263051e3 req-2bf46e6c-0ef4-4ee3-95a6-dc57b5f1049d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:17:32 np0005465988 nova_compute[236126]: 2025-10-02 13:17:32.565 2 DEBUG oslo_concurrency.lockutils [req-d8eb890f-59b6-4502-b316-9a49263051e3 req-2bf46e6c-0ef4-4ee3-95a6-dc57b5f1049d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:32 np0005465988 nova_compute[236126]: 2025-10-02 13:17:32.565 2 DEBUG oslo_concurrency.lockutils [req-d8eb890f-59b6-4502-b316-9a49263051e3 req-2bf46e6c-0ef4-4ee3-95a6-dc57b5f1049d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:32 np0005465988 nova_compute[236126]: 2025-10-02 13:17:32.566 2 DEBUG oslo_concurrency.lockutils [req-d8eb890f-59b6-4502-b316-9a49263051e3 req-2bf46e6c-0ef4-4ee3-95a6-dc57b5f1049d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:32 np0005465988 nova_compute[236126]: 2025-10-02 13:17:32.566 2 DEBUG nova.compute.manager [req-d8eb890f-59b6-4502-b316-9a49263051e3 req-2bf46e6c-0ef4-4ee3-95a6-dc57b5f1049d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] No waiting events found dispatching network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:17:32 np0005465988 nova_compute[236126]: 2025-10-02 13:17:32.566 2 WARNING nova.compute.manager [req-d8eb890f-59b6-4502-b316-9a49263051e3 req-2bf46e6c-0ef4-4ee3-95a6-dc57b5f1049d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received unexpected event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 for instance with vm_state active and task_state None.#033[00m
Oct  2 09:17:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:32.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:33.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:34 np0005465988 nova_compute[236126]: 2025-10-02 13:17:34.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:34 np0005465988 podman[346234]: 2025-10-02 13:17:34.561332229 +0000 UTC m=+0.076262569 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 09:17:34 np0005465988 podman[346235]: 2025-10-02 13:17:34.566061784 +0000 UTC m=+0.082948499 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:17:34 np0005465988 podman[346233]: 2025-10-02 13:17:34.586782656 +0000 UTC m=+0.109370194 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 09:17:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:34.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:35.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:36.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:36 np0005465988 nova_compute[236126]: 2025-10-02 13:17:36.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:37.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:37 np0005465988 nova_compute[236126]: 2025-10-02 13:17:37.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:17:37 np0005465988 nova_compute[236126]: 2025-10-02 13:17:37.516 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:37 np0005465988 nova_compute[236126]: 2025-10-02 13:17:37.518 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:37 np0005465988 nova_compute[236126]: 2025-10-02 13:17:37.518 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:37 np0005465988 nova_compute[236126]: 2025-10-02 13:17:37.518 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:17:37 np0005465988 nova_compute[236126]: 2025-10-02 13:17:37.518 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:17:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:17:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1240062327' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:17:38 np0005465988 nova_compute[236126]: 2025-10-02 13:17:38.019 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:17:38 np0005465988 nova_compute[236126]: 2025-10-02 13:17:38.109 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000df as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:17:38 np0005465988 nova_compute[236126]: 2025-10-02 13:17:38.110 2 DEBUG nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] skipping disk for instance-000000df as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct  2 09:17:38 np0005465988 nova_compute[236126]: 2025-10-02 13:17:38.296 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:17:38 np0005465988 nova_compute[236126]: 2025-10-02 13:17:38.297 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3880MB free_disk=20.94272232055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:17:38 np0005465988 nova_compute[236126]: 2025-10-02 13:17:38.297 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:38 np0005465988 nova_compute[236126]: 2025-10-02 13:17:38.298 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:38 np0005465988 nova_compute[236126]: 2025-10-02 13:17:38.410 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Instance 09e6757a-c973-4e60-bc00-5f66b22c4f51 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 09:17:38 np0005465988 nova_compute[236126]: 2025-10-02 13:17:38.410 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:17:38 np0005465988 nova_compute[236126]: 2025-10-02 13:17:38.410 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:17:38 np0005465988 nova_compute[236126]: 2025-10-02 13:17:38.615 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:17:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:38.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:39 np0005465988 nova_compute[236126]: 2025-10-02 13:17:39.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 09:17:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:39.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 09:17:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:17:39 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1698312112' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:17:39 np0005465988 nova_compute[236126]: 2025-10-02 13:17:39.354 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.739s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:17:39 np0005465988 nova_compute[236126]: 2025-10-02 13:17:39.364 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:17:39 np0005465988 nova_compute[236126]: 2025-10-02 13:17:39.382 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:17:39 np0005465988 nova_compute[236126]: 2025-10-02 13:17:39.406 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:17:39 np0005465988 nova_compute[236126]: 2025-10-02 13:17:39.407 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:40.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:41.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:41 np0005465988 nova_compute[236126]: 2025-10-02 13:17:41.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:42.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:43.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:43 np0005465988 nova_compute[236126]: 2025-10-02 13:17:43.406 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:17:43 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:43Z|00135|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:53:9d:74 10.100.0.3
Oct  2 09:17:44 np0005465988 nova_compute[236126]: 2025-10-02 13:17:44.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:44 np0005465988 nova_compute[236126]: 2025-10-02 13:17:44.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:17:44 np0005465988 nova_compute[236126]: 2025-10-02 13:17:44.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:17:44 np0005465988 nova_compute[236126]: 2025-10-02 13:17:44.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:17:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:44.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:45.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:17:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:46.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:46 np0005465988 nova_compute[236126]: 2025-10-02 13:17:46.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:47.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:47 np0005465988 nova_compute[236126]: 2025-10-02 13:17:47.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:17:48 np0005465988 nova_compute[236126]: 2025-10-02 13:17:48.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:17:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:48.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:49 np0005465988 nova_compute[236126]: 2025-10-02 13:17:49.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:49 np0005465988 nova_compute[236126]: 2025-10-02 13:17:49.174 2 INFO nova.compute.manager [None req-6657b9a0-55c2-4871-8a26-55f3509671e0 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Get console output#033[00m
Oct  2 09:17:49 np0005465988 nova_compute[236126]: 2025-10-02 13:17:49.181 15591 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 09:17:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:49.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:49 np0005465988 nova_compute[236126]: 2025-10-02 13:17:49.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:17:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:50.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.833 2 DEBUG nova.compute.manager [req-90d1433e-a5fd-4250-abac-ef0f0c2745bd req-bc5b422f-c1b9-4d3d-97bd-cc6a451d46c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-changed-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.834 2 DEBUG nova.compute.manager [req-90d1433e-a5fd-4250-abac-ef0f0c2745bd req-bc5b422f-c1b9-4d3d-97bd-cc6a451d46c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Refreshing instance network info cache due to event network-changed-f0875847-ec73-40e0-960f-0e3d0cd9a411. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.834 2 DEBUG oslo_concurrency.lockutils [req-90d1433e-a5fd-4250-abac-ef0f0c2745bd req-bc5b422f-c1b9-4d3d-97bd-cc6a451d46c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.834 2 DEBUG oslo_concurrency.lockutils [req-90d1433e-a5fd-4250-abac-ef0f0c2745bd req-bc5b422f-c1b9-4d3d-97bd-cc6a451d46c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquired lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.835 2 DEBUG nova.network.neutron [req-90d1433e-a5fd-4250-abac-ef0f0c2745bd req-bc5b422f-c1b9-4d3d-97bd-cc6a451d46c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Refreshing network info cache for port f0875847-ec73-40e0-960f-0e3d0cd9a411 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.945 2 DEBUG oslo_concurrency.lockutils [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.945 2 DEBUG oslo_concurrency.lockutils [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.946 2 DEBUG oslo_concurrency.lockutils [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.946 2 DEBUG oslo_concurrency.lockutils [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.946 2 DEBUG oslo_concurrency.lockutils [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.948 2 INFO nova.compute.manager [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Terminating instance#033[00m
Oct  2 09:17:50 np0005465988 nova_compute[236126]: 2025-10-02 13:17:50.949 2 DEBUG nova.compute.manager [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 09:17:51 np0005465988 kernel: tapf0875847-ec (unregistering): left promiscuous mode
Oct  2 09:17:51 np0005465988 NetworkManager[45041]: <info>  [1759411071.0048] device (tapf0875847-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:51 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:51Z|01022|binding|INFO|Releasing lport f0875847-ec73-40e0-960f-0e3d0cd9a411 from this chassis (sb_readonly=0)
Oct  2 09:17:51 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:51Z|01023|binding|INFO|Setting lport f0875847-ec73-40e0-960f-0e3d0cd9a411 down in Southbound
Oct  2 09:17:51 np0005465988 ovn_controller[132601]: 2025-10-02T13:17:51Z|01024|binding|INFO|Removing iface tapf0875847-ec ovn-installed in OVS
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.028 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:9d:74 10.100.0.3'], port_security=['fa:16:3e:53:9d:74 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '09e6757a-c973-4e60-bc00-5f66b22c4f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f64db62a-2972-4b2e-a8f5-e51887945e90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08e102ae48244af2ab448a2e1ff757df', 'neutron:revision_number': '6', 'neutron:security_group_ids': '57a62add-9cfb-4284-af25-abd70022edf9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c562599-5308-42d3-bd7e-fd65d6049e08, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>], logical_port=f0875847-ec73-40e0-960f-0e3d0cd9a411) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcdc7036850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.031 142124 INFO neutron.agent.ovn.metadata.agent [-] Port f0875847-ec73-40e0-960f-0e3d0cd9a411 in datapath f64db62a-2972-4b2e-a8f5-e51887945e90 unbound from our chassis#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.033 142124 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f64db62a-2972-4b2e-a8f5-e51887945e90, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.035 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[deb4ed30-4f61-4c01-bb17-3b5aeb6475b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.036 142124 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90 namespace which is not needed anymore#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:51 np0005465988 systemd[1]: machine-qemu\x2d107\x2dinstance\x2d000000df.scope: Deactivated successfully.
Oct  2 09:17:51 np0005465988 systemd[1]: machine-qemu\x2d107\x2dinstance\x2d000000df.scope: Consumed 14.613s CPU time.
Oct  2 09:17:51 np0005465988 systemd-machined[192594]: Machine qemu-107-instance-000000df terminated.
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.190 2 INFO nova.virt.libvirt.driver [-] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Instance destroyed successfully.#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.191 2 DEBUG nova.objects.instance [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lazy-loading 'resources' on Instance uuid 09e6757a-c973-4e60-bc00-5f66b22c4f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 09:17:51 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[346123]: [NOTICE]   (346146) : haproxy version is 2.8.14-c23fe91
Oct  2 09:17:51 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[346123]: [NOTICE]   (346146) : path to executable is /usr/sbin/haproxy
Oct  2 09:17:51 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[346123]: [WARNING]  (346146) : Exiting Master process...
Oct  2 09:17:51 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[346123]: [ALERT]    (346146) : Current worker (346148) exited with code 143 (Terminated)
Oct  2 09:17:51 np0005465988 neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90[346123]: [WARNING]  (346146) : All workers exited. Exiting... (0)
Oct  2 09:17:51 np0005465988 systemd[1]: libpod-5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212.scope: Deactivated successfully.
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.206 2 DEBUG nova.virt.libvirt.vif [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T13:16:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-122576717',display_name='tempest-TestNetworkAdvancedServerOps-server-122576717',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-122576717',id=223,image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMf7reumBTydD8Bvj9fduWIu4RpBeEqz1kj5y/au6S9Ayi0aksKCGdFTLh16toR0AjipEWmJ4nHDHJ2lgdfLPKA/NumJYC5Za7YiBxtc1LNqsXSZIPPvfd4KUEHXCwXjIQ==',key_name='tempest-TestNetworkAdvancedServerOps-168802950',keypairs=<?>,launch_index=0,launched_at=2025-10-02T13:16:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='08e102ae48244af2ab448a2e1ff757df',ramdisk_id='',reservation_id='r-lbbdviwb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c2d0c2bc-fe21-4689-86ae-d6728c15874c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1527846432',owner_user_name='tempest-TestNetworkAdvancedServerOps-1527846432-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T13:17:30Z,user_data=None,user_id='ffe4d737e4414fb3a3e358f8ca3f3e1e',uuid=09e6757a-c973-4e60-bc00-5f66b22c4f51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.207 2 DEBUG nova.network.os_vif_util [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converting VIF {"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.208 2 DEBUG nova.network.os_vif_util [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.208 2 DEBUG os_vif [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.210 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0875847-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:17:51 np0005465988 podman[346425]: 2025-10-02 13:17:51.210279594 +0000 UTC m=+0.053545001 container died 5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.217 2 INFO os_vif [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:74,bridge_name='br-int',has_traffic_filtering=True,id=f0875847-ec73-40e0-960f-0e3d0cd9a411,network=Network(f64db62a-2972-4b2e-a8f5-e51887945e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0875847-ec')#033[00m
Oct  2 09:17:51 np0005465988 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212-userdata-shm.mount: Deactivated successfully.
Oct  2 09:17:51 np0005465988 systemd[1]: var-lib-containers-storage-overlay-06b3ff283c143219e430e85e7c49dbfc428f8df0774587085659e6b7da835f2b-merged.mount: Deactivated successfully.
Oct  2 09:17:51 np0005465988 podman[346425]: 2025-10-02 13:17:51.267976501 +0000 UTC m=+0.111241918 container cleanup 5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.273 2 DEBUG nova.compute.manager [req-c24e6465-e714-474f-a4cb-2fe42b221741 req-c080d736-1301-4907-993d-204236442138 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-vif-unplugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.274 2 DEBUG oslo_concurrency.lockutils [req-c24e6465-e714-474f-a4cb-2fe42b221741 req-c080d736-1301-4907-993d-204236442138 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.274 2 DEBUG oslo_concurrency.lockutils [req-c24e6465-e714-474f-a4cb-2fe42b221741 req-c080d736-1301-4907-993d-204236442138 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.275 2 DEBUG oslo_concurrency.lockutils [req-c24e6465-e714-474f-a4cb-2fe42b221741 req-c080d736-1301-4907-993d-204236442138 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.275 2 DEBUG nova.compute.manager [req-c24e6465-e714-474f-a4cb-2fe42b221741 req-c080d736-1301-4907-993d-204236442138 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] No waiting events found dispatching network-vif-unplugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.276 2 DEBUG nova.compute.manager [req-c24e6465-e714-474f-a4cb-2fe42b221741 req-c080d736-1301-4907-993d-204236442138 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-vif-unplugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 09:17:51 np0005465988 systemd[1]: libpod-conmon-5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212.scope: Deactivated successfully.
Oct  2 09:17:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:51.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:51 np0005465988 podman[346482]: 2025-10-02 13:17:51.359882036 +0000 UTC m=+0.067538550 container remove 5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.367 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[9db0a5a6-6c69-448c-b130-3c3f199092c9]: (4, ('Thu Oct  2 01:17:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90 (5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212)\n5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212\nThu Oct  2 01:17:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90 (5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212)\n5d04d210f312d60ab0cefb4fb59f7b61a3007bc7e96bb10ed1ce48a7b6cbe212\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.369 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ccf53d-7c9d-4a49-ab2f-7e685a0830d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.370 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf64db62a-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:51 np0005465988 kernel: tapf64db62a-20: left promiscuous mode
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.391 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7efc75-38e7-42be-939a-879e81f1fcca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.423 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc10cb2-bede-4b0e-a817-11aaee38abb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.425 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[31f56de4-7d9b-4745-bdea-6b71b701eb2d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.445 239912 DEBUG oslo.privsep.daemon [-] privsep: reply[fc534650-d206-4524-95d2-da3ea3e070a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 910961, 'reachable_time': 36632, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 346497, 'error': None, 'target': 'ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:51 np0005465988 systemd[1]: run-netns-ovnmeta\x2df64db62a\x2d2972\x2d4b2e\x2da8f5\x2de51887945e90.mount: Deactivated successfully.
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.451 142246 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f64db62a-2972-4b2e-a8f5-e51887945e90 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 09:17:51 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:51.451 142246 DEBUG oslo.privsep.daemon [-] privsep: reply[197edf58-8d18-4d83-9cd7-0711cc61fcf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.490 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.491 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:17:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.714 2 INFO nova.virt.libvirt.driver [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Deleting instance files /var/lib/nova/instances/09e6757a-c973-4e60-bc00-5f66b22c4f51_del#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.715 2 INFO nova.virt.libvirt.driver [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Deletion of /var/lib/nova/instances/09e6757a-c973-4e60-bc00-5f66b22c4f51_del complete#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.764 2 INFO nova.compute.manager [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Took 0.81 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.765 2 DEBUG oslo.service.loopingcall [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.765 2 DEBUG nova.compute.manager [-] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 09:17:51 np0005465988 nova_compute[236126]: 2025-10-02 13:17:51.765 2 DEBUG nova.network.neutron [-] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 09:17:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:52.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:53.211 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=89, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=88) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:53 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:17:53.212 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.230 2 DEBUG nova.network.neutron [-] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.260 2 INFO nova.compute.manager [-] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Took 1.50 seconds to deallocate network for instance.#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.274 2 DEBUG nova.network.neutron [req-90d1433e-a5fd-4250-abac-ef0f0c2745bd req-bc5b422f-c1b9-4d3d-97bd-cc6a451d46c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Updated VIF entry in instance network info cache for port f0875847-ec73-40e0-960f-0e3d0cd9a411. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.274 2 DEBUG nova.network.neutron [req-90d1433e-a5fd-4250-abac-ef0f0c2745bd req-bc5b422f-c1b9-4d3d-97bd-cc6a451d46c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Updating instance_info_cache with network_info: [{"id": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "address": "fa:16:3e:53:9d:74", "network": {"id": "f64db62a-2972-4b2e-a8f5-e51887945e90", "bridge": "br-int", "label": "tempest-network-smoke--709946045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08e102ae48244af2ab448a2e1ff757df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0875847-ec", "ovs_interfaceid": "f0875847-ec73-40e0-960f-0e3d0cd9a411", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:17:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:53.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.305 2 DEBUG oslo_concurrency.lockutils [req-90d1433e-a5fd-4250-abac-ef0f0c2745bd req-bc5b422f-c1b9-4d3d-97bd-cc6a451d46c2 d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Releasing lock "refresh_cache-09e6757a-c973-4e60-bc00-5f66b22c4f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.312 2 DEBUG oslo_concurrency.lockutils [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.313 2 DEBUG oslo_concurrency.lockutils [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.315 2 DEBUG nova.compute.manager [req-c58280da-8c89-4dd3-8639-0e57a493340f req-f7f06a18-c6f5-4dd0-a7a4-165affc3842d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-vif-deleted-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.315 2 INFO nova.compute.manager [req-c58280da-8c89-4dd3-8639-0e57a493340f req-f7f06a18-c6f5-4dd0-a7a4-165affc3842d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Neutron deleted interface f0875847-ec73-40e0-960f-0e3d0cd9a411; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.316 2 DEBUG nova.network.neutron [req-c58280da-8c89-4dd3-8639-0e57a493340f req-f7f06a18-c6f5-4dd0-a7a4-165affc3842d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.337 2 DEBUG nova.compute.manager [req-c58280da-8c89-4dd3-8639-0e57a493340f req-f7f06a18-c6f5-4dd0-a7a4-165affc3842d d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Detach interface failed, port_id=f0875847-ec73-40e0-960f-0e3d0cd9a411, reason: Instance 09e6757a-c973-4e60-bc00-5f66b22c4f51 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.361 2 DEBUG oslo_concurrency.processutils [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.411 2 DEBUG nova.compute.manager [req-744cc739-0d8d-4049-955e-5d517caf219f req-425b4c61-4b3b-4e27-8323-4e3b923eb7bf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.412 2 DEBUG oslo_concurrency.lockutils [req-744cc739-0d8d-4049-955e-5d517caf219f req-425b4c61-4b3b-4e27-8323-4e3b923eb7bf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Acquiring lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.412 2 DEBUG oslo_concurrency.lockutils [req-744cc739-0d8d-4049-955e-5d517caf219f req-425b4c61-4b3b-4e27-8323-4e3b923eb7bf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.412 2 DEBUG oslo_concurrency.lockutils [req-744cc739-0d8d-4049-955e-5d517caf219f req-425b4c61-4b3b-4e27-8323-4e3b923eb7bf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.412 2 DEBUG nova.compute.manager [req-744cc739-0d8d-4049-955e-5d517caf219f req-425b4c61-4b3b-4e27-8323-4e3b923eb7bf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] No waiting events found dispatching network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.413 2 WARNING nova.compute.manager [req-744cc739-0d8d-4049-955e-5d517caf219f req-425b4c61-4b3b-4e27-8323-4e3b923eb7bf d55b3ee250d1468fbcf33643dda37adb aebeb47424cc4f05b6c098503009ac0d - - default default] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Received unexpected event network-vif-plugged-f0875847-ec73-40e0-960f-0e3d0cd9a411 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 09:17:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:17:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2595809564' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.866 2 DEBUG oslo_concurrency.processutils [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.874 2 DEBUG nova.compute.provider_tree [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.893 2 DEBUG nova.scheduler.client.report [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.928 2 DEBUG oslo_concurrency.lockutils [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:53 np0005465988 nova_compute[236126]: 2025-10-02 13:17:53.974 2 INFO nova.scheduler.client.report [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Deleted allocations for instance 09e6757a-c973-4e60-bc00-5f66b22c4f51#033[00m
Oct  2 09:17:54 np0005465988 nova_compute[236126]: 2025-10-02 13:17:54.071 2 DEBUG oslo_concurrency.lockutils [None req-b2e03fac-6e4b-4934-8585-5b1aa0876318 ffe4d737e4414fb3a3e358f8ca3f3e1e 08e102ae48244af2ab448a2e1ff757df - - default default] Lock "09e6757a-c973-4e60-bc00-5f66b22c4f51" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:17:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:17:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:54.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:17:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:55.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:56 np0005465988 nova_compute[236126]: 2025-10-02 13:17:56.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:17:56 np0005465988 podman[346523]: 2025-10-02 13:17:56.541699693 +0000 UTC m=+0.079356668 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 09:17:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:17:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:56.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:17:56 np0005465988 nova_compute[236126]: 2025-10-02 13:17:56.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:17:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:57.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:17:58 np0005465988 nova_compute[236126]: 2025-10-02 13:17:58.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:58 np0005465988 nova_compute[236126]: 2025-10-02 13:17:58.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:17:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:17:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:17:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:17:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:17:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:17:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:17:58 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:17:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:17:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:17:58.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:17:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:17:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:17:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:17:59.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:00.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:01 np0005465988 nova_compute[236126]: 2025-10-02 13:18:01.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:01.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:01 np0005465988 nova_compute[236126]: 2025-10-02 13:18:01.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:02 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:18:02.215 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '89'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:18:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:02.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:18:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:03.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:18:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:04.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:04 np0005465988 podman[346703]: 2025-10-02 13:18:04.704834966 +0000 UTC m=+0.068180728 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  2 09:18:04 np0005465988 podman[346702]: 2025-10-02 13:18:04.746326261 +0000 UTC m=+0.114468660 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 09:18:04 np0005465988 podman[346701]: 2025-10-02 13:18:04.792706225 +0000 UTC m=+0.161638527 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 09:18:05 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:18:05 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:18:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:05.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:06 np0005465988 nova_compute[236126]: 2025-10-02 13:18:06.189 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759411071.1875746, 09e6757a-c973-4e60-bc00-5f66b22c4f51 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 09:18:06 np0005465988 nova_compute[236126]: 2025-10-02 13:18:06.190 2 INFO nova.compute.manager [-] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] VM Stopped (Lifecycle Event)#033[00m
Oct  2 09:18:06 np0005465988 nova_compute[236126]: 2025-10-02 13:18:06.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:06 np0005465988 nova_compute[236126]: 2025-10-02 13:18:06.237 2 DEBUG nova.compute.manager [None req-55b16c85-d657-4f81-832d-5a5d1ad6ec08 - - - - - -] [instance: 09e6757a-c973-4e60-bc00-5f66b22c4f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 09:18:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:06.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:06 np0005465988 nova_compute[236126]: 2025-10-02 13:18:06.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:07.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:08.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:09.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:10.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:11 np0005465988 nova_compute[236126]: 2025-10-02 13:18:11.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:11.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:11 np0005465988 nova_compute[236126]: 2025-10-02 13:18:11.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:12.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:13.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:14.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:15.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:16 np0005465988 nova_compute[236126]: 2025-10-02 13:18:16.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:16.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:16 np0005465988 nova_compute[236126]: 2025-10-02 13:18:16.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:18:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:17.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:18:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:18.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:19.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:20.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:21 np0005465988 nova_compute[236126]: 2025-10-02 13:18:21.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:21.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:21 np0005465988 nova_compute[236126]: 2025-10-02 13:18:21.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:22.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:23.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:24.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:25.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:26 np0005465988 nova_compute[236126]: 2025-10-02 13:18:26.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:26.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:26 np0005465988 nova_compute[236126]: 2025-10-02 13:18:26.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:27.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:18:27.426 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:18:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:18:27.427 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:18:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:18:27.427 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:18:27 np0005465988 podman[346856]: 2025-10-02 13:18:27.554378811 +0000 UTC m=+0.085185433 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Oct  2 09:18:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:28.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:18:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:29.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:18:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:30.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:31 np0005465988 nova_compute[236126]: 2025-10-02 13:18:31.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:18:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:31.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:18:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:31 np0005465988 nova_compute[236126]: 2025-10-02 13:18:31.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:32.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:33.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:34 np0005465988 nova_compute[236126]: 2025-10-02 13:18:34.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:18:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:34.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:35.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:35 np0005465988 podman[346934]: 2025-10-02 13:18:35.539446324 +0000 UTC m=+0.060685714 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:18:35 np0005465988 podman[346933]: 2025-10-02 13:18:35.551478817 +0000 UTC m=+0.080255892 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 09:18:35 np0005465988 podman[346932]: 2025-10-02 13:18:35.569929894 +0000 UTC m=+0.103900628 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  2 09:18:36 np0005465988 nova_compute[236126]: 2025-10-02 13:18:36.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:18:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:36.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:18:36 np0005465988 nova_compute[236126]: 2025-10-02 13:18:36.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:18:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:37.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:18:37 np0005465988 nova_compute[236126]: 2025-10-02 13:18:37.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:18:37 np0005465988 nova_compute[236126]: 2025-10-02 13:18:37.509 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:18:37 np0005465988 nova_compute[236126]: 2025-10-02 13:18:37.510 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:18:37 np0005465988 nova_compute[236126]: 2025-10-02 13:18:37.510 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:18:37 np0005465988 nova_compute[236126]: 2025-10-02 13:18:37.510 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:18:37 np0005465988 nova_compute[236126]: 2025-10-02 13:18:37.510 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:18:37 np0005465988 ovn_controller[132601]: 2025-10-02T13:18:37Z|01025|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  2 09:18:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:18:37 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3402411611' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:18:37 np0005465988 nova_compute[236126]: 2025-10-02 13:18:37.954 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.141 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.142 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3997MB free_disk=20.95880889892578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.143 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.143 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.223 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.223 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.253 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:18:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:18:38 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4179534562' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.710 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.716 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:18:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:38.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.733 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.759 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:18:38 np0005465988 nova_compute[236126]: 2025-10-02 13:18:38.760 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:18:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:39.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:40.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:41 np0005465988 nova_compute[236126]: 2025-10-02 13:18:41.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:18:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:41.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:18:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:41 np0005465988 nova_compute[236126]: 2025-10-02 13:18:41.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:42.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:43.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:44.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:44 np0005465988 nova_compute[236126]: 2025-10-02 13:18:44.761 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:18:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:45.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:46 np0005465988 nova_compute[236126]: 2025-10-02 13:18:46.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:46 np0005465988 nova_compute[236126]: 2025-10-02 13:18:46.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:18:46 np0005465988 nova_compute[236126]: 2025-10-02 13:18:46.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:18:46 np0005465988 nova_compute[236126]: 2025-10-02 13:18:46.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:18:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:46.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:46 np0005465988 nova_compute[236126]: 2025-10-02 13:18:46.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:47.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:47 np0005465988 nova_compute[236126]: 2025-10-02 13:18:47.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:18:48 np0005465988 nova_compute[236126]: 2025-10-02 13:18:48.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:18:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:48.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:49.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:50 np0005465988 nova_compute[236126]: 2025-10-02 13:18:50.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:18:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:50.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:51 np0005465988 nova_compute[236126]: 2025-10-02 13:18:51.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:18:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:51.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:18:51 np0005465988 nova_compute[236126]: 2025-10-02 13:18:51.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:18:51 np0005465988 nova_compute[236126]: 2025-10-02 13:18:51.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:18:51 np0005465988 nova_compute[236126]: 2025-10-02 13:18:51.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:18:51 np0005465988 nova_compute[236126]: 2025-10-02 13:18:51.495 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:18:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:51 np0005465988 nova_compute[236126]: 2025-10-02 13:18:51.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #178. Immutable memtables: 0.
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.167998) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 113] Flushing memtable with next log file: 178
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411132168128, "job": 113, "event": "flush_started", "num_memtables": 1, "num_entries": 1302, "num_deletes": 251, "total_data_size": 2920761, "memory_usage": 2960168, "flush_reason": "Manual Compaction"}
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 113] Level-0 flush table #179: started
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411132206812, "cf_name": "default", "job": 113, "event": "table_file_creation", "file_number": 179, "file_size": 1915902, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 86271, "largest_seqno": 87568, "table_properties": {"data_size": 1910260, "index_size": 3036, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11991, "raw_average_key_size": 19, "raw_value_size": 1899048, "raw_average_value_size": 3154, "num_data_blocks": 135, "num_entries": 602, "num_filter_entries": 602, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759411021, "oldest_key_time": 1759411021, "file_creation_time": 1759411132, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 179, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 113] Flush lasted 38861 microseconds, and 6082 cpu microseconds.
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.206875) [db/flush_job.cc:967] [default] [JOB 113] Level-0 flush table #179: 1915902 bytes OK
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.206904) [db/memtable_list.cc:519] [default] Level-0 commit table #179 started
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.235891) [db/memtable_list.cc:722] [default] Level-0 commit table #179: memtable #1 done
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.235968) EVENT_LOG_v1 {"time_micros": 1759411132235954, "job": 113, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.236002) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 113] Try to delete WAL files size 2914684, prev total WAL file size 2914684, number of live WAL files 2.
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000175.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.237486) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037353330' seq:72057594037927935, type:22 .. '7061786F730037373832' seq:0, type:0; will stop at (end)
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 114] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 113 Base level 0, inputs: [179(1870KB)], [177(12MB)]
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411132237600, "job": 114, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [179], "files_L6": [177], "score": -1, "input_data_size": 14934944, "oldest_snapshot_seqno": -1}
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 114] Generated table #180: 10586 keys, 13020852 bytes, temperature: kUnknown
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411132368834, "cf_name": "default", "job": 114, "event": "table_file_creation", "file_number": 180, "file_size": 13020852, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12953165, "index_size": 40118, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26501, "raw_key_size": 280223, "raw_average_key_size": 26, "raw_value_size": 12768684, "raw_average_value_size": 1206, "num_data_blocks": 1521, "num_entries": 10586, "num_filter_entries": 10586, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759411132, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 180, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.369243) [db/compaction/compaction_job.cc:1663] [default] [JOB 114] Compacted 1@0 + 1@6 files to L6 => 13020852 bytes
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.374263) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 113.7 rd, 99.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 12.4 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(14.6) write-amplify(6.8) OK, records in: 11103, records dropped: 517 output_compression: NoCompression
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.374327) EVENT_LOG_v1 {"time_micros": 1759411132374298, "job": 114, "event": "compaction_finished", "compaction_time_micros": 131342, "compaction_time_cpu_micros": 33245, "output_level": 6, "num_output_files": 1, "total_output_size": 13020852, "num_input_records": 11103, "num_output_records": 10586, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000179.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411132374980, "job": 114, "event": "table_file_deletion", "file_number": 179}
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000177.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411132378483, "job": 114, "event": "table_file_deletion", "file_number": 177}
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.237202) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.378614) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.378624) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.378627) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.378630) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:18:52 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:18:52.378632) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:18:52.502 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=90, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=89) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:18:52 np0005465988 nova_compute[236126]: 2025-10-02 13:18:52.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:18:52.504 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:18:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:52.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:18:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:53.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:18:54 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:18:54.507 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '90'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:18:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:18:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:54.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:18:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:55.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:56 np0005465988 nova_compute[236126]: 2025-10-02 13:18:56.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:18:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:56.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:56 np0005465988 nova_compute[236126]: 2025-10-02 13:18:56.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:18:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:18:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:57.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:18:58 np0005465988 podman[347103]: 2025-10-02 13:18:58.502268075 +0000 UTC m=+0.046909820 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 09:18:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:18:58.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:18:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:18:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:18:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:18:59.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:00.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:01 np0005465988 nova_compute[236126]: 2025-10-02 13:19:01.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:01.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:01 np0005465988 nova_compute[236126]: 2025-10-02 13:19:01.490 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:19:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:01 np0005465988 nova_compute[236126]: 2025-10-02 13:19:01.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:02.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:02 np0005465988 ceph-mgr[76715]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3158772141
Oct  2 09:19:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:03.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:04.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:05.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:05 np0005465988 podman[347321]: 2025-10-02 13:19:05.670277903 +0000 UTC m=+0.064396200 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  2 09:19:05 np0005465988 podman[347320]: 2025-10-02 13:19:05.699695103 +0000 UTC m=+0.097380902 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 09:19:05 np0005465988 podman[347322]: 2025-10-02 13:19:05.727087935 +0000 UTC m=+0.120494612 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:19:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:19:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:19:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 09:19:06 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct  2 09:19:06 np0005465988 nova_compute[236126]: 2025-10-02 13:19:06.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:06 np0005465988 nova_compute[236126]: 2025-10-02 13:19:06.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:06.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 09:19:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:19:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:19:07 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:19:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:07.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:08.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:09.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:10.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:11 np0005465988 nova_compute[236126]: 2025-10-02 13:19:11.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:11.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:11 np0005465988 nova_compute[236126]: 2025-10-02 13:19:11.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:19:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:19:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:12.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:13.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:14.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:15.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:16 np0005465988 nova_compute[236126]: 2025-10-02 13:19:16.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:16.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:16 np0005465988 nova_compute[236126]: 2025-10-02 13:19:16.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:17.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:18.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:19.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:20.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:21 np0005465988 nova_compute[236126]: 2025-10-02 13:19:21.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:21.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:21 np0005465988 nova_compute[236126]: 2025-10-02 13:19:21.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:22.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:23.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:24.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:25.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:26 np0005465988 nova_compute[236126]: 2025-10-02 13:19:26.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:26.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:26 np0005465988 nova_compute[236126]: 2025-10-02 13:19:26.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:27.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:19:27.427 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:19:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:19:27.427 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:19:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:19:27.428 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:19:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:28.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:29.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:29 np0005465988 podman[347551]: 2025-10-02 13:19:29.514116662 +0000 UTC m=+0.054699253 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 09:19:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:30.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:31 np0005465988 nova_compute[236126]: 2025-10-02 13:19:31.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:31.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:31 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:31 np0005465988 nova_compute[236126]: 2025-10-02 13:19:31.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:32.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:33.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:34.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:35.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:36 np0005465988 nova_compute[236126]: 2025-10-02 13:19:36.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:36 np0005465988 nova_compute[236126]: 2025-10-02 13:19:36.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:19:36 np0005465988 podman[347626]: 2025-10-02 13:19:36.554413862 +0000 UTC m=+0.072201053 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251001)
Oct  2 09:19:36 np0005465988 podman[347624]: 2025-10-02 13:19:36.587475766 +0000 UTC m=+0.119816653 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 09:19:36 np0005465988 podman[347625]: 2025-10-02 13:19:36.587476056 +0000 UTC m=+0.110909368 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Oct  2 09:19:36 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:36.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:36 np0005465988 nova_compute[236126]: 2025-10-02 13:19:36.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:37.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:38.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:39.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:39 np0005465988 nova_compute[236126]: 2025-10-02 13:19:39.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:19:39 np0005465988 nova_compute[236126]: 2025-10-02 13:19:39.598 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:19:39 np0005465988 nova_compute[236126]: 2025-10-02 13:19:39.599 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:19:39 np0005465988 nova_compute[236126]: 2025-10-02 13:19:39.599 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:19:39 np0005465988 nova_compute[236126]: 2025-10-02 13:19:39.600 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:19:39 np0005465988 nova_compute[236126]: 2025-10-02 13:19:39.600 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:19:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:19:40 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3724228417' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:19:40 np0005465988 nova_compute[236126]: 2025-10-02 13:19:40.069 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:19:40 np0005465988 nova_compute[236126]: 2025-10-02 13:19:40.255 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:19:40 np0005465988 nova_compute[236126]: 2025-10-02 13:19:40.258 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3996MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:19:40 np0005465988 nova_compute[236126]: 2025-10-02 13:19:40.259 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:19:40 np0005465988 nova_compute[236126]: 2025-10-02 13:19:40.259 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:19:40 np0005465988 nova_compute[236126]: 2025-10-02 13:19:40.338 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:19:40 np0005465988 nova_compute[236126]: 2025-10-02 13:19:40.338 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:19:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:40.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:40 np0005465988 nova_compute[236126]: 2025-10-02 13:19:40.823 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:19:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:19:41 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2023800697' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:19:41 np0005465988 nova_compute[236126]: 2025-10-02 13:19:41.285 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:19:41 np0005465988 nova_compute[236126]: 2025-10-02 13:19:41.291 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:19:41 np0005465988 nova_compute[236126]: 2025-10-02 13:19:41.312 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:19:41 np0005465988 nova_compute[236126]: 2025-10-02 13:19:41.313 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:19:41 np0005465988 nova_compute[236126]: 2025-10-02 13:19:41.313 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:19:41 np0005465988 nova_compute[236126]: 2025-10-02 13:19:41.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:41.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:41 np0005465988 nova_compute[236126]: 2025-10-02 13:19:41.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:42.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.003000085s ======
Oct  2 09:19:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:43.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000085s
Oct  2 09:19:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:44.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:45 np0005465988 nova_compute[236126]: 2025-10-02 13:19:45.315 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:19:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:45.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:46 np0005465988 nova_compute[236126]: 2025-10-02 13:19:46.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:46.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:46 np0005465988 nova_compute[236126]: 2025-10-02 13:19:46.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:47.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:47 np0005465988 nova_compute[236126]: 2025-10-02 13:19:47.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:19:47 np0005465988 nova_compute[236126]: 2025-10-02 13:19:47.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:19:47 np0005465988 nova_compute[236126]: 2025-10-02 13:19:47.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:19:48 np0005465988 nova_compute[236126]: 2025-10-02 13:19:48.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:19:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:48.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:49.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:49 np0005465988 nova_compute[236126]: 2025-10-02 13:19:49.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:19:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:50.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:51 np0005465988 nova_compute[236126]: 2025-10-02 13:19:51.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:51.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:51 np0005465988 nova_compute[236126]: 2025-10-02 13:19:51.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:19:51 np0005465988 nova_compute[236126]: 2025-10-02 13:19:51.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 09:19:51 np0005465988 nova_compute[236126]: 2025-10-02 13:19:51.503 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 09:19:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:51 np0005465988 nova_compute[236126]: 2025-10-02 13:19:51.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:52 np0005465988 nova_compute[236126]: 2025-10-02 13:19:52.503 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:19:52 np0005465988 nova_compute[236126]: 2025-10-02 13:19:52.505 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:19:52 np0005465988 nova_compute[236126]: 2025-10-02 13:19:52.505 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:19:52 np0005465988 nova_compute[236126]: 2025-10-02 13:19:52.541 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:19:52 np0005465988 nova_compute[236126]: 2025-10-02 13:19:52.541 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:19:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:52.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:53.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:19:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:54.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:19:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:55.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:56 np0005465988 nova_compute[236126]: 2025-10-02 13:19:56.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:19:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:56.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:56 np0005465988 nova_compute[236126]: 2025-10-02 13:19:56.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:19:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:57.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:19:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:19:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:19:58.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:19:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:19:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:19:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:19:59.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:00 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 09:20:00 np0005465988 podman[347795]: 2025-10-02 13:20:00.51970933 +0000 UTC m=+0.061799476 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 09:20:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:00.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:01 np0005465988 nova_compute[236126]: 2025-10-02 13:20:01.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:01.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:01 np0005465988 nova_compute[236126]: 2025-10-02 13:20:01.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:02.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:03.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:04.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:05.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:06 np0005465988 nova_compute[236126]: 2025-10-02 13:20:06.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:06 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:06 np0005465988 nova_compute[236126]: 2025-10-02 13:20:06.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:06.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:07.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:07 np0005465988 podman[347821]: 2025-10-02 13:20:07.53551576 +0000 UTC m=+0.064569935 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct  2 09:20:07 np0005465988 podman[347820]: 2025-10-02 13:20:07.544069905 +0000 UTC m=+0.067326544 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 09:20:07 np0005465988 podman[347819]: 2025-10-02 13:20:07.560449192 +0000 UTC m=+0.095663732 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 09:20:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:08.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:09.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:20:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:10.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:20:11 np0005465988 nova_compute[236126]: 2025-10-02 13:20:11.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:20:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:11.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:11 np0005465988 nova_compute[236126]: 2025-10-02 13:20:11.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 09:20:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:12.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 09:20:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:20:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:13.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:14 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:20:14 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:20:14 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:20:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:20:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:14.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:20:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:15.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:15 np0005465988 nova_compute[236126]: 2025-10-02 13:20:15.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:20:16 np0005465988 nova_compute[236126]: 2025-10-02 13:20:16.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:16 np0005465988 nova_compute[236126]: 2025-10-02 13:20:16.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:20:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:16.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:20:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:20:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:17.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:18.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:19.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:20:20 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:20:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:20.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:21 np0005465988 nova_compute[236126]: 2025-10-02 13:20:21.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:21.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:21 np0005465988 nova_compute[236126]: 2025-10-02 13:20:21.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:22.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:23.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:24.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:20:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:25.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:20:26 np0005465988 nova_compute[236126]: 2025-10-02 13:20:26.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:26.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:26 np0005465988 nova_compute[236126]: 2025-10-02 13:20:26.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:20:27.427 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:20:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:20:27.428 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:20:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:20:27.429 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:20:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:20:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:27.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:20:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:20:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:28.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:20:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:20:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:29.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:20:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:20:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:30.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:31 np0005465988 nova_compute[236126]: 2025-10-02 13:20:31.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:31.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:31 np0005465988 podman[348127]: 2025-10-02 13:20:31.528998982 +0000 UTC m=+0.064873293 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:20:31 np0005465988 nova_compute[236126]: 2025-10-02 13:20:31.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:32.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:33.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:34 np0005465988 nova_compute[236126]: 2025-10-02 13:20:34.609 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:20:34 np0005465988 nova_compute[236126]: 2025-10-02 13:20:34.610 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 09:20:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:20:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:34.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:35.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:36 np0005465988 nova_compute[236126]: 2025-10-02 13:20:36.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:36 np0005465988 nova_compute[236126]: 2025-10-02 13:20:36.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:20:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:36.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:20:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:37.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:37 np0005465988 nova_compute[236126]: 2025-10-02 13:20:37.491 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:20:38 np0005465988 podman[348202]: 2025-10-02 13:20:38.559320597 +0000 UTC m=+0.096724153 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:20:38 np0005465988 podman[348203]: 2025-10-02 13:20:38.569476127 +0000 UTC m=+0.097203677 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:20:38 np0005465988 podman[348201]: 2025-10-02 13:20:38.570032333 +0000 UTC m=+0.113750189 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 09:20:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:38.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:39 np0005465988 nova_compute[236126]: 2025-10-02 13:20:39.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:20:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:39.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:39 np0005465988 nova_compute[236126]: 2025-10-02 13:20:39.521 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:20:39 np0005465988 nova_compute[236126]: 2025-10-02 13:20:39.521 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:20:39 np0005465988 nova_compute[236126]: 2025-10-02 13:20:39.522 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:20:39 np0005465988 nova_compute[236126]: 2025-10-02 13:20:39.522 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:20:39 np0005465988 nova_compute[236126]: 2025-10-02 13:20:39.523 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:20:39 np0005465988 systemd-logind[827]: New session 59 of user zuul.
Oct  2 09:20:39 np0005465988 systemd[1]: Started Session 59 of User zuul.
Oct  2 09:20:39 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:20:39 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/347551512' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:20:39 np0005465988 nova_compute[236126]: 2025-10-02 13:20:39.982 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.138 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.139 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3979MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.139 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.139 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.275 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.276 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.310 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:20:40 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:20:40 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1262707481' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.795 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.803 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.833 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.835 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:20:40 np0005465988 nova_compute[236126]: 2025-10-02 13:20:40.835 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:20:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:40.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:41 np0005465988 nova_compute[236126]: 2025-10-02 13:20:41.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:41.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:41 np0005465988 nova_compute[236126]: 2025-10-02 13:20:41.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:20:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:42.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:20:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:20:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:43.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:20:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:44.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:45.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Oct  2 09:20:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3080946224' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct  2 09:20:46 np0005465988 nova_compute[236126]: 2025-10-02 13:20:46.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:46 np0005465988 nova_compute[236126]: 2025-10-02 13:20:46.837 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:20:46 np0005465988 nova_compute[236126]: 2025-10-02 13:20:46.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:20:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:46.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:47.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:48.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:49 np0005465988 nova_compute[236126]: 2025-10-02 13:20:49.470 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:20:49 np0005465988 nova_compute[236126]: 2025-10-02 13:20:49.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:20:49 np0005465988 nova_compute[236126]: 2025-10-02 13:20:49.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:20:49 np0005465988 nova_compute[236126]: 2025-10-02 13:20:49.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:20:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:49.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:50 np0005465988 ovs-vsctl[348605]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  2 09:20:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:20:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:50.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:51 np0005465988 nova_compute[236126]: 2025-10-02 13:20:51.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:51 np0005465988 nova_compute[236126]: 2025-10-02 13:20:51.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:20:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:51.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:51 np0005465988 virtqemud[235689]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  2 09:20:51 np0005465988 virtqemud[235689]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  2 09:20:51 np0005465988 virtqemud[235689]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  2 09:20:51 np0005465988 nova_compute[236126]: 2025-10-02 13:20:51.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:52 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: cache status {prefix=cache status} (starting...)
Oct  2 09:20:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:52 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: client ls {prefix=client ls} (starting...)
Oct  2 09:20:52 np0005465988 lvm[349018]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct  2 09:20:52 np0005465988 lvm[349018]: VG ceph_vg0 finished
Oct  2 09:20:52 np0005465988 kernel: block dm-0: the capability attribute has been deprecated.
Oct  2 09:20:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:20:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:52.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:52 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: damage ls {prefix=damage ls} (starting...)
Oct  2 09:20:53 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump loads {prefix=dump loads} (starting...)
Oct  2 09:20:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Oct  2 09:20:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/514734843' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct  2 09:20:53 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Oct  2 09:20:53 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Oct  2 09:20:53 np0005465988 nova_compute[236126]: 2025-10-02 13:20:53.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:20:53 np0005465988 nova_compute[236126]: 2025-10-02 13:20:53.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:20:53 np0005465988 nova_compute[236126]: 2025-10-02 13:20:53.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:20:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:53.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:53 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Oct  2 09:20:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Oct  2 09:20:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1516179473' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct  2 09:20:53 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Oct  2 09:20:53 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Oct  2 09:20:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Oct  2 09:20:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1412689173' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct  2 09:20:54 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: get subtrees {prefix=get subtrees} (starting...)
Oct  2 09:20:54 np0005465988 nova_compute[236126]: 2025-10-02 13:20:54.209 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:20:54 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: ops {prefix=ops} (starting...)
Oct  2 09:20:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Oct  2 09:20:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1610077542' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct  2 09:20:54 np0005465988 nova_compute[236126]: 2025-10-02 13:20:54.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:20:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Oct  2 09:20:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2665894373' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct  2 09:20:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Oct  2 09:20:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4041968854' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct  2 09:20:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:54.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:55 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: session ls {prefix=session ls} (starting...)
Oct  2 09:20:55 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: status {prefix=status} (starting...)
Oct  2 09:20:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Oct  2 09:20:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2041360105' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct  2 09:20:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:55.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Oct  2 09:20:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/348385409' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct  2 09:20:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Oct  2 09:20:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1174465865' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct  2 09:20:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Oct  2 09:20:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3174335008' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct  2 09:20:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Oct  2 09:20:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/274695586' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct  2 09:20:56 np0005465988 nova_compute[236126]: 2025-10-02 13:20:56.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:56 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Oct  2 09:20:56 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3949948913' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct  2 09:20:56 np0005465988 nova_compute[236126]: 2025-10-02 13:20:56.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:20:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:20:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:56.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:20:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Oct  2 09:20:57 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4060181063' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Oct  2 09:20:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Oct  2 09:20:57 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1842640529' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct  2 09:20:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Oct  2 09:20:57 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2522821367' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct  2 09:20:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:20:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:57.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Oct  2 09:20:57 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2022479561' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct  2 09:20:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Oct  2 09:20:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2391751382' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct  2 09:20:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Oct  2 09:20:58 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2259120394' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536256512 unmapped: 47046656 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280e91800 session 0x5612813f7860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280fccc00 session 0x56127e9a43c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280fccc00 session 0x561281c77e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 47022080 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 47022080 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 47022080 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6774192 data_alloc: 285212672 data_used: 79626240
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 47022080 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 47013888 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x19339e000/0x0/0x1bfc00000, data 0xd029df3/0xd22f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 47013888 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 47013888 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 47013888 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6771728 data_alloc: 285212672 data_used: 79626240
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536322048 unmapped: 46981120 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536322048 unmapped: 46981120 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x19339f000/0x0/0x1bfc00000, data 0xd029df3/0xd22f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536322048 unmapped: 46981120 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536330240 unmapped: 46972928 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e02c400 session 0x56127e8d52c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.662735939s of 12.217157364s, submitted: 33
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e9bdc00 session 0x561281c77680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536338432 unmapped: 46964736 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6770716 data_alloc: 285212672 data_used: 79613952
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531939328 unmapped: 51363840 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531947520 unmapped: 51355648 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127f49e400 session 0x56127ea1e960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946b9000/0x0/0x1bfc00000, data 0xbd10de3/0xbf15000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533004288 unmapped: 50298880 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6536999 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946dd000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946dd000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946dd000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946dd000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6536999 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946dd000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6536999 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946dd000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946dd000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6536999 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.636241913s of 21.724683762s, submitted: 40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533012480 unmapped: 50290688 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946de000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533020672 unmapped: 50282496 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946de000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533020672 unmapped: 50282496 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533020672 unmapped: 50282496 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6536955 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533020672 unmapped: 50282496 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533020672 unmapped: 50282496 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533020672 unmapped: 50282496 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533020672 unmapped: 50282496 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946de000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533020672 unmapped: 50282496 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946de000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6536955 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533028864 unmapped: 50274304 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533028864 unmapped: 50274304 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533028864 unmapped: 50274304 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946de000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533028864 unmapped: 50274304 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533028864 unmapped: 50274304 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6536955 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533028864 unmapped: 50274304 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533028864 unmapped: 50274304 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533037056 unmapped: 50266112 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533045248 unmapped: 50257920 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946de000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533045248 unmapped: 50257920 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6536955 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533045248 unmapped: 50257920 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946de000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533045248 unmapped: 50257920 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533045248 unmapped: 50257920 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533045248 unmapped: 50257920 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533045248 unmapped: 50257920 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6536955 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533045248 unmapped: 50257920 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946de000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533053440 unmapped: 50249728 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946de000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533053440 unmapped: 50249728 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533053440 unmapped: 50249728 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533053440 unmapped: 50249728 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6536955 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533053440 unmapped: 50249728 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946de000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533053440 unmapped: 50249728 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533061632 unmapped: 50241536 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533061632 unmapped: 50241536 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533069824 unmapped: 50233344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946de000/0x0/0x1bfc00000, data 0xbcecdc0/0xbef0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6536955 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533069824 unmapped: 50233344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533069824 unmapped: 50233344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533069824 unmapped: 50233344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.515026093s of 37.573341370s, submitted: 2
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533069824 unmapped: 50233344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946ce000/0x0/0x1bfc00000, data 0xbcfadc0/0xbefe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533069824 unmapped: 50233344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x5612806aa800 session 0x561280f81860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x5612850f3000 session 0x56127f8874a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x5612804a7800 session 0x56127ecef0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6537447 data_alloc: 285212672 data_used: 71217152
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533069824 unmapped: 50233344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e02c400 session 0x56128329ef00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946d0000/0x0/0x1bfc00000, data 0xbcfadc0/0xbefe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533069824 unmapped: 50233344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533069824 unmapped: 50233344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1946d0000/0x0/0x1bfc00000, data 0xbcfadc0/0xbefe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e9bdc00 session 0x5612813e7e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127f49e400 session 0x561280b0bc20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533069824 unmapped: 50233344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127f49e400 session 0x561281c76960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e02c400 session 0x5612813e65a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e9bdc00 session 0x56127e5a5e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x5612804a7800 session 0x56127e8c74a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533078016 unmapped: 50225152 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6542861 data_alloc: 285212672 data_used: 71217152
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534478848 unmapped: 48824320 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x5612850f3000 session 0x561280c0f860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e02c400 session 0x561280f80780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e9bdc00 session 0x56128127e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x5612804a7800 session 0x5612813f6b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127f49e400 session 0x561280f805a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280528400 session 0x561280fa3860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280fccc00 session 0x561285cd1e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e02c400 session 0x5612813f61e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e9bdc00 session 0x56127e8c7a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534421504 unmapped: 48881664 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127f49e400 session 0x56128329fc20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534421504 unmapped: 48881664 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534421504 unmapped: 48881664 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x194109000/0x0/0x1bfc00000, data 0xc2bfe32/0xc4c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534421504 unmapped: 48881664 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6592115 data_alloc: 285212672 data_used: 71221248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.947784424s of 12.696119308s, submitted: 68
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534421504 unmapped: 48881664 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x5612804a7800 session 0x56127e997c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534421504 unmapped: 48881664 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x194109000/0x0/0x1bfc00000, data 0xc2bfe32/0xc4c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534429696 unmapped: 48873472 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 535224320 unmapped: 48078848 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536141824 unmapped: 47161344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6667855 data_alloc: 285212672 data_used: 79691776
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536141824 unmapped: 47161344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536141824 unmapped: 47161344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280fccc00 session 0x561281bea1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536141824 unmapped: 47161344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1940e5000/0x0/0x1bfc00000, data 0xc2e3e32/0xc4e9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4801.0 total, 600.0 interval#012Cumulative writes: 66K writes, 268K keys, 66K commit groups, 1.0 writes per commit group, ingest: 0.27 GB, 0.06 MB/s#012Cumulative WAL: 66K writes, 24K syncs, 2.72 writes per sync, written: 0.27 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8463 writes, 34K keys, 8463 commit groups, 1.0 writes per commit group, ingest: 40.77 MB, 0.07 MB/s#012Interval WAL: 8463 writes, 3047 syncs, 2.78 writes per sync, written: 0.04 GB, 0.07 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536141824 unmapped: 47161344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536141824 unmapped: 47161344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6670903 data_alloc: 285212672 data_used: 79691776
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536141824 unmapped: 47161344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536141824 unmapped: 47161344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536141824 unmapped: 47161344 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1940e5000/0x0/0x1bfc00000, data 0xc2e3e32/0xc4e9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.724725723s of 13.231598854s, submitted: 8
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537362432 unmapped: 45940736 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x5612804a7800 session 0x56127e612780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e02c400 session 0x561280bd8780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e9bdc00 session 0x56127e9945a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127f49e400 session 0x56127f33e1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537845760 unmapped: 45457408 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6711759 data_alloc: 301989888 data_used: 85368832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1940e5000/0x0/0x1bfc00000, data 0xc2e3e32/0xc4e9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537845760 unmapped: 45457408 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537845760 unmapped: 45457408 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537845760 unmapped: 45457408 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280fccc00 session 0x561280c0ed20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537845760 unmapped: 45457408 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537845760 unmapped: 45457408 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6713143 data_alloc: 301989888 data_used: 85405696
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1940e5000/0x0/0x1bfc00000, data 0xc2e3e32/0xc4e9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537845760 unmapped: 45457408 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537845760 unmapped: 45457408 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537845760 unmapped: 45457408 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1940e5000/0x0/0x1bfc00000, data 0xc2e3e32/0xc4e9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537845760 unmapped: 45457408 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537845760 unmapped: 45457408 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6713143 data_alloc: 301989888 data_used: 85405696
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.359648705s of 11.822547913s, submitted: 9
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539566080 unmapped: 43737088 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x1940e5000/0x0/0x1bfc00000, data 0xc2e3e32/0xc4e9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,1] op hist [0,0,0,0,0,5,0,21])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542318592 unmapped: 40984576 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542302208 unmapped: 41000960 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #57. Immutable memtables: 13.
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545153024 unmapped: 38150144 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545349632 unmapped: 37953536 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6809675 data_alloc: 301989888 data_used: 87769088
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x191504000/0x0/0x1bfc00000, data 0xcb84e32/0xcd8a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545349632 unmapped: 37953536 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x191504000/0x0/0x1bfc00000, data 0xcb84e32/0xcd8a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545349632 unmapped: 37953536 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x5612805fbc00 session 0x561280bd8960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280fcd400 session 0x5612813e6960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561282c77c00 session 0x561281beb860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127ea3ac00 session 0x5612813e7860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127ea3ac00 session 0x561280f81c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x5612805fbc00 session 0x56127e613c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545587200 unmapped: 37715968 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280fccc00 session 0x56127e9970e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280fcd400 session 0x561280f814a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561282c77c00 session 0x561280c0fc20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x190c0a000/0x0/0x1bfc00000, data 0xd47ee32/0xd684000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545587200 unmapped: 37715968 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545603584 unmapped: 37699584 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6883916 data_alloc: 301989888 data_used: 87769088
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545603584 unmapped: 37699584 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127ea3ac00 session 0x5612813e63c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.830035210s of 11.250144005s, submitted: 183
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 37683200 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280e91800 session 0x5612813e7a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280528c00 session 0x5612832a23c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x5612805fbc00 session 0x56127e8e8f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280fcd400 session 0x56128329fc20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545677312 unmapped: 37625856 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127ea3ac00 session 0x56127e8c7a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280fccc00 session 0x561280c0e000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545734656 unmapped: 37568512 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x191a95000/0x0/0x1bfc00000, data 0xc5f4dc0/0xc7f8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545734656 unmapped: 37568512 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6703104 data_alloc: 285212672 data_used: 79749120
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545734656 unmapped: 37568512 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545751040 unmapped: 37552128 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545751040 unmapped: 37552128 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545775616 unmapped: 37527552 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545849344 unmapped: 37453824 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x191a96000/0x0/0x1bfc00000, data 0xc5f4dc0/0xc7f8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6702928 data_alloc: 285212672 data_used: 79749120
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546955264 unmapped: 36347904 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127e9bdc00 session 0x5612813f7860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 35332096 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 heartbeat osd_stat(store_statfs(0x191a96000/0x0/0x1bfc00000, data 0xc5f4dc0/0xc7f8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 35332096 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x561280675c00 session 0x56127e7981e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.280895233s of 11.670135498s, submitted: 291
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 ms_handle_reset con 0x56127ea37400 session 0x56128329fe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 382 handle_osd_map epochs [382,383], i have 382, src has [1,383]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 35332096 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 383 ms_handle_reset con 0x5612805fbc00 session 0x561280bd85a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 383 ms_handle_reset con 0x56127e9bdc00 session 0x5612813e6780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536895488 unmapped: 46407680 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 6347028 data_alloc: 285212672 data_used: 67551232
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 383 ms_handle_reset con 0x56127e02c400 session 0x561280bd8d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 383 ms_handle_reset con 0x5612804a7800 session 0x561285cd05a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536895488 unmapped: 46407680 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 383 heartbeat osd_stat(store_statfs(0x19404b000/0x0/0x1bfc00000, data 0xa03ea3a/0xa241000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536895488 unmapped: 46407680 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536895488 unmapped: 46407680 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536895488 unmapped: 46407680 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 383 heartbeat osd_stat(store_statfs(0x19404d000/0x0/0x1bfc00000, data 0xa03ea3a/0xa241000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536903680 unmapped: 46399488 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6147942 data_alloc: 268435456 data_used: 53358592
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 383 ms_handle_reset con 0x56127ea37400 session 0x56127e996780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531734528 unmapped: 51568640 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 383 handle_osd_map epochs [383,384], i have 383, src has [1,384]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 383 handle_osd_map epochs [384,384], i have 384, src has [1,384]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533102592 unmapped: 50200576 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 384 heartbeat osd_stat(store_statfs(0x194695000/0x0/0x1bfc00000, data 0x99ed569/0x9bf0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 532185088 unmapped: 51118080 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 532185088 unmapped: 51118080 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 532185088 unmapped: 51118080 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6204728 data_alloc: 268435456 data_used: 53317632
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.655267715s of 12.241396904s, submitted: 181
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 384 ms_handle_reset con 0x56127ea37400 session 0x561280bd8f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 532209664 unmapped: 51093504 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 384 ms_handle_reset con 0x56127e02c400 session 0x561280fa3860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 532217856 unmapped: 51085312 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 384 heartbeat osd_stat(store_statfs(0x194621000/0x0/0x1bfc00000, data 0x9a6b507/0x9c6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 384 handle_osd_map epochs [384,385], i have 384, src has [1,385]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 385 ms_handle_reset con 0x56127e9bdc00 session 0x56127e8e90e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 526770176 unmapped: 56532992 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 385 ms_handle_reset con 0x56127eb1c000 session 0x5612832a34a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 526770176 unmapped: 56532992 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 385 ms_handle_reset con 0x5612804a7800 session 0x561280c0e780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 519151616 unmapped: 64151552 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5792188 data_alloc: 251658240 data_used: 34426880
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 519151616 unmapped: 64151552 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 385 heartbeat osd_stat(store_statfs(0x1968fe000/0x0/0x1bfc00000, data 0x778f0f0/0x7990000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 519151616 unmapped: 64151552 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 519151616 unmapped: 64151552 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 385 handle_osd_map epochs [385,386], i have 385, src has [1,386]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 386 ms_handle_reset con 0x56127e02c400 session 0x56128329e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 520208384 unmapped: 63094784 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 386 ms_handle_reset con 0x56127f39bc00 session 0x5612832a3c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 386 ms_handle_reset con 0x5612805fa400 session 0x56127f497e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 386 heartbeat osd_stat(store_statfs(0x196baf000/0x0/0x1bfc00000, data 0x74dbd9d/0x76de000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2196f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518545408 unmapped: 64757760 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 386 ms_handle_reset con 0x56127e9bdc00 session 0x561285cd1a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5454662 data_alloc: 234881024 data_used: 29659136
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518561792 unmapped: 64741376 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 386 handle_osd_map epochs [386,387], i have 386, src has [1,387]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.970864296s of 10.838081360s, submitted: 158
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518561792 unmapped: 64741376 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518561792 unmapped: 64741376 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 387 heartbeat osd_stat(store_statfs(0x198602000/0x0/0x1bfc00000, data 0x5677873/0x5879000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518561792 unmapped: 64741376 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 387 ms_handle_reset con 0x56127eb1c000 session 0x56127e6134a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 387 ms_handle_reset con 0x56127e02c400 session 0x56127e5a4b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 387 ms_handle_reset con 0x56127e9bdc00 session 0x56127ea19860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 387 ms_handle_reset con 0x56127f39bc00 session 0x56127e8d45a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 387 handle_osd_map epochs [387,388], i have 387, src has [1,388]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 387 handle_osd_map epochs [388,388], i have 388, src has [1,388]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 524902400 unmapped: 58400768 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5533008 data_alloc: 234881024 data_used: 29667328
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 388 ms_handle_reset con 0x5612805fa400 session 0x56127e6130e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 388 ms_handle_reset con 0x5612805fbc00 session 0x5612813e72c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 388 ms_handle_reset con 0x56127e02c400 session 0x56127ea163c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 388 ms_handle_reset con 0x56127e9bdc00 session 0x56127e9a43c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 388 ms_handle_reset con 0x56127ea37400 session 0x561285cd1a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 388 ms_handle_reset con 0x56127f39bc00 session 0x561280c0e780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514080768 unmapped: 69222400 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 388 ms_handle_reset con 0x5612804a8c00 session 0x56127f2ba1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 388 ms_handle_reset con 0x56127ea39800 session 0x56127f8b50e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 388 handle_osd_map epochs [388,389], i have 388, src has [1,389]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510222336 unmapped: 73080832 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 389 ms_handle_reset con 0x56127e02c400 session 0x5612832a34a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 389 heartbeat osd_stat(store_statfs(0x1996fa000/0x0/0x1bfc00000, data 0x41ff039/0x4403000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510246912 unmapped: 73056256 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510246912 unmapped: 73056256 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510246912 unmapped: 73056256 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 389 ms_handle_reset con 0x56127e9bdc00 session 0x5612813e7a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5215238 data_alloc: 234881024 data_used: 19886080
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510246912 unmapped: 73056256 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 389 ms_handle_reset con 0x56127ea37400 session 0x561280c0fc20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 389 handle_osd_map epochs [389,390], i have 389, src has [1,390]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.269517899s of 10.011886597s, submitted: 103
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f39bc00 session 0x561280f814a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f39bc00 session 0x5612813e6960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x56127e8e8f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510091264 unmapped: 73211904 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e9bdc00 session 0x561280bd8d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x56127f33eb40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37400 session 0x56127f33f680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37400 session 0x56127e8e8960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510074880 unmapped: 73228288 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x1997c9000/0x0/0x1bfc00000, data 0x44adb88/0x46b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510083072 unmapped: 73220096 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510083072 unmapped: 73220096 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5257924 data_alloc: 234881024 data_used: 21065728
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509878272 unmapped: 73424896 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509878272 unmapped: 73424896 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x1997c9000/0x0/0x1bfc00000, data 0x44adb88/0x46b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509878272 unmapped: 73424896 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509812736 unmapped: 73490432 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509812736 unmapped: 73490432 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5285492 data_alloc: 234881024 data_used: 25055232
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509812736 unmapped: 73490432 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.733349800s of 10.016734123s, submitted: 44
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x1997c9000/0x0/0x1bfc00000, data 0x44adbab/0x46b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509984768 unmapped: 73318400 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x56127e8c70e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509984768 unmapped: 73318400 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x1997a5000/0x0/0x1bfc00000, data 0x44d1bab/0x46d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [0,0,0,0,0,1,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509984768 unmapped: 73318400 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509984768 unmapped: 73318400 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5300454 data_alloc: 234881024 data_used: 26152960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509992960 unmapped: 73310208 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19979f000/0x0/0x1bfc00000, data 0x44d7bab/0x46df000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [0,1,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 513368064 unmapped: 69935104 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514826240 unmapped: 68476928 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514826240 unmapped: 68476928 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x198c71000/0x0/0x1bfc00000, data 0x4ffcbab/0x5204000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514031616 unmapped: 69271552 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5403078 data_alloc: 234881024 data_used: 28106752
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514031616 unmapped: 69271552 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514031616 unmapped: 69271552 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x198c77000/0x0/0x1bfc00000, data 0x4fffbab/0x5207000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.625724792s of 11.518292427s, submitted: 103
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514031616 unmapped: 69271552 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514031616 unmapped: 69271552 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514031616 unmapped: 69271552 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5404350 data_alloc: 234881024 data_used: 28106752
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x561280f80d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e9bdc00 session 0x56127e7994a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 513998848 unmapped: 69304320 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x198c55000/0x0/0x1bfc00000, data 0x5021bab/0x5229000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509132800 unmapped: 74170368 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea3ac00 session 0x561280bd85a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x198496000/0x0/0x1bfc00000, data 0x4921b8b/0x4b27000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509091840 unmapped: 74211328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508067840 unmapped: 75235328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19934b000/0x0/0x1bfc00000, data 0x492db8b/0x4b33000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508067840 unmapped: 75235328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5331191 data_alloc: 234881024 data_used: 23810048
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508067840 unmapped: 75235328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508067840 unmapped: 75235328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x199348000/0x0/0x1bfc00000, data 0x4930b8b/0x4b36000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508067840 unmapped: 75235328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.741526604s of 11.156961441s, submitted: 118
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508067840 unmapped: 75235328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x199348000/0x0/0x1bfc00000, data 0x4930b8b/0x4b36000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508067840 unmapped: 75235328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5331571 data_alloc: 234881024 data_used: 23810048
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508067840 unmapped: 75235328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508067840 unmapped: 75235328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508067840 unmapped: 75235328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508067840 unmapped: 75235328 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561280528c00 session 0x56127e5a5e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x199348000/0x0/0x1bfc00000, data 0x4930b8b/0x4b36000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507994112 unmapped: 75309056 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5330083 data_alloc: 234881024 data_used: 23818240
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507994112 unmapped: 75309056 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505339904 unmapped: 77963264 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea3ac00 session 0x561282a77680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505339904 unmapped: 77963264 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a474000/0x0/0x1bfc00000, data 0x3804b8b/0x3a0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505339904 unmapped: 77963264 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505339904 unmapped: 77963264 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5147213 data_alloc: 234881024 data_used: 15740928
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x5612832a32c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e9bdc00 session 0x56127f8b5c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37400 session 0x561282a77e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37400 session 0x561285cd01e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505339904 unmapped: 77963264 heap: 583303168 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.502902985s of 12.202594757s, submitted: 27
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 515088384 unmapped: 71892992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x56127e9a4d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e9bdc00 session 0x56128329fe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea3ac00 session 0x561285cd01e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561280528c00 session 0x5612832a32c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x561280f80d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507437056 unmapped: 79544320 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x199c19000/0x0/0x1bfc00000, data 0x405ebed/0x4265000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507437056 unmapped: 79544320 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x199c19000/0x0/0x1bfc00000, data 0x405ebed/0x4265000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507437056 unmapped: 79544320 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5213625 data_alloc: 234881024 data_used: 15740928
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507437056 unmapped: 79544320 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507437056 unmapped: 79544320 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507437056 unmapped: 79544320 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507437056 unmapped: 79544320 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507437056 unmapped: 79544320 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x199c19000/0x0/0x1bfc00000, data 0x405ebed/0x4265000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5213625 data_alloc: 218103808 data_used: 15740928
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507437056 unmapped: 79544320 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507437056 unmapped: 79544320 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507437056 unmapped: 79544320 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x199c19000/0x0/0x1bfc00000, data 0x405ebed/0x4265000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.639642715s of 12.630477905s, submitted: 38
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 506388480 unmapped: 80592896 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e9bdc00 session 0x56127f33f680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 506388480 unmapped: 80592896 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5213933 data_alloc: 218103808 data_used: 15740928
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37400 session 0x561280bd8d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 506388480 unmapped: 80592896 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea3ac00 session 0x56127e8e8f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x561280f814a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 506200064 unmapped: 80781312 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x199c17000/0x0/0x1bfc00000, data 0x405fbfd/0x4267000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 506200064 unmapped: 80781312 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505790464 unmapped: 81190912 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505659392 unmapped: 81321984 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5278908 data_alloc: 234881024 data_used: 24514560
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505659392 unmapped: 81321984 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x199c17000/0x0/0x1bfc00000, data 0x405fbfd/0x4267000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505659392 unmapped: 81321984 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505659392 unmapped: 81321984 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505659392 unmapped: 81321984 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505659392 unmapped: 81321984 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5278908 data_alloc: 234881024 data_used: 24514560
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x199c17000/0x0/0x1bfc00000, data 0x405fbfd/0x4267000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505659392 unmapped: 81321984 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505659392 unmapped: 81321984 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.253059387s of 13.453433990s, submitted: 12
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x56127f2ba1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x56128329e960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 505659392 unmapped: 81321984 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x199c17000/0x0/0x1bfc00000, data 0x405fbfd/0x4267000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e9bdc00 session 0x56127e5a54a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504070144 unmapped: 82911232 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a472000/0x0/0x1bfc00000, data 0x3805b8b/0x3a0b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504070144 unmapped: 82911232 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5155859 data_alloc: 218103808 data_used: 15757312
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a472000/0x0/0x1bfc00000, data 0x3805b8b/0x3a0b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504070144 unmapped: 82911232 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a472000/0x0/0x1bfc00000, data 0x3805b8b/0x3a0b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504070144 unmapped: 82911232 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504070144 unmapped: 82911232 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504119296 unmapped: 82862080 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37400 session 0x561280bd8960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea3ac00 session 0x56127e6130e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea3ac00 session 0x561285cd1a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x5612813e65a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e9bdc00 session 0x56127f8b5c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a101000/0x0/0x1bfc00000, data 0x3b77b8b/0x3d7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5194594 data_alloc: 218103808 data_used: 15757312
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a07e000/0x0/0x1bfc00000, data 0x3bfab8b/0x3e00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5194594 data_alloc: 218103808 data_used: 15757312
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a07e000/0x0/0x1bfc00000, data 0x3bfab8b/0x3e00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37400 session 0x56127f8874a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a07e000/0x0/0x1bfc00000, data 0x3bfab8b/0x3e00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x56127f8b5e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5194594 data_alloc: 218103808 data_used: 15757312
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503824384 unmapped: 83156992 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x561282a770e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.209194183s of 20.514221191s, submitted: 72
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x56127e8d43c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503980032 unmapped: 83001344 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503980032 unmapped: 83001344 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503980032 unmapped: 83001344 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a05a000/0x0/0x1bfc00000, data 0x3c1eb8b/0x3e24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5197802 data_alloc: 218103808 data_used: 15761408
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503980032 unmapped: 83001344 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503980032 unmapped: 83001344 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a05a000/0x0/0x1bfc00000, data 0x3c1eb8b/0x3e24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503980032 unmapped: 83001344 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503980032 unmapped: 83001344 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a05a000/0x0/0x1bfc00000, data 0x3c1eb8b/0x3e24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 502063104 unmapped: 84918272 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5217482 data_alloc: 218103808 data_used: 18505728
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 501932032 unmapped: 85049344 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a05a000/0x0/0x1bfc00000, data 0x3c1eb8b/0x3e24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a05a000/0x0/0x1bfc00000, data 0x3c1eb8b/0x3e24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 501882880 unmapped: 85098496 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 501882880 unmapped: 85098496 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a05a000/0x0/0x1bfc00000, data 0x3c1eb8b/0x3e24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 501882880 unmapped: 85098496 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a05a000/0x0/0x1bfc00000, data 0x3c1eb8b/0x3e24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a05a000/0x0/0x1bfc00000, data 0x3c1eb8b/0x3e24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 501882880 unmapped: 85098496 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5218602 data_alloc: 218103808 data_used: 18632704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 501882880 unmapped: 85098496 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 501882880 unmapped: 85098496 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 501882880 unmapped: 85098496 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 501882880 unmapped: 85098496 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 501882880 unmapped: 85098496 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5219082 data_alloc: 218103808 data_used: 18644992
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a05a000/0x0/0x1bfc00000, data 0x3c1eb8b/0x3e24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.044145584s of 18.056701660s, submitted: 3
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509034496 unmapped: 77946880 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503439360 unmapped: 83542016 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19964f000/0x0/0x1bfc00000, data 0x4622b8b/0x4828000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 503119872 unmapped: 83861504 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504389632 unmapped: 82591744 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x1995bf000/0x0/0x1bfc00000, data 0x46b0b8b/0x48b6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,9])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504397824 unmapped: 82583552 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5317672 data_alloc: 218103808 data_used: 19050496
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504397824 unmapped: 82583552 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x1995bf000/0x0/0x1bfc00000, data 0x46b0b8b/0x48b6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504397824 unmapped: 82583552 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x1995bf000/0x0/0x1bfc00000, data 0x46b0b8b/0x48b6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504397824 unmapped: 82583552 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504397824 unmapped: 82583552 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504397824 unmapped: 82583552 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5312380 data_alloc: 218103808 data_used: 19050496
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x56127dde5a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x5612822d7400 session 0x56127dde54a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56128458a400 session 0x561280b0a960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56128458a400 session 0x561280b0be00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.378281116s of 10.199616432s, submitted: 89
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x561280b0a780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504463360 unmapped: 82518016 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x198f16000/0x0/0x1bfc00000, data 0x4d62b8b/0x4f68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504463360 unmapped: 82518016 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504463360 unmapped: 82518016 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504463360 unmapped: 82518016 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504463360 unmapped: 82518016 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5367688 data_alloc: 218103808 data_used: 19050496
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504463360 unmapped: 82518016 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x561281c76b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x561281c76000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x5612822d7400 session 0x561282f9f680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x56127e8c6000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x561280f80780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x5612832a2b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56128458a400 session 0x56127f2ba5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561289ecb800 session 0x561281bebc20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561289ecb800 session 0x561282a76b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504635392 unmapped: 82345984 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x198f16000/0x0/0x1bfc00000, data 0x4d62b8b/0x4f68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x561285cd1e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x5612813e6b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x561280f810e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56128458a400 session 0x56127ea17680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x5612813e6000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56128458a400 session 0x56127e8c70e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x56127f3de1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x561280fa2000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561289ecb800 session 0x561280c0e000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561289ecb800 session 0x561281bea780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504619008 unmapped: 82362368 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x56127f886780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504619008 unmapped: 82362368 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x561280c0f680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504619008 unmapped: 82362368 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x561282a76d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5534294 data_alloc: 218103808 data_used: 19050496
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.894186020s of 10.279652596s, submitted: 82
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504537088 unmapped: 82444288 heap: 586981376 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea36800 session 0x56127f886000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x561282a77c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x56127f7192c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x56127e8d4b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561289ecb800 session 0x561280f81e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x5612804a8800 session 0x561280fa3e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x5612832a3a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x197b57000/0x0/0x1bfc00000, data 0x611dc1d/0x6327000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504782848 unmapped: 86401024 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x561280c0f860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x561280b0a5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x196f5d000/0x0/0x1bfc00000, data 0x6d17c1d/0x6f21000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504725504 unmapped: 86458368 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561289ecb800 session 0x5612813e7e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561286217c00 session 0x561285cd1680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504725504 unmapped: 86458368 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x56127ea1fa40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x56127e5a54a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x56128329f2c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504700928 unmapped: 86482944 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561286217c00 session 0x56127f2bba40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5671787 data_alloc: 234881024 data_used: 24633344
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504700928 unmapped: 86482944 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56128458b000 session 0x561282f9f2c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x561280fa30e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504889344 unmapped: 86294528 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x196f30000/0x0/0x1bfc00000, data 0x6d41c60/0x6f4e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 504897536 unmapped: 86286336 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561286217c00 session 0x56127e8e8f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 506503168 unmapped: 84680704 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561280e87000 session 0x561280bd8d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507822080 unmapped: 83361792 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5808219 data_alloc: 251658240 data_used: 40206336
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x5612823e1c00 session 0x56127f33f680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561280fcc800 session 0x5612832a32c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507822080 unmapped: 83361792 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.234940529s of 10.610002518s, submitted: 48
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507822080 unmapped: 83361792 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 508510208 unmapped: 82673664 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x196f2f000/0x0/0x1bfc00000, data 0x6d41c70/0x6f4f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [0,0,0,0,1,1,0,12])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 513130496 unmapped: 78053376 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 513146880 unmapped: 78036992 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5972071 data_alloc: 251658240 data_used: 43220992
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x1960bb000/0x0/0x1bfc00000, data 0x7bacc70/0x7dba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512491520 unmapped: 78692352 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x56128329fe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561280e87000 session 0x56127e8e8000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x5612823e1c00 session 0x56127ea1e1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512491520 unmapped: 78692352 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512753664 unmapped: 78430208 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518144000 unmapped: 73039872 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561289ecb800 session 0x561281beb4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561280529000 session 0x56128329e960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 520224768 unmapped: 70959104 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5960494 data_alloc: 234881024 data_used: 35905536
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561280529000 session 0x561282f9fe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 520093696 unmapped: 71090176 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x196f15000/0x0/0x1bfc00000, data 0x6d5ec1d/0x6f68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.147245407s of 10.043905258s, submitted: 374
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 520118272 unmapped: 71065600 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 520118272 unmapped: 71065600 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 520118272 unmapped: 71065600 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56128458a400 session 0x56127e996780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37800 session 0x561280f81a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 520118272 unmapped: 71065600 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x196f09000/0x0/0x1bfc00000, data 0x6d6ac1d/0x6f74000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5555635 data_alloc: 234881024 data_used: 22663168
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x56127f2ba5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517128192 unmapped: 74055680 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517128192 unmapped: 74055680 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517128192 unmapped: 74055680 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561280e87000 session 0x56127e5a52c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x198416000/0x0/0x1bfc00000, data 0x585ec6f/0x5a68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5556288 data_alloc: 234881024 data_used: 22671360
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x198416000/0x0/0x1bfc00000, data 0x585ec6f/0x5a68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.409004211s of 12.957763672s, submitted: 29
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x5612832a2780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x561281c76b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561280e87000 session 0x56127e997680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5556376 data_alloc: 234881024 data_used: 22675456
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x198416000/0x0/0x1bfc00000, data 0x585ec6f/0x5a68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x198416000/0x0/0x1bfc00000, data 0x585ec6f/0x5a68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517136384 unmapped: 74047488 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5556376 data_alloc: 234881024 data_used: 22675456
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x56128127e1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517144576 unmapped: 74039296 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e9bdc00 session 0x561282a76f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37400 session 0x5612832a2000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e9bdc00 session 0x56128329ef00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512614400 unmapped: 78569472 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512614400 unmapped: 78569472 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512622592 unmapped: 78561280 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x1992e5000/0x0/0x1bfc00000, data 0x498fc6f/0x4b99000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512630784 unmapped: 78553088 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5432604 data_alloc: 234881024 data_used: 23650304
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512630784 unmapped: 78553088 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512630784 unmapped: 78553088 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512630784 unmapped: 78553088 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512630784 unmapped: 78553088 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x1992e5000/0x0/0x1bfc00000, data 0x498fc6f/0x4b99000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512630784 unmapped: 78553088 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5432604 data_alloc: 234881024 data_used: 23650304
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512630784 unmapped: 78553088 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512630784 unmapped: 78553088 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512630784 unmapped: 78553088 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.969930649s of 19.254877090s, submitted: 31
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f39bc00 session 0x56127ecefa40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x5612805fa400 session 0x561280c0f4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512638976 unmapped: 78544896 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x1992e5000/0x0/0x1bfc00000, data 0x498fc6f/0x4b99000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512663552 unmapped: 78520320 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5307774 data_alloc: 234881024 data_used: 22913024
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512671744 unmapped: 78512128 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x561280e87000 session 0x5612813e7860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a050000/0x0/0x1bfc00000, data 0x3c24c6f/0x3e2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5303442 data_alloc: 234881024 data_used: 23539712
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a050000/0x0/0x1bfc00000, data 0x3c24c4c/0x3e2d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5303442 data_alloc: 234881024 data_used: 23539712
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a050000/0x0/0x1bfc00000, data 0x3c24c4c/0x3e2d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5303442 data_alloc: 234881024 data_used: 23539712
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a050000/0x0/0x1bfc00000, data 0x3c24c4c/0x3e2d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5303442 data_alloc: 234881024 data_used: 23539712
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a050000/0x0/0x1bfc00000, data 0x3c24c4c/0x3e2d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a050000/0x0/0x1bfc00000, data 0x3c24c4c/0x3e2d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x56127f8112c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5303442 data_alloc: 234881024 data_used: 23539712
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512679936 unmapped: 78503936 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.276937485s of 28.337133408s, submitted: 55
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e9bdc00 session 0x5612832a2b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37400 session 0x56127f8b5c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a050000/0x0/0x1bfc00000, data 0x3c24c4c/0x3e2d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a051000/0x0/0x1bfc00000, data 0x3c24c4c/0x3e2d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x561280c0e000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x561281bea780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127f49f000 session 0x56127e6130e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5304702 data_alloc: 234881024 data_used: 29368320
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x56127e8c74a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e9bdc00 session 0x56127e95f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37400 session 0x56127e95fa40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea39800 session 0x561281c761e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 heartbeat osd_stat(store_statfs(0x19a050000/0x0/0x1bfc00000, data 0x3c24c4c/0x3e2d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5347828 data_alloc: 234881024 data_used: 29368320
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127e02c400 session 0x561284543a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x5612805fa400 session 0x56127e8e83c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 handle_osd_map epochs [390,391], i have 390, src has [1,391]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 390 ms_handle_reset con 0x56127ea37800 session 0x56127f810f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.048686028s of 11.159454346s, submitted: 15
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 391 ms_handle_reset con 0x561280529000 session 0x561282f9ef00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 391 ms_handle_reset con 0x56128458a400 session 0x561280c0e000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518832128 unmapped: 72351744 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 391 ms_handle_reset con 0x56127e02c400 session 0x56127f8b5c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 391 ms_handle_reset con 0x56127ea37800 session 0x561281c76b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 391 ms_handle_reset con 0x561280529000 session 0x5612813e7e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 391 ms_handle_reset con 0x5612805fa400 session 0x56128329e1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 391 ms_handle_reset con 0x5612823e1c00 session 0x56127e8e9680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 391 handle_osd_map epochs [391,392], i have 391, src has [1,392]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 392 ms_handle_reset con 0x56127f39bc00 session 0x561280bd8b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 519020544 unmapped: 72163328 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 392 heartbeat osd_stat(store_statfs(0x19947b000/0x0/0x1bfc00000, data 0x47f88b5/0x4a03000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 392 handle_osd_map epochs [392,393], i have 392, src has [1,393]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 519020544 unmapped: 72163328 heap: 591183872 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 393 heartbeat osd_stat(store_statfs(0x199476000/0x0/0x1bfc00000, data 0x47fa572/0x4a07000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5430395 data_alloc: 234881024 data_used: 29388800
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 393 ms_handle_reset con 0x56127e02c400 session 0x56127ea174a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 393 handle_osd_map epochs [393,394], i have 393, src has [1,394]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517906432 unmapped: 76947456 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 ms_handle_reset con 0x56127ea37800 session 0x5612832a2000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 ms_handle_reset con 0x56127f49f000 session 0x56127f2ba1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 heartbeat osd_stat(store_statfs(0x1988bf000/0x0/0x1bfc00000, data 0x53ae249/0x55bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 ms_handle_reset con 0x561280529000 session 0x561280bd9a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517906432 unmapped: 76947456 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 ms_handle_reset con 0x56127e02c400 session 0x56127e995c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517906432 unmapped: 76947456 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 ms_handle_reset con 0x56127f49f000 session 0x56128127fa40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 ms_handle_reset con 0x56127f39bc00 session 0x56127f4c2000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 ms_handle_reset con 0x561289ecb800 session 0x5612832a3c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 519028736 unmapped: 75825152 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 ms_handle_reset con 0x5612805fa400 session 0x561280b0ab40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 ms_handle_reset con 0x56127e02c400 session 0x5612813e74a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517595136 unmapped: 77258752 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 ms_handle_reset con 0x56127f39bc00 session 0x56127e799c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5565790 data_alloc: 234881024 data_used: 29388800
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 394 handle_osd_map epochs [394,395], i have 394, src has [1,395]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 395 handle_osd_map epochs [395,396], i have 395, src has [1,396]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x561286217c00 session 0x561280b0a000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x5612862ad400 session 0x561285cd1860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517611520 unmapped: 77242368 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127ea37800 session 0x56127f33e3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517619712 unmapped: 77234176 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x198248000/0x0/0x1bfc00000, data 0x5a1f6a5/0x5c34000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 517619712 unmapped: 77234176 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.542555809s of 11.018893242s, submitted: 93
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518438912 unmapped: 76414976 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518479872 unmapped: 76374016 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5623414 data_alloc: 251658240 data_used: 36331520
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518479872 unmapped: 76374016 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x198248000/0x0/0x1bfc00000, data 0x5a1f6a5/0x5c34000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518479872 unmapped: 76374016 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518479872 unmapped: 76374016 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127f49f000 session 0x56127f887680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x561289ecb800 session 0x561280bd8f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 518479872 unmapped: 76374016 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516374528 unmapped: 78479360 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x198914000/0x0/0x1bfc00000, data 0x53556a5/0x556a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5518276 data_alloc: 234881024 data_used: 29405184
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x198914000/0x0/0x1bfc00000, data 0x53556a5/0x556a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127e02c400 session 0x56127ea1e1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516374528 unmapped: 78479360 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516374528 unmapped: 78479360 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516374528 unmapped: 78479360 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127e9bdc00 session 0x56128329ed20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127ea37400 session 0x56127e9974a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.612201691s of 10.155220032s, submitted: 39
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516374528 unmapped: 78479360 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507781120 unmapped: 87072768 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5288414 data_alloc: 234881024 data_used: 17674240
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127ea37400 session 0x56127e5a5e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x19948a000/0x0/0x1bfc00000, data 0x41cb5f0/0x43db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507781120 unmapped: 87072768 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507781120 unmapped: 87072768 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507781120 unmapped: 87072768 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x19948a000/0x0/0x1bfc00000, data 0x41cb5f0/0x43db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x19948a000/0x0/0x1bfc00000, data 0x41cb5f0/0x43db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507781120 unmapped: 87072768 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x19948a000/0x0/0x1bfc00000, data 0x41cb5f0/0x43db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507781120 unmapped: 87072768 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5288414 data_alloc: 234881024 data_used: 17674240
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127e02c400 session 0x561281c76780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507781120 unmapped: 87072768 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507781120 unmapped: 87072768 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 507781120 unmapped: 87072768 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509362176 unmapped: 85491712 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509362176 unmapped: 85491712 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x199aa2000/0x0/0x1bfc00000, data 0x41cb613/0x43dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5330416 data_alloc: 234881024 data_used: 23080960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509362176 unmapped: 85491712 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.426215172s of 13.164875984s, submitted: 37
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 509362176 unmapped: 85491712 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x561289ecb800 session 0x561282f9eb40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510795776 unmapped: 84058112 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x199134000/0x0/0x1bfc00000, data 0x4b39613/0x4d4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510795776 unmapped: 84058112 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510795776 unmapped: 84058112 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5408148 data_alloc: 234881024 data_used: 23085056
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510795776 unmapped: 84058112 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510795776 unmapped: 84058112 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x1990f3000/0x0/0x1bfc00000, data 0x4b7a613/0x4d8b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127f39bc00 session 0x56127e9a4000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510795776 unmapped: 84058112 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x561286217c00 session 0x56127e995680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127e02c400 session 0x56127e8d43c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127ea37400 session 0x561285cd1860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127f39bc00 session 0x56127ea1e1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x561289ecb800 session 0x561280bd8f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 512417792 unmapped: 82436096 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x5612862ad400 session 0x56127e8c6000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127e02c400 session 0x561280bd94a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127ea37400 session 0x5612832a23c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127f39bc00 session 0x56128329f4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x561289ecb800 session 0x56127e8d52c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 513417216 unmapped: 81436672 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563248 data_alloc: 234881024 data_used: 23097344
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127ea38c00 session 0x56127e8d4d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127e02c400 session 0x56127e9a43c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 513449984 unmapped: 81403904 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x1982af000/0x0/0x1bfc00000, data 0x6027646/0x5bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 513449984 unmapped: 81403904 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x1982af000/0x0/0x1bfc00000, data 0x6027646/0x5bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x1982af000/0x0/0x1bfc00000, data 0x6027646/0x5bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 513458176 unmapped: 81395712 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 515063808 unmapped: 79790080 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127f39bc00 session 0x56127e613a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127ea37400 session 0x5612813e6f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 515063808 unmapped: 79790080 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x1982af000/0x0/0x1bfc00000, data 0x6027646/0x5bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.074668884s of 13.138185501s, submitted: 87
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5509805 data_alloc: 234881024 data_used: 23097344
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x561289ecb800 session 0x56127e7983c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127ea35c00 session 0x5612813f63c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 513867776 unmapped: 80986112 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127e02c400 session 0x56127e9a4d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 513867776 unmapped: 80986112 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127ea37400 session 0x561280f81e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127f39bc00 session 0x561280b0b680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127e9bdc00 session 0x56127f718960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127f49f000 session 0x561280c0e1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514220032 unmapped: 80633856 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 ms_handle_reset con 0x56127e02c400 session 0x5612813f72c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x198c62000/0x0/0x1bfc00000, data 0x5675623/0x521c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514260992 unmapped: 80592896 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514260992 unmapped: 80592896 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5513432 data_alloc: 234881024 data_used: 23093248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514203648 unmapped: 80650240 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 heartbeat osd_stat(store_statfs(0x198c37000/0x0/0x1bfc00000, data 0x569f610/0x5246000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x21d7f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514203648 unmapped: 80650240 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 396 handle_osd_map epochs [396,397], i have 396, src has [1,397]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 397 ms_handle_reset con 0x56127f39bc00 session 0x56127f2ba1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 397 ms_handle_reset con 0x56127ea34c00 session 0x56127e995c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 397 ms_handle_reset con 0x561289ecb800 session 0x561280bd9c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510967808 unmapped: 83886080 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 397 handle_osd_map epochs [397,398], i have 397, src has [1,398]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 397 handle_osd_map epochs [398,398], i have 398, src has [1,398]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 398 ms_handle_reset con 0x56128458b400 session 0x5612832a2000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 398 ms_handle_reset con 0x56128458b400 session 0x56127e799c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510976000 unmapped: 83877888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510976000 unmapped: 83877888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5279698 data_alloc: 234881024 data_used: 25096192
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510976000 unmapped: 83877888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510976000 unmapped: 83877888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 398 heartbeat osd_stat(store_statfs(0x19a50c000/0x0/0x1bfc00000, data 0x334f0aa/0x3560000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 398 heartbeat osd_stat(store_statfs(0x19a50c000/0x0/0x1bfc00000, data 0x334f0aa/0x3560000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510976000 unmapped: 83877888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510976000 unmapped: 83877888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 398 heartbeat osd_stat(store_statfs(0x19a50c000/0x0/0x1bfc00000, data 0x334f0aa/0x3560000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510976000 unmapped: 83877888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5279698 data_alloc: 234881024 data_used: 25096192
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 398 handle_osd_map epochs [398,399], i have 398, src has [1,399]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.787668228s of 15.310669899s, submitted: 117
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 510984192 unmapped: 83869696 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 511049728 unmapped: 83804160 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19a50a000/0x0/0x1bfc00000, data 0x3350c21/0x3563000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 513957888 unmapped: 80896000 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 515072000 unmapped: 79781888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 515072000 unmapped: 79781888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5401550 data_alloc: 234881024 data_used: 26066944
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 515072000 unmapped: 79781888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1997a3000/0x0/0x1bfc00000, data 0x40afc21/0x42c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 515072000 unmapped: 79781888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1997a3000/0x0/0x1bfc00000, data 0x40afc21/0x42c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 515072000 unmapped: 79781888 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514015232 unmapped: 80838656 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19978a000/0x0/0x1bfc00000, data 0x40d1c21/0x42e4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514015232 unmapped: 80838656 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5394310 data_alloc: 234881024 data_used: 26066944
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514015232 unmapped: 80838656 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514015232 unmapped: 80838656 heap: 594853888 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x56128329f2c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x56127e8c70e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f39bc00 session 0x561281c76000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289ecb800 session 0x561285cd0000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.992821693s of 12.398694038s, submitted: 127
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289ecb800 session 0x561282f9e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514211840 unmapped: 84320256 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x56127e612780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198c4a000/0x0/0x1bfc00000, data 0x4c10c83/0x4e24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514138112 unmapped: 84393984 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514138112 unmapped: 84393984 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5484393 data_alloc: 234881024 data_used: 26066944
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x561282f9ef00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514138112 unmapped: 84393984 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f39bc00 session 0x56127f4c2000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514138112 unmapped: 84393984 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514138112 unmapped: 84393984 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128458b400 session 0x56127f886000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x56127f887e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514138112 unmapped: 84393984 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198c49000/0x0/0x1bfc00000, data 0x4c10ca6/0x4e25000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514138112 unmapped: 84393984 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5486046 data_alloc: 234881024 data_used: 26066944
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 514408448 unmapped: 84123648 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198c49000/0x0/0x1bfc00000, data 0x4c10ca6/0x4e25000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516603904 unmapped: 81928192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516603904 unmapped: 81928192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516603904 unmapped: 81928192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f39bc00 session 0x56127ea1e000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516603904 unmapped: 81928192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5569594 data_alloc: 251658240 data_used: 37687296
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198c46000/0x0/0x1bfc00000, data 0x4c13ca6/0x4e28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516603904 unmapped: 81928192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516603904 unmapped: 81928192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516603904 unmapped: 81928192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516603904 unmapped: 81928192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea3b400 session 0x5612832a3860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.679225922s of 16.973186493s, submitted: 43
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516603904 unmapped: 81928192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198c46000/0x0/0x1bfc00000, data 0x4c13ca6/0x4e28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5569678 data_alloc: 251658240 data_used: 37687296
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 516603904 unmapped: 81928192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 519241728 unmapped: 79290368 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523132928 unmapped: 75399168 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289ecb800 session 0x561280b0a960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523272192 unmapped: 75259904 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522682368 unmapped: 75849728 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5681148 data_alloc: 251658240 data_used: 39047168
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197e6a000/0x0/0x1bfc00000, data 0x59efca6/0x5c04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea3bc00 session 0x56127ea17680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522682368 unmapped: 75849728 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522682368 unmapped: 75849728 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197e6a000/0x0/0x1bfc00000, data 0x59efca6/0x5c04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x561282a774a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea37400 session 0x56127f8112c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522682368 unmapped: 75849728 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea3bc00 session 0x56127e996d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522698752 unmapped: 75833344 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.229879379s of 10.384419441s, submitted: 168
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522698752 unmapped: 75833344 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5450491 data_alloc: 234881024 data_used: 30593024
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199498000/0x0/0x1bfc00000, data 0x43c3c86/0x45d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522698752 unmapped: 75833344 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199498000/0x0/0x1bfc00000, data 0x43c3c86/0x45d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522698752 unmapped: 75833344 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199498000/0x0/0x1bfc00000, data 0x43c3c86/0x45d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522698752 unmapped: 75833344 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522698752 unmapped: 75833344 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522698752 unmapped: 75833344 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5450491 data_alloc: 234881024 data_used: 30593024
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x561280c0fe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199498000/0x0/0x1bfc00000, data 0x43c3c86/0x45d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522706944 unmapped: 75825152 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea3b400 session 0x56127f33e3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x56127ea1fa40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x561281c761e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522723328 unmapped: 75808768 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x561280f812c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea37400 session 0x561280f81c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522723328 unmapped: 75808768 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522723328 unmapped: 75808768 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea3bc00 session 0x56128329eb40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x561280b0a780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb3000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522723328 unmapped: 75808768 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175846 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522723328 unmapped: 75808768 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522723328 unmapped: 75808768 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522723328 unmapped: 75808768 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522723328 unmapped: 75808768 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522723328 unmapped: 75808768 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175846 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522723328 unmapped: 75808768 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522723328 unmapped: 75808768 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522731520 unmapped: 75800576 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522731520 unmapped: 75800576 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522739712 unmapped: 75792384 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175846 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522739712 unmapped: 75792384 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522739712 unmapped: 75792384 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522739712 unmapped: 75792384 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522739712 unmapped: 75792384 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522739712 unmapped: 75792384 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175846 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522747904 unmapped: 75784192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522747904 unmapped: 75784192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522747904 unmapped: 75784192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522747904 unmapped: 75784192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522747904 unmapped: 75784192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175846 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522747904 unmapped: 75784192 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522756096 unmapped: 75776000 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522756096 unmapped: 75776000 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522764288 unmapped: 75767808 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522764288 unmapped: 75767808 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175846 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522764288 unmapped: 75767808 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522764288 unmapped: 75767808 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522764288 unmapped: 75767808 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522764288 unmapped: 75767808 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522764288 unmapped: 75767808 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175846 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522764288 unmapped: 75767808 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522772480 unmapped: 75759616 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522772480 unmapped: 75759616 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522772480 unmapped: 75759616 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522772480 unmapped: 75759616 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175846 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522772480 unmapped: 75759616 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522780672 unmapped: 75751424 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522780672 unmapped: 75751424 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522780672 unmapped: 75751424 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522780672 unmapped: 75751424 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175846 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522780672 unmapped: 75751424 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522780672 unmapped: 75751424 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522780672 unmapped: 75751424 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522780672 unmapped: 75751424 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522780672 unmapped: 75751424 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175846 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522780672 unmapped: 75751424 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522780672 unmapped: 75751424 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522788864 unmapped: 75743232 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522788864 unmapped: 75743232 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522788864 unmapped: 75743232 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5175846 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522788864 unmapped: 75743232 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522788864 unmapped: 75743232 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19adb4000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522788864 unmapped: 75743232 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 522788864 unmapped: 75743232 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 64.420852661s of 64.688301086s, submitted: 43
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x561281c76d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x561280f80780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea37400 session 0x561285cd1a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f39bc00 session 0x56127e995c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f39bc00 session 0x56127e5a52c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x56127ea18b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x56128329ef00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x56127e5a5c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea37400 session 0x5612813f6b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 527269888 unmapped: 71262208 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19a87b000/0x0/0x1bfc00000, data 0x2fe1c11/0x31f3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [0,0,0,1,0,8,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289ecb800 session 0x56128329ef00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea37400 session 0x561280b0a780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5299175 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x561280f812c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x56127f33e3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x561282a774a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523239424 unmapped: 75292672 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523239424 unmapped: 75292672 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523239424 unmapped: 75292672 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523239424 unmapped: 75292672 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523239424 unmapped: 75292672 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5299175 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199fc0000/0x0/0x1bfc00000, data 0x389cc11/0x3aae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523239424 unmapped: 75292672 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199fc0000/0x0/0x1bfc00000, data 0x389cc11/0x3aae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523239424 unmapped: 75292672 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523239424 unmapped: 75292672 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199fc0000/0x0/0x1bfc00000, data 0x389cc11/0x3aae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199fc0000/0x0/0x1bfc00000, data 0x389cc11/0x3aae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523247616 unmapped: 75284480 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x5612832a3860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x56127ea1e000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199fc0000/0x0/0x1bfc00000, data 0x389cc11/0x3aae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523247616 unmapped: 75284480 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea37400 session 0x56127f886000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5299175 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289ecb800 session 0x56127f4c2000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f39bc00 session 0x561282f9ef00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f39bc00 session 0x56127e612780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.230807304s of 11.846506119s, submitted: 67
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523247616 unmapped: 75284480 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x561282f9e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x561281c76000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523255808 unmapped: 75276288 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523255808 unmapped: 75276288 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 523223040 unmapped: 75309056 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 524345344 unmapped: 74186752 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5371171 data_alloc: 234881024 data_used: 26710016
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199f9a000/0x0/0x1bfc00000, data 0x38c0c44/0x3ad4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 524345344 unmapped: 74186752 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 524345344 unmapped: 74186752 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199f9a000/0x0/0x1bfc00000, data 0x38c0c44/0x3ad4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 524673024 unmapped: 73859072 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199f9a000/0x0/0x1bfc00000, data 0x38c0c44/0x3ad4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 524869632 unmapped: 73662464 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 524869632 unmapped: 73662464 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5401411 data_alloc: 251658240 data_used: 31014912
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 524869632 unmapped: 73662464 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x199f9a000/0x0/0x1bfc00000, data 0x38c0c44/0x3ad4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 524869632 unmapped: 73662464 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 524869632 unmapped: 73662464 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 524869632 unmapped: 73662464 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.869935036s of 13.460103989s, submitted: 16
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531423232 unmapped: 67108864 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5532023 data_alloc: 251658240 data_used: 33079296
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197e3d000/0x0/0x1bfc00000, data 0x4875c44/0x4a89000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531759104 unmapped: 66772992 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531759104 unmapped: 66772992 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531759104 unmapped: 66772992 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 532414464 unmapped: 66117632 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534429696 unmapped: 64102400 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5622495 data_alloc: 251658240 data_used: 34181120
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197627000/0x0/0x1bfc00000, data 0x5092c44/0x52a6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534437888 unmapped: 64094208 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533913600 unmapped: 64618496 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533913600 unmapped: 64618496 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533913600 unmapped: 64618496 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197600000/0x0/0x1bfc00000, data 0x50bac44/0x52ce000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533913600 unmapped: 64618496 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5627453 data_alloc: 251658240 data_used: 34328576
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.471938133s of 11.465478897s, submitted: 223
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533913600 unmapped: 64618496 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533913600 unmapped: 64618496 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533913600 unmapped: 64618496 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280e87000 session 0x561281bebe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533913600 unmapped: 64618496 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1975fe000/0x0/0x1bfc00000, data 0x50bcc44/0x52d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533913600 unmapped: 64618496 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5629639 data_alloc: 251658240 data_used: 34336768
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533913600 unmapped: 64618496 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533913600 unmapped: 64618496 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533921792 unmapped: 64610304 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1975fd000/0x0/0x1bfc00000, data 0x50bdc44/0x52d1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533921792 unmapped: 64610304 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533921792 unmapped: 64610304 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5629635 data_alloc: 251658240 data_used: 34336768
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.803511620s of 10.216893196s, submitted: 5
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533921792 unmapped: 64610304 heap: 598532096 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128458ac00 session 0x561280f801e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533995520 unmapped: 68739072 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533995520 unmapped: 68739072 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533995520 unmapped: 68739072 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196c38000/0x0/0x1bfc00000, data 0x5a82c44/0x5c96000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533995520 unmapped: 68739072 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5711913 data_alloc: 251658240 data_used: 34336768
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533995520 unmapped: 68739072 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533995520 unmapped: 68739072 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196c38000/0x0/0x1bfc00000, data 0x5a82c44/0x5c96000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533995520 unmapped: 68739072 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533995520 unmapped: 68739072 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533995520 unmapped: 68739072 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5711913 data_alloc: 251658240 data_used: 34336768
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533995520 unmapped: 68739072 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196c38000/0x0/0x1bfc00000, data 0x5a82c44/0x5c96000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.403929710s of 10.415681839s, submitted: 16
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x561282a77860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x561280f80b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533995520 unmapped: 68739072 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f39bc00 session 0x56127e5a4b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534020096 unmapped: 68714496 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280e87000 session 0x56127e8d41e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534020096 unmapped: 68714496 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534020096 unmapped: 68714496 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5713404 data_alloc: 251658240 data_used: 34332672
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196c37000/0x0/0x1bfc00000, data 0x5a82c67/0x5c97000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 68550656 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536436736 unmapped: 66297856 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536436736 unmapped: 66297856 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536436736 unmapped: 66297856 heap: 602734592 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280fcd400 session 0x56127f2bbe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x561280fa3680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x56127f2bad20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f39bc00 session 0x56127f887a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280e87000 session 0x561281c76f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea38800 session 0x5612813f7a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x561282f9ed20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536682496 unmapped: 69730304 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x56127e997a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f39bc00 session 0x561281c77860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5882909 data_alloc: 251658240 data_used: 41979904
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536682496 unmapped: 69730304 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196017000/0x0/0x1bfc00000, data 0x66a0cd9/0x68b7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536682496 unmapped: 69730304 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536682496 unmapped: 69730304 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536682496 unmapped: 69730304 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536682496 unmapped: 69730304 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5882909 data_alloc: 251658240 data_used: 41979904
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536682496 unmapped: 69730304 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.116009712s of 14.997079849s, submitted: 46
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #58. Immutable memtables: 14.
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541392896 unmapped: 65019904 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1957ee000/0x0/0x1bfc00000, data 0x6ec1cd9/0x70d8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541425664 unmapped: 64987136 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540770304 unmapped: 65642496 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x194121000/0x0/0x1bfc00000, data 0x73eecd9/0x7605000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540770304 unmapped: 65642496 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5995451 data_alloc: 251658240 data_used: 43220992
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280e87000 session 0x5612813e7e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540966912 unmapped: 65445888 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540966912 unmapped: 65445888 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1940f6000/0x0/0x1bfc00000, data 0x7418cfc/0x7630000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540966912 unmapped: 65445888 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547872768 unmapped: 58540032 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547872768 unmapped: 58540032 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6085159 data_alloc: 268435456 data_used: 55738368
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547872768 unmapped: 58540032 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547872768 unmapped: 58540032 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1940fe000/0x0/0x1bfc00000, data 0x7418cfc/0x7630000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547872768 unmapped: 58540032 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.635467529s of 12.613275528s, submitted: 132
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547872768 unmapped: 58540032 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547872768 unmapped: 58540032 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6085479 data_alloc: 268435456 data_used: 55746560
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547872768 unmapped: 58540032 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547954688 unmapped: 58458112 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1940fe000/0x0/0x1bfc00000, data 0x7418cfc/0x7630000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547962880 unmapped: 58449920 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 547962880 unmapped: 58449920 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555253760 unmapped: 51159040 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6197769 data_alloc: 268435456 data_used: 55754752
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 551305216 unmapped: 55107584 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552304640 unmapped: 54108160 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552378368 unmapped: 54034432 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19332d000/0x0/0x1bfc00000, data 0x81e9cfc/0x8401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 6.645711899s of 10.220244408s, submitted: 127
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552599552 unmapped: 53813248 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552681472 unmapped: 53731328 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6218377 data_alloc: 268435456 data_used: 56815616
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612805fb800 session 0x56127ea16960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19328b000/0x0/0x1bfc00000, data 0x828bcfc/0x84a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552689664 unmapped: 53723136 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552689664 unmapped: 53723136 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546562048 unmapped: 59850752 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x56127ea16960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546562048 unmapped: 59850752 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546562048 unmapped: 59850752 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5970630 data_alloc: 251658240 data_used: 45506560
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19497a000/0x0/0x1bfc00000, data 0x6b9dcd9/0x6db4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546570240 unmapped: 59842560 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546570240 unmapped: 59842560 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546570240 unmapped: 59842560 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546570240 unmapped: 59842560 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.029835701s of 10.683382988s, submitted: 47
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19497a000/0x0/0x1bfc00000, data 0x6b9dcd9/0x6db4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49cc00 session 0x56127e613c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127eb1d800 session 0x5612813e7860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546578432 unmapped: 59834368 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5969194 data_alloc: 251658240 data_used: 45506560
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546586624 unmapped: 59826176 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546619392 unmapped: 59793408 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1956f0000/0x0/0x1bfc00000, data 0x5e27c77/0x603d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x56127f2bbe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5813740 data_alloc: 251658240 data_used: 40177664
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195713000/0x0/0x1bfc00000, data 0x5e06c54/0x601b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195713000/0x0/0x1bfc00000, data 0x5e06c54/0x601b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195713000/0x0/0x1bfc00000, data 0x5e06c54/0x601b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5814104 data_alloc: 251658240 data_used: 40185856
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.736224651s of 11.216956139s, submitted: 64
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea38400 session 0x56127ea16d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49c800 session 0x56127f33f4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x56128329f680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196e23000/0x0/0x1bfc00000, data 0x434dbe2/0x4560000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543047680 unmapped: 63365120 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196e23000/0x0/0x1bfc00000, data 0x434dbe2/0x4560000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543047680 unmapped: 63365120 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196e4d000/0x0/0x1bfc00000, data 0x4323bbf/0x4535000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543047680 unmapped: 63365120 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543047680 unmapped: 63365120 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5507754 data_alloc: 234881024 data_used: 26574848
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543047680 unmapped: 63365120 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543055872 unmapped: 63356928 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543055872 unmapped: 63356928 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128056e400 session 0x561280c0e000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543055872 unmapped: 63356928 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196e4d000/0x0/0x1bfc00000, data 0x4323bbf/0x4535000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543064064 unmapped: 63348736 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5507622 data_alloc: 234881024 data_used: 26574848
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196e4d000/0x0/0x1bfc00000, data 0x4323bbf/0x4535000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543064064 unmapped: 63348736 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.698352814s of 11.074827194s, submitted: 53
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea37400 session 0x56127f3df0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289ecb800 session 0x561280b0a5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543064064 unmapped: 63348736 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1971f9000/0x0/0x1bfc00000, data 0x4323bbf/0x4535000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543064064 unmapped: 63348736 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543072256 unmapped: 63340544 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5240353 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289ecb800 session 0x561280f80000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5240353 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5401.0 total, 600.0 interval#012Cumulative writes: 72K writes, 293K keys, 72K commit groups, 1.0 writes per commit group, ingest: 0.29 GB, 0.06 MB/s#012Cumulative WAL: 72K writes, 26K syncs, 2.71 writes per sync, written: 0.29 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5949 writes, 24K keys, 5949 commit groups, 1.0 writes per commit group, ingest: 24.13 MB, 0.04 MB/s#012Interval WAL: 5949 writes, 2301 syncs, 2.59 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5240353 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: mgrc ms_handle_reset ms_handle_reset con 0x56127e5c1400
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3158772141
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3158772141,v1:192.168.122.100:6801/3158772141]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: mgrc handle_mgr_configure stats_period=5
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538681344 unmapped: 67731456 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538681344 unmapped: 67731456 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5240353 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538681344 unmapped: 67731456 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538681344 unmapped: 67731456 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538689536 unmapped: 67723264 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538689536 unmapped: 67723264 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862acc00 session 0x561285cd0960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128056ec00 session 0x56128127f4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862ad800 session 0x561280b0a000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612834c6400 session 0x56127e8c7e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538689536 unmapped: 67723264 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.682285309s of 23.394338608s, submitted: 33
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5303640 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128056ec00 session 0x561280c0fe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862acc00 session 0x561285cd0f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862ad800 session 0x56127f8b5e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289ecb800 session 0x56127e8d4b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539795456 unmapped: 66617344 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b48000/0x0/0x1bfc00000, data 0x39d4c63/0x3be6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539803648 unmapped: 66609152 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539836416 unmapped: 66576384 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539836416 unmapped: 66576384 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539836416 unmapped: 66576384 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5365357 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539836416 unmapped: 66576384 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561286920000 session 0x561280f803c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539836416 unmapped: 66576384 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128056ec00 session 0x56127f3ee780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b48000/0x0/0x1bfc00000, data 0x39d4c63/0x3be6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b48000/0x0/0x1bfc00000, data 0x39d4c63/0x3be6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612850f3000 session 0x56127f3de1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea5d000 session 0x561280b0b2c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f45000 session 0x56127f2bb0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539844608 unmapped: 66568192 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b22000/0x0/0x1bfc00000, data 0x39f8c96/0x3c0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539844608 unmapped: 66568192 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539885568 unmapped: 66527232 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5439220 data_alloc: 234881024 data_used: 26955776
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b22000/0x0/0x1bfc00000, data 0x39f8c96/0x3c0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5476980 data_alloc: 251658240 data_used: 32296960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b22000/0x0/0x1bfc00000, data 0x39f8c96/0x3c0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b22000/0x0/0x1bfc00000, data 0x39f8c96/0x3c0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.117702484s of 19.345458984s, submitted: 68
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 66322432 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5476756 data_alloc: 251658240 data_used: 32301056
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545300480 unmapped: 61112320 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545357824 unmapped: 61054976 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196a2e000/0x0/0x1bfc00000, data 0x4aebc96/0x4cff000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5639368 data_alloc: 251658240 data_used: 33820672
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196a2c000/0x0/0x1bfc00000, data 0x4aeec96/0x4d02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.879464149s of 10.096103668s, submitted: 423
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612810bb800 session 0x56127e95e960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5638532 data_alloc: 251658240 data_used: 33828864
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196a2b000/0x0/0x1bfc00000, data 0x4aeecf8/0x4d03000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196a2b000/0x0/0x1bfc00000, data 0x4aeecf8/0x4d03000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280e86c00 session 0x56127f3de3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612810bac00 session 0x56127e997a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542588928 unmapped: 63823872 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612806aa400 session 0x56127e8c7a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5424450 data_alloc: 234881024 data_used: 24637440
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612841f8000 session 0x561280b0a960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542588928 unmapped: 63823872 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289eca800 session 0x56127e799e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128b8bd800 session 0x561280b0b860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542588928 unmapped: 63823872 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612806aa400 session 0x561285cd05a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198436000/0x0/0x1bfc00000, data 0x2aaac24/0x2cbc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5270901 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198436000/0x0/0x1bfc00000, data 0x2aaac01/0x2cbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5270901 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198436000/0x0/0x1bfc00000, data 0x2aaac01/0x2cbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198436000/0x0/0x1bfc00000, data 0x2aaac01/0x2cbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5270901 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530128896 unmapped: 76283904 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530128896 unmapped: 76283904 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530128896 unmapped: 76283904 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49d400 session 0x56127e8e83c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127fa4ac00 session 0x56127ea17e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612841f8400 session 0x56127f33e3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49d400 session 0x56127ecee780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.085548401s of 24.294006348s, submitted: 62
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530112512 unmapped: 76300288 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127fa4ac00 session 0x561280c0e1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198436000/0x0/0x1bfc00000, data 0x2aaac01/0x2cbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612806aa400 session 0x56128127e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128b8bd800 session 0x561281bebe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128fb9f400 session 0x56127e9a5860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49d400 session 0x5612813e7e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5309808 data_alloc: 234881024 data_used: 17612800
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198259000/0x0/0x1bfc00000, data 0x2eb2c73/0x30c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5309808 data_alloc: 234881024 data_used: 17612800
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530022400 unmapped: 76390400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530022400 unmapped: 76390400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.395060539s of 11.491009712s, submitted: 29
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530022400 unmapped: 76390400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5326945 data_alloc: 234881024 data_used: 17612800
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198259000/0x0/0x1bfc00000, data 0x2eb2c73/0x30c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561285e00c00 session 0x56127e8d52c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530071552 unmapped: 83689472 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49cc00 session 0x5612813f7a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561286218400 session 0x56127ea1e1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bc400 session 0x561281bea960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530006016 unmapped: 83755008 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289a6dc00 session 0x561280f80780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 83746816 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f42400 session 0x5612813f63c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127eb1d800 session 0x561280f81680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19772a000/0x0/0x1bfc00000, data 0x39ded08/0x3bf4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530178048 unmapped: 83582976 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530178048 unmapped: 83582976 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5457726 data_alloc: 234881024 data_used: 23810048
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19772a000/0x0/0x1bfc00000, data 0x39ded08/0x3bf4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612810ba400 session 0x56127f33e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280c31400 session 0x561282a77c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5515354 data_alloc: 251658240 data_used: 31928320
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.525037766s of 13.965292931s, submitted: 61
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19772a000/0x0/0x1bfc00000, data 0x39ded08/0x3bf4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [0,0,0,1,0,3])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 77807616 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5597972 data_alloc: 251658240 data_used: 32464896
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x56127f8b50e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612805fa400 session 0x56128329f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 77774848 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196d2c000/0x0/0x1bfc00000, data 0x43d4d08/0x45ea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5604106 data_alloc: 251658240 data_used: 32288768
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196d20000/0x0/0x1bfc00000, data 0x43e2d08/0x45f8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561285602800 session 0x56127f8103c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280674c00 session 0x56128127f4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561286216400 session 0x56127e8d4b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612806afc00 session 0x56127e8c6960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5448333 data_alloc: 234881024 data_used: 22208512
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197824000/0x0/0x1bfc00000, data 0x38e3d08/0x3af9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x56127e9945a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197824000/0x0/0x1bfc00000, data 0x38e3d08/0x3af9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.898028374s of 15.850214005s, submitted: 141
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49f400 session 0x56128329f860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280528c00 session 0x56127f886b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5447601 data_alloc: 234881024 data_used: 22208512
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533331968 unmapped: 80429056 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533348352 unmapped: 80412672 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e5c0c00 session 0x561280b0ab40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19865f000/0x0/0x1bfc00000, data 0x2aaac73/0x2cbd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533348352 unmapped: 80412672 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533348352 unmapped: 80412672 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19865f000/0x0/0x1bfc00000, data 0x2aaac73/0x2cbd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533348352 unmapped: 80412672 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5299051 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533348352 unmapped: 80412672 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x5612832a3680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49f400 session 0x56127ea1e780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533356544 unmapped: 80404480 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533356544 unmapped: 80404480 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612805fbc00 session 0x56127f8b4780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f42800 session 0x561280fa25a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533389312 unmapped: 80371712 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.706570625s of 10.357423782s, submitted: 79
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533389312 unmapped: 80371712 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289a6c400 session 0x561280f80b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5296296 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x56127f8b4780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5295481 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5295481 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533413888 unmapped: 80347136 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533413888 unmapped: 80347136 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5295481 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533413888 unmapped: 80347136 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533422080 unmapped: 80338944 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea5d000 session 0x56128329f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561283958800 session 0x56127f8b50e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561284537800 session 0x561282a77c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612806ae800 session 0x56127f33e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.242624283s of 17.551719666s, submitted: 16
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533422080 unmapped: 80338944 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x5612813f63c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea5d000 session 0x5612813e7e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5397174 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128a0d4000 session 0x561281bebe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19794e000/0x0/0x1bfc00000, data 0x37bfc01/0x39d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02d000 session 0x56128127e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862ac800 session 0x56127ecee780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 84525056 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399012 data_alloc: 234881024 data_used: 17608704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19794e000/0x0/0x1bfc00000, data 0x37bfc01/0x39d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533422080 unmapped: 84541440 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02d000 session 0x56127f33e3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19794c000/0x0/0x1bfc00000, data 0x37bfc34/0x39d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49cc00 session 0x56127e5a54a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128458a800 session 0x561285cd0f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561285e01400 session 0x561280b0a960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f1fc00 session 0x56127e613a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.739593506s of 10.920108795s, submitted: 44
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02d000 session 0x561280c0f860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49cc00 session 0x561280c0f2c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128458a800 session 0x561280c0f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561285e01400 session 0x561280c0e960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea35c00 session 0x561281c77860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 535896064 unmapped: 82067456 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536748032 unmapped: 81215488 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5581490 data_alloc: 251658240 data_used: 31318016
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536748032 unmapped: 81215488 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536748032 unmapped: 81215488 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561286216800 session 0x561281c77e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196f0e000/0x0/0x1bfc00000, data 0x41fcc44/0x4410000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49e400 session 0x561281c765a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536748032 unmapped: 81215488 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128a0d5000 session 0x561281c77680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536748032 unmapped: 81215488 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f45c00 session 0x561282a772c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536756224 unmapped: 81207296 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5583174 data_alloc: 251658240 data_used: 31318016
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536756224 unmapped: 81207296 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196eea000/0x0/0x1bfc00000, data 0x4220c44/0x4434000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537149440 unmapped: 80814080 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539803648 unmapped: 78159872 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539803648 unmapped: 78159872 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.370302200s of 11.185728073s, submitted: 17
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542154752 unmapped: 75808768 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5701180 data_alloc: 251658240 data_used: 41771008
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542613504 unmapped: 75350016 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546119680 unmapped: 71843840 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1965eb000/0x0/0x1bfc00000, data 0x4b1fc44/0x4d33000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [0,0,0,0,0,0,1,0,0,34,2])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544186368 unmapped: 73777152 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543498240 unmapped: 74465280 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543498240 unmapped: 74465280 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5757618 data_alloc: 251658240 data_used: 42098688
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1961fb000/0x0/0x1bfc00000, data 0x4f0fc44/0x5123000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543498240 unmapped: 74465280 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1961f6000/0x0/0x1bfc00000, data 0x4f14c44/0x5128000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544563200 unmapped: 73400320 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546381824 unmapped: 71581696 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545767424 unmapped: 72196096 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195dfb000/0x0/0x1bfc00000, data 0x5307c44/0x551b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 3.628212929s of 10.026289940s, submitted: 155
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545988608 unmapped: 71974912 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5819230 data_alloc: 251658240 data_used: 42954752
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545988608 unmapped: 71974912 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545996800 unmapped: 71966720 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545996800 unmapped: 71966720 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545996800 unmapped: 71966720 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195d84000/0x0/0x1bfc00000, data 0x5386c44/0x559a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545996800 unmapped: 71966720 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5818434 data_alloc: 251658240 data_used: 42975232
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545996800 unmapped: 71966720 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195d62000/0x0/0x1bfc00000, data 0x53a8c44/0x55bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5819290 data_alloc: 251658240 data_used: 42975232
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195d62000/0x0/0x1bfc00000, data 0x53a8c44/0x55bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.674377441s of 12.980909348s, submitted: 18
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195d5f000/0x0/0x1bfc00000, data 0x53abc44/0x55bf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546013184 unmapped: 71950336 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5820618 data_alloc: 251658240 data_used: 42979328
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546013184 unmapped: 71950336 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546013184 unmapped: 71950336 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea39800 session 0x561280bd9c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862af000 session 0x561280b0b2c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f1fc00 session 0x5612813e7c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49c800 session 0x561282f9f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546013184 unmapped: 71950336 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49e400 session 0x561285cd0d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197496000/0x0/0x1bfc00000, data 0x3983baf/0x3b94000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546062336 unmapped: 71901184 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546062336 unmapped: 71901184 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5526921 data_alloc: 234881024 data_used: 28925952
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546062336 unmapped: 71901184 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561286216800 session 0x561282a77a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 399 handle_osd_map epochs [399,400], i have 399, src has [1,400]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 400 ms_handle_reset con 0x56127f49c800 session 0x561281c774a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 400 ms_handle_reset con 0x5612823e1400 session 0x561281beab40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 400 ms_handle_reset con 0x56127eb1dc00 session 0x5612813e72c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546062336 unmapped: 71901184 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546070528 unmapped: 71892992 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546070528 unmapped: 71892992 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 400 heartbeat osd_stat(store_statfs(0x197785000/0x0/0x1bfc00000, data 0x398586a/0x3b98000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5533519 data_alloc: 234881024 data_used: 28995584
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 400 heartbeat osd_stat(store_statfs(0x197785000/0x0/0x1bfc00000, data 0x398586a/0x3b98000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5533519 data_alloc: 234881024 data_used: 28995584
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 400 heartbeat osd_stat(store_statfs(0x197785000/0x0/0x1bfc00000, data 0x398586a/0x3b98000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 400 ms_handle_reset con 0x56127f39bc00 session 0x561280f81c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.182943344s of 19.346008301s, submitted: 60
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 400 handle_osd_map epochs [400,401], i have 400, src has [1,401]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 401 ms_handle_reset con 0x56128b8bc400 session 0x56128127eb40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 401 heartbeat osd_stat(store_statfs(0x197783000/0x0/0x1bfc00000, data 0x39874b5/0x3b9a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5535477 data_alloc: 234881024 data_used: 29016064
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 401 heartbeat osd_stat(store_statfs(0x197784000/0x0/0x1bfc00000, data 0x39874b5/0x3b9a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 401 heartbeat osd_stat(store_statfs(0x197784000/0x0/0x1bfc00000, data 0x39874b5/0x3b9a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 401 handle_osd_map epochs [401,402], i have 401, src has [1,402]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5544371 data_alloc: 234881024 data_used: 29605888
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x197780000/0x0/0x1bfc00000, data 0x3988ff4/0x3b9d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19777c000/0x0/0x1bfc00000, data 0x398dff4/0x3ba2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5546969 data_alloc: 234881024 data_used: 29601792
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19777c000/0x0/0x1bfc00000, data 0x398dff4/0x3ba2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f42800 session 0x561280c0ed20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280674c00 session 0x561281c77c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.279945374s of 15.460276604s, submitted: 45
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x56127f33f4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19865b000/0x0/0x1bfc00000, data 0x2aaffe4/0x2cc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5332809 data_alloc: 234881024 data_used: 17637376
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19865b000/0x0/0x1bfc00000, data 0x2aaffe4/0x2cc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5332809 data_alloc: 234881024 data_used: 17637376
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19865b000/0x0/0x1bfc00000, data 0x2aaffe4/0x2cc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5332809 data_alloc: 234881024 data_used: 17637376
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19865b000/0x0/0x1bfc00000, data 0x2aaffe4/0x2cc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538255360 unmapped: 79708160 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538255360 unmapped: 79708160 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19865b000/0x0/0x1bfc00000, data 0x2aaffe4/0x2cc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538255360 unmapped: 79708160 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5332809 data_alloc: 234881024 data_used: 17637376
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538255360 unmapped: 79708160 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538255360 unmapped: 79708160 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f45800 session 0x561280b0af00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280e86000 session 0x56127f7192c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x56128127fe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280674c00 session 0x56127e798b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.710439682s of 20.854120255s, submitted: 13
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280e86000 session 0x56128127e000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f42800 session 0x5612832a3c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f45800 session 0x561281bea5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539435008 unmapped: 78528512 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x5612832a21e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280674c00 session 0x5612813e7e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280e86000 session 0x561285cd0f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f42800 session 0x561280c0f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561284536800 session 0x561281bea960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x561281c77680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280674c00 session 0x561280fa2960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280e86000 session 0x561280bd9860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561284536800 session 0x56127f3ee780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f42800 session 0x56128127e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x56128329f4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x197e44000/0x0/0x1bfc00000, data 0x32c4066/0x34da000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411862 data_alloc: 234881024 data_used: 17637376
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561289ecb800 session 0x561281beba40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56128458a800 session 0x56127e8c7a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280e87000 session 0x561280bd9e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280d06000 session 0x561280c0e960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x56127e8d4d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539484160 unmapped: 78479360 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539443200 unmapped: 78520320 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x197dfa000/0x0/0x1bfc00000, data 0x330c099/0x3524000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5472521 data_alloc: 234881024 data_used: 24907776
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x197dfa000/0x0/0x1bfc00000, data 0x330c099/0x3524000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5472521 data_alloc: 234881024 data_used: 24907776
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.624403000s of 17.017063141s, submitted: 70
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542777344 unmapped: 75186176 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5524501 data_alloc: 234881024 data_used: 25411584
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x197dfa000/0x0/0x1bfc00000, data 0x330c099/0x3524000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544989184 unmapped: 72974336 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x196b9a000/0x0/0x1bfc00000, data 0x4563099/0x477b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5643323 data_alloc: 234881024 data_used: 25501696
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x196b9a000/0x0/0x1bfc00000, data 0x4563099/0x477b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5633579 data_alloc: 234881024 data_used: 25505792
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.728960991s of 11.184023857s, submitted: 164
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545243136 unmapped: 72720384 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x196ba0000/0x0/0x1bfc00000, data 0x4566099/0x477e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x196ba0000/0x0/0x1bfc00000, data 0x4566099/0x477e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545243136 unmapped: 72720384 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x5612822d6800 session 0x561280c0eb40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545251328 unmapped: 72712192 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 402 handle_osd_map epochs [402,403], i have 402, src has [1,403]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 403 handle_osd_map epochs [403,403], i have 403, src has [1,403]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545251328 unmapped: 72712192 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 403 ms_handle_reset con 0x561280d07c00 session 0x56127dde54a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 403 ms_handle_reset con 0x561285e01c00 session 0x561281beb2c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 403 ms_handle_reset con 0x561283958800 session 0x56127e994780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 403 heartbeat osd_stat(store_statfs(0x194df5000/0x0/0x1bfc00000, data 0x630fcf2/0x6529000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 403 ms_handle_reset con 0x561283958800 session 0x56127f33e1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 403 handle_osd_map epochs [403,404], i have 403, src has [1,404]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 403 handle_osd_map epochs [404,404], i have 404, src has [1,404]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 404 ms_handle_reset con 0x56127f49f400 session 0x5612813f7860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557522944 unmapped: 83746816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5894637 data_alloc: 251658240 data_used: 32055296
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 404 handle_osd_map epochs [404,405], i have 404, src has [1,405]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557531136 unmapped: 83738624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x561280d07c00 session 0x561281c76b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 heartbeat osd_stat(store_statfs(0x194df1000/0x0/0x1bfc00000, data 0x631199f/0x652c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557531136 unmapped: 83738624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56127ea36000 session 0x561280b0ba40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x561289a6d000 session 0x56127f886000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56127ea36000 session 0x561282a76780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 83730432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 83730432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 heartbeat osd_stat(store_statfs(0x194deb000/0x0/0x1bfc00000, data 0x6315676/0x6532000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552140800 unmapped: 89128960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5885083 data_alloc: 251658240 data_used: 32055296
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552140800 unmapped: 89128960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.128082275s of 11.069601059s, submitted: 70
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56128458a800 session 0x561280c0fe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x561289ecb800 session 0x56128127f2c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552140800 unmapped: 89128960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552140800 unmapped: 89128960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56127ea35400 session 0x56127ea1fa40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56128056e400 session 0x56127e8c72c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56127ea35400 session 0x561281bea780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 heartbeat osd_stat(store_statfs(0x194dec000/0x0/0x1bfc00000, data 0x6315676/0x6532000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552140800 unmapped: 89128960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 405 handle_osd_map epochs [405,406], i have 405, src has [1,406]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x56127ea36000 session 0x56127ea1f4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x561289ecb800 session 0x56127e612960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x56128458a800 session 0x56127e5a4b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x561286f37800 session 0x561281c761e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x56127ea35400 session 0x561285cd03c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x56127ea36000 session 0x56127e8e83c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5891905 data_alloc: 251658240 data_used: 32067584
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 406 heartbeat osd_stat(store_statfs(0x194de7000/0x0/0x1bfc00000, data 0x6317217/0x6536000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5891905 data_alloc: 251658240 data_used: 32067584
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 406 heartbeat osd_stat(store_statfs(0x194de7000/0x0/0x1bfc00000, data 0x6317217/0x6536000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x561295f45000 session 0x56128329eb40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 406 handle_osd_map epochs [406,407], i have 406, src has [1,407]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.107981682s of 10.156295776s, submitted: 26
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 406 handle_osd_map epochs [407,407], i have 407, src has [1,407]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 handle_osd_map epochs [407,407], i have 407, src has [1,407]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561280e87c00 session 0x561281c770e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552173568 unmapped: 89096192 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561289a6c000 session 0x561280f80000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x56127ea35400 session 0x56128329fe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x56127ea36000 session 0x5612813f70e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552173568 unmapped: 89096192 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561280e87c00 session 0x56127f497c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552189952 unmapped: 89079808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de3000/0x0/0x1bfc00000, data 0x6318ed2/0x653a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552198144 unmapped: 89071616 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5897451 data_alloc: 251658240 data_used: 32100352
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552206336 unmapped: 89063424 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x56127eb1d800 session 0x56127e996000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de3000/0x0/0x1bfc00000, data 0x6318ed2/0x653a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552206336 unmapped: 89063424 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552206336 unmapped: 89063424 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 553230336 unmapped: 88039424 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de3000/0x0/0x1bfc00000, data 0x6318ef5/0x653b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561295f45000 session 0x56127f497680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558751744 unmapped: 82518016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561289a6d400 session 0x5612832a32c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6001264 data_alloc: 251658240 data_used: 46129152
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558751744 unmapped: 82518016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de3000/0x0/0x1bfc00000, data 0x6318ef5/0x653b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558751744 unmapped: 82518016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558751744 unmapped: 82518016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x5612823e1000 session 0x56127e9974a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.553009987s of 11.791515350s, submitted: 16
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561286218400 session 0x561285cd0b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de3000/0x0/0x1bfc00000, data 0x6318ef5/0x653b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558891008 unmapped: 82378752 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de4000/0x0/0x1bfc00000, data 0x73d9e93/0x653a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 407 handle_osd_map epochs [407,408], i have 407, src has [1,408]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 408 handle_osd_map epochs [408,408], i have 408, src has [1,408]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 408 ms_handle_reset con 0x561286921000 session 0x561282a76d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559955968 unmapped: 81313792 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6074674 data_alloc: 251658240 data_used: 46137344
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 408 ms_handle_reset con 0x56127e9bdc00 session 0x56128127f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 408 heartbeat osd_stat(store_statfs(0x194de1000/0x0/0x1bfc00000, data 0x631ab40/0x653d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559955968 unmapped: 81313792 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 408 heartbeat osd_stat(store_statfs(0x194de1000/0x0/0x1bfc00000, data 0x631ab40/0x653d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559955968 unmapped: 81313792 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 408 heartbeat osd_stat(store_statfs(0x194de1000/0x0/0x1bfc00000, data 0x631ab40/0x653d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559964160 unmapped: 81305600 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 408 handle_osd_map epochs [408,409], i have 408, src has [1,409]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559972352 unmapped: 81297408 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194ddd000/0x0/0x1bfc00000, data 0x631c67f/0x6540000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559742976 unmapped: 81526784 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6079926 data_alloc: 251658240 data_used: 46387200
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559742976 unmapped: 81526784 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559775744 unmapped: 81494016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559775744 unmapped: 81494016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559775744 unmapped: 81494016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194ce1000/0x0/0x1bfc00000, data 0x641867f/0x663c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559775744 unmapped: 81494016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6090966 data_alloc: 251658240 data_used: 47108096
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.032366753s of 12.322899818s, submitted: 81
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559775744 unmapped: 81494016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559783936 unmapped: 81485824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559783936 unmapped: 81485824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559923200 unmapped: 81346560 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194ce0000/0x0/0x1bfc00000, data 0x641867f/0x663c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6101928 data_alloc: 251658240 data_used: 48390144
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194ce0000/0x0/0x1bfc00000, data 0x641867f/0x663c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194ce2000/0x0/0x1bfc00000, data 0x641867f/0x663c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194cdc000/0x0/0x1bfc00000, data 0x641e67f/0x6642000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6104846 data_alloc: 251658240 data_used: 48398336
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.983446121s of 11.115625381s, submitted: 15
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 ms_handle_reset con 0x5612823e0c00 session 0x5612832a2000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 ms_handle_reset con 0x5612810bac00 session 0x56127e612f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 ms_handle_reset con 0x5612822d7800 session 0x56127e798d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194cdc000/0x0/0x1bfc00000, data 0x641e67f/0x6642000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x195dc9000/0x0/0x1bfc00000, data 0x533363c/0x5554000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5882971 data_alloc: 251658240 data_used: 42958848
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x195dc9000/0x0/0x1bfc00000, data 0x533363c/0x5554000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5891131 data_alloc: 251658240 data_used: 43483136
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 ms_handle_reset con 0x561280c30800 session 0x561282f9e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x195dc9000/0x0/0x1bfc00000, data 0x533363c/0x5554000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 409 handle_osd_map epochs [409,410], i have 409, src has [1,410]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 410 handle_osd_map epochs [410,410], i have 410, src has [1,410]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 410 ms_handle_reset con 0x561280e86800 session 0x56127ecee5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.808508873s of 10.125526428s, submitted: 52
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 410 ms_handle_reset con 0x5612810bac00 session 0x56127ea1e960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 410 ms_handle_reset con 0x561280c30800 session 0x5612813e65a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 410 ms_handle_reset con 0x5612822d7800 session 0x56127e613c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 410 handle_osd_map epochs [410,411], i have 410, src has [1,411]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 411 handle_osd_map epochs [411,411], i have 411, src has [1,411]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 411 ms_handle_reset con 0x5612823e0c00 session 0x56127dde54a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 77168640 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 411 handle_osd_map epochs [411,412], i have 411, src has [1,412]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 77570048 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 412 ms_handle_reset con 0x561295f42000 session 0x561280b0bc20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 77570048 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6016353 data_alloc: 268435456 data_used: 46612480
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 412 ms_handle_reset con 0x561280c30800 session 0x561280bd9680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 412 heartbeat osd_stat(store_statfs(0x194951000/0x0/0x1bfc00000, data 0x67a7bb7/0x69cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 412 ms_handle_reset con 0x5612810bac00 session 0x561282f9ed20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 412 ms_handle_reset con 0x5612822d7800 session 0x561280f80f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 77570048 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 77561856 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 77561856 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 412 handle_osd_map epochs [412,413], i have 412, src has [1,413]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 413 ms_handle_reset con 0x56127f49fc00 session 0x56127e9945a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563716096 unmapped: 77553664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 413 heartbeat osd_stat(store_statfs(0x195dbd000/0x0/0x1bfc00000, data 0x533a880/0x5560000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 413 ms_handle_reset con 0x5612810bb800 session 0x56127f810000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563716096 unmapped: 77553664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5877483 data_alloc: 268435456 data_used: 46600192
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 413 ms_handle_reset con 0x5612810bb800 session 0x56127e9a5860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 77545472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 77545472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 77545472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.639679909s of 11.257122040s, submitted: 116
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 413 handle_osd_map epochs [413,414], i have 413, src has [1,414]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 414 handle_osd_map epochs [414,415], i have 414, src has [1,415]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561286920c00 session 0x5612813e7a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 551043072 unmapped: 90226688 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 heartbeat osd_stat(store_statfs(0x197c5d000/0x0/0x1bfc00000, data 0x3499ff5/0x36bf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 551043072 unmapped: 90226688 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5494105 data_alloc: 234881024 data_used: 18702336
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561280d06000 session 0x56127e9965a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561280e87000 session 0x561280c0f860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x56128458b000 session 0x5612813e6960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538820608 unmapped: 102449152 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538820608 unmapped: 102449152 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 550281216 unmapped: 90988544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561280d06000 session 0x561280f803c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561280e87000 session 0x56127f33e3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x5612810bb800 session 0x561281c76780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561286920c00 session 0x561281c77680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x5612862ac800 session 0x56127e8c6960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 101179392 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 heartbeat osd_stat(store_statfs(0x1978f0000/0x0/0x1bfc00000, data 0x380af83/0x3a2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 101179392 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5473510 data_alloc: 218103808 data_used: 8441856
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 heartbeat osd_stat(store_statfs(0x1978f0000/0x0/0x1bfc00000, data 0x380af83/0x3a2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 101179392 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 heartbeat osd_stat(store_statfs(0x1978f0000/0x0/0x1bfc00000, data 0x380af83/0x3a2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 101179392 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x56128056e800 session 0x5612813f72c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 101179392 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x5612834c7c00 session 0x56127ea17e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561280d07400 session 0x56127ea16d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.646598816s of 10.607299805s, submitted: 112
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 415 handle_osd_map epochs [415,416], i have 415, src has [1,416]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x5612806aa400 session 0x561280b0b0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540098560 unmapped: 101171200 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540098560 unmapped: 101171200 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5482439 data_alloc: 218103808 data_used: 8450048
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1978c8000/0x0/0x1bfc00000, data 0x3830ad2/0x3a56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 101408768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1978c8000/0x0/0x1bfc00000, data 0x3830ad2/0x3a56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1978c8000/0x0/0x1bfc00000, data 0x3830ad2/0x3a56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5581319 data_alloc: 234881024 data_used: 20127744
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1978c8000/0x0/0x1bfc00000, data 0x3830ad2/0x3a56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5581799 data_alloc: 234881024 data_used: 20140032
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1978c8000/0x0/0x1bfc00000, data 0x3830ad2/0x3a56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.785026550s of 11.835221291s, submitted: 20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541966336 unmapped: 99303424 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544423936 unmapped: 96845824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c65000/0x0/0x1bfc00000, data 0x4485ad2/0x46ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c65000/0x0/0x1bfc00000, data 0x4485ad2/0x46ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5706971 data_alloc: 234881024 data_used: 21835776
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c65000/0x0/0x1bfc00000, data 0x4485ad2/0x46ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c65000/0x0/0x1bfc00000, data 0x4485ad2/0x46ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5698087 data_alloc: 234881024 data_used: 21835776
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c71000/0x0/0x1bfc00000, data 0x4487ad2/0x46ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.116588593s of 13.764609337s, submitted: 132
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5698635 data_alloc: 234881024 data_used: 21843968
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c70000/0x0/0x1bfc00000, data 0x4488ad2/0x46ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c70000/0x0/0x1bfc00000, data 0x4488ad2/0x46ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5698635 data_alloc: 234881024 data_used: 21843968
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561289a6d400 session 0x561280f814a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.007428169s of 10.032960892s, submitted: 6
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5703798 data_alloc: 234881024 data_used: 21852160
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56127eb1dc00 session 0x561285cd1a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56128056dc00 session 0x561280c0f2c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5703442 data_alloc: 234881024 data_used: 21852160
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561285e01400 session 0x56128127f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56127ea37000 session 0x56127e996780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561289a6cc00 session 0x561280b0a960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.614952087s of 10.691198349s, submitted: 9
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561289a6cc00 session 0x56127e5a5680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5704062 data_alloc: 234881024 data_used: 21852160
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56128056e800 session 0x56127f886b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x5612806aa400 session 0x56128329e960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544759808 unmapped: 96509952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561295f44400 session 0x56127e799680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5390803 data_alloc: 218103808 data_used: 8716288
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1981e0000/0x0/0x1bfc00000, data 0x2b08b25/0x2d2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x5612834c7c00 session 0x5612813f7a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56127f49fc00 session 0x561280f803c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1981e0000/0x0/0x1bfc00000, data 0x2b08b25/0x2d2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x5612834c7c00 session 0x561280c0f860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56128056e800 session 0x56127e9965a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5386741 data_alloc: 218103808 data_used: 8450048
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x198220000/0x0/0x1bfc00000, data 0x2ac8ac2/0x2ced000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.404942513s of 12.867709160s, submitted: 60
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561280c31800 session 0x56127e9945a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539107328 unmapped: 102162432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561280d06000 session 0x561280bd9680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x198221000/0x0/0x1bfc00000, data 0x2ac8ac2/0x2ced000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539107328 unmapped: 102162432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56127f49fc00 session 0x56127e798d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539107328 unmapped: 102162432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5401867 data_alloc: 234881024 data_used: 13656064
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x198220000/0x0/0x1bfc00000, data 0x2ac8ad2/0x2cee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539107328 unmapped: 102162432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56127fa4b800 session 0x5612832a2000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539107328 unmapped: 102162432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561295f1f800 session 0x5612832a2b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x5612841f8c00 session 0x561280c0e000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56128a0d5000 session 0x561282f9f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539410432 unmapped: 101859328 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x197679000/0x0/0x1bfc00000, data 0x366fad2/0x3895000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539410432 unmapped: 101859328 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 416 handle_osd_map epochs [416,417], i have 416, src has [1,417]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f49fc00 session 0x56127ea1fa40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x197674000/0x0/0x1bfc00000, data 0x367178d/0x3899000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544006144 unmapped: 97263616 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5540474 data_alloc: 234881024 data_used: 13664256
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127fa4b800 session 0x56127ea19860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x5612841f8c00 session 0x5612832a21e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561295f1f800 session 0x5612832a30e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127ea5d000 session 0x561280bd8960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f49fc00 session 0x561280c0f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x197674000/0x0/0x1bfc00000, data 0x367178d/0x3899000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540819456 unmapped: 100450304 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540819456 unmapped: 100450304 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561286219800 session 0x56127e613c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561284537400 session 0x56127e612780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540819456 unmapped: 100450304 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561285e01400 session 0x56127e613a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.703686714s of 11.037638664s, submitted: 92
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56128056c000 session 0x561281bea960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540819456 unmapped: 100450304 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561284537400 session 0x56127ea16960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f49fc00 session 0x5612813e63c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540835840 unmapped: 100433920 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5577375 data_alloc: 234881024 data_used: 13672448
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540712960 unmapped: 100556800 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x196e5c000/0x0/0x1bfc00000, data 0x3e8780f/0x40b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x5612862ae800 session 0x56127ea1e780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56128fb9fc00 session 0x56127e5a41e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5636415 data_alloc: 234881024 data_used: 21962752
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x196e5c000/0x0/0x1bfc00000, data 0x3e8780f/0x40b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561285e00400 session 0x56127e9970e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f49fc00 session 0x56128329e3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x196e5c000/0x0/0x1bfc00000, data 0x3e8780f/0x40b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543899648 unmapped: 97370112 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x196e37000/0x0/0x1bfc00000, data 0x3eab81f/0x40d7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 548061184 unmapped: 93208576 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x196e37000/0x0/0x1bfc00000, data 0x3eab81f/0x40d7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 548061184 unmapped: 93208576 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5728177 data_alloc: 251658240 data_used: 34103296
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.103850365s of 12.316456795s, submitted: 13
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 548061184 unmapped: 93208576 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561284537400 session 0x56127ea1eb40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561285e00400 session 0x561280fa3c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554704896 unmapped: 86564864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56128056dc00 session 0x56127ea17680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555008000 unmapped: 86261760 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x195ca5000/0x0/0x1bfc00000, data 0x4ed880f/0x5103000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555286528 unmapped: 85983232 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561295f1f800 session 0x56127e8d43c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555286528 unmapped: 85983232 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5856242 data_alloc: 251658240 data_used: 34967552
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f49fc00 session 0x561282a76000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555286528 unmapped: 85983232 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f39bc00 session 0x561280f814a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555286528 unmapped: 85983232 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 417 handle_osd_map epochs [417,418], i have 417, src has [1,418]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x5612810bb400 session 0x561280f81e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 heartbeat osd_stat(store_statfs(0x195ca6000/0x0/0x1bfc00000, data 0x4ed87ff/0x5102000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555302912 unmapped: 85966848 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x561295f42000 session 0x56127e612960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x561295f1e400 session 0x56127ea18f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554328064 unmapped: 86941696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554328064 unmapped: 86941696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5682012 data_alloc: 234881024 data_used: 22839296
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554328064 unmapped: 86941696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.493256569s of 11.112817764s, submitted: 175
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554328064 unmapped: 86941696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x5612834c6800 session 0x56127f33e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x5612804a6400 session 0x561282f9eb40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 heartbeat osd_stat(store_statfs(0x196852000/0x0/0x1bfc00000, data 0x433343a/0x455c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x56127f39a400 session 0x56127f8103c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5689400 data_alloc: 234881024 data_used: 23822336
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x561285603800 session 0x56127ecee780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 handle_osd_map epochs [418,419], i have 418, src has [1,419]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 handle_osd_map epochs [419,419], i have 419, src has [1,419]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 heartbeat osd_stat(store_statfs(0x196851000/0x0/0x1bfc00000, data 0x433349c/0x455d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [0,0,0,0,1,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 418 handle_osd_map epochs [419,419], i have 419, src has [1,419]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x56127f49e800 session 0x56128329ed20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561285e01400 session 0x5612813e74a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561286219800 session 0x5612813f65a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 553877504 unmapped: 87392256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x196851000/0x0/0x1bfc00000, data 0x433349c/0x455d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x56127f39a400 session 0x56127f2bba40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x5612834c6800 session 0x56127e8d4b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x1964cd000/0x0/0x1bfc00000, data 0x46b4f88/0x48e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552853504 unmapped: 88416256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5718846 data_alloc: 234881024 data_used: 23830528
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x5612804a6400 session 0x56128329fe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x19798d000/0x0/0x1bfc00000, data 0x2e4df88/0x3079000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5471915 data_alloc: 234881024 data_used: 14663680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d37000/0x0/0x1bfc00000, data 0x2e4df16/0x3077000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d37000/0x0/0x1bfc00000, data 0x2e4df16/0x3077000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5471915 data_alloc: 234881024 data_used: 14663680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d37000/0x0/0x1bfc00000, data 0x2e4df16/0x3077000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561285e01c00 session 0x561281beba40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x56128a0d5400 session 0x561282f9e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561280e91c00 session 0x56127f7185a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.033617020s of 22.532424927s, submitted: 104
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536800 session 0x5612832a3c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d37000/0x0/0x1bfc00000, data 0x2e4df16/0x3077000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5478936 data_alloc: 234881024 data_used: 14663680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d0b000/0x0/0x1bfc00000, data 0x2e77f49/0x30a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5505336 data_alloc: 234881024 data_used: 18341888
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d0b000/0x0/0x1bfc00000, data 0x2e77f49/0x30a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5505336 data_alloc: 234881024 data_used: 18341888
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d0b000/0x0/0x1bfc00000, data 0x2e77f49/0x30a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.706553459s of 12.811210632s, submitted: 7
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552312832 unmapped: 88956928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536c00 session 0x561281c77680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561280e91c00 session 0x561282f9ef00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536800 session 0x5612813f72c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536c00 session 0x56127e8d41e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552476672 unmapped: 88793088 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561285e01c00 session 0x561282f9f860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x56128a0d5400 session 0x5612813e7860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561280e91c00 session 0x5612832a3a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536800 session 0x561280bd92c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536c00 session 0x56127f886d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5608619 data_alloc: 234881024 data_used: 18997248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x1971a4000/0x0/0x1bfc00000, data 0x39ddf59/0x3c0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x1971a4000/0x0/0x1bfc00000, data 0x39ddf59/0x3c0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5607251 data_alloc: 234881024 data_used: 18997248
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x56128056e800 session 0x56127f3de780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x1971a1000/0x0/0x1bfc00000, data 0x39e0f59/0x3c0d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 handle_osd_map epochs [419,420], i have 419, src has [1,420]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.388351440s of 10.914187431s, submitted: 75
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552493056 unmapped: 88776704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 419 handle_osd_map epochs [420,420], i have 420, src has [1,420]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 420 ms_handle_reset con 0x561284536400 session 0x5612832a3a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 420 heartbeat osd_stat(store_statfs(0x1971a1000/0x0/0x1bfc00000, data 0x39e0f59/0x3c0d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552591360 unmapped: 88678400 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 420 handle_osd_map epochs [420,421], i have 420, src has [1,421]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56127fa4b800 session 0x561280fa2d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5646425 data_alloc: 234881024 data_used: 23064576
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552599552 unmapped: 88670208 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x197198000/0x0/0x1bfc00000, data 0x39e486d/0x3c14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x197198000/0x0/0x1bfc00000, data 0x39e486d/0x3c14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5646425 data_alloc: 234881024 data_used: 23064576
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x197198000/0x0/0x1bfc00000, data 0x39e486d/0x3c14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56128544c400 session 0x56127ea1f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x5612810bb800 session 0x561282f9f860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552615936 unmapped: 88653824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552615936 unmapped: 88653824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.998759270s of 11.290341377s, submitted: 9
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5697957 data_alloc: 234881024 data_used: 24182784
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555933696 unmapped: 85336064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555941888 unmapped: 85327872 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #59. Immutable memtables: 15.
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558292992 unmapped: 82976768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195966000/0x0/0x1bfc00000, data 0x40698cf/0x429a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561285602000 session 0x561281c77680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56128b8bc400 session 0x5612832a3c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558292992 unmapped: 82976768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195962000/0x0/0x1bfc00000, data 0x406d8cf/0x429e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558309376 unmapped: 82960384 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56127fa4b800 session 0x561282f9e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5707589 data_alloc: 234881024 data_used: 24141824
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x5612810bb800 session 0x561281beba40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558325760 unmapped: 82944000 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558325760 unmapped: 82944000 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558473216 unmapped: 82796544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558481408 unmapped: 82788352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195949000/0x0/0x1bfc00000, data 0x4092902/0x42c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195949000/0x0/0x1bfc00000, data 0x4092902/0x42c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558489600 unmapped: 82780160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x19593c000/0x0/0x1bfc00000, data 0x409f902/0x42d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x19593c000/0x0/0x1bfc00000, data 0x409f902/0x42d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5715485 data_alloc: 234881024 data_used: 24166400
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558497792 unmapped: 82771968 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.514921188s of 11.821525574s, submitted: 162
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558497792 unmapped: 82771968 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56128b8bcc00 session 0x561285cd01e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558497792 unmapped: 82771968 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558497792 unmapped: 82771968 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x19593b000/0x0/0x1bfc00000, data 0x40a0902/0x42d3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558505984 unmapped: 82763776 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5729985 data_alloc: 234881024 data_used: 24281088
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558505984 unmapped: 82763776 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558505984 unmapped: 82763776 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558505984 unmapped: 82763776 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x19593a000/0x0/0x1bfc00000, data 0x42cb902/0x42d4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558505984 unmapped: 82763776 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557793280 unmapped: 83476480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5766101 data_alloc: 234881024 data_used: 25194496
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557793280 unmapped: 83476480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557793280 unmapped: 83476480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557793280 unmapped: 83476480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557801472 unmapped: 83468288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.717605591s of 12.977429390s, submitted: 17
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557801472 unmapped: 83468288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:20:58.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5776481 data_alloc: 234881024 data_used: 25587712
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5781553 data_alloc: 234881024 data_used: 26611712
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558268416 unmapped: 83001344 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5788209 data_alloc: 251658240 data_used: 28356608
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558268416 unmapped: 83001344 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.078017235s of 11.319710732s, submitted: 34
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56127ea36c00 session 0x56128329eb40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [0,0,0,0,0,0,0,6])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557383680 unmapped: 83886080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561289ecb400 session 0x56127ea183c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557400064 unmapped: 83869696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x196167000/0x0/0x1bfc00000, data 0x3a9f8f2/0x3aa7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557400064 unmapped: 83869696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x196167000/0x0/0x1bfc00000, data 0x3a9f8f2/0x3aa7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557400064 unmapped: 83869696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x196167000/0x0/0x1bfc00000, data 0x3a9f8f2/0x3aa7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5651843 data_alloc: 234881024 data_used: 21692416
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557400064 unmapped: 83869696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557400064 unmapped: 83869696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561285e00000 session 0x56128329ed20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x5612862acc00 session 0x56127f8b5a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561280e91c00 session 0x561281bea5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x196168000/0x0/0x1bfc00000, data 0x3a9f8e2/0x3aa6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5648874 data_alloc: 234881024 data_used: 21688320
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.224549294s of 10.959377289s, submitted: 57
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x196168000/0x0/0x1bfc00000, data 0x3a9f8bf/0x3aa5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561289ecbc00 session 0x56127ea18b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557441024 unmapped: 83828736 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557449216 unmapped: 83820544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557449216 unmapped: 83820544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561295f1f400 session 0x56127f2bb0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5621088 data_alloc: 234881024 data_used: 21508096
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 83812352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x19641c000/0x0/0x1bfc00000, data 0x37ec85d/0x37f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 83812352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561289eca000 session 0x561280fa2000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 83812352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 handle_osd_map epochs [421,422], i have 421, src has [1,422]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56128b8bec00 session 0x56127e996000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 422 ms_handle_reset con 0x5612862af800 session 0x56127e9961e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 422 ms_handle_reset con 0x5612823dec00 session 0x56128127f680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 422 heartbeat osd_stat(store_statfs(0x19641b000/0x0/0x1bfc00000, data 0x35c34a8/0x37f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5612415 data_alloc: 234881024 data_used: 21512192
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 422 ms_handle_reset con 0x561295f1e000 session 0x561282a76000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 422 handle_osd_map epochs [422,423], i have 422, src has [1,423]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 6.168501377s of 11.982603073s, submitted: 52
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 422 handle_osd_map epochs [423,423], i have 423, src has [1,423]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 423 ms_handle_reset con 0x5612823dec00 session 0x56128329e3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 423 heartbeat osd_stat(store_statfs(0x196447000/0x0/0x1bfc00000, data 0x3599475/0x37c6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 423 ms_handle_reset con 0x5612862af800 session 0x561285cd0f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554647552 unmapped: 86622208 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5483119 data_alloc: 234881024 data_used: 14696448
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554647552 unmapped: 86622208 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554655744 unmapped: 86614016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 423 heartbeat osd_stat(store_statfs(0x196f0b000/0x0/0x1bfc00000, data 0x2ad4fa5/0x2d02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554663936 unmapped: 86605824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 423 heartbeat osd_stat(store_statfs(0x196f0b000/0x0/0x1bfc00000, data 0x2ad4fa5/0x2d02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554663936 unmapped: 86605824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 423 handle_osd_map epochs [423,424], i have 423, src has [1,424]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 424 ms_handle_reset con 0x56127f49fc00 session 0x561280c0fa40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554663936 unmapped: 86605824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5486093 data_alloc: 234881024 data_used: 14696448
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554663936 unmapped: 86605824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 424 ms_handle_reset con 0x561280e87000 session 0x561281beb860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 424 heartbeat osd_stat(store_statfs(0x196f08000/0x0/0x1bfc00000, data 0x2ad6c52/0x2d05000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556236800 unmapped: 85032960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556236800 unmapped: 85032960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 424 handle_osd_map epochs [424,425], i have 424, src has [1,425]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556253184 unmapped: 85016576 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556261376 unmapped: 85008384 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5531035 data_alloc: 234881024 data_used: 14704640
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556261376 unmapped: 85008384 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56128b8bfc00 session 0x56127e8d41e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127f49fc00 session 0x561280c0e780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556261376 unmapped: 85008384 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x196abe000/0x0/0x1bfc00000, data 0x2f1f791/0x314f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561280e87000 session 0x561282a774a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556261376 unmapped: 85008384 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.842556000s of 14.146264076s, submitted: 54
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612823dec00 session 0x561282f9f4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556269568 unmapped: 85000192 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561284536400 session 0x56127ecee5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561289ecbc00 session 0x561282f9f4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554803200 unmapped: 86466560 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5553854 data_alloc: 234881024 data_used: 17833984
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555171840 unmapped: 86097920 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561286920400 session 0x56127ea183c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555171840 unmapped: 86097920 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555171840 unmapped: 86097920 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x196abd000/0x0/0x1bfc00000, data 0x2f1f7c4/0x3151000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555171840 unmapped: 86097920 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561285e00800 session 0x56128329eb40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561289eca400 session 0x561285cd01e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561289eca000 session 0x561282f9e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56128b8be800 session 0x5612832a3a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595329 data_alloc: 234881024 data_used: 17776640
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x196642000/0x0/0x1bfc00000, data 0x3399826/0x35cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595329 data_alloc: 234881024 data_used: 17776640
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.714643478s of 13.027671814s, submitted: 42
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x19612b000/0x0/0x1bfc00000, data 0x3828826/0x3a5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,3])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560390144 unmapped: 80879616 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558743552 unmapped: 82526208 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559800320 unmapped: 81469440 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560029696 unmapped: 81240064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5716325 data_alloc: 234881024 data_used: 18632704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560029696 unmapped: 81240064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560029696 unmapped: 81240064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612834c7000 session 0x56127f886d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x1958cb000/0x0/0x1bfc00000, data 0x4110826/0x4343000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559931392 unmapped: 81338368 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559931392 unmapped: 81338368 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6001.0 total, 600.0 interval#012Cumulative writes: 77K writes, 314K keys, 77K commit groups, 1.0 writes per commit group, ingest: 0.31 GB, 0.05 MB/s#012Cumulative WAL: 77K writes, 28K syncs, 2.70 writes per sync, written: 0.31 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5288 writes, 20K keys, 5288 commit groups, 1.0 writes per commit group, ingest: 21.34 MB, 0.04 MB/s#012Interval WAL: 5288 writes, 2082 syncs, 2.54 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6001.0 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6001.0 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6001.0 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560037888 unmapped: 81231872 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5733101 data_alloc: 234881024 data_used: 18702336
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560037888 unmapped: 81231872 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.124117851s of 10.161478043s, submitted: 137
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195897000/0x0/0x1bfc00000, data 0x4144826/0x4377000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612862af800 session 0x561282f9ed20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561295f1f000 session 0x561280fa2780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561285e01800 session 0x5612832a23c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612862acc00 session 0x561285cd10e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5639210 data_alloc: 234881024 data_used: 14770176
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127fa4b000 session 0x56127ea16d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127ea35c00 session 0x56128329fe00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127e5c0400 session 0x561280b0bc20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127ea3bc00 session 0x561280bd83c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561250304 unmapped: 80019456 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x1961d4000/0x0/0x1bfc00000, data 0x3807826/0x3a3a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561250304 unmapped: 80019456 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x1961d4000/0x0/0x1bfc00000, data 0x3807826/0x3a3a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 77832192 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5753436 data_alloc: 234881024 data_used: 23560192
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564355072 unmapped: 76914688 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.122691154s of 10.414531708s, submitted: 119
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564363264 unmapped: 76906496 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c0b000/0x0/0x1bfc00000, data 0x3dc8826/0x3ffb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195b83000/0x0/0x1bfc00000, data 0x3e4f826/0x4082000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195b83000/0x0/0x1bfc00000, data 0x3e4f826/0x4082000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5769892 data_alloc: 234881024 data_used: 23609344
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195b83000/0x0/0x1bfc00000, data 0x3e4f826/0x4082000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563929088 unmapped: 77340672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127ea3b800 session 0x56127ecefa40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563929088 unmapped: 77340672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561286f37c00 session 0x56127f8b4780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127ea36000 session 0x561281c76780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5826668 data_alloc: 234881024 data_used: 23732224
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564879360 unmapped: 76390400 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.159999847s of 10.091464996s, submitted: 89
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565927936 unmapped: 75341824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x19536c000/0x0/0x1bfc00000, data 0x4667826/0x489a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565256192 unmapped: 76013568 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565264384 unmapped: 76005376 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565264384 unmapped: 76005376 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5834934 data_alloc: 234881024 data_used: 24035328
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565264384 unmapped: 76005376 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565264384 unmapped: 76005376 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195361000/0x0/0x1bfc00000, data 0x467a826/0x48ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565264384 unmapped: 76005376 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565272576 unmapped: 75997184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565272576 unmapped: 75997184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195361000/0x0/0x1bfc00000, data 0x467a826/0x48ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5834950 data_alloc: 234881024 data_used: 24035328
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565272576 unmapped: 75997184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195361000/0x0/0x1bfc00000, data 0x467a826/0x48ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565288960 unmapped: 75980800 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.978540421s of 11.017058372s, submitted: 14
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561285e01800 session 0x56127e8d4d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127f49c400 session 0x56127f886780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565297152 unmapped: 75972608 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612823e1c00 session 0x561280b0a5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554508288 unmapped: 86761472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554532864 unmapped: 86736896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5633869 data_alloc: 234881024 data_used: 14770176
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554467328 unmapped: 86802432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554491904 unmapped: 86777856 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x196001000/0x0/0x1bfc00000, data 0x35cc7f3/0x37fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x196001000/0x0/0x1bfc00000, data 0x35cc7f3/0x37fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561280e86c00 session 0x561280fa30e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554532864 unmapped: 86736896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612804a6000 session 0x561282f9e000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554532864 unmapped: 86736896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612862ae400 session 0x561281beba40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561285603800 session 0x56127e9974a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5640039 data_alloc: 234881024 data_used: 14770176
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5641451 data_alloc: 234881024 data_used: 14827520
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5641451 data_alloc: 234881024 data_used: 14827520
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.556997299s of 20.875722885s, submitted: 282
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554565632 unmapped: 86704128 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fda000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554565632 unmapped: 86704128 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5656051 data_alloc: 234881024 data_used: 15351808
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fda000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5656051 data_alloc: 234881024 data_used: 15347712
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5661011 data_alloc: 234881024 data_used: 16093184
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 425 handle_osd_map epochs [425,426], i have 425, src has [1,426]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.668267250s of 13.934776306s, submitted: 10
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x561285e01c00 session 0x561282a76f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x561280e86c00 session 0x56127f887680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612804a6000 session 0x56127e997a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195fcd000/0x0/0x1bfc00000, data 0x36a84d1/0x3830000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5676383 data_alloc: 234881024 data_used: 16031744
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554532864 unmapped: 86736896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56128056d000 session 0x56127e8c7c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612806aa400 session 0x56127e997e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x561286219c00 session 0x561282f9e960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612804a6000 session 0x561280b0b0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56128056d000 session 0x561281beab40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554541056 unmapped: 86728704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5719484 data_alloc: 234881024 data_used: 16035840
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554541056 unmapped: 86728704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195aef000/0x0/0x1bfc00000, data 0x3b874d1/0x3d0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554541056 unmapped: 86728704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554541056 unmapped: 86728704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x561284537800 session 0x56128127f4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554541056 unmapped: 86728704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127fa4a800 session 0x561282f9e960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612841f9c00 session 0x56127e8c7c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.336693764s of 13.468007088s, submitted: 36
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127fa4a800 session 0x56127e997a40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5719484 data_alloc: 234881024 data_used: 16035840
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195aef000/0x0/0x1bfc00000, data 0x3b874d1/0x3d0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554565632 unmapped: 86704128 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554573824 unmapped: 86695936 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56128056d000 session 0x5612813e72c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554573824 unmapped: 86695936 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x561284537800 session 0x561281c77c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554573824 unmapped: 86695936 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195ae9000/0x0/0x1bfc00000, data 0x3b914d1/0x3d15000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127e5c1c00 session 0x5612813e7c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56128458ac00 session 0x561285cd1c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5751069 data_alloc: 234881024 data_used: 19804160
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195ae9000/0x0/0x1bfc00000, data 0x3b914d1/0x3d15000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554582016 unmapped: 86687744 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554590208 unmapped: 86679552 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554590208 unmapped: 86679552 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195ae8000/0x0/0x1bfc00000, data 0x3b914e1/0x3d16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195ae8000/0x0/0x1bfc00000, data 0x3b914e1/0x3d16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554598400 unmapped: 86671360 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554598400 unmapped: 86671360 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5751653 data_alloc: 234881024 data_used: 19816448
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554598400 unmapped: 86671360 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554598400 unmapped: 86671360 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554598400 unmapped: 86671360 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.716708183s of 12.770442009s, submitted: 14
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557015040 unmapped: 84254720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x1956b0000/0x0/0x1bfc00000, data 0x3fc84e1/0x414d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556990464 unmapped: 84279296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5810889 data_alloc: 234881024 data_used: 19959808
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556990464 unmapped: 84279296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556990464 unmapped: 84279296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556990464 unmapped: 84279296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556990464 unmapped: 84279296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556351488 unmapped: 84918272 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x19554f000/0x0/0x1bfc00000, data 0x412a4e1/0x42af000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5823081 data_alloc: 234881024 data_used: 20680704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556351488 unmapped: 84918272 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x19552b000/0x0/0x1bfc00000, data 0x414e4e1/0x42d3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556351488 unmapped: 84918272 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556351488 unmapped: 84918272 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x19552b000/0x0/0x1bfc00000, data 0x414e4e1/0x42d3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556359680 unmapped: 84910080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556359680 unmapped: 84910080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.680471420s of 12.101355553s, submitted: 82
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x19552b000/0x0/0x1bfc00000, data 0x414e4e1/0x42d3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5823465 data_alloc: 234881024 data_used: 20676608
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557408256 unmapped: 83861504 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557408256 unmapped: 83861504 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557408256 unmapped: 83861504 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127e5c1c00 session 0x561280b0ad20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127fa4a800 session 0x56128127e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612862af800 session 0x56127f7192c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557424640 unmapped: 83845120 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195525000/0x0/0x1bfc00000, data 0x41544e1/0x42d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5823276 data_alloc: 234881024 data_used: 20680704
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558473216 unmapped: 82796544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558473216 unmapped: 82796544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612804a6000 session 0x56127f887680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558407680 unmapped: 82862080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558407680 unmapped: 82862080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195513000/0x0/0x1bfc00000, data 0x43274d1/0x42eb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558407680 unmapped: 82862080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612822d6400 session 0x56127ea1fa40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127e5c1c00 session 0x561280c0f0e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5827202 data_alloc: 234881024 data_used: 20672512
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558432256 unmapped: 82837504 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127fa4a800 session 0x561282a76f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.686823845s of 11.059775352s, submitted: 71
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x1955c9000/0x0/0x1bfc00000, data 0x427146f/0x4234000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612862af800 session 0x56127e8c74a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558440448 unmapped: 82829312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 handle_osd_map epochs [426,427], i have 426, src has [1,427]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 426 handle_osd_map epochs [427,427], i have 427, src has [1,427]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x5612804a6000 session 0x561280c0e3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x5612823de400 session 0x56127f887e00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x56127e5c1c00 session 0x5612813e65a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x56127fa4a800 session 0x56127e613c20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558465024 unmapped: 82804736 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 427 heartbeat osd_stat(store_statfs(0x1955d4000/0x0/0x1bfc00000, data 0x3ff426e/0x4228000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558465024 unmapped: 82804736 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x561286219800 session 0x56127e8e9680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x5612862ad400 session 0x561280b0b4a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558489600 unmapped: 82780160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [O-0] New memtable created with log file: #60. Immutable memtables: 0.
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x5612804a6000 session 0x5612832a2780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5808686 data_alloc: 234881024 data_used: 20566016
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559554560 unmapped: 81715200 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 427 heartbeat osd_stat(store_statfs(0x19445a000/0x0/0x1bfc00000, data 0x3fd024b/0x4203000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559562752 unmapped: 81707008 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559562752 unmapped: 81707008 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 427 handle_osd_map epochs [427,428], i have 427, src has [1,428]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559562752 unmapped: 81707008 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 428 ms_handle_reset con 0x56127e5c1c00 session 0x56128329f680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 428 handle_osd_map epochs [428,429], i have 428, src has [1,429]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 429 handle_osd_map epochs [429,429], i have 429, src has [1,429]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 429 ms_handle_reset con 0x5612804a6000 session 0x56127f2ba1e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 429 ms_handle_reset con 0x56127fa4a800 session 0x56128329f860
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560619520 unmapped: 80650240 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 429 ms_handle_reset con 0x561280c31800 session 0x56127f3de3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5675450 data_alloc: 234881024 data_used: 15044608
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558989312 unmapped: 82280448 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558989312 unmapped: 82280448 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 429 heartbeat osd_stat(store_statfs(0x194f4d000/0x0/0x1bfc00000, data 0x34da9d5/0x370f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 429 heartbeat osd_stat(store_statfs(0x194f4d000/0x0/0x1bfc00000, data 0x34da9d5/0x370f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5675450 data_alloc: 234881024 data_used: 15044608
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 429 heartbeat osd_stat(store_statfs(0x194f4d000/0x0/0x1bfc00000, data 0x34da9d5/0x370f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.988192558s of 15.492597580s, submitted: 96
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 429 handle_osd_map epochs [429,430], i have 429, src has [1,430]
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5686916 data_alloc: 234881024 data_used: 15859712
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559005696 unmapped: 82264064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194f4b000/0x0/0x1bfc00000, data 0x34dc514/0x3712000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559005696 unmapped: 82264064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559005696 unmapped: 82264064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194f46000/0x0/0x1bfc00000, data 0x34e2514/0x3718000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559005696 unmapped: 82264064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559005696 unmapped: 82264064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612862af800 session 0x561282a774a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563896 data_alloc: 234881024 data_used: 11071488
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612862af800 session 0x56127ea18b40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559030272 unmapped: 82239488 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559030272 unmapped: 82239488 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559030272 unmapped: 82239488 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563896 data_alloc: 234881024 data_used: 11071488
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5564056 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5564056 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559054848 unmapped: 82214912 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5564056 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559054848 unmapped: 82214912 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559054848 unmapped: 82214912 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559054848 unmapped: 82214912 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612806aa400 session 0x561280c0eb40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127e5c2800 session 0x561280c0e5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612834c6800 session 0x561280fa2000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612841f9400 session 0x5612832a30e0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.846471786s of 31.994029999s, submitted: 58
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127e5c2800 session 0x56127f4c2f00
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612806aa400 session 0x561282f9fc20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612834c6800 session 0x561281bea5a0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612862af800 session 0x561282f9e3c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612822d6000 session 0x561282f9f680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d8000/0x0/0x1bfc00000, data 0x324f524/0x3486000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5623978 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561295f1fc00 session 0x561282f9fa40
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d8000/0x0/0x1bfc00000, data 0x324f524/0x3486000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612862afc00 session 0x56128329e780
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612804a7c00 session 0x56127e798000
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56128b8bec00 session 0x56127e6132c0
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5627085 data_alloc: 234881024 data_used: 11083776
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5674605 data_alloc: 234881024 data_used: 17743872
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5674605 data_alloc: 234881024 data_used: 17743872
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.016057968s of 19.102005005s, submitted: 18
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560381952 unmapped: 80887808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559685632 unmapped: 81584128 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5729831 data_alloc: 234881024 data_used: 18255872
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c05000/0x0/0x1bfc00000, data 0x381f557/0x3a58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c05000/0x0/0x1bfc00000, data 0x381f557/0x3a58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5729423 data_alloc: 234881024 data_used: 18276352
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612823e0800 session 0x5612813e7680
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127ea38c00 session 0x56127e8d4d20
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5729423 data_alloc: 234881024 data_used: 18276352
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559628288 unmapped: 81641472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559628288 unmapped: 81641472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561284536000 session 0x561282f9e960
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5729583 data_alloc: 234881024 data_used: 18280448
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559628288 unmapped: 81641472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559628288 unmapped: 81641472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559628288 unmapped: 81641472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5730223 data_alloc: 234881024 data_used: 18345984
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:58 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5730223 data_alloc: 234881024 data_used: 18345984
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.599523544s of 31.511573792s, submitted: 47
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194bfe000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5733487 data_alloc: 234881024 data_used: 18886656
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194bfe000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5733647 data_alloc: 234881024 data_used: 18890752
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561286f36400 session 0x561281beab40
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56128458b800 session 0x56128329e5a0
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c02000/0x0/0x1bfc00000, data 0x3824547/0x3a5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127ea38000 session 0x56127f886d20
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574928 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574928 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574928 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574928 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574928 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561280c31800 session 0x56127f3eef00
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127ea38400 session 0x5612813e7e00
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612834c7000 session 0x5612813e7c20
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561289a6cc00 session 0x56127f887680
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.226699829s of 32.331203461s, submitted: 22
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612804a7000 session 0x561280fa3a40
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127ea38400 session 0x561280f80f00
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194814000/0x0/0x1bfc00000, data 0x3c1254d/0x3e4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561280c31800 session 0x561285cd1860
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612834c7000 session 0x56128127fe00
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561289a6cc00 session 0x561280bd8960
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194814000/0x0/0x1bfc00000, data 0x3c1254d/0x3e4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194814000/0x0/0x1bfc00000, data 0x3c12586/0x3e4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5712775 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194814000/0x0/0x1bfc00000, data 0x3c12586/0x3e4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561285603c00 session 0x5612832a3a40
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127ea38400 session 0x56127f2bad20
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561280c31800 session 0x561281c76d20
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194814000/0x0/0x1bfc00000, data 0x3c12586/0x3e4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612834c7000 session 0x561280b0ba40
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 85901312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 85901312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194812000/0x0/0x1bfc00000, data 0x3c125b9/0x3e4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5844871 data_alloc: 251658240 data_used: 29020160
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194812000/0x0/0x1bfc00000, data 0x3c125b9/0x3e4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5844871 data_alloc: 251658240 data_used: 29020160
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.580867767s of 18.858497620s, submitted: 53
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194812000/0x0/0x1bfc00000, data 0x3c125b9/0x3e4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563322880 unmapped: 81625088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5973445 data_alloc: 251658240 data_used: 30908416
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x193887000/0x0/0x1bfc00000, data 0x4b975b9/0x4dd1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563765248 unmapped: 81182720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19385c000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19385c000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19385c000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5992247 data_alloc: 251658240 data_used: 31162368
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561289a6cc00 session 0x561281c774a0
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56128056d000 session 0x561281c76780
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19386a000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5980779 data_alloc: 251658240 data_used: 31174656
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19386a000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.893699646s of 14.352492332s, submitted: 153
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5981071 data_alloc: 251658240 data_used: 31178752
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19386a000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56128b8bc000 session 0x561280b0ba40
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612810ba000 session 0x56127ea1fa40
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563978240 unmapped: 80969728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195944000/0x0/0x1bfc00000, data 0x2ae15a9/0x2d1a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5597967 data_alloc: 234881024 data_used: 11083776
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56128b8bec00 session 0x561281c76d20
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564011008 unmapped: 80936960 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564011008 unmapped: 80936960 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564011008 unmapped: 80936960 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564011008 unmapped: 80936960 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564011008 unmapped: 80936960 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564043776 unmapped: 80904192 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564043776 unmapped: 80904192 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564043776 unmapped: 80904192 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564076544 unmapped: 80871424 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564076544 unmapped: 80871424 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564076544 unmapped: 80871424 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564092928 unmapped: 80855040 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564125696 unmapped: 80822272 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564125696 unmapped: 80822272 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564133888 unmapped: 80814080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564133888 unmapped: 80814080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564133888 unmapped: 80814080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564166656 unmapped: 80781312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564174848 unmapped: 80773120 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564174848 unmapped: 80773120 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564174848 unmapped: 80773120 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564191232 unmapped: 80756736 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564256768 unmapped: 80691200 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: do_command 'config diff' '{prefix=config diff}'
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: do_command 'config show' '{prefix=config show}'
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: do_command 'counter dump' '{prefix=counter dump}'
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: do_command 'counter schema' '{prefix=counter schema}'
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563953664 unmapped: 80994304 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563789824 unmapped: 81158144 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563224576 unmapped: 81723392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:20:59 np0005465988 ceph-osd[79039]: do_command 'log dump' '{prefix=log dump}'
Oct  2 09:20:59 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 09:20:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:20:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:20:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:20:59.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:20:59 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Oct  2 09:20:59 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4139500642' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct  2 09:21:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Oct  2 09:21:00 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3775449968' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct  2 09:21:00 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Oct  2 09:21:00 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3958445490' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct  2 09:21:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:00.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct  2 09:21:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct  2 09:21:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct  2 09:21:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct  2 09:21:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Oct  2 09:21:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2753875291' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct  2 09:21:01 np0005465988 nova_compute[236126]: 2025-10-02 13:21:01.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:01.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Oct  2 09:21:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/556070882' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct  2 09:21:01 np0005465988 nova_compute[236126]: 2025-10-02 13:21:01.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Oct  2 09:21:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2910516409' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct  2 09:21:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:02 np0005465988 podman[350531]: 2025-10-02 13:21:02.544503311 +0000 UTC m=+0.073179310 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 09:21:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Oct  2 09:21:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2344270193' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct  2 09:21:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Oct  2 09:21:02 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/675020617' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct  2 09:21:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:02.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Oct  2 09:21:03 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2480337755' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct  2 09:21:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Oct  2 09:21:03 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1933385493' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct  2 09:21:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:03.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Oct  2 09:21:03 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1840728467' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct  2 09:21:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Oct  2 09:21:03 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2922471195' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Oct  2 09:21:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Oct  2 09:21:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/301432652' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct  2 09:21:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Oct  2 09:21:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/933441506' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct  2 09:21:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Oct  2 09:21:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1509359919' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct  2 09:21:04 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Oct  2 09:21:04 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1223793828' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct  2 09:21:04 np0005465988 systemd[1]: Starting Hostname Service...
Oct  2 09:21:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:21:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:04.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:21:04 np0005465988 systemd[1]: Started Hostname Service.
Oct  2 09:21:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Oct  2 09:21:05 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/479756167' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct  2 09:21:05 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Oct  2 09:21:05 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2043791923' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct  2 09:21:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:05.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:06 np0005465988 nova_compute[236126]: 2025-10-02 13:21:06.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:21:06 np0005465988 nova_compute[236126]: 2025-10-02 13:21:06.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:06 np0005465988 nova_compute[236126]: 2025-10-02 13:21:06.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:21:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:06.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:21:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Oct  2 09:21:07 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3571929272' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct  2 09:21:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:07.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Oct  2 09:21:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2609548251' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct  2 09:21:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Oct  2 09:21:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/18796829' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct  2 09:21:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct  2 09:21:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct  2 09:21:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:08.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0) v1
Oct  2 09:21:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2560908634' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Oct  2 09:21:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:09.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:09 np0005465988 podman[351383]: 2025-10-02 13:21:09.554155576 +0000 UTC m=+0.072618724 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:21:09 np0005465988 podman[351384]: 2025-10-02 13:21:09.570095142 +0000 UTC m=+0.083980200 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 09:21:09 np0005465988 podman[351380]: 2025-10-02 13:21:09.58546122 +0000 UTC m=+0.107800949 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  2 09:21:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0) v1
Oct  2 09:21:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3754853666' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Oct  2 09:21:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:21:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:10.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:21:11 np0005465988 nova_compute[236126]: 2025-10-02 13:21:11.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:11.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0) v1
Oct  2 09:21:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2602768956' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Oct  2 09:21:11 np0005465988 nova_compute[236126]: 2025-10-02 13:21:11.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0) v1
Oct  2 09:21:12 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1982109687' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Oct  2 09:21:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:21:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:12.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:21:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:13.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0) v1
Oct  2 09:21:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/442756606' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Oct  2 09:21:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0) v1
Oct  2 09:21:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/690376585' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Oct  2 09:21:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:14.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:21:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:15.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:21:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0) v1
Oct  2 09:21:15 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1827421971' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Oct  2 09:21:16 np0005465988 nova_compute[236126]: 2025-10-02 13:21:16.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:16 np0005465988 nova_compute[236126]: 2025-10-02 13:21:16.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:21:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:16.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:21:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0) v1
Oct  2 09:21:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2680411240' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Oct  2 09:21:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:17.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:18 np0005465988 ovs-appctl[353127]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  2 09:21:18 np0005465988 ovs-appctl[353145]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  2 09:21:18 np0005465988 ovs-appctl[353149]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  2 09:21:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0) v1
Oct  2 09:21:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3566435322' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Oct  2 09:21:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:21:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:18.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:21:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0) v1
Oct  2 09:21:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1625710437' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Oct  2 09:21:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:19.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:20.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:21 np0005465988 nova_compute[236126]: 2025-10-02 13:21:21.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:21.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Oct  2 09:21:21 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2397871864' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct  2 09:21:21 np0005465988 nova_compute[236126]: 2025-10-02 13:21:21.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0) v1
Oct  2 09:21:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3705838895' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Oct  2 09:21:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:21:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:21:22 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:21:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0) v1
Oct  2 09:21:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2646082702' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Oct  2 09:21:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:21:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:22.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:21:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0) v1
Oct  2 09:21:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/452093045' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct  2 09:21:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:23.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0) v1
Oct  2 09:21:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/975264910' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Oct  2 09:21:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0) v1
Oct  2 09:21:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1136109663' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Oct  2 09:21:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0) v1
Oct  2 09:21:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3567487335' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Oct  2 09:21:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:21:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:24.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:21:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:25.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:25 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0) v1
Oct  2 09:21:25 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/424226001' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Oct  2 09:21:26 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0) v1
Oct  2 09:21:26 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3536608000' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Oct  2 09:21:26 np0005465988 nova_compute[236126]: 2025-10-02 13:21:26.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:26 np0005465988 nova_compute[236126]: 2025-10-02 13:21:26.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:26.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:21:27.428 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:21:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:21:27.430 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:21:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:21:27.430 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:21:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:27.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0) v1
Oct  2 09:21:27 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2426649433' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Oct  2 09:21:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:21:28 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:21:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0) v1
Oct  2 09:21:28 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/343366598' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Oct  2 09:21:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:28.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:29.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0) v1
Oct  2 09:21:30 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1553975403' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct  2 09:21:30 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0) v1
Oct  2 09:21:30 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/772974085' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Oct  2 09:21:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:30.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:31 np0005465988 virtqemud[235689]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  2 09:21:31 np0005465988 nova_compute[236126]: 2025-10-02 13:21:31.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:31.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:31 np0005465988 nova_compute[236126]: 2025-10-02 13:21:31.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Oct  2 09:21:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3210879108' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Oct  2 09:21:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:32 np0005465988 systemd[1]: Starting Time & Date Service...
Oct  2 09:21:32 np0005465988 systemd[1]: Started Time & Date Service.
Oct  2 09:21:32 np0005465988 podman[355346]: 2025-10-02 13:21:32.65648479 +0000 UTC m=+0.064044940 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 09:21:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0) v1
Oct  2 09:21:32 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2566103676' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Oct  2 09:21:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:21:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:32.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:21:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:33.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:21:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:34.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:21:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:35.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:36 np0005465988 nova_compute[236126]: 2025-10-02 13:21:36.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:36 np0005465988 nova_compute[236126]: 2025-10-02 13:21:36.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:36.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:37.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:38 np0005465988 nova_compute[236126]: 2025-10-02 13:21:38.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:21:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:38.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:39.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:40 np0005465988 podman[355386]: 2025-10-02 13:21:40.541257156 +0000 UTC m=+0.074717244 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:21:40 np0005465988 podman[355387]: 2025-10-02 13:21:40.564234212 +0000 UTC m=+0.094303654 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  2 09:21:40 np0005465988 podman[355385]: 2025-10-02 13:21:40.569915405 +0000 UTC m=+0.103360873 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Oct  2 09:21:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:40.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:41 np0005465988 nova_compute[236126]: 2025-10-02 13:21:41.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:21:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:41.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:41 np0005465988 nova_compute[236126]: 2025-10-02 13:21:41.548 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:21:41 np0005465988 nova_compute[236126]: 2025-10-02 13:21:41.549 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:21:41 np0005465988 nova_compute[236126]: 2025-10-02 13:21:41.549 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:21:41 np0005465988 nova_compute[236126]: 2025-10-02 13:21:41.549 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:21:41 np0005465988 nova_compute[236126]: 2025-10-02 13:21:41.550 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:21:41 np0005465988 nova_compute[236126]: 2025-10-02 13:21:41.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:41 np0005465988 nova_compute[236126]: 2025-10-02 13:21:41.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:41 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:21:41 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2394421373' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.012 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.197 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.199 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3802MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.199 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.199 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:21:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.377 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.378 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.404 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.425 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.426 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.443 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.500 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.548 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:21:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:21:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2190346141' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:21:42 np0005465988 nova_compute[236126]: 2025-10-02 13:21:42.992 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:21:43 np0005465988 nova_compute[236126]: 2025-10-02 13:21:43.000 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:21:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:43.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:43 np0005465988 nova_compute[236126]: 2025-10-02 13:21:43.038 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:21:43 np0005465988 nova_compute[236126]: 2025-10-02 13:21:43.041 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:21:43 np0005465988 nova_compute[236126]: 2025-10-02 13:21:43.041 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.841s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:21:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:43.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000057s ======
Oct  2 09:21:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:45.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Oct  2 09:21:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:45.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:46 np0005465988 nova_compute[236126]: 2025-10-02 13:21:46.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:46 np0005465988 nova_compute[236126]: 2025-10-02 13:21:46.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:21:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:47.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:21:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:47.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:48 np0005465988 nova_compute[236126]: 2025-10-02 13:21:48.042 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:21:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:21:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:49.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:21:49 np0005465988 nova_compute[236126]: 2025-10-02 13:21:49.471 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:21:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:49.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:51.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:51 np0005465988 nova_compute[236126]: 2025-10-02 13:21:51.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:21:51 np0005465988 nova_compute[236126]: 2025-10-02 13:21:51.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:21:51 np0005465988 nova_compute[236126]: 2025-10-02 13:21:51.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:21:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:21:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:51.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:21:51 np0005465988 nova_compute[236126]: 2025-10-02 13:21:51.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:51 np0005465988 nova_compute[236126]: 2025-10-02 13:21:51.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:53.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:53 np0005465988 nova_compute[236126]: 2025-10-02 13:21:53.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:21:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:53.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:55.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 09:21:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1819642134' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 09:21:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 09:21:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1819642134' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 09:21:55 np0005465988 nova_compute[236126]: 2025-10-02 13:21:55.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:21:55 np0005465988 nova_compute[236126]: 2025-10-02 13:21:55.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:21:55 np0005465988 nova_compute[236126]: 2025-10-02 13:21:55.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:21:55 np0005465988 nova_compute[236126]: 2025-10-02 13:21:55.512 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:21:55 np0005465988 nova_compute[236126]: 2025-10-02 13:21:55.513 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:21:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:21:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:55.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:21:56 np0005465988 nova_compute[236126]: 2025-10-02 13:21:56.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:56 np0005465988 nova_compute[236126]: 2025-10-02 13:21:56.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:21:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:57.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:21:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:57.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:21:59.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:21:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:21:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:21:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:21:59.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:01.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:01.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:01 np0005465988 nova_compute[236126]: 2025-10-02 13:22:01.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:01 np0005465988 nova_compute[236126]: 2025-10-02 13:22:01.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:02 np0005465988 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  2 09:22:02 np0005465988 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 09:22:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:03.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:22:03 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4138600610' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:22:03 np0005465988 podman[355558]: 2025-10-02 13:22:03.529188283 +0000 UTC m=+0.061157797 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct  2 09:22:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:03.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:05.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:05.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:06 np0005465988 nova_compute[236126]: 2025-10-02 13:22:06.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:06 np0005465988 nova_compute[236126]: 2025-10-02 13:22:06.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:07.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:22:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:07.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:22:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:22:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:09.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:22:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:09.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:10 np0005465988 podman[355581]: 2025-10-02 13:22:10.696304945 +0000 UTC m=+0.070550936 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:22:10 np0005465988 podman[355580]: 2025-10-02 13:22:10.719680112 +0000 UTC m=+0.096427675 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 09:22:10 np0005465988 podman[355582]: 2025-10-02 13:22:10.747240599 +0000 UTC m=+0.106688068 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:22:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:22:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:11.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:22:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:11.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:11 np0005465988 nova_compute[236126]: 2025-10-02 13:22:11.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:11 np0005465988 nova_compute[236126]: 2025-10-02 13:22:11.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:13.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:13.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:15.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:15.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #181. Immutable memtables: 0.
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.479116) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 115] Flushing memtable with next log file: 181
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411336479228, "job": 115, "event": "flush_started", "num_memtables": 1, "num_entries": 2637, "num_deletes": 250, "total_data_size": 5920265, "memory_usage": 5992952, "flush_reason": "Manual Compaction"}
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 115] Level-0 flush table #182: started
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411336497672, "cf_name": "default", "job": 115, "event": "table_file_creation", "file_number": 182, "file_size": 2412864, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 87574, "largest_seqno": 90205, "table_properties": {"data_size": 2403874, "index_size": 4907, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26958, "raw_average_key_size": 22, "raw_value_size": 2382946, "raw_average_value_size": 2007, "num_data_blocks": 215, "num_entries": 1187, "num_filter_entries": 1187, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759411133, "oldest_key_time": 1759411133, "file_creation_time": 1759411336, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 182, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 115] Flush lasted 18628 microseconds, and 10341 cpu microseconds.
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.497757) [db/flush_job.cc:967] [default] [JOB 115] Level-0 flush table #182: 2412864 bytes OK
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.497793) [db/memtable_list.cc:519] [default] Level-0 commit table #182 started
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.499857) [db/memtable_list.cc:722] [default] Level-0 commit table #182: memtable #1 done
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.499876) EVENT_LOG_v1 {"time_micros": 1759411336499869, "job": 115, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.499904) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 115] Try to delete WAL files size 5907856, prev total WAL file size 5907856, number of live WAL files 2.
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000178.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.502732) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033303132' seq:72057594037927935, type:22 .. '6D6772737461740033323633' seq:0, type:0; will stop at (end)
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 116] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 115 Base level 0, inputs: [182(2356KB)], [180(12MB)]
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411336502851, "job": 116, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [182], "files_L6": [180], "score": -1, "input_data_size": 15433716, "oldest_snapshot_seqno": -1}
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 116] Generated table #183: 11354 keys, 13053596 bytes, temperature: kUnknown
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411336585650, "cf_name": "default", "job": 116, "event": "table_file_creation", "file_number": 183, "file_size": 13053596, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12982683, "index_size": 41424, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28421, "raw_key_size": 298565, "raw_average_key_size": 26, "raw_value_size": 12786798, "raw_average_value_size": 1126, "num_data_blocks": 1578, "num_entries": 11354, "num_filter_entries": 11354, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759411336, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 183, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.585888) [db/compaction/compaction_job.cc:1663] [default] [JOB 116] Compacted 1@0 + 1@6 files to L6 => 13053596 bytes
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.587099) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 186.2 rd, 157.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 12.4 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(11.8) write-amplify(5.4) OK, records in: 11773, records dropped: 419 output_compression: NoCompression
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.587116) EVENT_LOG_v1 {"time_micros": 1759411336587107, "job": 116, "event": "compaction_finished", "compaction_time_micros": 82875, "compaction_time_cpu_micros": 32021, "output_level": 6, "num_output_files": 1, "total_output_size": 13053596, "num_input_records": 11773, "num_output_records": 11354, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000182.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411336587585, "job": 116, "event": "table_file_deletion", "file_number": 182}
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000180.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411336589738, "job": 116, "event": "table_file_deletion", "file_number": 180}
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.502579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.589945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.589955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.589957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.589959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:16 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:16.589961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:16 np0005465988 nova_compute[236126]: 2025-10-02 13:22:16.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:16 np0005465988 nova_compute[236126]: 2025-10-02 13:22:16.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:17.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:17.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:19.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:19.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:19 np0005465988 systemd[1]: session-59.scope: Deactivated successfully.
Oct  2 09:22:19 np0005465988 systemd[1]: session-59.scope: Consumed 2min 56.909s CPU time, 1.0G memory peak, read 443.1M from disk, written 295.4M to disk.
Oct  2 09:22:19 np0005465988 systemd-logind[827]: Session 59 logged out. Waiting for processes to exit.
Oct  2 09:22:19 np0005465988 systemd-logind[827]: Removed session 59.
Oct  2 09:22:19 np0005465988 systemd-logind[827]: New session 60 of user zuul.
Oct  2 09:22:19 np0005465988 systemd[1]: Started Session 60 of User zuul.
Oct  2 09:22:20 np0005465988 systemd[1]: session-60.scope: Deactivated successfully.
Oct  2 09:22:20 np0005465988 systemd-logind[827]: Session 60 logged out. Waiting for processes to exit.
Oct  2 09:22:20 np0005465988 systemd-logind[827]: Removed session 60.
Oct  2 09:22:20 np0005465988 systemd-logind[827]: New session 61 of user zuul.
Oct  2 09:22:20 np0005465988 systemd[1]: Started Session 61 of User zuul.
Oct  2 09:22:20 np0005465988 systemd[1]: session-61.scope: Deactivated successfully.
Oct  2 09:22:20 np0005465988 systemd-logind[827]: Session 61 logged out. Waiting for processes to exit.
Oct  2 09:22:20 np0005465988 systemd-logind[827]: Removed session 61.
Oct  2 09:22:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:21.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:21.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:21 np0005465988 nova_compute[236126]: 2025-10-02 13:22:21.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:21 np0005465988 nova_compute[236126]: 2025-10-02 13:22:21.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:22:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:23.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:22:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:22:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:23.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #184. Immutable memtables: 0.
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:23.724337) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 117] Flushing memtable with next log file: 184
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411343724518, "job": 117, "event": "flush_started", "num_memtables": 1, "num_entries": 322, "num_deletes": 251, "total_data_size": 213552, "memory_usage": 220792, "flush_reason": "Manual Compaction"}
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 117] Level-0 flush table #185: started
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411343807993, "cf_name": "default", "job": 117, "event": "table_file_creation", "file_number": 185, "file_size": 140566, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 90210, "largest_seqno": 90527, "table_properties": {"data_size": 138568, "index_size": 225, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5162, "raw_average_key_size": 18, "raw_value_size": 134605, "raw_average_value_size": 480, "num_data_blocks": 10, "num_entries": 280, "num_filter_entries": 280, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759411337, "oldest_key_time": 1759411337, "file_creation_time": 1759411343, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 185, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 117] Flush lasted 83661 microseconds, and 1952 cpu microseconds.
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:23.808082) [db/flush_job.cc:967] [default] [JOB 117] Level-0 flush table #185: 140566 bytes OK
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:23.808105) [db/memtable_list.cc:519] [default] Level-0 commit table #185 started
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:23.896129) [db/memtable_list.cc:722] [default] Level-0 commit table #185: memtable #1 done
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:23.896178) EVENT_LOG_v1 {"time_micros": 1759411343896166, "job": 117, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:23.896203) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 117] Try to delete WAL files size 211275, prev total WAL file size 211275, number of live WAL files 2.
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000181.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:23.896877) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037373831' seq:72057594037927935, type:22 .. '7061786F730038303333' seq:0, type:0; will stop at (end)
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 118] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 117 Base level 0, inputs: [185(137KB)], [183(12MB)]
Oct  2 09:22:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411343896949, "job": 118, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [185], "files_L6": [183], "score": -1, "input_data_size": 13194162, "oldest_snapshot_seqno": -1}
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 118] Generated table #186: 11124 keys, 11281011 bytes, temperature: kUnknown
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411344056021, "cf_name": "default", "job": 118, "event": "table_file_creation", "file_number": 186, "file_size": 11281011, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11213175, "index_size": 38923, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27845, "raw_key_size": 294473, "raw_average_key_size": 26, "raw_value_size": 11022747, "raw_average_value_size": 990, "num_data_blocks": 1465, "num_entries": 11124, "num_filter_entries": 11124, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759411343, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 186, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:24.056315) [db/compaction/compaction_job.cc:1663] [default] [JOB 118] Compacted 1@0 + 1@6 files to L6 => 11281011 bytes
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:24.111990) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 82.9 rd, 70.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 12.4 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(174.1) write-amplify(80.3) OK, records in: 11634, records dropped: 510 output_compression: NoCompression
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:24.112035) EVENT_LOG_v1 {"time_micros": 1759411344112017, "job": 118, "event": "compaction_finished", "compaction_time_micros": 159144, "compaction_time_cpu_micros": 30012, "output_level": 6, "num_output_files": 1, "total_output_size": 11281011, "num_input_records": 11634, "num_output_records": 11124, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000185.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411344112264, "job": 118, "event": "table_file_deletion", "file_number": 185}
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000183.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411344114883, "job": 118, "event": "table_file_deletion", "file_number": 183}
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:23.896735) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:24.115055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:24.115074) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:24.115075) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:24.115077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:24 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:22:24.115079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:22:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:22:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:25.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:22:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:25.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:26 np0005465988 nova_compute[236126]: 2025-10-02 13:22:26.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:26 np0005465988 nova_compute[236126]: 2025-10-02 13:22:26.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:27.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:22:27.431 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:22:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:22:27.431 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:22:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:22:27.431 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:22:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:27.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:29.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:29.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:22:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:22:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:22:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:22:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:22:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:22:30 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:22:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:22:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:31.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:22:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:31.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:31 np0005465988 nova_compute[236126]: 2025-10-02 13:22:31.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:31 np0005465988 nova_compute[236126]: 2025-10-02 13:22:31.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:33.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:33.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:34 np0005465988 podman[355949]: 2025-10-02 13:22:34.555633878 +0000 UTC m=+0.087965363 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 09:22:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:35.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:35.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:36 np0005465988 nova_compute[236126]: 2025-10-02 13:22:36.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:36 np0005465988 nova_compute[236126]: 2025-10-02 13:22:36.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:37.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:22:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:37.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:22:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:22:37 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:22:38 np0005465988 nova_compute[236126]: 2025-10-02 13:22:38.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:22:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:39.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:39.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:41.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:41 np0005465988 podman[356026]: 2025-10-02 13:22:41.529201013 +0000 UTC m=+0.062298189 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible)
Oct  2 09:22:41 np0005465988 podman[356025]: 2025-10-02 13:22:41.562154524 +0000 UTC m=+0.100561821 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:22:41 np0005465988 podman[356027]: 2025-10-02 13:22:41.569229736 +0000 UTC m=+0.097420692 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001)
Oct  2 09:22:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:41.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:41 np0005465988 nova_compute[236126]: 2025-10-02 13:22:41.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:41 np0005465988 nova_compute[236126]: 2025-10-02 13:22:41.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:42 np0005465988 nova_compute[236126]: 2025-10-02 13:22:42.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:22:42 np0005465988 nova_compute[236126]: 2025-10-02 13:22:42.538 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:22:42 np0005465988 nova_compute[236126]: 2025-10-02 13:22:42.538 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:22:42 np0005465988 nova_compute[236126]: 2025-10-02 13:22:42.538 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:22:42 np0005465988 nova_compute[236126]: 2025-10-02 13:22:42.539 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:22:42 np0005465988 nova_compute[236126]: 2025-10-02 13:22:42.539 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:22:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:22:42 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1731271340' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.012 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:22:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:43.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.191 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.193 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3947MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.193 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.193 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.359 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.360 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.478 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:22:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:43.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:22:43 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4131413981' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.931 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.937 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.983 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.984 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:22:43 np0005465988 nova_compute[236126]: 2025-10-02 13:22:43.985 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.791s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:22:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:45.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:45.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:46 np0005465988 nova_compute[236126]: 2025-10-02 13:22:46.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:46 np0005465988 nova_compute[236126]: 2025-10-02 13:22:46.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:47.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:47.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:47 np0005465988 nova_compute[236126]: 2025-10-02 13:22:47.986 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:22:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:49.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:49.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:50 np0005465988 nova_compute[236126]: 2025-10-02 13:22:50.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:22:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:51.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:51 np0005465988 nova_compute[236126]: 2025-10-02 13:22:51.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:22:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:51.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:51 np0005465988 nova_compute[236126]: 2025-10-02 13:22:51.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:51 np0005465988 nova_compute[236126]: 2025-10-02 13:22:51.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:53.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:53 np0005465988 nova_compute[236126]: 2025-10-02 13:22:53.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:22:53 np0005465988 nova_compute[236126]: 2025-10-02 13:22:53.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:22:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:53.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:55.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:55 np0005465988 nova_compute[236126]: 2025-10-02 13:22:55.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:22:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:55.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:56 np0005465988 nova_compute[236126]: 2025-10-02 13:22:56.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:56 np0005465988 nova_compute[236126]: 2025-10-02 13:22:56.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:22:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:22:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:57.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:22:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:22:57 np0005465988 nova_compute[236126]: 2025-10-02 13:22:57.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:22:57 np0005465988 nova_compute[236126]: 2025-10-02 13:22:57.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:22:57 np0005465988 nova_compute[236126]: 2025-10-02 13:22:57.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:22:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:57.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:22:57 np0005465988 nova_compute[236126]: 2025-10-02 13:22:57.671 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:22:57 np0005465988 nova_compute[236126]: 2025-10-02 13:22:57.672 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:22:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:22:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:22:59.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:22:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:22:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:22:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:22:59.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:01.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:23:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:01.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:23:01 np0005465988 nova_compute[236126]: 2025-10-02 13:23:01.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:02 np0005465988 nova_compute[236126]: 2025-10-02 13:23:01.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:03.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:23:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:03.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:23:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:05.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:05 np0005465988 podman[356196]: 2025-10-02 13:23:05.53894835 +0000 UTC m=+0.069938348 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:23:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:05.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:06 np0005465988 nova_compute[236126]: 2025-10-02 13:23:06.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:07 np0005465988 nova_compute[236126]: 2025-10-02 13:23:07.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:23:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:07.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:23:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:07.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:23:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:09.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:23:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:09.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:11.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:11.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:11 np0005465988 nova_compute[236126]: 2025-10-02 13:23:11.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:11 np0005465988 nova_compute[236126]: 2025-10-02 13:23:11.667 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:23:12 np0005465988 nova_compute[236126]: 2025-10-02 13:23:12.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:12 np0005465988 podman[356221]: 2025-10-02 13:23:12.525704652 +0000 UTC m=+0.061510537 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 09:23:12 np0005465988 podman[356220]: 2025-10-02 13:23:12.539484986 +0000 UTC m=+0.068515358 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 09:23:12 np0005465988 podman[356219]: 2025-10-02 13:23:12.544071137 +0000 UTC m=+0.085078231 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:23:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:23:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:13.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:23:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:13.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:23:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:15.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:23:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:15.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:16 np0005465988 nova_compute[236126]: 2025-10-02 13:23:16.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:17 np0005465988 nova_compute[236126]: 2025-10-02 13:23:17.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:23:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:17.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:23:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:17.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:19.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:19.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:21.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:21.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:21 np0005465988 nova_compute[236126]: 2025-10-02 13:23:21.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:22 np0005465988 nova_compute[236126]: 2025-10-02 13:23:22.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:23.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:23.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:25.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:25.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:26 np0005465988 nova_compute[236126]: 2025-10-02 13:23:26.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:27 np0005465988 nova_compute[236126]: 2025-10-02 13:23:27.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:23:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:27.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:23:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:23:27.432 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:23:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:23:27.432 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:23:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:23:27.432 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:23:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:23:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:27.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:23:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:29.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:23:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.0 total, 600.0 interval#012Cumulative writes: 18K writes, 91K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.18 GB, 0.03 MB/s#012Cumulative WAL: 18K writes, 18K syncs, 1.00 writes per sync, written: 0.18 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1555 writes, 7906 keys, 1555 commit groups, 1.0 writes per commit group, ingest: 15.95 MB, 0.03 MB/s#012Interval WAL: 1555 writes, 1555 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     55.9      1.99              0.39        59    0.034       0      0       0.0       0.0#012  L6      1/0   10.76 MB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   5.5    113.7     97.5      6.20              2.08        58    0.107    458K    31K       0.0       0.0#012 Sum      1/0   10.76 MB   0.0      0.7     0.1      0.6       0.7      0.1       0.0   6.5     86.1     87.4      8.19              2.46       117    0.070    458K    31K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   9.0     78.7     76.7      1.03              0.23        12    0.086     67K   2966       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   0.0    113.7     97.5      6.20              2.08        58    0.107    458K    31K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     55.9      1.98              0.39        58    0.034       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6600.0 total, 600.0 interval#012Flush(GB): cumulative 0.108, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.70 GB write, 0.11 MB/s write, 0.69 GB read, 0.11 MB/s read, 8.2 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 304.00 MB usage: 78.14 MB table_size: 0 occupancy: 18446744073709551615 collections: 12 last_copies: 0 last_secs: 0.001073 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(4841,74.88 MB,24.6318%) FilterBlock(117,1.23 MB,0.40359%) IndexBlock(117,2.03 MB,0.669148%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 09:23:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:29.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:31.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:31.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:31 np0005465988 nova_compute[236126]: 2025-10-02 13:23:31.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:32 np0005465988 nova_compute[236126]: 2025-10-02 13:23:32.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:33.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:33.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:23:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:35.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:23:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:35.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:36 np0005465988 podman[356395]: 2025-10-02 13:23:36.52109671 +0000 UTC m=+0.051635726 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 09:23:36 np0005465988 nova_compute[236126]: 2025-10-02 13:23:36.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:37 np0005465988 nova_compute[236126]: 2025-10-02 13:23:37.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:37.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:37.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:23:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:23:39 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:23:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:23:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:39.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:23:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:39.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:40 np0005465988 nova_compute[236126]: 2025-10-02 13:23:40.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:23:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:23:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:41.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:23:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:23:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:41.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:23:41 np0005465988 nova_compute[236126]: 2025-10-02 13:23:41.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:42 np0005465988 nova_compute[236126]: 2025-10-02 13:23:42.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:23:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:43.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:23:43 np0005465988 podman[356550]: 2025-10-02 13:23:43.527231684 +0000 UTC m=+0.062259429 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 09:23:43 np0005465988 podman[356551]: 2025-10-02 13:23:43.527503212 +0000 UTC m=+0.060383256 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 09:23:43 np0005465988 podman[356549]: 2025-10-02 13:23:43.561275726 +0000 UTC m=+0.098016140 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 09:23:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:43.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:23:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:23:44 np0005465988 nova_compute[236126]: 2025-10-02 13:23:44.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:23:44 np0005465988 nova_compute[236126]: 2025-10-02 13:23:44.564 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:23:44 np0005465988 nova_compute[236126]: 2025-10-02 13:23:44.564 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:23:44 np0005465988 nova_compute[236126]: 2025-10-02 13:23:44.564 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:23:44 np0005465988 nova_compute[236126]: 2025-10-02 13:23:44.565 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:23:44 np0005465988 nova_compute[236126]: 2025-10-02 13:23:44.565 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:23:44 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:23:44 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1656033096' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.001 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.185 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.187 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3950MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.187 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.187 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:23:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:23:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:45.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.250 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.251 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.428 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:23:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:45.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:23:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2475010573' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.907 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.912 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.938 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.940 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:23:45 np0005465988 nova_compute[236126]: 2025-10-02 13:23:45.940 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:23:46 np0005465988 nova_compute[236126]: 2025-10-02 13:23:46.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:47 np0005465988 nova_compute[236126]: 2025-10-02 13:23:47.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:23:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:47.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:23:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:47.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:48 np0005465988 nova_compute[236126]: 2025-10-02 13:23:48.940 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:23:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:49.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:49.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:50 np0005465988 nova_compute[236126]: 2025-10-02 13:23:50.468 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:23:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:51.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:51.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:51 np0005465988 nova_compute[236126]: 2025-10-02 13:23:51.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:52 np0005465988 nova_compute[236126]: 2025-10-02 13:23:52.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:52 np0005465988 nova_compute[236126]: 2025-10-02 13:23:52.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:23:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:23:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:53.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:23:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:23:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:53.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:23:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:55.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:55 np0005465988 nova_compute[236126]: 2025-10-02 13:23:55.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:23:55 np0005465988 nova_compute[236126]: 2025-10-02 13:23:55.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:23:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:55.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #187. Immutable memtables: 0.
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.584215) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 119] Flushing memtable with next log file: 187
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411436584283, "job": 119, "event": "flush_started", "num_memtables": 1, "num_entries": 1174, "num_deletes": 255, "total_data_size": 2549193, "memory_usage": 2588288, "flush_reason": "Manual Compaction"}
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 119] Level-0 flush table #188: started
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411436596347, "cf_name": "default", "job": 119, "event": "table_file_creation", "file_number": 188, "file_size": 1659869, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 90532, "largest_seqno": 91701, "table_properties": {"data_size": 1654697, "index_size": 2631, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11036, "raw_average_key_size": 19, "raw_value_size": 1644332, "raw_average_value_size": 2900, "num_data_blocks": 116, "num_entries": 567, "num_filter_entries": 567, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759411345, "oldest_key_time": 1759411345, "file_creation_time": 1759411436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 188, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 119] Flush lasted 12228 microseconds, and 5566 cpu microseconds.
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.596451) [db/flush_job.cc:967] [default] [JOB 119] Level-0 flush table #188: 1659869 bytes OK
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.596477) [db/memtable_list.cc:519] [default] Level-0 commit table #188 started
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.599328) [db/memtable_list.cc:722] [default] Level-0 commit table #188: memtable #1 done
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.599354) EVENT_LOG_v1 {"time_micros": 1759411436599347, "job": 119, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.599391) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 119] Try to delete WAL files size 2543564, prev total WAL file size 2543564, number of live WAL files 2.
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000184.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.600066) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353130' seq:72057594037927935, type:22 .. '6C6F676D0033373631' seq:0, type:0; will stop at (end)
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 120] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 119 Base level 0, inputs: [188(1620KB)], [186(10MB)]
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411436600112, "job": 120, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [188], "files_L6": [186], "score": -1, "input_data_size": 12940880, "oldest_snapshot_seqno": -1}
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 120] Generated table #189: 11166 keys, 12802492 bytes, temperature: kUnknown
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411436723048, "cf_name": "default", "job": 120, "event": "table_file_creation", "file_number": 189, "file_size": 12802492, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12732485, "index_size": 40966, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27973, "raw_key_size": 296256, "raw_average_key_size": 26, "raw_value_size": 12539389, "raw_average_value_size": 1122, "num_data_blocks": 1551, "num_entries": 11166, "num_filter_entries": 11166, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759411436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 189, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.723351) [db/compaction/compaction_job.cc:1663] [default] [JOB 120] Compacted 1@0 + 1@6 files to L6 => 12802492 bytes
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.732122) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 105.2 rd, 104.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 10.8 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(15.5) write-amplify(7.7) OK, records in: 11691, records dropped: 525 output_compression: NoCompression
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.732168) EVENT_LOG_v1 {"time_micros": 1759411436732151, "job": 120, "event": "compaction_finished", "compaction_time_micros": 123039, "compaction_time_cpu_micros": 43673, "output_level": 6, "num_output_files": 1, "total_output_size": 12802492, "num_input_records": 11691, "num_output_records": 11166, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000188.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411436732985, "job": 120, "event": "table_file_deletion", "file_number": 188}
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000186.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411436735983, "job": 120, "event": "table_file_deletion", "file_number": 186}
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.599995) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.736145) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.736153) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.736157) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.736162) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:23:56 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:23:56.736165) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:23:56 np0005465988 nova_compute[236126]: 2025-10-02 13:23:56.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:57 np0005465988 nova_compute[236126]: 2025-10-02 13:23:57.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:23:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:57.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:23:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:23:57 np0005465988 nova_compute[236126]: 2025-10-02 13:23:57.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:23:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:23:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:57.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:23:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:23:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:23:59.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:23:59 np0005465988 nova_compute[236126]: 2025-10-02 13:23:59.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:23:59 np0005465988 nova_compute[236126]: 2025-10-02 13:23:59.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:23:59 np0005465988 nova_compute[236126]: 2025-10-02 13:23:59.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:23:59 np0005465988 nova_compute[236126]: 2025-10-02 13:23:59.511 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:23:59 np0005465988 nova_compute[236126]: 2025-10-02 13:23:59.511 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:23:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:23:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:23:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:23:59.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:24:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:01.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:24:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:01.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:01 np0005465988 nova_compute[236126]: 2025-10-02 13:24:01.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:02 np0005465988 nova_compute[236126]: 2025-10-02 13:24:02.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:03.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:03.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:05.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:24:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:05.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:24:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:24:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6601.0 total, 600.0 interval#012Cumulative writes: 79K writes, 321K keys, 79K commit groups, 1.0 writes per commit group, ingest: 0.32 GB, 0.05 MB/s#012Cumulative WAL: 79K writes, 29K syncs, 2.69 writes per sync, written: 0.32 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2103 writes, 7442 keys, 2103 commit groups, 1.0 writes per commit group, ingest: 6.64 MB, 0.01 MB/s#012Interval WAL: 2103 writes, 881 syncs, 2.39 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 09:24:06 np0005465988 nova_compute[236126]: 2025-10-02 13:24:06.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:07 np0005465988 nova_compute[236126]: 2025-10-02 13:24:07.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:07.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:07 np0005465988 podman[356771]: 2025-10-02 13:24:07.539073631 +0000 UTC m=+0.062301171 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:24:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:07.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:09.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:24:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:09.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:24:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:11.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:24:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:11.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:24:11 np0005465988 nova_compute[236126]: 2025-10-02 13:24:11.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:12 np0005465988 nova_compute[236126]: 2025-10-02 13:24:12.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:13.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:13 np0005465988 auditd[708]: Audit daemon rotating log files
Oct  2 09:24:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:13.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:14 np0005465988 podman[356845]: 2025-10-02 13:24:14.536382354 +0000 UTC m=+0.064546964 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:24:14 np0005465988 podman[356844]: 2025-10-02 13:24:14.550088654 +0000 UTC m=+0.083597687 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 09:24:14 np0005465988 podman[356843]: 2025-10-02 13:24:14.558175435 +0000 UTC m=+0.093777588 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 09:24:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:15.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:15.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:16 np0005465988 nova_compute[236126]: 2025-10-02 13:24:16.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:17 np0005465988 nova_compute[236126]: 2025-10-02 13:24:17.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:17.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:17.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:19.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:19.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:21.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:21.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:21 np0005465988 nova_compute[236126]: 2025-10-02 13:24:21.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:22 np0005465988 nova_compute[236126]: 2025-10-02 13:24:22.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:23.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:23.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:25.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:25.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:26 np0005465988 nova_compute[236126]: 2025-10-02 13:24:26.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:27 np0005465988 nova_compute[236126]: 2025-10-02 13:24:27.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:27.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:24:27.433 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:24:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:24:27.433 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:24:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:24:27.434 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:24:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:27.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:29.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:24:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:29.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:24:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:31.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:31.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:31 np0005465988 nova_compute[236126]: 2025-10-02 13:24:31.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:32 np0005465988 nova_compute[236126]: 2025-10-02 13:24:32.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:33.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:33.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:35.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:35.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:36 np0005465988 nova_compute[236126]: 2025-10-02 13:24:36.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:37 np0005465988 nova_compute[236126]: 2025-10-02 13:24:37.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:37.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:37.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:38 np0005465988 podman[356969]: 2025-10-02 13:24:38.519570164 +0000 UTC m=+0.060068677 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 09:24:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:39.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:39.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:41 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Oct  2 09:24:41 np0005465988 radosgw[82571]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Oct  2 09:24:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:41.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:41 np0005465988 nova_compute[236126]: 2025-10-02 13:24:41.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:24:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:41.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:41 np0005465988 nova_compute[236126]: 2025-10-02 13:24:41.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:42 np0005465988 nova_compute[236126]: 2025-10-02 13:24:42.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:24:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:43.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:24:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:43.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:24:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:24:45 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:24:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:45.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:45 np0005465988 nova_compute[236126]: 2025-10-02 13:24:45.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:24:45 np0005465988 nova_compute[236126]: 2025-10-02 13:24:45.519 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:24:45 np0005465988 nova_compute[236126]: 2025-10-02 13:24:45.519 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:24:45 np0005465988 nova_compute[236126]: 2025-10-02 13:24:45.520 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:24:45 np0005465988 nova_compute[236126]: 2025-10-02 13:24:45.520 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:24:45 np0005465988 nova_compute[236126]: 2025-10-02 13:24:45.520 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:24:45 np0005465988 podman[357123]: 2025-10-02 13:24:45.522216078 +0000 UTC m=+0.056859815 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 09:24:45 np0005465988 podman[357124]: 2025-10-02 13:24:45.523826174 +0000 UTC m=+0.057785141 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 09:24:45 np0005465988 podman[357122]: 2025-10-02 13:24:45.55975289 +0000 UTC m=+0.094320265 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  2 09:24:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:45.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:24:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3288897796' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:24:45 np0005465988 nova_compute[236126]: 2025-10-02 13:24:45.966 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.125 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.127 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3974MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.128 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.128 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.244 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.244 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.269 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:24:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:24:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/467109670' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.708 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.714 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.753 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.756 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.756 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:24:46 np0005465988 nova_compute[236126]: 2025-10-02 13:24:46.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:47 np0005465988 nova_compute[236126]: 2025-10-02 13:24:47.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:47.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:47.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:24:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:49.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:24:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:24:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:49.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:24:50 np0005465988 nova_compute[236126]: 2025-10-02 13:24:50.758 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:24:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:51.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:51 np0005465988 nova_compute[236126]: 2025-10-02 13:24:51.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:24:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:51.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:51 np0005465988 nova_compute[236126]: 2025-10-02 13:24:51.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:52 np0005465988 nova_compute[236126]: 2025-10-02 13:24:52.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:24:52 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:24:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:52 np0005465988 nova_compute[236126]: 2025-10-02 13:24:52.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:24:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:53.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:53.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 09:24:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1555327155' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 09:24:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 09:24:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1555327155' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 09:24:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:55.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:55.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:56 np0005465988 nova_compute[236126]: 2025-10-02 13:24:56.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:24:56 np0005465988 nova_compute[236126]: 2025-10-02 13:24:56.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:24:56 np0005465988 nova_compute[236126]: 2025-10-02 13:24:56.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:57 np0005465988 nova_compute[236126]: 2025-10-02 13:24:57.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:24:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:57.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:24:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:24:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:57.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:58 np0005465988 nova_compute[236126]: 2025-10-02 13:24:58.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:24:58 np0005465988 nova_compute[236126]: 2025-10-02 13:24:58.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 09:24:58 np0005465988 nova_compute[236126]: 2025-10-02 13:24:58.504 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 09:24:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:24:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:24:59.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:24:59 np0005465988 nova_compute[236126]: 2025-10-02 13:24:59.504 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:24:59 np0005465988 nova_compute[236126]: 2025-10-02 13:24:59.505 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:24:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:24:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:24:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:24:59.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:00 np0005465988 nova_compute[236126]: 2025-10-02 13:25:00.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:25:00 np0005465988 nova_compute[236126]: 2025-10-02 13:25:00.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:25:00 np0005465988 nova_compute[236126]: 2025-10-02 13:25:00.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:25:00 np0005465988 nova_compute[236126]: 2025-10-02 13:25:00.648 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:25:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:01.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:25:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:01.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:25:01 np0005465988 nova_compute[236126]: 2025-10-02 13:25:01.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:02 np0005465988 nova_compute[236126]: 2025-10-02 13:25:02.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:03.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:03.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:05.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:25:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:05.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:25:06 np0005465988 nova_compute[236126]: 2025-10-02 13:25:06.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:07 np0005465988 nova_compute[236126]: 2025-10-02 13:25:07.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:07.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:07.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:09.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:09 np0005465988 nova_compute[236126]: 2025-10-02 13:25:09.435 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:25:09 np0005465988 podman[357343]: 2025-10-02 13:25:09.529738773 +0000 UTC m=+0.062262809 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 09:25:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:25:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:09.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:25:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:11.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:25:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:11.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:25:11 np0005465988 nova_compute[236126]: 2025-10-02 13:25:11.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:12 np0005465988 nova_compute[236126]: 2025-10-02 13:25:12.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:13.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:13 np0005465988 nova_compute[236126]: 2025-10-02 13:25:13.471 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:25:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:25:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:13.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:25:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:25:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:15.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:25:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:15.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:16 np0005465988 podman[357416]: 2025-10-02 13:25:16.536509145 +0000 UTC m=+0.061929119 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 09:25:16 np0005465988 podman[357417]: 2025-10-02 13:25:16.563760734 +0000 UTC m=+0.085889694 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct  2 09:25:16 np0005465988 podman[357415]: 2025-10-02 13:25:16.566287256 +0000 UTC m=+0.096734964 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 09:25:16 np0005465988 nova_compute[236126]: 2025-10-02 13:25:16.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:17 np0005465988 nova_compute[236126]: 2025-10-02 13:25:17.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:17.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:17.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:19.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:25:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:19.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:25:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:25:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:21.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:25:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:21.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:21 np0005465988 nova_compute[236126]: 2025-10-02 13:25:21.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:22 np0005465988 nova_compute[236126]: 2025-10-02 13:25:22.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:23.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:23.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:25.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:25:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:25.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:25:26 np0005465988 nova_compute[236126]: 2025-10-02 13:25:26.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:27 np0005465988 nova_compute[236126]: 2025-10-02 13:25:27.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:27.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:25:27.434 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:25:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:25:27.436 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:25:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:25:27.436 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:25:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:25:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:27.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:25:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:25:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:29.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:25:29 np0005465988 nova_compute[236126]: 2025-10-02 13:25:29.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:25:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:25:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:29.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:25:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:31.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:31.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:31 np0005465988 nova_compute[236126]: 2025-10-02 13:25:31.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:32 np0005465988 nova_compute[236126]: 2025-10-02 13:25:32.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:25:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:33.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:25:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:33.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:35.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:35 np0005465988 nova_compute[236126]: 2025-10-02 13:25:35.738 2 DEBUG oslo_concurrency.processutils [None req-5849a452-6b6a-4a68-83cf-321285824690 57823c2cf8a04b2abc574ed057efc3db 1533ac528d35434c826050eed402afba - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:25:35 np0005465988 nova_compute[236126]: 2025-10-02 13:25:35.798 2 DEBUG oslo_concurrency.processutils [None req-5849a452-6b6a-4a68-83cf-321285824690 57823c2cf8a04b2abc574ed057efc3db 1533ac528d35434c826050eed402afba - - default default] CMD "env LANG=C uptime" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:25:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:25:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:35.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:25:36 np0005465988 nova_compute[236126]: 2025-10-02 13:25:36.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:37 np0005465988 nova_compute[236126]: 2025-10-02 13:25:37.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:37.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:37.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:25:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:39.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:25:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:25:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:39.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:25:40 np0005465988 podman[357543]: 2025-10-02 13:25:40.533514701 +0000 UTC m=+0.055981440 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent)
Oct  2 09:25:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:41.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:25:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:41.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:25:41 np0005465988 nova_compute[236126]: 2025-10-02 13:25:41.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:42 np0005465988 nova_compute[236126]: 2025-10-02 13:25:42.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:42 np0005465988 nova_compute[236126]: 2025-10-02 13:25:42.536 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:25:42 np0005465988 nova_compute[236126]: 2025-10-02 13:25:42.537 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 09:25:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:25:42.775 142124 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=91, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e6:77:f5', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'f6:43:27:7a:f9:f1'}, ipsec=False) old=SB_Global(nb_cfg=90) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:25:42 np0005465988 nova_compute[236126]: 2025-10-02 13:25:42.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:42 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:25:42.777 142124 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:25:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:43.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:43 np0005465988 nova_compute[236126]: 2025-10-02 13:25:43.504 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:25:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:43.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:45.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:45 np0005465988 nova_compute[236126]: 2025-10-02 13:25:45.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:25:45 np0005465988 nova_compute[236126]: 2025-10-02 13:25:45.529 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:25:45 np0005465988 nova_compute[236126]: 2025-10-02 13:25:45.529 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:25:45 np0005465988 nova_compute[236126]: 2025-10-02 13:25:45.529 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:25:45 np0005465988 nova_compute[236126]: 2025-10-02 13:25:45.529 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:25:45 np0005465988 nova_compute[236126]: 2025-10-02 13:25:45.530 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:25:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:45.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:45 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:25:45 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/797989071' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.002 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.159 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.160 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3974MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.161 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.161 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.217 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.217 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.285 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:25:46 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:25:46 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3308627465' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.752 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.758 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.775 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.776 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.777 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:25:46 np0005465988 nova_compute[236126]: 2025-10-02 13:25:46.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:47 np0005465988 nova_compute[236126]: 2025-10-02 13:25:47.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:25:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:47.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:25:47 np0005465988 podman[357611]: 2025-10-02 13:25:47.553670606 +0000 UTC m=+0.090865166 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 09:25:47 np0005465988 podman[357612]: 2025-10-02 13:25:47.558756321 +0000 UTC m=+0.093953824 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 09:25:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:47 np0005465988 podman[357610]: 2025-10-02 13:25:47.577478646 +0000 UTC m=+0.118637549 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 09:25:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:47.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:49.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:49.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:51.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:51.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:51 np0005465988 nova_compute[236126]: 2025-10-02 13:25:51.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:52 np0005465988 nova_compute[236126]: 2025-10-02 13:25:52.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:52 np0005465988 podman[357847]: 2025-10-02 13:25:52.682201272 +0000 UTC m=+0.305153476 container exec 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Oct  2 09:25:52 np0005465988 nova_compute[236126]: 2025-10-02 13:25:52.772 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:25:52 np0005465988 nova_compute[236126]: 2025-10-02 13:25:52.772 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:25:52 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:25:52.779 142124 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90028908-5ebc-4bb4-8a1f-92ec79bb27aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '91'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:25:52 np0005465988 podman[357847]: 2025-10-02 13:25:52.827990845 +0000 UTC m=+0.450943049 container exec_died 6a8146c47c403b8de5691fa335025e708d58a4cea0cfae5e66fc0107fbae2c57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 09:25:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:53.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:53.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:53 np0005465988 podman[357986]: 2025-10-02 13:25:53.852104581 +0000 UTC m=+0.269573559 container exec 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 09:25:53 np0005465988 podman[358008]: 2025-10-02 13:25:53.923567932 +0000 UTC m=+0.052294924 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 09:25:53 np0005465988 podman[357986]: 2025-10-02 13:25:53.991810531 +0000 UTC m=+0.409279489 container exec_died 4568438c075030704fdfa5d8aaddd599c9c5707c282819f2822b716c2018daf4 (image=quay.io/ceph/haproxy:2.3, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-haproxy-rgw-default-compute-2-jycvzz)
Oct  2 09:25:54 np0005465988 podman[358104]: 2025-10-02 13:25:54.366087339 +0000 UTC m=+0.164431066 container exec 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, version=2.2.4, vcs-type=git, architecture=x86_64, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph)
Oct  2 09:25:54 np0005465988 podman[358125]: 2025-10-02 13:25:54.456514411 +0000 UTC m=+0.062652110 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, com.redhat.component=keepalived-container, version=2.2.4, architecture=x86_64, name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.expose-services=, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793)
Oct  2 09:25:54 np0005465988 podman[358104]: 2025-10-02 13:25:54.464036866 +0000 UTC m=+0.262380583 container exec_died 6ade722ec10fbc804b29c502a4f2a506fef828a598e619c2f6258306279c45c4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-fd4c5763-22d1-50ea-ad0b-96a3dc3040b2-keepalived-rgw-default-compute-2-ahfyxt, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=keepalived for Ceph, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, release=1793, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=Ceph keepalived)
Oct  2 09:25:54 np0005465988 nova_compute[236126]: 2025-10-02 13:25:54.472 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:25:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:55.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:25:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:25:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:25:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:25:55 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:25:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:25:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:55.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:25:56 np0005465988 nova_compute[236126]: 2025-10-02 13:25:56.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:57 np0005465988 nova_compute[236126]: 2025-10-02 13:25:57.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:25:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:57.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:57 np0005465988 nova_compute[236126]: 2025-10-02 13:25:57.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:25:57 np0005465988 nova_compute[236126]: 2025-10-02 13:25:57.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:25:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:25:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:57.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:25:59.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:25:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:25:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:25:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:25:59.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:00 np0005465988 nova_compute[236126]: 2025-10-02 13:26:00.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:26:00 np0005465988 nova_compute[236126]: 2025-10-02 13:26:00.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:26:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:01.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:26:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:01.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:26:01 np0005465988 nova_compute[236126]: 2025-10-02 13:26:01.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #190. Immutable memtables: 0.
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.127268) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 121] Flushing memtable with next log file: 190
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411562127326, "job": 121, "event": "flush_started", "num_memtables": 1, "num_entries": 1439, "num_deletes": 251, "total_data_size": 3319642, "memory_usage": 3359248, "flush_reason": "Manual Compaction"}
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 121] Level-0 flush table #191: started
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411562169003, "cf_name": "default", "job": 121, "event": "table_file_creation", "file_number": 191, "file_size": 2179196, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 91706, "largest_seqno": 93140, "table_properties": {"data_size": 2173059, "index_size": 3399, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12937, "raw_average_key_size": 19, "raw_value_size": 2160817, "raw_average_value_size": 3339, "num_data_blocks": 150, "num_entries": 647, "num_filter_entries": 647, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759411437, "oldest_key_time": 1759411437, "file_creation_time": 1759411562, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 191, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 121] Flush lasted 41788 microseconds, and 5790 cpu microseconds.
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:26:02 np0005465988 nova_compute[236126]: 2025-10-02 13:26:02.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.169054) [db/flush_job.cc:967] [default] [JOB 121] Level-0 flush table #191: 2179196 bytes OK
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.169080) [db/memtable_list.cc:519] [default] Level-0 commit table #191 started
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.343183) [db/memtable_list.cc:722] [default] Level-0 commit table #191: memtable #1 done
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.343227) EVENT_LOG_v1 {"time_micros": 1759411562343217, "job": 121, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.343251) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 121] Try to delete WAL files size 3313026, prev total WAL file size 3313026, number of live WAL files 2.
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000187.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.344956) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038303332' seq:72057594037927935, type:22 .. '7061786F730038323834' seq:0, type:0; will stop at (end)
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 122] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 121 Base level 0, inputs: [191(2128KB)], [189(12MB)]
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411562345048, "job": 122, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [191], "files_L6": [189], "score": -1, "input_data_size": 14981688, "oldest_snapshot_seqno": -1}
Oct  2 09:26:02 np0005465988 nova_compute[236126]: 2025-10-02 13:26:02.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:26:02 np0005465988 nova_compute[236126]: 2025-10-02 13:26:02.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:26:02 np0005465988 nova_compute[236126]: 2025-10-02 13:26:02.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:26:02 np0005465988 nova_compute[236126]: 2025-10-02 13:26:02.495 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 122] Generated table #192: 11296 keys, 13035017 bytes, temperature: kUnknown
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411562498883, "cf_name": "default", "job": 122, "event": "table_file_creation", "file_number": 192, "file_size": 13035017, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12963991, "index_size": 41665, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28293, "raw_key_size": 299623, "raw_average_key_size": 26, "raw_value_size": 12768410, "raw_average_value_size": 1130, "num_data_blocks": 1576, "num_entries": 11296, "num_filter_entries": 11296, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759411562, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 192, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.499192) [db/compaction/compaction_job.cc:1663] [default] [JOB 122] Compacted 1@0 + 1@6 files to L6 => 13035017 bytes
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.551202) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 97.4 rd, 84.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 12.2 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(12.9) write-amplify(6.0) OK, records in: 11813, records dropped: 517 output_compression: NoCompression
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.551251) EVENT_LOG_v1 {"time_micros": 1759411562551233, "job": 122, "event": "compaction_finished", "compaction_time_micros": 153895, "compaction_time_cpu_micros": 44731, "output_level": 6, "num_output_files": 1, "total_output_size": 13035017, "num_input_records": 11813, "num_output_records": 11296, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000191.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411562552014, "job": 122, "event": "table_file_deletion", "file_number": 191}
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000189.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411562554944, "job": 122, "event": "table_file_deletion", "file_number": 189}
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.344666) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.555194) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.555217) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.555218) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.555220) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:26:02.555222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:26:02 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:26:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:03.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:03.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:05.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:26:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:05.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:26:06 np0005465988 nova_compute[236126]: 2025-10-02 13:26:06.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:07 np0005465988 nova_compute[236126]: 2025-10-02 13:26:07.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:07.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:26:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:07.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:26:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:09.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:26:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:09.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:26:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:11.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:11 np0005465988 podman[358345]: 2025-10-02 13:26:11.588032058 +0000 UTC m=+0.103189648 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:26:11 np0005465988 nova_compute[236126]: 2025-10-02 13:26:11.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:11.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:12 np0005465988 nova_compute[236126]: 2025-10-02 13:26:12.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:13.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:13.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:15.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:15.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:16 np0005465988 nova_compute[236126]: 2025-10-02 13:26:16.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:17 np0005465988 nova_compute[236126]: 2025-10-02 13:26:17.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:17.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:17.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:18 np0005465988 podman[358419]: 2025-10-02 13:26:18.524210184 +0000 UTC m=+0.055288040 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 09:26:18 np0005465988 podman[358418]: 2025-10-02 13:26:18.546997105 +0000 UTC m=+0.080463679 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:26:18 np0005465988 podman[358417]: 2025-10-02 13:26:18.56330433 +0000 UTC m=+0.092598305 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:26:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:19.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:19.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:26:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:21.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:26:21 np0005465988 nova_compute[236126]: 2025-10-02 13:26:21.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:26:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:21.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:26:22 np0005465988 nova_compute[236126]: 2025-10-02 13:26:22.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:26:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:23.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:26:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:26:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:23.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:26:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:26:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:25.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:26:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:25.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:26 np0005465988 nova_compute[236126]: 2025-10-02 13:26:26.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:27 np0005465988 nova_compute[236126]: 2025-10-02 13:26:27.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:26:27.435 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:26:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:26:27.436 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:26:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:26:27.436 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:26:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:27.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:27.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:29.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:29.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:26:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:31.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:26:31 np0005465988 nova_compute[236126]: 2025-10-02 13:26:31.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:31.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:32 np0005465988 nova_compute[236126]: 2025-10-02 13:26:32.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:26:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:33.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:26:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:33.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:35.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:26:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:35.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:26:36 np0005465988 nova_compute[236126]: 2025-10-02 13:26:36.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:37 np0005465988 nova_compute[236126]: 2025-10-02 13:26:37.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:37.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:37.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:39.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:26:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:39.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:26:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:41.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:41 np0005465988 nova_compute[236126]: 2025-10-02 13:26:41.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:41.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:42 np0005465988 nova_compute[236126]: 2025-10-02 13:26:42.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:42 np0005465988 podman[358541]: 2025-10-02 13:26:42.522474913 +0000 UTC m=+0.059824779 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  2 09:26:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:43 np0005465988 nova_compute[236126]: 2025-10-02 13:26:43.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:26:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:43.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:43.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:45.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:26:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:45.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:26:46 np0005465988 nova_compute[236126]: 2025-10-02 13:26:46.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:47 np0005465988 nova_compute[236126]: 2025-10-02 13:26:47.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:47 np0005465988 nova_compute[236126]: 2025-10-02 13:26:47.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:26:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:47.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:47 np0005465988 nova_compute[236126]: 2025-10-02 13:26:47.556 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:26:47 np0005465988 nova_compute[236126]: 2025-10-02 13:26:47.556 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:26:47 np0005465988 nova_compute[236126]: 2025-10-02 13:26:47.557 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:26:47 np0005465988 nova_compute[236126]: 2025-10-02 13:26:47.557 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:26:47 np0005465988 nova_compute[236126]: 2025-10-02 13:26:47.557 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:26:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:26:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:47.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:26:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:26:48 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1269393396' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.035 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.214 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.216 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3963MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.216 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.216 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.516 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.517 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.546 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.601 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.602 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.620 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.650 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 09:26:48 np0005465988 nova_compute[236126]: 2025-10-02 13:26:48.668 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:26:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:26:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2578479254' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:26:49 np0005465988 nova_compute[236126]: 2025-10-02 13:26:49.137 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:26:49 np0005465988 nova_compute[236126]: 2025-10-02 13:26:49.144 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:26:49 np0005465988 nova_compute[236126]: 2025-10-02 13:26:49.165 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:26:49 np0005465988 nova_compute[236126]: 2025-10-02 13:26:49.168 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:26:49 np0005465988 nova_compute[236126]: 2025-10-02 13:26:49.168 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.952s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:26:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:26:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:49.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:26:49 np0005465988 podman[358612]: 2025-10-02 13:26:49.538329626 +0000 UTC m=+0.063669720 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd)
Oct  2 09:26:49 np0005465988 podman[358611]: 2025-10-02 13:26:49.551029578 +0000 UTC m=+0.082131506 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 09:26:49 np0005465988 podman[358610]: 2025-10-02 13:26:49.58086489 +0000 UTC m=+0.114372077 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:26:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:49.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:51.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:51.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:51 np0005465988 nova_compute[236126]: 2025-10-02 13:26:51.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:52 np0005465988 nova_compute[236126]: 2025-10-02 13:26:52.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:26:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:53.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:26:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:53.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:54 np0005465988 nova_compute[236126]: 2025-10-02 13:26:54.169 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:26:54 np0005465988 nova_compute[236126]: 2025-10-02 13:26:54.170 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:26:55 np0005465988 nova_compute[236126]: 2025-10-02 13:26:55.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:26:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:55.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:26:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:55.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:26:56 np0005465988 nova_compute[236126]: 2025-10-02 13:26:56.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:57 np0005465988 nova_compute[236126]: 2025-10-02 13:26:57.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:26:57 np0005465988 nova_compute[236126]: 2025-10-02 13:26:57.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:26:57 np0005465988 nova_compute[236126]: 2025-10-02 13:26:57.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:26:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:57.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:26:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:26:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:57.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:26:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:26:59.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:26:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:26:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:26:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:26:59.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:00 np0005465988 nova_compute[236126]: 2025-10-02 13:27:00.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:27:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:01.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:01.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:01 np0005465988 nova_compute[236126]: 2025-10-02 13:27:01.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:02 np0005465988 nova_compute[236126]: 2025-10-02 13:27:02.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:02 np0005465988 nova_compute[236126]: 2025-10-02 13:27:02.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:27:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:03.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:27:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:27:03 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:27:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:03.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:04 np0005465988 nova_compute[236126]: 2025-10-02 13:27:04.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:27:04 np0005465988 nova_compute[236126]: 2025-10-02 13:27:04.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:27:04 np0005465988 nova_compute[236126]: 2025-10-02 13:27:04.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:27:04 np0005465988 nova_compute[236126]: 2025-10-02 13:27:04.537 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:27:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:05.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:05.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:06 np0005465988 nova_compute[236126]: 2025-10-02 13:27:06.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:07 np0005465988 nova_compute[236126]: 2025-10-02 13:27:07.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:07.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:27:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:07.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:27:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:09.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:09.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:27:10 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:27:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:27:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:11.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:27:11 np0005465988 nova_compute[236126]: 2025-10-02 13:27:11.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:27:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:11.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:27:12 np0005465988 nova_compute[236126]: 2025-10-02 13:27:12.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:12 np0005465988 nova_compute[236126]: 2025-10-02 13:27:12.858 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:27:13 np0005465988 podman[358918]: 2025-10-02 13:27:13.531780504 +0000 UTC m=+0.062068982 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 09:27:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:13.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:13.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:14 np0005465988 nova_compute[236126]: 2025-10-02 13:27:14.645 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:27:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:15.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:27:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:15.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:27:16 np0005465988 nova_compute[236126]: 2025-10-02 13:27:16.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:17 np0005465988 nova_compute[236126]: 2025-10-02 13:27:17.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:17.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:17.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:19.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:19.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:20 np0005465988 podman[358993]: 2025-10-02 13:27:20.574287087 +0000 UTC m=+0.065197773 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:27:20 np0005465988 podman[358994]: 2025-10-02 13:27:20.578029624 +0000 UTC m=+0.063742711 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 09:27:20 np0005465988 podman[358992]: 2025-10-02 13:27:20.628932047 +0000 UTC m=+0.122584271 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 09:27:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:21.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:21 np0005465988 nova_compute[236126]: 2025-10-02 13:27:21.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:27:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:21.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:27:22 np0005465988 nova_compute[236126]: 2025-10-02 13:27:22.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:27:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:23.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:27:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:23.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:25.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:25.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:26 np0005465988 nova_compute[236126]: 2025-10-02 13:27:26.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:27:27.436 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:27:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:27:27.437 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:27:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:27:27.437 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:27:27 np0005465988 nova_compute[236126]: 2025-10-02 13:27:27.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:27.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:27.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:27:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:29.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:27:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:29.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:31.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:31 np0005465988 nova_compute[236126]: 2025-10-02 13:27:31.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:31.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:32 np0005465988 nova_compute[236126]: 2025-10-02 13:27:32.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:33.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:33.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 09:27:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:35.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 09:27:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:35.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:36 np0005465988 nova_compute[236126]: 2025-10-02 13:27:36.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:37 np0005465988 nova_compute[236126]: 2025-10-02 13:27:37.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:37.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:37.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:39.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:27:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:39.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:27:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:27:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:41.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:27:41 np0005465988 nova_compute[236126]: 2025-10-02 13:27:41.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:42.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:42 np0005465988 nova_compute[236126]: 2025-10-02 13:27:42.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:43.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:44.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:44 np0005465988 nova_compute[236126]: 2025-10-02 13:27:44.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:27:44 np0005465988 podman[359117]: 2025-10-02 13:27:44.514262153 +0000 UTC m=+0.054338143 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 09:27:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:45.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:46.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:46 np0005465988 nova_compute[236126]: 2025-10-02 13:27:46.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:47 np0005465988 nova_compute[236126]: 2025-10-02 13:27:47.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:47.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:48.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:49 np0005465988 nova_compute[236126]: 2025-10-02 13:27:49.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:27:49 np0005465988 nova_compute[236126]: 2025-10-02 13:27:49.533 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:27:49 np0005465988 nova_compute[236126]: 2025-10-02 13:27:49.534 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:27:49 np0005465988 nova_compute[236126]: 2025-10-02 13:27:49.534 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:27:49 np0005465988 nova_compute[236126]: 2025-10-02 13:27:49.534 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:27:49 np0005465988 nova_compute[236126]: 2025-10-02 13:27:49.535 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:27:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:49.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:49 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:27:49 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2130006771' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:27:49 np0005465988 nova_compute[236126]: 2025-10-02 13:27:49.994 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:27:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:50.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.138 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.140 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3982MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.140 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.140 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.248 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.248 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.404 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:27:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:27:50 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1533256654' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.912 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.920 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.974 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.976 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:27:50 np0005465988 nova_compute[236126]: 2025-10-02 13:27:50.976 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.836s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:27:51 np0005465988 podman[359185]: 2025-10-02 13:27:51.515010794 +0000 UTC m=+0.050539444 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0)
Oct  2 09:27:51 np0005465988 podman[359186]: 2025-10-02 13:27:51.538102814 +0000 UTC m=+0.067444778 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 09:27:51 np0005465988 podman[359184]: 2025-10-02 13:27:51.546572065 +0000 UTC m=+0.083017801 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller)
Oct  2 09:27:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:51.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:51 np0005465988 nova_compute[236126]: 2025-10-02 13:27:51.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:52.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:52 np0005465988 nova_compute[236126]: 2025-10-02 13:27:52.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:53.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:53 np0005465988 nova_compute[236126]: 2025-10-02 13:27:53.976 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:27:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:27:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:54.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:27:54 np0005465988 nova_compute[236126]: 2025-10-02 13:27:54.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:27:55 np0005465988 nova_compute[236126]: 2025-10-02 13:27:55.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:27:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:55.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:56.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:56 np0005465988 nova_compute[236126]: 2025-10-02 13:27:56.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:57 np0005465988 nova_compute[236126]: 2025-10-02 13:27:57.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:27:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:27:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:57.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:27:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:27:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:27:58.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:27:59 np0005465988 nova_compute[236126]: 2025-10-02 13:27:59.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:27:59 np0005465988 nova_compute[236126]: 2025-10-02 13:27:59.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:27:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:27:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:27:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:27:59.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:00.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:01.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:28:01 np0005465988 nova_compute[236126]: 2025-10-02 13:28:01.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:02.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:02 np0005465988 nova_compute[236126]: 2025-10-02 13:28:02.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:28:02 np0005465988 nova_compute[236126]: 2025-10-02 13:28:02.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:03 np0005465988 nova_compute[236126]: 2025-10-02 13:28:03.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:28:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:03.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:04.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:04 np0005465988 nova_compute[236126]: 2025-10-02 13:28:04.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:28:04 np0005465988 nova_compute[236126]: 2025-10-02 13:28:04.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:28:04 np0005465988 nova_compute[236126]: 2025-10-02 13:28:04.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:28:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:05.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:06.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:06 np0005465988 nova_compute[236126]: 2025-10-02 13:28:06.380 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:28:06 np0005465988 nova_compute[236126]: 2025-10-02 13:28:06.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:07 np0005465988 nova_compute[236126]: 2025-10-02 13:28:07.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:07.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:08.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:28:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:09.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:28:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:10.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:28:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:28:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:28:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:28:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:28:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:28:11 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:28:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:28:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:11.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:28:11 np0005465988 nova_compute[236126]: 2025-10-02 13:28:11.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:12.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:12 np0005465988 nova_compute[236126]: 2025-10-02 13:28:12.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:13.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:28:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:14.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:28:15 np0005465988 podman[359465]: 2025-10-02 13:28:15.248260135 +0000 UTC m=+0.076946379 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 09:28:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:15.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:28:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:16.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:28:16 np0005465988 nova_compute[236126]: 2025-10-02 13:28:16.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:17 np0005465988 nova_compute[236126]: 2025-10-02 13:28:17.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:17.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:28:17 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:28:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:18.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:19.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:28:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:20.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:21.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:21 np0005465988 nova_compute[236126]: 2025-10-02 13:28:21.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:22.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:22 np0005465988 nova_compute[236126]: 2025-10-02 13:28:22.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:22 np0005465988 podman[359566]: 2025-10-02 13:28:22.547667655 +0000 UTC m=+0.074115468 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 09:28:22 np0005465988 podman[359565]: 2025-10-02 13:28:22.566222895 +0000 UTC m=+0.101868430 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 09:28:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:22 np0005465988 podman[359564]: 2025-10-02 13:28:22.590745035 +0000 UTC m=+0.128769398 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 09:28:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:23.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:24.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:25.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:26.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:27 np0005465988 nova_compute[236126]: 2025-10-02 13:28:26.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:28:27.438 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:28:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:28:27.439 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:28:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:28:27.439 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:28:27 np0005465988 nova_compute[236126]: 2025-10-02 13:28:27.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:27.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:28.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:28:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:29.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:28:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:30.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:28:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:31.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:32 np0005465988 nova_compute[236126]: 2025-10-02 13:28:32.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:32.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:32 np0005465988 nova_compute[236126]: 2025-10-02 13:28:32.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:33.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:34.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:35.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:36.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:28:37 np0005465988 nova_compute[236126]: 2025-10-02 13:28:37.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:37 np0005465988 nova_compute[236126]: 2025-10-02 13:28:37.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:37.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:38.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:28:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:39.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:40.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:41.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:42 np0005465988 nova_compute[236126]: 2025-10-02 13:28:42.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:42.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:42 np0005465988 nova_compute[236126]: 2025-10-02 13:28:42.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:43.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:44.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:44 np0005465988 nova_compute[236126]: 2025-10-02 13:28:44.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:28:45 np0005465988 podman[359690]: 2025-10-02 13:28:45.549091127 +0000 UTC m=+0.084874225 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:28:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:28:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:45.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:28:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:46.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:28:47 np0005465988 nova_compute[236126]: 2025-10-02 13:28:47.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:47 np0005465988 nova_compute[236126]: 2025-10-02 13:28:47.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:47.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:28:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:48.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:49.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:28:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:50.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:50 np0005465988 nova_compute[236126]: 2025-10-02 13:28:50.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:28:50 np0005465988 nova_compute[236126]: 2025-10-02 13:28:50.520 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:28:50 np0005465988 nova_compute[236126]: 2025-10-02 13:28:50.521 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:28:50 np0005465988 nova_compute[236126]: 2025-10-02 13:28:50.521 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:28:50 np0005465988 nova_compute[236126]: 2025-10-02 13:28:50.521 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:28:50 np0005465988 nova_compute[236126]: 2025-10-02 13:28:50.521 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:28:50 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:28:50 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1215553004' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:28:50 np0005465988 nova_compute[236126]: 2025-10-02 13:28:50.988 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.178 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.180 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3967MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.181 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.181 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.254 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.254 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.268 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:28:51 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:28:51 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1495290970' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.732 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.742 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:28:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:51.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.759 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.765 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:28:51 np0005465988 nova_compute[236126]: 2025-10-02 13:28:51.766 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:28:52 np0005465988 nova_compute[236126]: 2025-10-02 13:28:52.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:52.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:28:52 np0005465988 nova_compute[236126]: 2025-10-02 13:28:52.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:53 np0005465988 podman[359758]: 2025-10-02 13:28:53.525382577 +0000 UTC m=+0.060247952 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, config_id=iscsid)
Oct  2 09:28:53 np0005465988 podman[359759]: 2025-10-02 13:28:53.535281819 +0000 UTC m=+0.067185269 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 09:28:53 np0005465988 podman[359757]: 2025-10-02 13:28:53.561996402 +0000 UTC m=+0.101233052 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 09:28:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:53.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:54.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 09:28:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1013062954' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 09:28:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 09:28:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1013062954' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 09:28:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:55.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:55 np0005465988 nova_compute[236126]: 2025-10-02 13:28:55.766 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:28:55 np0005465988 nova_compute[236126]: 2025-10-02 13:28:55.767 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:28:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:56.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:56 np0005465988 nova_compute[236126]: 2025-10-02 13:28:56.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:28:57 np0005465988 nova_compute[236126]: 2025-10-02 13:28:57.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:57 np0005465988 nova_compute[236126]: 2025-10-02 13:28:57.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:28:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:28:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:28:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:57.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:28:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:28:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:28:58.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:28:59 np0005465988 nova_compute[236126]: 2025-10-02 13:28:59.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:28:59 np0005465988 nova_compute[236126]: 2025-10-02 13:28:59.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:28:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:28:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:28:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:28:59.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:00.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:29:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:01.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:29:02 np0005465988 nova_compute[236126]: 2025-10-02 13:29:02.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:02.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:02 np0005465988 nova_compute[236126]: 2025-10-02 13:29:02.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:03.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:04.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:04 np0005465988 nova_compute[236126]: 2025-10-02 13:29:04.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:29:05 np0005465988 nova_compute[236126]: 2025-10-02 13:29:05.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:29:05 np0005465988 nova_compute[236126]: 2025-10-02 13:29:05.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:29:05 np0005465988 nova_compute[236126]: 2025-10-02 13:29:05.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:29:05 np0005465988 nova_compute[236126]: 2025-10-02 13:29:05.498 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:29:05 np0005465988 nova_compute[236126]: 2025-10-02 13:29:05.499 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:29:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:05.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:06.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:07 np0005465988 nova_compute[236126]: 2025-10-02 13:29:07.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:07 np0005465988 nova_compute[236126]: 2025-10-02 13:29:07.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:07.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:08.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:09.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:10.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:11.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:12 np0005465988 nova_compute[236126]: 2025-10-02 13:29:12.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:12.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:12 np0005465988 nova_compute[236126]: 2025-10-02 13:29:12.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:13.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:14.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:15 np0005465988 podman[359902]: 2025-10-02 13:29:15.771793342 +0000 UTC m=+0.088992913 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:29:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:15.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:16.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:17 np0005465988 nova_compute[236126]: 2025-10-02 13:29:17.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:17 np0005465988 nova_compute[236126]: 2025-10-02 13:29:17.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:17.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:18.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct  2 09:29:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct  2 09:29:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:29:18 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:29:18 np0005465988 nova_compute[236126]: 2025-10-02 13:29:18.494 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:29:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct  2 09:29:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:29:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:29:19 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:29:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:19.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:20.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:21.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:22 np0005465988 nova_compute[236126]: 2025-10-02 13:29:22.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:22.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:22 np0005465988 nova_compute[236126]: 2025-10-02 13:29:22.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:23.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:24.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:24 np0005465988 podman[360204]: 2025-10-02 13:29:24.528502291 +0000 UTC m=+0.061331997 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 09:29:24 np0005465988 podman[360205]: 2025-10-02 13:29:24.555705865 +0000 UTC m=+0.084271908 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:29:24 np0005465988 podman[360203]: 2025-10-02 13:29:24.614831027 +0000 UTC m=+0.149423954 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 09:29:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:25.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:25 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:29:25 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:29:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:26.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:27 np0005465988 nova_compute[236126]: 2025-10-02 13:29:27.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:29:27.439 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:29:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:29:27.440 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:29:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:29:27.440 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:29:27 np0005465988 nova_compute[236126]: 2025-10-02 13:29:27.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:27.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:28.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:29.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:30.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:29:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:31.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:29:32 np0005465988 nova_compute[236126]: 2025-10-02 13:29:32.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:32.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:32 np0005465988 nova_compute[236126]: 2025-10-02 13:29:32.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:33.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:29:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:34.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:29:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:35.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:36.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:37 np0005465988 nova_compute[236126]: 2025-10-02 13:29:37.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:37 np0005465988 nova_compute[236126]: 2025-10-02 13:29:37.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:37.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:38.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:39.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:40.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #193. Immutable memtables: 0.
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.427285) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 123] Flushing memtable with next log file: 193
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411781427331, "job": 123, "event": "flush_started", "num_memtables": 1, "num_entries": 2353, "num_deletes": 251, "total_data_size": 5821213, "memory_usage": 5888368, "flush_reason": "Manual Compaction"}
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 123] Level-0 flush table #194: started
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411781451727, "cf_name": "default", "job": 123, "event": "table_file_creation", "file_number": 194, "file_size": 3807499, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 93145, "largest_seqno": 95493, "table_properties": {"data_size": 3798053, "index_size": 6003, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19116, "raw_average_key_size": 20, "raw_value_size": 3779237, "raw_average_value_size": 4007, "num_data_blocks": 262, "num_entries": 943, "num_filter_entries": 943, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759411562, "oldest_key_time": 1759411562, "file_creation_time": 1759411781, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 194, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 123] Flush lasted 24501 microseconds, and 14162 cpu microseconds.
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.451783) [db/flush_job.cc:967] [default] [JOB 123] Level-0 flush table #194: 3807499 bytes OK
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.451810) [db/memtable_list.cc:519] [default] Level-0 commit table #194 started
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.453415) [db/memtable_list.cc:722] [default] Level-0 commit table #194: memtable #1 done
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.453431) EVENT_LOG_v1 {"time_micros": 1759411781453426, "job": 123, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.453452) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 123] Try to delete WAL files size 5811043, prev total WAL file size 5811043, number of live WAL files 2.
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000190.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.455305) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038323833' seq:72057594037927935, type:22 .. '7061786F730038353335' seq:0, type:0; will stop at (end)
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 124] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 123 Base level 0, inputs: [194(3718KB)], [192(12MB)]
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411781455405, "job": 124, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [194], "files_L6": [192], "score": -1, "input_data_size": 16842516, "oldest_snapshot_seqno": -1}
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 124] Generated table #195: 11720 keys, 14758165 bytes, temperature: kUnknown
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411781560839, "cf_name": "default", "job": 124, "event": "table_file_creation", "file_number": 195, "file_size": 14758165, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14683382, "index_size": 44398, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29317, "raw_key_size": 309061, "raw_average_key_size": 26, "raw_value_size": 14479326, "raw_average_value_size": 1235, "num_data_blocks": 1687, "num_entries": 11720, "num_filter_entries": 11720, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759411781, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 195, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.561115) [db/compaction/compaction_job.cc:1663] [default] [JOB 124] Compacted 1@0 + 1@6 files to L6 => 14758165 bytes
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.562841) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 159.6 rd, 139.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 12.4 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(8.3) write-amplify(3.9) OK, records in: 12239, records dropped: 519 output_compression: NoCompression
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.562897) EVENT_LOG_v1 {"time_micros": 1759411781562875, "job": 124, "event": "compaction_finished", "compaction_time_micros": 105511, "compaction_time_cpu_micros": 52529, "output_level": 6, "num_output_files": 1, "total_output_size": 14758165, "num_input_records": 12239, "num_output_records": 11720, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000194.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411781564404, "job": 124, "event": "table_file_deletion", "file_number": 194}
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000192.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411781568708, "job": 124, "event": "table_file_deletion", "file_number": 192}
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.455137) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.568927) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.568938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.568940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.568942) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:29:41 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:29:41.568945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:29:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:41.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:42 np0005465988 nova_compute[236126]: 2025-10-02 13:29:42.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:42.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:42 np0005465988 nova_compute[236126]: 2025-10-02 13:29:42.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:43.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:44.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:45 np0005465988 nova_compute[236126]: 2025-10-02 13:29:45.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:29:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:45.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:46.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:46 np0005465988 podman[360377]: 2025-10-02 13:29:46.520637746 +0000 UTC m=+0.058174726 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 09:29:47 np0005465988 nova_compute[236126]: 2025-10-02 13:29:47.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:47 np0005465988 nova_compute[236126]: 2025-10-02 13:29:47.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:47.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:48.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:49.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:50.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:51.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:52 np0005465988 nova_compute[236126]: 2025-10-02 13:29:52.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:52.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:52 np0005465988 nova_compute[236126]: 2025-10-02 13:29:52.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:29:52 np0005465988 nova_compute[236126]: 2025-10-02 13:29:52.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:52 np0005465988 nova_compute[236126]: 2025-10-02 13:29:52.623 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:29:52 np0005465988 nova_compute[236126]: 2025-10-02 13:29:52.624 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:29:52 np0005465988 nova_compute[236126]: 2025-10-02 13:29:52.625 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:29:52 np0005465988 nova_compute[236126]: 2025-10-02 13:29:52.625 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:29:52 np0005465988 nova_compute[236126]: 2025-10-02 13:29:52.626 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:29:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:29:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2326759658' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:29:53 np0005465988 nova_compute[236126]: 2025-10-02 13:29:53.105 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:29:53 np0005465988 nova_compute[236126]: 2025-10-02 13:29:53.284 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:29:53 np0005465988 nova_compute[236126]: 2025-10-02 13:29:53.285 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3971MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:29:53 np0005465988 nova_compute[236126]: 2025-10-02 13:29:53.286 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:29:53 np0005465988 nova_compute[236126]: 2025-10-02 13:29:53.286 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:29:53 np0005465988 nova_compute[236126]: 2025-10-02 13:29:53.522 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:29:53 np0005465988 nova_compute[236126]: 2025-10-02 13:29:53.523 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:29:53 np0005465988 nova_compute[236126]: 2025-10-02 13:29:53.720 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:29:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:53.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:29:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1388984387' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:29:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:54.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:54 np0005465988 nova_compute[236126]: 2025-10-02 13:29:54.221 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:29:54 np0005465988 nova_compute[236126]: 2025-10-02 13:29:54.228 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:29:54 np0005465988 nova_compute[236126]: 2025-10-02 13:29:54.376 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:29:54 np0005465988 nova_compute[236126]: 2025-10-02 13:29:54.378 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:29:54 np0005465988 nova_compute[236126]: 2025-10-02 13:29:54.379 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:29:55 np0005465988 podman[360447]: 2025-10-02 13:29:55.530288226 +0000 UTC m=+0.060671738 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0)
Oct  2 09:29:55 np0005465988 podman[360448]: 2025-10-02 13:29:55.558759876 +0000 UTC m=+0.076331229 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 09:29:55 np0005465988 podman[360446]: 2025-10-02 13:29:55.566463378 +0000 UTC m=+0.101356020 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:29:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:55.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:56.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:57 np0005465988 nova_compute[236126]: 2025-10-02 13:29:57.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:57 np0005465988 nova_compute[236126]: 2025-10-02 13:29:57.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:29:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:29:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:57.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:29:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:29:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:29:58.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:29:58 np0005465988 nova_compute[236126]: 2025-10-02 13:29:58.374 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:29:58 np0005465988 nova_compute[236126]: 2025-10-02 13:29:58.375 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:29:58 np0005465988 nova_compute[236126]: 2025-10-02 13:29:58.375 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:29:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:29:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:29:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:29:59.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:00.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:00 np0005465988 ceph-mon[76355]: overall HEALTH_OK
Oct  2 09:30:01 np0005465988 nova_compute[236126]: 2025-10-02 13:30:01.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:01 np0005465988 nova_compute[236126]: 2025-10-02 13:30:01.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:30:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:01.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:02 np0005465988 nova_compute[236126]: 2025-10-02 13:30:02.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:02.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:02 np0005465988 nova_compute[236126]: 2025-10-02 13:30:02.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:03.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:04.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:05 np0005465988 nova_compute[236126]: 2025-10-02 13:30:05.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:05.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:06.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:07 np0005465988 nova_compute[236126]: 2025-10-02 13:30:07.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:07 np0005465988 nova_compute[236126]: 2025-10-02 13:30:07.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:07 np0005465988 nova_compute[236126]: 2025-10-02 13:30:07.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:30:07 np0005465988 nova_compute[236126]: 2025-10-02 13:30:07.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:30:07 np0005465988 nova_compute[236126]: 2025-10-02 13:30:07.496 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:30:07 np0005465988 nova_compute[236126]: 2025-10-02 13:30:07.497 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:07 np0005465988 nova_compute[236126]: 2025-10-02 13:30:07.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:30:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:07.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:30:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:08.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:09 np0005465988 nova_compute[236126]: 2025-10-02 13:30:09.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:09 np0005465988 nova_compute[236126]: 2025-10-02 13:30:09.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 09:30:09 np0005465988 nova_compute[236126]: 2025-10-02 13:30:09.492 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 09:30:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:09.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:10.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:11 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:11 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:11 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:11.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:12 np0005465988 nova_compute[236126]: 2025-10-02 13:30:12.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:12.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:12 np0005465988 nova_compute[236126]: 2025-10-02 13:30:12.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:13 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:13 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:13 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:13.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:14.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:15 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:15 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:15 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:15.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:16.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:17 np0005465988 nova_compute[236126]: 2025-10-02 13:30:17.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:17 np0005465988 podman[360625]: 2025-10-02 13:30:17.53488396 +0000 UTC m=+0.062423319 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 09:30:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:17 np0005465988 nova_compute[236126]: 2025-10-02 13:30:17.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:17 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:17 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:17 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:17.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:30:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:18.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:30:19 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:19 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:19 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:19.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:20.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:21 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:21 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:21 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:21.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:22 np0005465988 nova_compute[236126]: 2025-10-02 13:30:22.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:22.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:22 np0005465988 nova_compute[236126]: 2025-10-02 13:30:22.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:23 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:23 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:23 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:23.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:24.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:25 np0005465988 podman[360749]: 2025-10-02 13:30:25.667348548 +0000 UTC m=+0.060422461 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:30:25 np0005465988 podman[360748]: 2025-10-02 13:30:25.686579442 +0000 UTC m=+0.086491642 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 09:30:25 np0005465988 podman[360747]: 2025-10-02 13:30:25.686651864 +0000 UTC m=+0.091902347 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  2 09:30:25 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:25 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:25 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:25.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:26.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:30:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:30:26 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:30:27 np0005465988 nova_compute[236126]: 2025-10-02 13:30:27.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:30:27.441 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:30:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:30:27.441 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:30:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:30:27.441 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:30:27 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:27 np0005465988 nova_compute[236126]: 2025-10-02 13:30:27.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:27 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:27 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:27 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:27.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:28.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:29 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:29 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:29 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:29.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:30.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:31 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:31 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:31 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:31.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:32 np0005465988 nova_compute[236126]: 2025-10-02 13:30:32.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:32.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:32 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:32 np0005465988 nova_compute[236126]: 2025-10-02 13:30:32.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:30:32 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:30:33 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:33 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:33 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:33.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:34.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:35 np0005465988 nova_compute[236126]: 2025-10-02 13:30:35.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:35 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:35 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:35 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:35.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:36.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:37 np0005465988 nova_compute[236126]: 2025-10-02 13:30:37.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:37 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:37 np0005465988 nova_compute[236126]: 2025-10-02 13:30:37.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:37 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:37 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:37 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:37.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:38.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:39 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:39 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:39 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:39.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:40.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:41 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:41 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:41 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:41.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:42 np0005465988 nova_compute[236126]: 2025-10-02 13:30:42.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:42.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:42 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:42 np0005465988 nova_compute[236126]: 2025-10-02 13:30:42.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #196. Immutable memtables: 0.
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.191132) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 125] Flushing memtable with next log file: 196
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411843191180, "job": 125, "event": "flush_started", "num_memtables": 1, "num_entries": 811, "num_deletes": 251, "total_data_size": 1474523, "memory_usage": 1491448, "flush_reason": "Manual Compaction"}
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 125] Level-0 flush table #197: started
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411843197479, "cf_name": "default", "job": 125, "event": "table_file_creation", "file_number": 197, "file_size": 635511, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 95498, "largest_seqno": 96304, "table_properties": {"data_size": 632308, "index_size": 1046, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8764, "raw_average_key_size": 20, "raw_value_size": 625466, "raw_average_value_size": 1478, "num_data_blocks": 46, "num_entries": 423, "num_filter_entries": 423, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759411783, "oldest_key_time": 1759411783, "file_creation_time": 1759411843, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 197, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 125] Flush lasted 6427 microseconds, and 3318 cpu microseconds.
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.197553) [db/flush_job.cc:967] [default] [JOB 125] Level-0 flush table #197: 635511 bytes OK
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.197583) [db/memtable_list.cc:519] [default] Level-0 commit table #197 started
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.199612) [db/memtable_list.cc:722] [default] Level-0 commit table #197: memtable #1 done
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.199651) EVENT_LOG_v1 {"time_micros": 1759411843199638, "job": 125, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.199683) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 125] Try to delete WAL files size 1470347, prev total WAL file size 1470347, number of live WAL files 2.
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000193.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.200566) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033323632' seq:72057594037927935, type:22 .. '6D6772737461740033353134' seq:0, type:0; will stop at (end)
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 126] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 125 Base level 0, inputs: [197(620KB)], [195(14MB)]
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411843200616, "job": 126, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [197], "files_L6": [195], "score": -1, "input_data_size": 15393676, "oldest_snapshot_seqno": -1}
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 126] Generated table #198: 11650 keys, 11880083 bytes, temperature: kUnknown
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411843283424, "cf_name": "default", "job": 126, "event": "table_file_creation", "file_number": 198, "file_size": 11880083, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11809843, "index_size": 39983, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29189, "raw_key_size": 307797, "raw_average_key_size": 26, "raw_value_size": 11611179, "raw_average_value_size": 996, "num_data_blocks": 1503, "num_entries": 11650, "num_filter_entries": 11650, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759411843, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 198, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.283817) [db/compaction/compaction_job.cc:1663] [default] [JOB 126] Compacted 1@0 + 1@6 files to L6 => 11880083 bytes
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.285232) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 185.6 rd, 143.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 14.1 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(42.9) write-amplify(18.7) OK, records in: 12143, records dropped: 493 output_compression: NoCompression
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.285252) EVENT_LOG_v1 {"time_micros": 1759411843285242, "job": 126, "event": "compaction_finished", "compaction_time_micros": 82949, "compaction_time_cpu_micros": 35869, "output_level": 6, "num_output_files": 1, "total_output_size": 11880083, "num_input_records": 12143, "num_output_records": 11650, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000197.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411843285541, "job": 126, "event": "table_file_deletion", "file_number": 197}
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000195.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411843288928, "job": 126, "event": "table_file_deletion", "file_number": 195}
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.200496) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.288966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.288971) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.288973) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.288975) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:30:43 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:30:43.288976) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:30:43 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:43 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:43 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:43.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:44.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:44 np0005465988 nova_compute[236126]: 2025-10-02 13:30:44.496 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:44 np0005465988 nova_compute[236126]: 2025-10-02 13:30:44.497 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 09:30:45 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:45 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:45 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:45.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:46.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:47 np0005465988 nova_compute[236126]: 2025-10-02 13:30:47.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:47 np0005465988 nova_compute[236126]: 2025-10-02 13:30:47.491 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:47 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:47 np0005465988 nova_compute[236126]: 2025-10-02 13:30:47.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:47 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:47 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:47 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:47.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:48.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:48 np0005465988 podman[360958]: 2025-10-02 13:30:48.530497396 +0000 UTC m=+0.061659997 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 09:30:49 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:49 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:49 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:49.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:50.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:51 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:51 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:51 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:51.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:52 np0005465988 nova_compute[236126]: 2025-10-02 13:30:52.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:52.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:52 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:52 np0005465988 nova_compute[236126]: 2025-10-02 13:30:52.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:53 np0005465988 nova_compute[236126]: 2025-10-02 13:30:53.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:53 np0005465988 nova_compute[236126]: 2025-10-02 13:30:53.723 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:30:53 np0005465988 nova_compute[236126]: 2025-10-02 13:30:53.724 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:30:53 np0005465988 nova_compute[236126]: 2025-10-02 13:30:53.724 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:30:53 np0005465988 nova_compute[236126]: 2025-10-02 13:30:53.724 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:30:53 np0005465988 nova_compute[236126]: 2025-10-02 13:30:53.724 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:30:53 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:53 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:53 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:53.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:30:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3996399624' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:30:54 np0005465988 nova_compute[236126]: 2025-10-02 13:30:54.215 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:30:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:54.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:54 np0005465988 nova_compute[236126]: 2025-10-02 13:30:54.378 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:30:54 np0005465988 nova_compute[236126]: 2025-10-02 13:30:54.379 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3995MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:30:54 np0005465988 nova_compute[236126]: 2025-10-02 13:30:54.380 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:30:54 np0005465988 nova_compute[236126]: 2025-10-02 13:30:54.380 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:30:54 np0005465988 nova_compute[236126]: 2025-10-02 13:30:54.655 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:30:54 np0005465988 nova_compute[236126]: 2025-10-02 13:30:54.655 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:30:54 np0005465988 nova_compute[236126]: 2025-10-02 13:30:54.671 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:30:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:30:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2584588499' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:30:55 np0005465988 nova_compute[236126]: 2025-10-02 13:30:55.122 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:30:55 np0005465988 nova_compute[236126]: 2025-10-02 13:30:55.130 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:30:55 np0005465988 nova_compute[236126]: 2025-10-02 13:30:55.235 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:30:55 np0005465988 nova_compute[236126]: 2025-10-02 13:30:55.237 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:30:55 np0005465988 nova_compute[236126]: 2025-10-02 13:30:55.237 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:30:55 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:55 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:55 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:55.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:30:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:56.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:56 np0005465988 podman[361054]: 2025-10-02 13:30:56.531746647 +0000 UTC m=+0.070875943 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd)
Oct  2 09:30:56 np0005465988 podman[361053]: 2025-10-02 13:30:56.540330364 +0000 UTC m=+0.087770019 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 09:30:56 np0005465988 podman[361052]: 2025-10-02 13:30:56.579296946 +0000 UTC m=+0.129464509 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 09:30:57 np0005465988 nova_compute[236126]: 2025-10-02 13:30:57.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:30:57 np0005465988 nova_compute[236126]: 2025-10-02 13:30:57.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:30:57 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:57 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:57 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:57.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:58 np0005465988 nova_compute[236126]: 2025-10-02 13:30:58.237 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:58 np0005465988 nova_compute[236126]: 2025-10-02 13:30:58.238 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:30:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:30:58.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:30:59 np0005465988 nova_compute[236126]: 2025-10-02 13:30:59.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:30:59 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:30:59 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:30:59 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:30:59.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:00.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:01 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:01 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:01 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:01.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:02 np0005465988 nova_compute[236126]: 2025-10-02 13:31:02.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:02.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:02 np0005465988 nova_compute[236126]: 2025-10-02 13:31:02.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:31:02 np0005465988 nova_compute[236126]: 2025-10-02 13:31:02.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:31:02 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:02 np0005465988 nova_compute[236126]: 2025-10-02 13:31:02.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:03 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:03 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:03 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:03.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:04.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:05 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:05 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:05 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:05.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:06.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:06 np0005465988 nova_compute[236126]: 2025-10-02 13:31:06.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:31:07 np0005465988 nova_compute[236126]: 2025-10-02 13:31:07.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:07 np0005465988 nova_compute[236126]: 2025-10-02 13:31:07.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:07 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:07 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:07 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:07.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:08.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:09 np0005465988 nova_compute[236126]: 2025-10-02 13:31:09.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:31:09 np0005465988 nova_compute[236126]: 2025-10-02 13:31:09.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:31:09 np0005465988 nova_compute[236126]: 2025-10-02 13:31:09.476 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:31:09 np0005465988 nova_compute[236126]: 2025-10-02 13:31:09.534 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:31:09 np0005465988 nova_compute[236126]: 2025-10-02 13:31:09.534 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:31:09 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:09 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:09 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:09.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:10.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:12.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:12 np0005465988 nova_compute[236126]: 2025-10-02 13:31:12.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:12.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:12 np0005465988 nova_compute[236126]: 2025-10-02 13:31:12.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:14.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:31:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:14.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:31:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:16.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:16.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:17 np0005465988 nova_compute[236126]: 2025-10-02 13:31:17.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:17 np0005465988 nova_compute[236126]: 2025-10-02 13:31:17.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:18.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:18.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:18 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #199. Immutable memtables: 0.
Oct  2 09:31:18 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:18.579205) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:31:18 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 127] Flushing memtable with next log file: 199
Oct  2 09:31:18 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411878579304, "job": 127, "event": "flush_started", "num_memtables": 1, "num_entries": 597, "num_deletes": 256, "total_data_size": 917098, "memory_usage": 928216, "flush_reason": "Manual Compaction"}
Oct  2 09:31:18 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 127] Level-0 flush table #200: started
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411879193591, "cf_name": "default", "job": 127, "event": "table_file_creation", "file_number": 200, "file_size": 604704, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 96309, "largest_seqno": 96901, "table_properties": {"data_size": 601723, "index_size": 952, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 6919, "raw_average_key_size": 18, "raw_value_size": 595657, "raw_average_value_size": 1584, "num_data_blocks": 44, "num_entries": 376, "num_filter_entries": 376, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759411844, "oldest_key_time": 1759411844, "file_creation_time": 1759411878, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 200, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 127] Flush lasted 614428 microseconds, and 3750 cpu microseconds.
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.193648) [db/flush_job.cc:967] [default] [JOB 127] Level-0 flush table #200: 604704 bytes OK
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.193673) [db/memtable_list.cc:519] [default] Level-0 commit table #200 started
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.272974) [db/memtable_list.cc:722] [default] Level-0 commit table #200: memtable #1 done
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.273052) EVENT_LOG_v1 {"time_micros": 1759411879273036, "job": 127, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.273094) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 127] Try to delete WAL files size 913731, prev total WAL file size 913731, number of live WAL files 2.
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000196.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.274352) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373630' seq:72057594037927935, type:22 .. '6C6F676D0034303132' seq:0, type:0; will stop at (end)
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 128] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 127 Base level 0, inputs: [200(590KB)], [198(11MB)]
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411879274530, "job": 128, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [200], "files_L6": [198], "score": -1, "input_data_size": 12484787, "oldest_snapshot_seqno": -1}
Oct  2 09:31:19 np0005465988 podman[361206]: 2025-10-02 13:31:19.554307729 +0000 UTC m=+0.081863178 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 128] Generated table #201: 11502 keys, 12368302 bytes, temperature: kUnknown
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411879635662, "cf_name": "default", "job": 128, "event": "table_file_creation", "file_number": 201, "file_size": 12368302, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12298015, "index_size": 40414, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28805, "raw_key_size": 305648, "raw_average_key_size": 26, "raw_value_size": 12100743, "raw_average_value_size": 1052, "num_data_blocks": 1519, "num_entries": 11502, "num_filter_entries": 11502, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759411879, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 201, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.636093) [db/compaction/compaction_job.cc:1663] [default] [JOB 128] Compacted 1@0 + 1@6 files to L6 => 12368302 bytes
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.823562) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 34.6 rd, 34.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 11.3 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(41.1) write-amplify(20.5) OK, records in: 12026, records dropped: 524 output_compression: NoCompression
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.823632) EVENT_LOG_v1 {"time_micros": 1759411879823606, "job": 128, "event": "compaction_finished", "compaction_time_micros": 361260, "compaction_time_cpu_micros": 30390, "output_level": 6, "num_output_files": 1, "total_output_size": 12368302, "num_input_records": 12026, "num_output_records": 11502, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000200.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411879824058, "job": 128, "event": "table_file_deletion", "file_number": 200}
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000198.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759411879828085, "job": 128, "event": "table_file_deletion", "file_number": 198}
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.274234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.828163) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.828170) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.828172) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.828174) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:31:19 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:31:19.828176) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:31:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:20.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:20.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:22.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:22 np0005465988 nova_compute[236126]: 2025-10-02 13:31:22.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:22.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:22 np0005465988 nova_compute[236126]: 2025-10-02 13:31:22.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:23 np0005465988 nova_compute[236126]: 2025-10-02 13:31:23.529 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:31:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:24.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:24.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:26.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:26.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:27 np0005465988 nova_compute[236126]: 2025-10-02 13:31:27.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:31:27.442 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:31:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:31:27.443 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:31:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:31:27.443 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:31:27 np0005465988 podman[361231]: 2025-10-02 13:31:27.525211347 +0000 UTC m=+0.058336161 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 09:31:27 np0005465988 podman[361232]: 2025-10-02 13:31:27.525898487 +0000 UTC m=+0.053875592 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  2 09:31:27 np0005465988 podman[361230]: 2025-10-02 13:31:27.544197084 +0000 UTC m=+0.081547409 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible)
Oct  2 09:31:27 np0005465988 nova_compute[236126]: 2025-10-02 13:31:27.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:28.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:28.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:30.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:30.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:32.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:32 np0005465988 nova_compute[236126]: 2025-10-02 13:31:32.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:32.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:32 np0005465988 nova_compute[236126]: 2025-10-02 13:31:32.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:31:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:31:33 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:31:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:34.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:34.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:36.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:36.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:37 np0005465988 nova_compute[236126]: 2025-10-02 13:31:37.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:37 np0005465988 nova_compute[236126]: 2025-10-02 13:31:37.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:38.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:38.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:40.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:40.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:41 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:31:41 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:31:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:42.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:42 np0005465988 nova_compute[236126]: 2025-10-02 13:31:42.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:42.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:42 np0005465988 nova_compute[236126]: 2025-10-02 13:31:42.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:31:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:44.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:31:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:44.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:31:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:46.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:31:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:46.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:47 np0005465988 nova_compute[236126]: 2025-10-02 13:31:47.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:47 np0005465988 nova_compute[236126]: 2025-10-02 13:31:47.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:31:47 np0005465988 nova_compute[236126]: 2025-10-02 13:31:47.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:48.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:48.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:50.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:50.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:50 np0005465988 podman[361535]: 2025-10-02 13:31:50.519390407 +0000 UTC m=+0.058686141 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 09:31:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:52.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:52 np0005465988 nova_compute[236126]: 2025-10-02 13:31:52.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:52.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:52 np0005465988 nova_compute[236126]: 2025-10-02 13:31:52.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:53 np0005465988 nova_compute[236126]: 2025-10-02 13:31:53.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:31:53 np0005465988 nova_compute[236126]: 2025-10-02 13:31:53.507 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:31:53 np0005465988 nova_compute[236126]: 2025-10-02 13:31:53.507 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:31:53 np0005465988 nova_compute[236126]: 2025-10-02 13:31:53.507 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:31:53 np0005465988 nova_compute[236126]: 2025-10-02 13:31:53.507 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:31:53 np0005465988 nova_compute[236126]: 2025-10-02 13:31:53.508 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:31:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:31:53 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1496869046' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:31:53 np0005465988 nova_compute[236126]: 2025-10-02 13:31:53.982 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:31:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:54.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.113 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.114 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3970MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.114 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.115 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.195 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.195 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.213 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing inventories for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.229 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating ProviderTree inventory for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.229 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Updating inventory in ProviderTree for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.251 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing aggregate associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.283 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Refreshing trait associations for resource provider 5abd2871-a992-42ab-bb6a-594a92f77d4d, traits: COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.306 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:31:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:31:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:54.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:31:54 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:31:54 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/956431986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.775 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.782 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.796 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.798 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:31:54 np0005465988 nova_compute[236126]: 2025-10-02 13:31:54.798 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:31:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Oct  2 09:31:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/319318620' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct  2 09:31:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Oct  2 09:31:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/319318620' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct  2 09:31:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:56.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:31:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:56.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:31:57 np0005465988 nova_compute[236126]: 2025-10-02 13:31:57.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:57 np0005465988 nova_compute[236126]: 2025-10-02 13:31:57.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:31:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:31:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:31:58.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:31:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:31:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:31:58.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:31:58 np0005465988 podman[361656]: 2025-10-02 13:31:58.529930798 +0000 UTC m=+0.058043012 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:31:58 np0005465988 podman[361655]: 2025-10-02 13:31:58.556477813 +0000 UTC m=+0.084586787 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:31:58 np0005465988 podman[361654]: 2025-10-02 13:31:58.556507944 +0000 UTC m=+0.086663147 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:31:59 np0005465988 nova_compute[236126]: 2025-10-02 13:31:59.799 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:31:59 np0005465988 nova_compute[236126]: 2025-10-02 13:31:59.799 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:32:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:00.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:00.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:01 np0005465988 nova_compute[236126]: 2025-10-02 13:32:01.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:32:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:02.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:02 np0005465988 nova_compute[236126]: 2025-10-02 13:32:02.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:02.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:02 np0005465988 nova_compute[236126]: 2025-10-02 13:32:02.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:03 np0005465988 nova_compute[236126]: 2025-10-02 13:32:03.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:32:03 np0005465988 nova_compute[236126]: 2025-10-02 13:32:03.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:32:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:04.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:04.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:06.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:06.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:06 np0005465988 nova_compute[236126]: 2025-10-02 13:32:06.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:32:07 np0005465988 nova_compute[236126]: 2025-10-02 13:32:07.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:07 np0005465988 nova_compute[236126]: 2025-10-02 13:32:07.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:08.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:08.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:10.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:10.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:10 np0005465988 nova_compute[236126]: 2025-10-02 13:32:10.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:32:11 np0005465988 nova_compute[236126]: 2025-10-02 13:32:11.474 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:32:11 np0005465988 nova_compute[236126]: 2025-10-02 13:32:11.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:32:11 np0005465988 nova_compute[236126]: 2025-10-02 13:32:11.475 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:32:11 np0005465988 nova_compute[236126]: 2025-10-02 13:32:11.507 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:32:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:12.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:12 np0005465988 nova_compute[236126]: 2025-10-02 13:32:12.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:12.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:12 np0005465988 nova_compute[236126]: 2025-10-02 13:32:12.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:14.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:14.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:16.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:16.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:17 np0005465988 nova_compute[236126]: 2025-10-02 13:32:17.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:17 np0005465988 nova_compute[236126]: 2025-10-02 13:32:17.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:18.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:18.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:20.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:20.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:21 np0005465988 podman[361778]: 2025-10-02 13:32:21.527537943 +0000 UTC m=+0.056084526 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 09:32:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:22.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:22 np0005465988 nova_compute[236126]: 2025-10-02 13:32:22.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:22.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:22 np0005465988 nova_compute[236126]: 2025-10-02 13:32:22.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:24.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:24.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:26.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:26.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:27 np0005465988 nova_compute[236126]: 2025-10-02 13:32:27.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:32:27.442 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:32:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:32:27.443 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:32:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:32:27.443 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:32:27 np0005465988 nova_compute[236126]: 2025-10-02 13:32:27.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:28.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:28.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:29 np0005465988 podman[361804]: 2025-10-02 13:32:29.53954305 +0000 UTC m=+0.065946460 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 09:32:29 np0005465988 podman[361803]: 2025-10-02 13:32:29.563924603 +0000 UTC m=+0.094918435 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct  2 09:32:29 np0005465988 podman[361805]: 2025-10-02 13:32:29.576555106 +0000 UTC m=+0.095466970 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd)
Oct  2 09:32:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:30.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:30.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:32.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:32 np0005465988 nova_compute[236126]: 2025-10-02 13:32:32.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:32.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:32 np0005465988 nova_compute[236126]: 2025-10-02 13:32:32.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:34.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:34.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:36.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:36.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:37 np0005465988 nova_compute[236126]: 2025-10-02 13:32:37.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:37 np0005465988 nova_compute[236126]: 2025-10-02 13:32:37.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:38.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:38.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:40.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:40.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:42.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:42 np0005465988 nova_compute[236126]: 2025-10-02 13:32:42.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:42.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:42 np0005465988 nova_compute[236126]: 2025-10-02 13:32:42.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:32:43 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:32:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:44.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:32:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:32:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:32:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:32:44 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:32:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:44.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:46.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:46.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:47 np0005465988 nova_compute[236126]: 2025-10-02 13:32:47.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:47 np0005465988 nova_compute[236126]: 2025-10-02 13:32:47.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:48.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:48.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:48 np0005465988 nova_compute[236126]: 2025-10-02 13:32:48.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:32:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:50.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:50.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:32:50 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:32:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:52.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:52 np0005465988 nova_compute[236126]: 2025-10-02 13:32:52.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:52.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:32:52 np0005465988 podman[362111]: 2025-10-02 13:32:52.528747117 +0000 UTC m=+0.062396198 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:32:52 np0005465988 nova_compute[236126]: 2025-10-02 13:32:52.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:54.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:54.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:54 np0005465988 nova_compute[236126]: 2025-10-02 13:32:54.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:32:54 np0005465988 nova_compute[236126]: 2025-10-02 13:32:54.541 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:32:54 np0005465988 nova_compute[236126]: 2025-10-02 13:32:54.542 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:32:54 np0005465988 nova_compute[236126]: 2025-10-02 13:32:54.542 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:32:54 np0005465988 nova_compute[236126]: 2025-10-02 13:32:54.542 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:32:54 np0005465988 nova_compute[236126]: 2025-10-02 13:32:54.543 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:32:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:32:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3166897853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.021 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.213 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.214 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3958MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.214 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.215 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.292 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.293 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.401 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:32:55 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:32:55 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/731534150' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.890 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.898 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.935 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.938 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:32:55 np0005465988 nova_compute[236126]: 2025-10-02 13:32:55.939 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:32:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:56.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:56.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:57 np0005465988 nova_compute[236126]: 2025-10-02 13:32:57.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:57 np0005465988 nova_compute[236126]: 2025-10-02 13:32:57.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:32:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:32:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:32:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:32:58.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:32:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:32:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:32:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:32:58.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.002000058s ======
Oct  2 09:33:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:00.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Oct  2 09:33:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:00.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:00 np0005465988 podman[362231]: 2025-10-02 13:33:00.529700247 +0000 UTC m=+0.059255258 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 09:33:00 np0005465988 podman[362229]: 2025-10-02 13:33:00.551473314 +0000 UTC m=+0.086648857 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:33:00 np0005465988 podman[362230]: 2025-10-02 13:33:00.551536195 +0000 UTC m=+0.081860468 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 09:33:01 np0005465988 nova_compute[236126]: 2025-10-02 13:33:01.941 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:33:01 np0005465988 nova_compute[236126]: 2025-10-02 13:33:01.942 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:33:01 np0005465988 nova_compute[236126]: 2025-10-02 13:33:01.942 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:33:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:02.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:02 np0005465988 nova_compute[236126]: 2025-10-02 13:33:02.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:02.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:02 np0005465988 nova_compute[236126]: 2025-10-02 13:33:02.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:04.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:04 np0005465988 nova_compute[236126]: 2025-10-02 13:33:04.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:33:04 np0005465988 nova_compute[236126]: 2025-10-02 13:33:04.473 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:33:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:04.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:06.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:06.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:07 np0005465988 nova_compute[236126]: 2025-10-02 13:33:07.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:07 np0005465988 nova_compute[236126]: 2025-10-02 13:33:07.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:33:07 np0005465988 nova_compute[236126]: 2025-10-02 13:33:07.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:08.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:08.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:10.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:10 np0005465988 nova_compute[236126]: 2025-10-02 13:33:10.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:33:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:10.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:12.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:12 np0005465988 nova_compute[236126]: 2025-10-02 13:33:12.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:12 np0005465988 nova_compute[236126]: 2025-10-02 13:33:12.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:33:12 np0005465988 nova_compute[236126]: 2025-10-02 13:33:12.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:33:12 np0005465988 nova_compute[236126]: 2025-10-02 13:33:12.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:33:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:12.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:12 np0005465988 nova_compute[236126]: 2025-10-02 13:33:12.510 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:33:12 np0005465988 nova_compute[236126]: 2025-10-02 13:33:12.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:14.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:14.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:16.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:16.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:17 np0005465988 nova_compute[236126]: 2025-10-02 13:33:17.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:18 np0005465988 nova_compute[236126]: 2025-10-02 13:33:17.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:18.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:18.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:20.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:20.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:22.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:22 np0005465988 nova_compute[236126]: 2025-10-02 13:33:22.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:22.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:23 np0005465988 nova_compute[236126]: 2025-10-02 13:33:23.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #202. Immutable memtables: 0.
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.034011) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:856] [default] [JOB 129] Flushing memtable with next log file: 202
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759412003034070, "job": 129, "event": "flush_started", "num_memtables": 1, "num_entries": 1448, "num_deletes": 251, "total_data_size": 3325536, "memory_usage": 3370576, "flush_reason": "Manual Compaction"}
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:885] [default] [JOB 129] Level-0 flush table #203: started
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759412003051444, "cf_name": "default", "job": 129, "event": "table_file_creation", "file_number": 203, "file_size": 2171920, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 96906, "largest_seqno": 98349, "table_properties": {"data_size": 2165786, "index_size": 3396, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12988, "raw_average_key_size": 19, "raw_value_size": 2153502, "raw_average_value_size": 3313, "num_data_blocks": 151, "num_entries": 650, "num_filter_entries": 650, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759411879, "oldest_key_time": 1759411879, "file_creation_time": 1759412003, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 203, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 129] Flush lasted 17597 microseconds, and 11133 cpu microseconds.
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.051604) [db/flush_job.cc:967] [default] [JOB 129] Level-0 flush table #203: 2171920 bytes OK
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.051680) [db/memtable_list.cc:519] [default] Level-0 commit table #203 started
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.053260) [db/memtable_list.cc:722] [default] Level-0 commit table #203: memtable #1 done
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.053280) EVENT_LOG_v1 {"time_micros": 1759412003053273, "job": 129, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.053304) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 129] Try to delete WAL files size 3318881, prev total WAL file size 3318881, number of live WAL files 2.
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000199.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.054999) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038353334' seq:72057594037927935, type:22 .. '7061786F730038373836' seq:0, type:0; will stop at (end)
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 130] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 129 Base level 0, inputs: [203(2121KB)], [201(11MB)]
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759412003055062, "job": 130, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [203], "files_L6": [201], "score": -1, "input_data_size": 14540222, "oldest_snapshot_seqno": -1}
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 130] Generated table #204: 11635 keys, 12557795 bytes, temperature: kUnknown
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759412003145231, "cf_name": "default", "job": 130, "event": "table_file_creation", "file_number": 204, "file_size": 12557795, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12486512, "index_size": 41090, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29125, "raw_key_size": 309066, "raw_average_key_size": 26, "raw_value_size": 12286813, "raw_average_value_size": 1056, "num_data_blocks": 1544, "num_entries": 11635, "num_filter_entries": 11635, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759404809, "oldest_key_time": 0, "file_creation_time": 1759412003, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0f53a282-e834-4e21-912d-f4016b84b664", "db_session_id": "MN3GOSCTDLK2IE3HZQ1Z", "orig_file_number": 204, "seqno_to_time_mapping": "N/A"}}
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.145597) [db/compaction/compaction_job.cc:1663] [default] [JOB 130] Compacted 1@0 + 1@6 files to L6 => 12557795 bytes
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.147102) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.0 rd, 139.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 11.8 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(12.5) write-amplify(5.8) OK, records in: 12152, records dropped: 517 output_compression: NoCompression
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.147131) EVENT_LOG_v1 {"time_micros": 1759412003147118, "job": 130, "event": "compaction_finished", "compaction_time_micros": 90323, "compaction_time_cpu_micros": 46688, "output_level": 6, "num_output_files": 1, "total_output_size": 12557795, "num_input_records": 12152, "num_output_records": 11635, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000203.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759412003148273, "job": 130, "event": "table_file_deletion", "file_number": 203}
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000201.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759412003151727, "job": 130, "event": "table_file_deletion", "file_number": 201}
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.054911) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.151859) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.151865) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.151867) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.151868) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:33:23 np0005465988 ceph-mon[76355]: rocksdb: (Original Log Time 2025/10/02-13:33:23.151870) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct  2 09:33:23 np0005465988 podman[362357]: 2025-10-02 13:33:23.513951281 +0000 UTC m=+0.050974439 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:33:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:24.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:24.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:26.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:26 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:26 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:26 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:26.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:27 np0005465988 nova_compute[236126]: 2025-10-02 13:33:27.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:33:27.444 142124 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:33:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:33:27.444 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:33:27 np0005465988 ovn_metadata_agent[142119]: 2025-10-02 13:33:27.444 142124 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:33:28 np0005465988 nova_compute[236126]: 2025-10-02 13:33:28.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:28 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:28.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:28 np0005465988 nova_compute[236126]: 2025-10-02 13:33:28.505 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:33:28 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:28 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:28 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:28.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:33:29 np0005465988 ceph-mon[76355]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.0 total, 600.0 interval#012Cumulative writes: 19K writes, 98K keys, 19K commit groups, 1.0 writes per commit group, ingest: 0.19 GB, 0.03 MB/s#012Cumulative WAL: 19K writes, 19K syncs, 1.00 writes per sync, written: 0.19 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1486 writes, 7229 keys, 1486 commit groups, 1.0 writes per commit group, ingest: 14.98 MB, 0.02 MB/s#012Interval WAL: 1486 writes, 1486 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     44.9      2.70              0.43        65    0.042       0      0       0.0       0.0#012  L6      1/0   11.98 MB   0.0      0.8     0.1      0.7       0.7      0.0       0.0   5.6    110.7     95.4      7.12              2.33        64    0.111    530K    34K       0.0       0.0#012 Sum      1/0   11.98 MB   0.0      0.8     0.1      0.7       0.8      0.1       0.0   6.6     80.3     81.5      9.82              2.76       129    0.076    530K    34K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.0     50.9     51.6      1.63              0.30        12    0.136     72K   3095       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.8     0.1      0.7       0.7      0.0       0.0   0.0    110.7     95.4      7.12              2.33        64    0.111    530K    34K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     45.0      2.70              0.43        64    0.042       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 7200.0 total, 600.0 interval#012Flush(GB): cumulative 0.119, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.78 GB write, 0.11 MB/s write, 0.77 GB read, 0.11 MB/s read, 9.8 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.14 MB/s read, 1.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b3e97ad1f0#2 capacity: 304.00 MB usage: 86.50 MB table_size: 0 occupancy: 18446744073709551615 collections: 13 last_copies: 0 last_secs: 0.000776 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(5322,82.76 MB,27.2226%) FilterBlock(129,1.42 MB,0.46848%) IndexBlock(129,2.32 MB,0.762889%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct  2 09:33:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:30.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:30 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:30 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:30 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:30.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:31 np0005465988 podman[362381]: 2025-10-02 13:33:31.536585935 +0000 UTC m=+0.060865294 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 09:33:31 np0005465988 podman[362380]: 2025-10-02 13:33:31.546296695 +0000 UTC m=+0.084572287 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 09:33:31 np0005465988 podman[362382]: 2025-10-02 13:33:31.555215672 +0000 UTC m=+0.082783635 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 09:33:32 np0005465988 nova_compute[236126]: 2025-10-02 13:33:32.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:32.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:32 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:32 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:32 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:32.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:33 np0005465988 nova_compute[236126]: 2025-10-02 13:33:33.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:33 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:34.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:34 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:34 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:34 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:34.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:36.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:36 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:36 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:33:36 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:36.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:33:37 np0005465988 nova_compute[236126]: 2025-10-02 13:33:37.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:38 np0005465988 nova_compute[236126]: 2025-10-02 13:33:38.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:38 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:38.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:38 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:38 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:38 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:38.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:40.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:40 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:40 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:40 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:40.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:42 np0005465988 nova_compute[236126]: 2025-10-02 13:33:42.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:42.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:42 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:42 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:42 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:42.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:43 np0005465988 nova_compute[236126]: 2025-10-02 13:33:43.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:43 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:44.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:44 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:44 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:44 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:44.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:46.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:46 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:46 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:46 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:46.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:47 np0005465988 nova_compute[236126]: 2025-10-02 13:33:47.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:48 np0005465988 nova_compute[236126]: 2025-10-02 13:33:48.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:48 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:48.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:48 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:48 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:33:48 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:48.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:33:49 np0005465988 nova_compute[236126]: 2025-10-02 13:33:49.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:33:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:50.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:50 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:50 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:50 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:50.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct  2 09:33:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:33:51 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct  2 09:33:52 np0005465988 nova_compute[236126]: 2025-10-02 13:33:52.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:52.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:52 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:52 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:52 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:52.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:53 np0005465988 nova_compute[236126]: 2025-10-02 13:33:53.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:53 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:54.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:54 np0005465988 podman[362637]: 2025-10-02 13:33:54.532860275 +0000 UTC m=+0.062076459 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 09:33:54 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:54 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:54 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:54.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:56.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:56 np0005465988 nova_compute[236126]: 2025-10-02 13:33:56.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:33:56 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:56 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:56 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:56.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:33:56 np0005465988 nova_compute[236126]: 2025-10-02 13:33:56.585 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:33:56 np0005465988 nova_compute[236126]: 2025-10-02 13:33:56.586 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:33:56 np0005465988 nova_compute[236126]: 2025-10-02 13:33:56.586 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:33:56 np0005465988 nova_compute[236126]: 2025-10-02 13:33:56.586 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:33:56 np0005465988 nova_compute[236126]: 2025-10-02 13:33:56.586 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:33:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:33:57 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2526761597' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:33:57 np0005465988 nova_compute[236126]: 2025-10-02 13:33:57.051 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:33:57 np0005465988 nova_compute[236126]: 2025-10-02 13:33:57.206 2 WARNING nova.virt.libvirt.driver [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:33:57 np0005465988 nova_compute[236126]: 2025-10-02 13:33:57.208 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3963MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:33:57 np0005465988 nova_compute[236126]: 2025-10-02 13:33:57.208 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:33:57 np0005465988 nova_compute[236126]: 2025-10-02 13:33:57.208 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:33:57 np0005465988 nova_compute[236126]: 2025-10-02 13:33:57.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:57 np0005465988 nova_compute[236126]: 2025-10-02 13:33:57.349 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:33:57 np0005465988 nova_compute[236126]: 2025-10-02 13:33:57.349 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:33:57 np0005465988 nova_compute[236126]: 2025-10-02 13:33:57.416 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 09:33:57 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Oct  2 09:33:57 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2473710781' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct  2 09:33:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:33:57 np0005465988 ceph-mon[76355]: from='mgr.14132 192.168.122.100:0/4553957' entity='mgr.compute-0.fmcstn' 
Oct  2 09:33:57 np0005465988 nova_compute[236126]: 2025-10-02 13:33:57.909 2 DEBUG oslo_concurrency.processutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 09:33:57 np0005465988 nova_compute[236126]: 2025-10-02 13:33:57.915 2 DEBUG nova.compute.provider_tree [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed in ProviderTree for provider: 5abd2871-a992-42ab-bb6a-594a92f77d4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:33:58 np0005465988 nova_compute[236126]: 2025-10-02 13:33:58.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:33:58 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:33:58 np0005465988 nova_compute[236126]: 2025-10-02 13:33:58.086 2 DEBUG nova.scheduler.client.report [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Inventory has not changed for provider 5abd2871-a992-42ab-bb6a-594a92f77d4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:33:58 np0005465988 nova_compute[236126]: 2025-10-02 13:33:58.089 2 DEBUG nova.compute.resource_tracker [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:33:58 np0005465988 nova_compute[236126]: 2025-10-02 13:33:58.089 2 DEBUG oslo_concurrency.lockutils [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:33:58 np0005465988 systemd-logind[827]: New session 62 of user zuul.
Oct  2 09:33:58 np0005465988 systemd[1]: Started Session 62 of User zuul.
Oct  2 09:33:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:33:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:33:58.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:33:58 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:33:58 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:33:58 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:33:58.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:00.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:00 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:00 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:00 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:00.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:01 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Oct  2 09:34:01 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3741218328' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct  2 09:34:02 np0005465988 nova_compute[236126]: 2025-10-02 13:34:02.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:34:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:02.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:02 np0005465988 podman[363060]: 2025-10-02 13:34:02.533074384 +0000 UTC m=+0.065487157 container health_status 52445b8500d0096562f8bf9d764fab9e891d26517b640925ae3bb3d1c71d7017 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid)
Oct  2 09:34:02 np0005465988 podman[363057]: 2025-10-02 13:34:02.564770237 +0000 UTC m=+0.098061815 container health_status 3c1d9521c8ebc48472b4136130b7ecdbd4fa296045a8183544c1da7f59125cb0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 09:34:02 np0005465988 podman[363061]: 2025-10-02 13:34:02.564847849 +0000 UTC m=+0.095176412 container health_status 91d721052aefc9299d16b5071c55aef53a4da0e89f8cead222325fa8663252c0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  2 09:34:02 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:02 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:34:02 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:02.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:34:02 np0005465988 ceph-mgr[76715]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3158772141
Oct  2 09:34:03 np0005465988 nova_compute[236126]: 2025-10-02 13:34:03.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:34:03 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:34:03 np0005465988 nova_compute[236126]: 2025-10-02 13:34:03.091 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:34:03 np0005465988 nova_compute[236126]: 2025-10-02 13:34:03.092 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:34:03 np0005465988 nova_compute[236126]: 2025-10-02 13:34:03.469 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:34:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:04.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:04 np0005465988 nova_compute[236126]: 2025-10-02 13:34:04.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:34:04 np0005465988 nova_compute[236126]: 2025-10-02 13:34:04.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:34:04 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:04 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000028s ======
Oct  2 09:34:04 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:04.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Oct  2 09:34:05 np0005465988 ovs-vsctl[363152]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  2 09:34:06 np0005465988 virtqemud[235689]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  2 09:34:06 np0005465988 virtqemud[235689]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  2 09:34:06 np0005465988 virtqemud[235689]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  2 09:34:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:06.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:34:06 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7201.0 total, 600.0 interval#012Cumulative writes: 80K writes, 322K keys, 80K commit groups, 1.0 writes per commit group, ingest: 0.32 GB, 0.05 MB/s#012Cumulative WAL: 80K writes, 29K syncs, 2.68 writes per sync, written: 0.32 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 478 writes, 740 keys, 478 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s#012Interval WAL: 478 writes, 235 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 09:34:06 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:06 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:06 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:06.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:06 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: cache status {prefix=cache status} (starting...)
Oct  2 09:34:06 np0005465988 lvm[363472]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct  2 09:34:06 np0005465988 lvm[363472]: VG ceph_vg0 finished
Oct  2 09:34:06 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: client ls {prefix=client ls} (starting...)
Oct  2 09:34:07 np0005465988 nova_compute[236126]: 2025-10-02 13:34:07.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:34:07 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: damage ls {prefix=damage ls} (starting...)
Oct  2 09:34:07 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Oct  2 09:34:07 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1711372420' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct  2 09:34:07 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump loads {prefix=dump loads} (starting...)
Oct  2 09:34:07 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Oct  2 09:34:08 np0005465988 nova_compute[236126]: 2025-10-02 13:34:08.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:34:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:34:08 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Oct  2 09:34:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Oct  2 09:34:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1601998562' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct  2 09:34:08 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Oct  2 09:34:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:08.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:08 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Oct  2 09:34:08 np0005465988 nova_compute[236126]: 2025-10-02 13:34:08.475 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:34:08 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:08 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:34:08 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:08.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:34:08 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Oct  2 09:34:08 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/397104129' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct  2 09:34:08 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Oct  2 09:34:08 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: get subtrees {prefix=get subtrees} (starting...)
Oct  2 09:34:09 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: ops {prefix=ops} (starting...)
Oct  2 09:34:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Oct  2 09:34:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1966899698' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct  2 09:34:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Oct  2 09:34:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/248155687' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct  2 09:34:09 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Oct  2 09:34:09 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/944630855' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct  2 09:34:09 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: session ls {prefix=session ls} (starting...)
Oct  2 09:34:10 np0005465988 ceph-mds[84851]: mds.cephfs.compute-2.gpiyct asok_command: status {prefix=status} (starting...)
Oct  2 09:34:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Oct  2 09:34:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4177335310' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct  2 09:34:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:10.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Oct  2 09:34:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2849923825' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct  2 09:34:10 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:10 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:10 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:10.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Oct  2 09:34:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1389248408' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct  2 09:34:10 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Oct  2 09:34:10 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1110813031' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct  2 09:34:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Oct  2 09:34:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3111688935' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct  2 09:34:11 np0005465988 nova_compute[236126]: 2025-10-02 13:34:11.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:34:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Oct  2 09:34:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2736873891' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct  2 09:34:11 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Oct  2 09:34:11 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3850657111' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct  2 09:34:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Oct  2 09:34:12 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1111808518' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct  2 09:34:12 np0005465988 nova_compute[236126]: 2025-10-02 13:34:12.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:34:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Oct  2 09:34:12 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4134743442' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct  2 09:34:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:12.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:12 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:12 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.003000086s ======
Oct  2 09:34:12 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:12.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000086s
Oct  2 09:34:12 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Oct  2 09:34:12 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1708908358' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct  2 09:34:13 np0005465988 nova_compute[236126]: 2025-10-02 13:34:13.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:34:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:34:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Oct  2 09:34:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1670926851' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct  2 09:34:13 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Oct  2 09:34:13 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/206892401' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546562048 unmapped: 59850752 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5970630 data_alloc: 251658240 data_used: 45506560
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19497a000/0x0/0x1bfc00000, data 0x6b9dcd9/0x6db4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546570240 unmapped: 59842560 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546570240 unmapped: 59842560 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546570240 unmapped: 59842560 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546570240 unmapped: 59842560 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.029835701s of 10.683382988s, submitted: 47
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19497a000/0x0/0x1bfc00000, data 0x6b9dcd9/0x6db4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49cc00 session 0x56127e613c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127eb1d800 session 0x5612813e7860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546578432 unmapped: 59834368 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5969194 data_alloc: 251658240 data_used: 45506560
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546586624 unmapped: 59826176 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546619392 unmapped: 59793408 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1956f0000/0x0/0x1bfc00000, data 0x5e27c77/0x603d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bdc00 session 0x56127f2bbe00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5813740 data_alloc: 251658240 data_used: 40177664
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195713000/0x0/0x1bfc00000, data 0x5e06c54/0x601b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195713000/0x0/0x1bfc00000, data 0x5e06c54/0x601b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195713000/0x0/0x1bfc00000, data 0x5e06c54/0x601b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5814104 data_alloc: 251658240 data_used: 40185856
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.736224651s of 11.216956139s, submitted: 64
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea38400 session 0x56127ea16d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49c800 session 0x56127f33f4a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 59777024 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02c400 session 0x56128329f680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196e23000/0x0/0x1bfc00000, data 0x434dbe2/0x4560000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543047680 unmapped: 63365120 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196e23000/0x0/0x1bfc00000, data 0x434dbe2/0x4560000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543047680 unmapped: 63365120 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196e4d000/0x0/0x1bfc00000, data 0x4323bbf/0x4535000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543047680 unmapped: 63365120 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543047680 unmapped: 63365120 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5507754 data_alloc: 234881024 data_used: 26574848
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543047680 unmapped: 63365120 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543055872 unmapped: 63356928 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543055872 unmapped: 63356928 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128056e400 session 0x561280c0e000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543055872 unmapped: 63356928 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196e4d000/0x0/0x1bfc00000, data 0x4323bbf/0x4535000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543064064 unmapped: 63348736 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5507622 data_alloc: 234881024 data_used: 26574848
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196e4d000/0x0/0x1bfc00000, data 0x4323bbf/0x4535000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543064064 unmapped: 63348736 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.698352814s of 11.074827194s, submitted: 53
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea37400 session 0x56127f3df0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289ecb800 session 0x561280b0a5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543064064 unmapped: 63348736 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1971f9000/0x0/0x1bfc00000, data 0x4323bbf/0x4535000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543064064 unmapped: 63348736 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543072256 unmapped: 63340544 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5240353 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289ecb800 session 0x561280f80000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5240353 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5401.0 total, 600.0 interval#012Cumulative writes: 72K writes, 293K keys, 72K commit groups, 1.0 writes per commit group, ingest: 0.29 GB, 0.06 MB/s#012Cumulative WAL: 72K writes, 26K syncs, 2.71 writes per sync, written: 0.29 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5949 writes, 24K keys, 5949 commit groups, 1.0 writes per commit group, ingest: 24.13 MB, 0.04 MB/s#012Interval WAL: 5949 writes, 2301 syncs, 2.59 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5240353 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: mgrc ms_handle_reset ms_handle_reset con 0x56127e5c1400
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3158772141
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3158772141,v1:192.168.122.100:6801/3158772141]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: mgrc handle_mgr_configure stats_period=5
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538673152 unmapped: 67739648 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538681344 unmapped: 67731456 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538681344 unmapped: 67731456 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5240353 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538681344 unmapped: 67731456 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538681344 unmapped: 67731456 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538689536 unmapped: 67723264 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538689536 unmapped: 67723264 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198a74000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862acc00 session 0x561285cd0960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128056ec00 session 0x56128127f4a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862ad800 session 0x561280b0a000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612834c6400 session 0x56127e8c7e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538689536 unmapped: 67723264 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.682285309s of 23.394338608s, submitted: 33
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5303640 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128056ec00 session 0x561280c0fe00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862acc00 session 0x561285cd0f00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862ad800 session 0x56127f8b5e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289ecb800 session 0x56127e8d4b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539795456 unmapped: 66617344 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b48000/0x0/0x1bfc00000, data 0x39d4c63/0x3be6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539803648 unmapped: 66609152 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539836416 unmapped: 66576384 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539836416 unmapped: 66576384 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539836416 unmapped: 66576384 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5365357 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539836416 unmapped: 66576384 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561286920000 session 0x561280f803c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539836416 unmapped: 66576384 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128056ec00 session 0x56127f3ee780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b48000/0x0/0x1bfc00000, data 0x39d4c63/0x3be6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b48000/0x0/0x1bfc00000, data 0x39d4c63/0x3be6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612850f3000 session 0x56127f3de1e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea5d000 session 0x561280b0b2c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f45000 session 0x56127f2bb0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539844608 unmapped: 66568192 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b22000/0x0/0x1bfc00000, data 0x39f8c96/0x3c0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539844608 unmapped: 66568192 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539885568 unmapped: 66527232 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5439220 data_alloc: 234881024 data_used: 26955776
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b22000/0x0/0x1bfc00000, data 0x39f8c96/0x3c0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5476980 data_alloc: 251658240 data_used: 32296960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b22000/0x0/0x1bfc00000, data 0x39f8c96/0x3c0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197b22000/0x0/0x1bfc00000, data 0x39f8c96/0x3c0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540082176 unmapped: 66330624 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.117702484s of 19.345458984s, submitted: 68
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 66322432 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5476756 data_alloc: 251658240 data_used: 32301056
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545300480 unmapped: 61112320 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545357824 unmapped: 61054976 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196a2e000/0x0/0x1bfc00000, data 0x4aebc96/0x4cff000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5639368 data_alloc: 251658240 data_used: 33820672
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196a2c000/0x0/0x1bfc00000, data 0x4aeec96/0x4d02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.879464149s of 10.096103668s, submitted: 423
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612810bb800 session 0x56127e95e960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5638532 data_alloc: 251658240 data_used: 33828864
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196a2b000/0x0/0x1bfc00000, data 0x4aeecf8/0x4d03000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545382400 unmapped: 61030400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196a2b000/0x0/0x1bfc00000, data 0x4aeecf8/0x4d03000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x244cf9c6), peers [0,1] op hist [0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280e86c00 session 0x56127f3de3c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612810bac00 session 0x56127e997a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542588928 unmapped: 63823872 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612806aa400 session 0x56127e8c7a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5424450 data_alloc: 234881024 data_used: 24637440
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612841f8000 session 0x561280b0a960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542588928 unmapped: 63823872 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289eca800 session 0x56127e799e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128b8bd800 session 0x561280b0b860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542588928 unmapped: 63823872 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612806aa400 session 0x561285cd05a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198436000/0x0/0x1bfc00000, data 0x2aaac24/0x2cbc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5270901 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198436000/0x0/0x1bfc00000, data 0x2aaac01/0x2cbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5270901 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198436000/0x0/0x1bfc00000, data 0x2aaac01/0x2cbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530120704 unmapped: 76292096 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198436000/0x0/0x1bfc00000, data 0x2aaac01/0x2cbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5270901 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530128896 unmapped: 76283904 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530128896 unmapped: 76283904 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530128896 unmapped: 76283904 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49d400 session 0x56127e8e83c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127fa4ac00 session 0x56127ea17e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612841f8400 session 0x56127f33e3c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49d400 session 0x56127ecee780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.085548401s of 24.294006348s, submitted: 62
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530112512 unmapped: 76300288 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127fa4ac00 session 0x561280c0e1e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198436000/0x0/0x1bfc00000, data 0x2aaac01/0x2cbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612806aa400 session 0x56128127e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128b8bd800 session 0x561281bebe00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128fb9f400 session 0x56127e9a5860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49d400 session 0x5612813e7e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5309808 data_alloc: 234881024 data_used: 17612800
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198259000/0x0/0x1bfc00000, data 0x2eb2c73/0x30c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5309808 data_alloc: 234881024 data_used: 17612800
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 76398592 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530022400 unmapped: 76390400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530022400 unmapped: 76390400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.395060539s of 11.491009712s, submitted: 29
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530022400 unmapped: 76390400 heap: 606412800 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5326945 data_alloc: 234881024 data_used: 17612800
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198259000/0x0/0x1bfc00000, data 0x2eb2c73/0x30c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561285e00c00 session 0x56127e8d52c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530071552 unmapped: 83689472 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49cc00 session 0x5612813f7a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561286218400 session 0x56127ea1e1e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e9bc400 session 0x561281bea960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530006016 unmapped: 83755008 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289a6dc00 session 0x561280f80780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 83746816 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f42400 session 0x5612813f63c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127eb1d800 session 0x561280f81680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19772a000/0x0/0x1bfc00000, data 0x39ded08/0x3bf4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530178048 unmapped: 83582976 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 530178048 unmapped: 83582976 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5457726 data_alloc: 234881024 data_used: 23810048
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19772a000/0x0/0x1bfc00000, data 0x39ded08/0x3bf4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612810ba400 session 0x56127f33e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280c31400 session 0x561282a77c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5515354 data_alloc: 251658240 data_used: 31928320
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.525037766s of 13.965292931s, submitted: 61
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 531931136 unmapped: 81829888 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19772a000/0x0/0x1bfc00000, data 0x39ded08/0x3bf4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [0,0,0,1,0,3])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 77807616 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5597972 data_alloc: 251658240 data_used: 32464896
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x56127f8b50e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612805fa400 session 0x56128329f0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 77774848 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196d2c000/0x0/0x1bfc00000, data 0x43d4d08/0x45ea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5604106 data_alloc: 251658240 data_used: 32288768
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196d20000/0x0/0x1bfc00000, data 0x43e2d08/0x45f8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561285602800 session 0x56127f8103c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280674c00 session 0x56128127f4a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561286216400 session 0x56127e8d4b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536002560 unmapped: 77758464 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612806afc00 session 0x56127e8c6960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5448333 data_alloc: 234881024 data_used: 22208512
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197824000/0x0/0x1bfc00000, data 0x38e3d08/0x3af9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x56127e9945a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197824000/0x0/0x1bfc00000, data 0x38e3d08/0x3af9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.898028374s of 15.850214005s, submitted: 141
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49f400 session 0x56128329f860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561280528c00 session 0x56127f886b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 77750272 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5447601 data_alloc: 234881024 data_used: 22208512
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533331968 unmapped: 80429056 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533348352 unmapped: 80412672 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e5c0c00 session 0x561280b0ab40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19865f000/0x0/0x1bfc00000, data 0x2aaac73/0x2cbd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533348352 unmapped: 80412672 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533348352 unmapped: 80412672 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19865f000/0x0/0x1bfc00000, data 0x2aaac73/0x2cbd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533348352 unmapped: 80412672 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5299051 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533348352 unmapped: 80412672 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x5612832a3680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49f400 session 0x56127ea1e780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533356544 unmapped: 80404480 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533356544 unmapped: 80404480 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612805fbc00 session 0x56127f8b4780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f42800 session 0x561280fa25a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533389312 unmapped: 80371712 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.706570625s of 10.357423782s, submitted: 79
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533389312 unmapped: 80371712 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561289a6c400 session 0x561280f80b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5296296 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x56127f8b4780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5295481 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533397504 unmapped: 80363520 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5295481 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533405696 unmapped: 80355328 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533413888 unmapped: 80347136 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533413888 unmapped: 80347136 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5295481 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533413888 unmapped: 80347136 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533422080 unmapped: 80338944 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea5d000 session 0x56128329f0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561283958800 session 0x56127f8b50e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561284537800 session 0x561282a77c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612806ae800 session 0x56127f33e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.242624283s of 17.551719666s, submitted: 16
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x198664000/0x0/0x1bfc00000, data 0x2aaab9f/0x2cba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533422080 unmapped: 80338944 heap: 613761024 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea34c00 session 0x5612813f63c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea5d000 session 0x5612813e7e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5397174 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128a0d4000 session 0x561281bebe00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19794e000/0x0/0x1bfc00000, data 0x37bfc01/0x39d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02d000 session 0x56128127e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862ac800 session 0x56127ecee780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 84525056 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399012 data_alloc: 234881024 data_used: 17608704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19794e000/0x0/0x1bfc00000, data 0x37bfc01/0x39d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533422080 unmapped: 84541440 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02d000 session 0x56127f33e3c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x19794c000/0x0/0x1bfc00000, data 0x37bfc34/0x39d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 533430272 unmapped: 84533248 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49cc00 session 0x56127e5a54a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128458a800 session 0x561285cd0f00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561285e01400 session 0x561280b0a960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f1fc00 session 0x56127e613a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.739593506s of 10.920108795s, submitted: 44
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127e02d000 session 0x561280c0f860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49cc00 session 0x561280c0f2c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128458a800 session 0x561280c0f0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561285e01400 session 0x561280c0e960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea35c00 session 0x561281c77860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 535896064 unmapped: 82067456 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536748032 unmapped: 81215488 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5581490 data_alloc: 251658240 data_used: 31318016
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536748032 unmapped: 81215488 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536748032 unmapped: 81215488 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561286216800 session 0x561281c77e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196f0e000/0x0/0x1bfc00000, data 0x41fcc44/0x4410000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49e400 session 0x561281c765a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536748032 unmapped: 81215488 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56128a0d5000 session 0x561281c77680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536748032 unmapped: 81215488 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f45c00 session 0x561282a772c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536756224 unmapped: 81207296 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5583174 data_alloc: 251658240 data_used: 31318016
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 536756224 unmapped: 81207296 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x196eea000/0x0/0x1bfc00000, data 0x4220c44/0x4434000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537149440 unmapped: 80814080 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539803648 unmapped: 78159872 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539803648 unmapped: 78159872 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.370302200s of 11.185728073s, submitted: 17
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542154752 unmapped: 75808768 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5701180 data_alloc: 251658240 data_used: 41771008
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542613504 unmapped: 75350016 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546119680 unmapped: 71843840 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1965eb000/0x0/0x1bfc00000, data 0x4b1fc44/0x4d33000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [0,0,0,0,0,0,1,0,0,34,2])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544186368 unmapped: 73777152 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543498240 unmapped: 74465280 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543498240 unmapped: 74465280 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5757618 data_alloc: 251658240 data_used: 42098688
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1961fb000/0x0/0x1bfc00000, data 0x4f0fc44/0x5123000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543498240 unmapped: 74465280 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x1961f6000/0x0/0x1bfc00000, data 0x4f14c44/0x5128000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544563200 unmapped: 73400320 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546381824 unmapped: 71581696 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545767424 unmapped: 72196096 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195dfb000/0x0/0x1bfc00000, data 0x5307c44/0x551b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 3.628212929s of 10.026289940s, submitted: 155
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545988608 unmapped: 71974912 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5819230 data_alloc: 251658240 data_used: 42954752
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545988608 unmapped: 71974912 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545996800 unmapped: 71966720 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545996800 unmapped: 71966720 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545996800 unmapped: 71966720 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195d84000/0x0/0x1bfc00000, data 0x5386c44/0x559a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545996800 unmapped: 71966720 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5818434 data_alloc: 251658240 data_used: 42975232
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545996800 unmapped: 71966720 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195d62000/0x0/0x1bfc00000, data 0x53a8c44/0x55bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5819290 data_alloc: 251658240 data_used: 42975232
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195d62000/0x0/0x1bfc00000, data 0x53a8c44/0x55bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.674377441s of 12.980909348s, submitted: 18
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x195d5f000/0x0/0x1bfc00000, data 0x53abc44/0x55bf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546004992 unmapped: 71958528 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546013184 unmapped: 71950336 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5820618 data_alloc: 251658240 data_used: 42979328
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546013184 unmapped: 71950336 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546013184 unmapped: 71950336 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127ea39800 session 0x561280bd9c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x5612862af000 session 0x561280b0b2c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561295f1fc00 session 0x5612813e7c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49c800 session 0x561282f9f0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546013184 unmapped: 71950336 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x56127f49e400 session 0x561285cd0d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 heartbeat osd_stat(store_statfs(0x197496000/0x0/0x1bfc00000, data 0x3983baf/0x3b94000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546062336 unmapped: 71901184 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546062336 unmapped: 71901184 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5526921 data_alloc: 234881024 data_used: 28925952
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546062336 unmapped: 71901184 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 ms_handle_reset con 0x561286216800 session 0x561282a77a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 399 handle_osd_map epochs [399,400], i have 399, src has [1,400]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 400 ms_handle_reset con 0x56127f49c800 session 0x561281c774a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 400 ms_handle_reset con 0x5612823e1400 session 0x561281beab40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 400 ms_handle_reset con 0x56127eb1dc00 session 0x5612813e72c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546062336 unmapped: 71901184 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546070528 unmapped: 71892992 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546070528 unmapped: 71892992 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 400 heartbeat osd_stat(store_statfs(0x197785000/0x0/0x1bfc00000, data 0x398586a/0x3b98000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5533519 data_alloc: 234881024 data_used: 28995584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 400 heartbeat osd_stat(store_statfs(0x197785000/0x0/0x1bfc00000, data 0x398586a/0x3b98000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5533519 data_alloc: 234881024 data_used: 28995584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 400 heartbeat osd_stat(store_statfs(0x197785000/0x0/0x1bfc00000, data 0x398586a/0x3b98000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 400 ms_handle_reset con 0x56127f39bc00 session 0x561280f81c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.182943344s of 19.346008301s, submitted: 60
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 400 handle_osd_map epochs [400,401], i have 400, src has [1,401]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 401 ms_handle_reset con 0x56128b8bc400 session 0x56128127eb40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 401 heartbeat osd_stat(store_statfs(0x197783000/0x0/0x1bfc00000, data 0x39874b5/0x3b9a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5535477 data_alloc: 234881024 data_used: 29016064
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 401 heartbeat osd_stat(store_statfs(0x197784000/0x0/0x1bfc00000, data 0x39874b5/0x3b9a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 401 heartbeat osd_stat(store_statfs(0x197784000/0x0/0x1bfc00000, data 0x39874b5/0x3b9a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 401 handle_osd_map epochs [401,402], i have 401, src has [1,402]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5544371 data_alloc: 234881024 data_used: 29605888
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x197780000/0x0/0x1bfc00000, data 0x3988ff4/0x3b9d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19777c000/0x0/0x1bfc00000, data 0x398dff4/0x3ba2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5546969 data_alloc: 234881024 data_used: 29601792
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19777c000/0x0/0x1bfc00000, data 0x398dff4/0x3ba2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f42800 session 0x561280c0ed20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280674c00 session 0x561281c77c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 546078720 unmapped: 71884800 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.279945374s of 15.460276604s, submitted: 45
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x56127f33f4a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19865b000/0x0/0x1bfc00000, data 0x2aaffe4/0x2cc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5332809 data_alloc: 234881024 data_used: 17637376
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19865b000/0x0/0x1bfc00000, data 0x2aaffe4/0x2cc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5332809 data_alloc: 234881024 data_used: 17637376
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19865b000/0x0/0x1bfc00000, data 0x2aaffe4/0x2cc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5332809 data_alloc: 234881024 data_used: 17637376
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538247168 unmapped: 79716352 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19865b000/0x0/0x1bfc00000, data 0x2aaffe4/0x2cc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538255360 unmapped: 79708160 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538255360 unmapped: 79708160 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x19865b000/0x0/0x1bfc00000, data 0x2aaffe4/0x2cc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538255360 unmapped: 79708160 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5332809 data_alloc: 234881024 data_used: 17637376
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538255360 unmapped: 79708160 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538255360 unmapped: 79708160 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f45800 session 0x561280b0af00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280e86000 session 0x56127f7192c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x56128127fe00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280674c00 session 0x56127e798b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.710439682s of 20.854120255s, submitted: 13
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280e86000 session 0x56128127e000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f42800 session 0x5612832a3c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f45800 session 0x561281bea5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539435008 unmapped: 78528512 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x5612832a21e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280674c00 session 0x5612813e7e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280e86000 session 0x561285cd0f00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f42800 session 0x561280c0f0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561284536800 session 0x561281bea960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x561281c77680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280674c00 session 0x561280fa2960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280e86000 session 0x561280bd9860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561284536800 session 0x56127f3ee780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561295f42800 session 0x56128127e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x56128329f4a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x197e44000/0x0/0x1bfc00000, data 0x32c4066/0x34da000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411862 data_alloc: 234881024 data_used: 17637376
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561289ecb800 session 0x561281beba40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56128458a800 session 0x56127e8c7a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280e87000 session 0x561280bd9e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x561280d06000 session 0x561280c0e960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x56127f49f400 session 0x56127e8d4d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539484160 unmapped: 78479360 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539443200 unmapped: 78520320 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x197dfa000/0x0/0x1bfc00000, data 0x330c099/0x3524000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5472521 data_alloc: 234881024 data_used: 24907776
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x197dfa000/0x0/0x1bfc00000, data 0x330c099/0x3524000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5472521 data_alloc: 234881024 data_used: 24907776
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539451392 unmapped: 78512128 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.624403000s of 17.017063141s, submitted: 70
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 542777344 unmapped: 75186176 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5524501 data_alloc: 234881024 data_used: 25411584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x197dfa000/0x0/0x1bfc00000, data 0x330c099/0x3524000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544989184 unmapped: 72974336 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x196b9a000/0x0/0x1bfc00000, data 0x4563099/0x477b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5643323 data_alloc: 234881024 data_used: 25501696
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x196b9a000/0x0/0x1bfc00000, data 0x4563099/0x477b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545234944 unmapped: 72728576 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5633579 data_alloc: 234881024 data_used: 25505792
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.728960991s of 11.184023857s, submitted: 164
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545243136 unmapped: 72720384 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x196ba0000/0x0/0x1bfc00000, data 0x4566099/0x477e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 heartbeat osd_stat(store_statfs(0x196ba0000/0x0/0x1bfc00000, data 0x4566099/0x477e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545243136 unmapped: 72720384 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 ms_handle_reset con 0x5612822d6800 session 0x561280c0eb40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545251328 unmapped: 72712192 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 402 handle_osd_map epochs [402,403], i have 402, src has [1,403]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 403 handle_osd_map epochs [403,403], i have 403, src has [1,403]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 545251328 unmapped: 72712192 heap: 617963520 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 403 ms_handle_reset con 0x561280d07c00 session 0x56127dde54a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 403 ms_handle_reset con 0x561285e01c00 session 0x561281beb2c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 403 ms_handle_reset con 0x561283958800 session 0x56127e994780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 403 heartbeat osd_stat(store_statfs(0x194df5000/0x0/0x1bfc00000, data 0x630fcf2/0x6529000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 403 ms_handle_reset con 0x561283958800 session 0x56127f33e1e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 403 handle_osd_map epochs [403,404], i have 403, src has [1,404]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 403 handle_osd_map epochs [404,404], i have 404, src has [1,404]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 404 ms_handle_reset con 0x56127f49f400 session 0x5612813f7860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557522944 unmapped: 83746816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5894637 data_alloc: 251658240 data_used: 32055296
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 404 handle_osd_map epochs [404,405], i have 404, src has [1,405]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557531136 unmapped: 83738624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x561280d07c00 session 0x561281c76b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 heartbeat osd_stat(store_statfs(0x194df1000/0x0/0x1bfc00000, data 0x631199f/0x652c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557531136 unmapped: 83738624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56127ea36000 session 0x561280b0ba40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x561289a6d000 session 0x56127f886000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56127ea36000 session 0x561282a76780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 83730432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 83730432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 heartbeat osd_stat(store_statfs(0x194deb000/0x0/0x1bfc00000, data 0x6315676/0x6532000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552140800 unmapped: 89128960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5885083 data_alloc: 251658240 data_used: 32055296
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552140800 unmapped: 89128960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.128082275s of 11.069601059s, submitted: 70
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56128458a800 session 0x561280c0fe00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x561289ecb800 session 0x56128127f2c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552140800 unmapped: 89128960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552140800 unmapped: 89128960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56127ea35400 session 0x56127ea1fa40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56128056e400 session 0x56127e8c72c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 ms_handle_reset con 0x56127ea35400 session 0x561281bea780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 heartbeat osd_stat(store_statfs(0x194dec000/0x0/0x1bfc00000, data 0x6315676/0x6532000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552140800 unmapped: 89128960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 405 handle_osd_map epochs [405,406], i have 405, src has [1,406]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x56127ea36000 session 0x56127ea1f4a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x561289ecb800 session 0x56127e612960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x56128458a800 session 0x56127e5a4b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x561286f37800 session 0x561281c761e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x56127ea35400 session 0x561285cd03c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x56127ea36000 session 0x56127e8e83c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5891905 data_alloc: 251658240 data_used: 32067584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 406 heartbeat osd_stat(store_statfs(0x194de7000/0x0/0x1bfc00000, data 0x6317217/0x6536000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5891905 data_alloc: 251658240 data_used: 32067584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 406 heartbeat osd_stat(store_statfs(0x194de7000/0x0/0x1bfc00000, data 0x6317217/0x6536000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 406 ms_handle_reset con 0x561295f45000 session 0x56128329eb40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552148992 unmapped: 89120768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 406 handle_osd_map epochs [406,407], i have 406, src has [1,407]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.107981682s of 10.156295776s, submitted: 26
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 406 handle_osd_map epochs [407,407], i have 407, src has [1,407]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 handle_osd_map epochs [407,407], i have 407, src has [1,407]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561280e87c00 session 0x561281c770e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552173568 unmapped: 89096192 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561289a6c000 session 0x561280f80000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x56127ea35400 session 0x56128329fe00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x56127ea36000 session 0x5612813f70e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552173568 unmapped: 89096192 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561280e87c00 session 0x56127f497c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552189952 unmapped: 89079808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de3000/0x0/0x1bfc00000, data 0x6318ed2/0x653a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552198144 unmapped: 89071616 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5897451 data_alloc: 251658240 data_used: 32100352
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552206336 unmapped: 89063424 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x56127eb1d800 session 0x56127e996000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de3000/0x0/0x1bfc00000, data 0x6318ed2/0x653a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552206336 unmapped: 89063424 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552206336 unmapped: 89063424 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 553230336 unmapped: 88039424 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de3000/0x0/0x1bfc00000, data 0x6318ef5/0x653b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561295f45000 session 0x56127f497680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558751744 unmapped: 82518016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561289a6d400 session 0x5612832a32c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6001264 data_alloc: 251658240 data_used: 46129152
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558751744 unmapped: 82518016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de3000/0x0/0x1bfc00000, data 0x6318ef5/0x653b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558751744 unmapped: 82518016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558751744 unmapped: 82518016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x5612823e1000 session 0x56127e9974a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.553009987s of 11.791515350s, submitted: 16
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 ms_handle_reset con 0x561286218400 session 0x561285cd0b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de3000/0x0/0x1bfc00000, data 0x6318ef5/0x653b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558891008 unmapped: 82378752 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 heartbeat osd_stat(store_statfs(0x194de4000/0x0/0x1bfc00000, data 0x73d9e93/0x653a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 407 handle_osd_map epochs [407,408], i have 407, src has [1,408]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 408 handle_osd_map epochs [408,408], i have 408, src has [1,408]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 408 ms_handle_reset con 0x561286921000 session 0x561282a76d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559955968 unmapped: 81313792 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6074674 data_alloc: 251658240 data_used: 46137344
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 408 ms_handle_reset con 0x56127e9bdc00 session 0x56128127f0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 408 heartbeat osd_stat(store_statfs(0x194de1000/0x0/0x1bfc00000, data 0x631ab40/0x653d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559955968 unmapped: 81313792 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 408 heartbeat osd_stat(store_statfs(0x194de1000/0x0/0x1bfc00000, data 0x631ab40/0x653d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559955968 unmapped: 81313792 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 408 heartbeat osd_stat(store_statfs(0x194de1000/0x0/0x1bfc00000, data 0x631ab40/0x653d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559964160 unmapped: 81305600 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 408 handle_osd_map epochs [408,409], i have 408, src has [1,409]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559972352 unmapped: 81297408 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194ddd000/0x0/0x1bfc00000, data 0x631c67f/0x6540000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559742976 unmapped: 81526784 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6079926 data_alloc: 251658240 data_used: 46387200
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559742976 unmapped: 81526784 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559775744 unmapped: 81494016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559775744 unmapped: 81494016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559775744 unmapped: 81494016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194ce1000/0x0/0x1bfc00000, data 0x641867f/0x663c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559775744 unmapped: 81494016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6090966 data_alloc: 251658240 data_used: 47108096
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.032366753s of 12.322899818s, submitted: 81
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559775744 unmapped: 81494016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559783936 unmapped: 81485824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559783936 unmapped: 81485824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559923200 unmapped: 81346560 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194ce0000/0x0/0x1bfc00000, data 0x641867f/0x663c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6101928 data_alloc: 251658240 data_used: 48390144
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194ce0000/0x0/0x1bfc00000, data 0x641867f/0x663c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194ce2000/0x0/0x1bfc00000, data 0x641867f/0x663c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194cdc000/0x0/0x1bfc00000, data 0x641e67f/0x6642000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6104846 data_alloc: 251658240 data_used: 48398336
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.983446121s of 11.115625381s, submitted: 15
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 ms_handle_reset con 0x5612823e0c00 session 0x5612832a2000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 ms_handle_reset con 0x5612810bac00 session 0x56127e612f00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560103424 unmapped: 81166336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 ms_handle_reset con 0x5612822d7800 session 0x56127e798d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x194cdc000/0x0/0x1bfc00000, data 0x641e67f/0x6642000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x195dc9000/0x0/0x1bfc00000, data 0x533363c/0x5554000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5882971 data_alloc: 251658240 data_used: 42958848
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x195dc9000/0x0/0x1bfc00000, data 0x533363c/0x5554000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5891131 data_alloc: 251658240 data_used: 43483136
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 ms_handle_reset con 0x561280c30800 session 0x561282f9e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 heartbeat osd_stat(store_statfs(0x195dc9000/0x0/0x1bfc00000, data 0x533363c/0x5554000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 409 handle_osd_map epochs [409,410], i have 409, src has [1,410]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 410 handle_osd_map epochs [410,410], i have 410, src has [1,410]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 410 ms_handle_reset con 0x561280e86800 session 0x56127ecee5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558071808 unmapped: 83197952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.808508873s of 10.125526428s, submitted: 52
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 410 ms_handle_reset con 0x5612810bac00 session 0x56127ea1e960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 410 ms_handle_reset con 0x561280c30800 session 0x5612813e65a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 410 ms_handle_reset con 0x5612822d7800 session 0x56127e613c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 410 handle_osd_map epochs [410,411], i have 410, src has [1,411]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 411 handle_osd_map epochs [411,411], i have 411, src has [1,411]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 411 ms_handle_reset con 0x5612823e0c00 session 0x56127dde54a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 77168640 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 411 handle_osd_map epochs [411,412], i have 411, src has [1,412]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 77570048 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 412 ms_handle_reset con 0x561295f42000 session 0x561280b0bc20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 77570048 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6016353 data_alloc: 268435456 data_used: 46612480
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 412 ms_handle_reset con 0x561280c30800 session 0x561280bd9680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 412 heartbeat osd_stat(store_statfs(0x194951000/0x0/0x1bfc00000, data 0x67a7bb7/0x69cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 412 ms_handle_reset con 0x5612810bac00 session 0x561282f9ed20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 412 ms_handle_reset con 0x5612822d7800 session 0x561280f80f00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 77570048 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 77561856 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 77561856 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 412 handle_osd_map epochs [412,413], i have 412, src has [1,413]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 413 ms_handle_reset con 0x56127f49fc00 session 0x56127e9945a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563716096 unmapped: 77553664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 413 heartbeat osd_stat(store_statfs(0x195dbd000/0x0/0x1bfc00000, data 0x533a880/0x5560000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 413 ms_handle_reset con 0x5612810bb800 session 0x56127f810000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563716096 unmapped: 77553664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5877483 data_alloc: 268435456 data_used: 46600192
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 413 ms_handle_reset con 0x5612810bb800 session 0x56127e9a5860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 77545472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 77545472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 77545472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.639679909s of 11.257122040s, submitted: 116
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 413 handle_osd_map epochs [413,414], i have 413, src has [1,414]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 414 handle_osd_map epochs [414,415], i have 414, src has [1,415]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561286920c00 session 0x5612813e7a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 551043072 unmapped: 90226688 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 heartbeat osd_stat(store_statfs(0x197c5d000/0x0/0x1bfc00000, data 0x3499ff5/0x36bf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 551043072 unmapped: 90226688 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5494105 data_alloc: 234881024 data_used: 18702336
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561280d06000 session 0x56127e9965a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561280e87000 session 0x561280c0f860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x56128458b000 session 0x5612813e6960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538820608 unmapped: 102449152 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 538820608 unmapped: 102449152 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 550281216 unmapped: 90988544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561280d06000 session 0x561280f803c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561280e87000 session 0x56127f33e3c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x5612810bb800 session 0x561281c76780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561286920c00 session 0x561281c77680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x5612862ac800 session 0x56127e8c6960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 101179392 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 heartbeat osd_stat(store_statfs(0x1978f0000/0x0/0x1bfc00000, data 0x380af83/0x3a2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 101179392 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5473510 data_alloc: 218103808 data_used: 8441856
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 heartbeat osd_stat(store_statfs(0x1978f0000/0x0/0x1bfc00000, data 0x380af83/0x3a2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 101179392 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 heartbeat osd_stat(store_statfs(0x1978f0000/0x0/0x1bfc00000, data 0x380af83/0x3a2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 101179392 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x56128056e800 session 0x5612813f72c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540090368 unmapped: 101179392 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x5612834c7c00 session 0x56127ea17e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 ms_handle_reset con 0x561280d07400 session 0x56127ea16d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.646598816s of 10.607299805s, submitted: 112
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 415 handle_osd_map epochs [415,416], i have 415, src has [1,416]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x5612806aa400 session 0x561280b0b0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540098560 unmapped: 101171200 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540098560 unmapped: 101171200 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5482439 data_alloc: 218103808 data_used: 8450048
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1978c8000/0x0/0x1bfc00000, data 0x3830ad2/0x3a56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 101408768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1978c8000/0x0/0x1bfc00000, data 0x3830ad2/0x3a56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1978c8000/0x0/0x1bfc00000, data 0x3830ad2/0x3a56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5581319 data_alloc: 234881024 data_used: 20127744
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1978c8000/0x0/0x1bfc00000, data 0x3830ad2/0x3a56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539901952 unmapped: 101367808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5581799 data_alloc: 234881024 data_used: 20140032
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1978c8000/0x0/0x1bfc00000, data 0x3830ad2/0x3a56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.785026550s of 11.835221291s, submitted: 20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541966336 unmapped: 99303424 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544423936 unmapped: 96845824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c65000/0x0/0x1bfc00000, data 0x4485ad2/0x46ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c65000/0x0/0x1bfc00000, data 0x4485ad2/0x46ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5706971 data_alloc: 234881024 data_used: 21835776
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c65000/0x0/0x1bfc00000, data 0x4485ad2/0x46ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c65000/0x0/0x1bfc00000, data 0x4485ad2/0x46ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5698087 data_alloc: 234881024 data_used: 21835776
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c71000/0x0/0x1bfc00000, data 0x4487ad2/0x46ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.116588593s of 13.764609337s, submitted: 132
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5698635 data_alloc: 234881024 data_used: 21843968
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c70000/0x0/0x1bfc00000, data 0x4488ad2/0x46ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x196c70000/0x0/0x1bfc00000, data 0x4488ad2/0x46ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x248df9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5698635 data_alloc: 234881024 data_used: 21843968
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544743424 unmapped: 96526336 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561289a6d400 session 0x561280f814a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.007428169s of 10.032960892s, submitted: 6
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5703798 data_alloc: 234881024 data_used: 21852160
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56127eb1dc00 session 0x561285cd1a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56128056dc00 session 0x561280c0f2c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5703442 data_alloc: 234881024 data_used: 21852160
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561285e01400 session 0x56128127f0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56127ea37000 session 0x56127e996780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561289a6cc00 session 0x561280b0a960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.614952087s of 10.691198349s, submitted: 9
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561289a6cc00 session 0x56127e5a5680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544751616 unmapped: 96518144 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5704062 data_alloc: 234881024 data_used: 21852160
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56128056e800 session 0x56127f886b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x5612806aa400 session 0x56128329e960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x19681f000/0x0/0x1bfc00000, data 0x44c8b35/0x46ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544759808 unmapped: 96509952 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561295f44400 session 0x56127e799680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5390803 data_alloc: 218103808 data_used: 8716288
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1981e0000/0x0/0x1bfc00000, data 0x2b08b25/0x2d2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x5612834c7c00 session 0x5612813f7a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56127f49fc00 session 0x561280f803c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x1981e0000/0x0/0x1bfc00000, data 0x2b08b25/0x2d2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x5612834c7c00 session 0x561280c0f860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56128056e800 session 0x56127e9965a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5386741 data_alloc: 218103808 data_used: 8450048
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x198220000/0x0/0x1bfc00000, data 0x2ac8ac2/0x2ced000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 537624576 unmapped: 103645184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.404942513s of 12.867709160s, submitted: 60
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561280c31800 session 0x56127e9945a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539107328 unmapped: 102162432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561280d06000 session 0x561280bd9680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x198221000/0x0/0x1bfc00000, data 0x2ac8ac2/0x2ced000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539107328 unmapped: 102162432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56127f49fc00 session 0x56127e798d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539107328 unmapped: 102162432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5401867 data_alloc: 234881024 data_used: 13656064
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x198220000/0x0/0x1bfc00000, data 0x2ac8ad2/0x2cee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539107328 unmapped: 102162432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56127fa4b800 session 0x5612832a2000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539107328 unmapped: 102162432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x561295f1f800 session 0x5612832a2b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x5612841f8c00 session 0x561280c0e000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 ms_handle_reset con 0x56128a0d5000 session 0x561282f9f0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539410432 unmapped: 101859328 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 heartbeat osd_stat(store_statfs(0x197679000/0x0/0x1bfc00000, data 0x366fad2/0x3895000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 539410432 unmapped: 101859328 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 416 handle_osd_map epochs [416,417], i have 416, src has [1,417]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f49fc00 session 0x56127ea1fa40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x197674000/0x0/0x1bfc00000, data 0x367178d/0x3899000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [0,0,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 544006144 unmapped: 97263616 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5540474 data_alloc: 234881024 data_used: 13664256
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127fa4b800 session 0x56127ea19860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x5612841f8c00 session 0x5612832a21e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561295f1f800 session 0x5612832a30e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127ea5d000 session 0x561280bd8960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f49fc00 session 0x561280c0f0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x197674000/0x0/0x1bfc00000, data 0x367178d/0x3899000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540819456 unmapped: 100450304 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540819456 unmapped: 100450304 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561286219800 session 0x56127e613c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561284537400 session 0x56127e612780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540819456 unmapped: 100450304 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561285e01400 session 0x56127e613a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.703686714s of 11.037638664s, submitted: 92
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56128056c000 session 0x561281bea960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540819456 unmapped: 100450304 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561284537400 session 0x56127ea16960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f49fc00 session 0x5612813e63c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540835840 unmapped: 100433920 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5577375 data_alloc: 234881024 data_used: 13672448
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 540712960 unmapped: 100556800 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x196e5c000/0x0/0x1bfc00000, data 0x3e8780f/0x40b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x5612862ae800 session 0x56127ea1e780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56128fb9fc00 session 0x56127e5a41e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5636415 data_alloc: 234881024 data_used: 21962752
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x196e5c000/0x0/0x1bfc00000, data 0x3e8780f/0x40b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561285e00400 session 0x56127e9970e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f49fc00 session 0x56128329e3c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x196e5c000/0x0/0x1bfc00000, data 0x3e8780f/0x40b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 541147136 unmapped: 100122624 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 543899648 unmapped: 97370112 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x196e37000/0x0/0x1bfc00000, data 0x3eab81f/0x40d7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 548061184 unmapped: 93208576 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x196e37000/0x0/0x1bfc00000, data 0x3eab81f/0x40d7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24cef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 548061184 unmapped: 93208576 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5728177 data_alloc: 251658240 data_used: 34103296
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.103850365s of 12.316456795s, submitted: 13
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 548061184 unmapped: 93208576 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561284537400 session 0x56127ea1eb40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561285e00400 session 0x561280fa3c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554704896 unmapped: 86564864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56128056dc00 session 0x56127ea17680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555008000 unmapped: 86261760 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 heartbeat osd_stat(store_statfs(0x195ca5000/0x0/0x1bfc00000, data 0x4ed880f/0x5103000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555286528 unmapped: 85983232 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x561295f1f800 session 0x56127e8d43c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555286528 unmapped: 85983232 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5856242 data_alloc: 251658240 data_used: 34967552
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f49fc00 session 0x561282a76000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555286528 unmapped: 85983232 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 ms_handle_reset con 0x56127f39bc00 session 0x561280f814a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555286528 unmapped: 85983232 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 417 handle_osd_map epochs [417,418], i have 417, src has [1,418]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x5612810bb400 session 0x561280f81e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 heartbeat osd_stat(store_statfs(0x195ca6000/0x0/0x1bfc00000, data 0x4ed87ff/0x5102000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555302912 unmapped: 85966848 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x561295f42000 session 0x56127e612960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x561295f1e400 session 0x56127ea18f00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554328064 unmapped: 86941696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554328064 unmapped: 86941696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5682012 data_alloc: 234881024 data_used: 22839296
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554328064 unmapped: 86941696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.493256569s of 11.112817764s, submitted: 175
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554328064 unmapped: 86941696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x5612834c6800 session 0x56127f33e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x5612804a6400 session 0x561282f9eb40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 heartbeat osd_stat(store_statfs(0x196852000/0x0/0x1bfc00000, data 0x433343a/0x455c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x56127f39a400 session 0x56127f8103c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5689400 data_alloc: 234881024 data_used: 23822336
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 ms_handle_reset con 0x561285603800 session 0x56127ecee780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552828928 unmapped: 88440832 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 handle_osd_map epochs [418,419], i have 418, src has [1,419]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 handle_osd_map epochs [419,419], i have 419, src has [1,419]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 heartbeat osd_stat(store_statfs(0x196851000/0x0/0x1bfc00000, data 0x433349c/0x455d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [0,0,0,0,1,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 418 handle_osd_map epochs [419,419], i have 419, src has [1,419]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x56127f49e800 session 0x56128329ed20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561285e01400 session 0x5612813e74a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561286219800 session 0x5612813f65a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 553877504 unmapped: 87392256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x196851000/0x0/0x1bfc00000, data 0x433349c/0x455d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x56127f39a400 session 0x56127f2bba40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x5612834c6800 session 0x56127e8d4b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x1964cd000/0x0/0x1bfc00000, data 0x46b4f88/0x48e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [0,0,0,0,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552853504 unmapped: 88416256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5718846 data_alloc: 234881024 data_used: 23830528
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x5612804a6400 session 0x56128329fe00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x19798d000/0x0/0x1bfc00000, data 0x2e4df88/0x3079000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5471915 data_alloc: 234881024 data_used: 14663680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d37000/0x0/0x1bfc00000, data 0x2e4df16/0x3077000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549601280 unmapped: 91668480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d37000/0x0/0x1bfc00000, data 0x2e4df16/0x3077000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5471915 data_alloc: 234881024 data_used: 14663680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d37000/0x0/0x1bfc00000, data 0x2e4df16/0x3077000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561285e01c00 session 0x561281beba40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x56128a0d5400 session 0x561282f9e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549609472 unmapped: 91660288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561280e91c00 session 0x56127f7185a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.033617020s of 22.532424927s, submitted: 104
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536800 session 0x5612832a3c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d37000/0x0/0x1bfc00000, data 0x2e4df16/0x3077000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5478936 data_alloc: 234881024 data_used: 14663680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d0b000/0x0/0x1bfc00000, data 0x2e77f49/0x30a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5505336 data_alloc: 234881024 data_used: 18341888
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d0b000/0x0/0x1bfc00000, data 0x2e77f49/0x30a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5505336 data_alloc: 234881024 data_used: 18341888
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x197d0b000/0x0/0x1bfc00000, data 0x2e77f49/0x30a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 549969920 unmapped: 91299840 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.706553459s of 12.811210632s, submitted: 7
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552312832 unmapped: 88956928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536c00 session 0x561281c77680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561280e91c00 session 0x561282f9ef00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536800 session 0x5612813f72c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536c00 session 0x56127e8d41e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552476672 unmapped: 88793088 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561285e01c00 session 0x561282f9f860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x56128a0d5400 session 0x5612813e7860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561280e91c00 session 0x5612832a3a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536800 session 0x561280bd92c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x561284536c00 session 0x56127f886d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5608619 data_alloc: 234881024 data_used: 18997248
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x1971a4000/0x0/0x1bfc00000, data 0x39ddf59/0x3c0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x1971a4000/0x0/0x1bfc00000, data 0x39ddf59/0x3c0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5607251 data_alloc: 234881024 data_used: 18997248
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 ms_handle_reset con 0x56128056e800 session 0x56127f3de780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 heartbeat osd_stat(store_statfs(0x1971a1000/0x0/0x1bfc00000, data 0x39e0f59/0x3c0d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552484864 unmapped: 88784896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 handle_osd_map epochs [419,420], i have 419, src has [1,420]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.388351440s of 10.914187431s, submitted: 75
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552493056 unmapped: 88776704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 419 handle_osd_map epochs [420,420], i have 420, src has [1,420]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 420 ms_handle_reset con 0x561284536400 session 0x5612832a3a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 420 heartbeat osd_stat(store_statfs(0x1971a1000/0x0/0x1bfc00000, data 0x39e0f59/0x3c0d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552591360 unmapped: 88678400 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 420 handle_osd_map epochs [420,421], i have 420, src has [1,421]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56127fa4b800 session 0x561280fa2d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5646425 data_alloc: 234881024 data_used: 23064576
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552599552 unmapped: 88670208 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x197198000/0x0/0x1bfc00000, data 0x39e486d/0x3c14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x197198000/0x0/0x1bfc00000, data 0x39e486d/0x3c14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5646425 data_alloc: 234881024 data_used: 23064576
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x197198000/0x0/0x1bfc00000, data 0x39e486d/0x3c14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56128544c400 session 0x56127ea1f0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x5612810bb800 session 0x561282f9f860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552607744 unmapped: 88662016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552615936 unmapped: 88653824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 552615936 unmapped: 88653824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.998759270s of 11.290341377s, submitted: 9
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5697957 data_alloc: 234881024 data_used: 24182784
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555933696 unmapped: 85336064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555941888 unmapped: 85327872 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #59. Immutable memtables: 15.
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558292992 unmapped: 82976768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195966000/0x0/0x1bfc00000, data 0x40698cf/0x429a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561285602000 session 0x561281c77680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56128b8bc400 session 0x5612832a3c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558292992 unmapped: 82976768 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195962000/0x0/0x1bfc00000, data 0x406d8cf/0x429e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558309376 unmapped: 82960384 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56127fa4b800 session 0x561282f9e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5707589 data_alloc: 234881024 data_used: 24141824
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x5612810bb800 session 0x561281beba40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558325760 unmapped: 82944000 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558325760 unmapped: 82944000 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558473216 unmapped: 82796544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558481408 unmapped: 82788352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195949000/0x0/0x1bfc00000, data 0x4092902/0x42c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195949000/0x0/0x1bfc00000, data 0x4092902/0x42c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558489600 unmapped: 82780160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x19593c000/0x0/0x1bfc00000, data 0x409f902/0x42d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x19593c000/0x0/0x1bfc00000, data 0x409f902/0x42d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5715485 data_alloc: 234881024 data_used: 24166400
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558497792 unmapped: 82771968 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.514921188s of 11.821525574s, submitted: 162
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558497792 unmapped: 82771968 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56128b8bcc00 session 0x561285cd01e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558497792 unmapped: 82771968 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558497792 unmapped: 82771968 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x19593b000/0x0/0x1bfc00000, data 0x40a0902/0x42d3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558505984 unmapped: 82763776 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5729985 data_alloc: 234881024 data_used: 24281088
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558505984 unmapped: 82763776 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558505984 unmapped: 82763776 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558505984 unmapped: 82763776 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x19593a000/0x0/0x1bfc00000, data 0x42cb902/0x42d4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558505984 unmapped: 82763776 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557793280 unmapped: 83476480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5766101 data_alloc: 234881024 data_used: 25194496
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557793280 unmapped: 83476480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557793280 unmapped: 83476480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557793280 unmapped: 83476480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557801472 unmapped: 83468288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.717605591s of 12.977429390s, submitted: 17
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557801472 unmapped: 83468288 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5776481 data_alloc: 234881024 data_used: 25587712
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5781553 data_alloc: 234881024 data_used: 26611712
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557809664 unmapped: 83460096 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558268416 unmapped: 83001344 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5788209 data_alloc: 251658240 data_used: 28356608
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558268416 unmapped: 83001344 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.078017235s of 11.319710732s, submitted: 34
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56127ea36c00 session 0x56128329eb40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x195687000/0x0/0x1bfc00000, data 0x457e902/0x4587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [0,0,0,0,0,0,0,6])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557383680 unmapped: 83886080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561289ecb400 session 0x56127ea183c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557400064 unmapped: 83869696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x196167000/0x0/0x1bfc00000, data 0x3a9f8f2/0x3aa7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557400064 unmapped: 83869696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x196167000/0x0/0x1bfc00000, data 0x3a9f8f2/0x3aa7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557400064 unmapped: 83869696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x196167000/0x0/0x1bfc00000, data 0x3a9f8f2/0x3aa7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5651843 data_alloc: 234881024 data_used: 21692416
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557400064 unmapped: 83869696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557400064 unmapped: 83869696 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561285e00000 session 0x56128329ed20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x5612862acc00 session 0x56127f8b5a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561280e91c00 session 0x561281bea5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x196168000/0x0/0x1bfc00000, data 0x3a9f8e2/0x3aa6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5648874 data_alloc: 234881024 data_used: 21688320
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.224549294s of 10.959377289s, submitted: 57
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x196168000/0x0/0x1bfc00000, data 0x3a9f8bf/0x3aa5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561289ecbc00 session 0x56127ea18b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557441024 unmapped: 83828736 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557449216 unmapped: 83820544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557449216 unmapped: 83820544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561295f1f400 session 0x56127f2bb0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5621088 data_alloc: 234881024 data_used: 21508096
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 83812352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 heartbeat osd_stat(store_statfs(0x19641c000/0x0/0x1bfc00000, data 0x37ec85d/0x37f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 83812352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x561289eca000 session 0x561280fa2000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 83812352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 handle_osd_map epochs [421,422], i have 421, src has [1,422]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 421 ms_handle_reset con 0x56128b8bec00 session 0x56127e996000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 422 ms_handle_reset con 0x5612862af800 session 0x56127e9961e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 422 ms_handle_reset con 0x5612823dec00 session 0x56128127f680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 422 heartbeat osd_stat(store_statfs(0x19641b000/0x0/0x1bfc00000, data 0x35c34a8/0x37f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5612415 data_alloc: 234881024 data_used: 21512192
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 422 ms_handle_reset con 0x561295f1e000 session 0x561282a76000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 422 handle_osd_map epochs [422,423], i have 422, src has [1,423]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 6.168501377s of 11.982603073s, submitted: 52
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 422 handle_osd_map epochs [423,423], i have 423, src has [1,423]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 83804160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 423 ms_handle_reset con 0x5612823dec00 session 0x56128329e3c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 423 heartbeat osd_stat(store_statfs(0x196447000/0x0/0x1bfc00000, data 0x3599475/0x37c6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 423 ms_handle_reset con 0x5612862af800 session 0x561285cd0f00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554647552 unmapped: 86622208 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5483119 data_alloc: 234881024 data_used: 14696448
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554647552 unmapped: 86622208 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554655744 unmapped: 86614016 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 423 heartbeat osd_stat(store_statfs(0x196f0b000/0x0/0x1bfc00000, data 0x2ad4fa5/0x2d02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554663936 unmapped: 86605824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 423 heartbeat osd_stat(store_statfs(0x196f0b000/0x0/0x1bfc00000, data 0x2ad4fa5/0x2d02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554663936 unmapped: 86605824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 423 handle_osd_map epochs [423,424], i have 423, src has [1,424]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 424 ms_handle_reset con 0x56127f49fc00 session 0x561280c0fa40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554663936 unmapped: 86605824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5486093 data_alloc: 234881024 data_used: 14696448
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554663936 unmapped: 86605824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 424 ms_handle_reset con 0x561280e87000 session 0x561281beb860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 424 heartbeat osd_stat(store_statfs(0x196f08000/0x0/0x1bfc00000, data 0x2ad6c52/0x2d05000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556236800 unmapped: 85032960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556236800 unmapped: 85032960 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 424 handle_osd_map epochs [424,425], i have 424, src has [1,425]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556253184 unmapped: 85016576 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556261376 unmapped: 85008384 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5531035 data_alloc: 234881024 data_used: 14704640
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556261376 unmapped: 85008384 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56128b8bfc00 session 0x56127e8d41e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127f49fc00 session 0x561280c0e780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556261376 unmapped: 85008384 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x196abe000/0x0/0x1bfc00000, data 0x2f1f791/0x314f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561280e87000 session 0x561282a774a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556261376 unmapped: 85008384 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.842556000s of 14.146264076s, submitted: 54
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612823dec00 session 0x561282f9f4a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556269568 unmapped: 85000192 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561284536400 session 0x56127ecee5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561289ecbc00 session 0x561282f9f4a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554803200 unmapped: 86466560 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5553854 data_alloc: 234881024 data_used: 17833984
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555171840 unmapped: 86097920 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561286920400 session 0x56127ea183c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555171840 unmapped: 86097920 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555171840 unmapped: 86097920 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x196abd000/0x0/0x1bfc00000, data 0x2f1f7c4/0x3151000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555171840 unmapped: 86097920 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561285e00800 session 0x56128329eb40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561289eca400 session 0x561285cd01e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561289eca000 session 0x561282f9e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56128b8be800 session 0x5612832a3a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595329 data_alloc: 234881024 data_used: 17776640
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x196642000/0x0/0x1bfc00000, data 0x3399826/0x35cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595329 data_alloc: 234881024 data_used: 17776640
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 555409408 unmapped: 85860352 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.714643478s of 13.027671814s, submitted: 42
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x19612b000/0x0/0x1bfc00000, data 0x3828826/0x3a5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,3])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560390144 unmapped: 80879616 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558743552 unmapped: 82526208 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559800320 unmapped: 81469440 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560029696 unmapped: 81240064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5716325 data_alloc: 234881024 data_used: 18632704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560029696 unmapped: 81240064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560029696 unmapped: 81240064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612834c7000 session 0x56127f886d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x1958cb000/0x0/0x1bfc00000, data 0x4110826/0x4343000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559931392 unmapped: 81338368 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559931392 unmapped: 81338368 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6001.0 total, 600.0 interval#012Cumulative writes: 77K writes, 314K keys, 77K commit groups, 1.0 writes per commit group, ingest: 0.31 GB, 0.05 MB/s#012Cumulative WAL: 77K writes, 28K syncs, 2.70 writes per sync, written: 0.31 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5288 writes, 20K keys, 5288 commit groups, 1.0 writes per commit group, ingest: 21.34 MB, 0.04 MB/s#012Interval WAL: 5288 writes, 2082 syncs, 2.54 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.14              0.00         1    0.142       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6001.0 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6001.0 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56127d005770#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6001.0 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560037888 unmapped: 81231872 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5733101 data_alloc: 234881024 data_used: 18702336
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560037888 unmapped: 81231872 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.124117851s of 10.161478043s, submitted: 137
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195897000/0x0/0x1bfc00000, data 0x4144826/0x4377000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612862af800 session 0x561282f9ed20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561295f1f000 session 0x561280fa2780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561285e01800 session 0x5612832a23c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612862acc00 session 0x561285cd10e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5639210 data_alloc: 234881024 data_used: 14770176
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127fa4b000 session 0x56127ea16d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127ea35c00 session 0x56128329fe00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561094656 unmapped: 80175104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127e5c0400 session 0x561280b0bc20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127ea3bc00 session 0x561280bd83c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561250304 unmapped: 80019456 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x1961d4000/0x0/0x1bfc00000, data 0x3807826/0x3a3a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 561250304 unmapped: 80019456 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x1961d4000/0x0/0x1bfc00000, data 0x3807826/0x3a3a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 77832192 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5753436 data_alloc: 234881024 data_used: 23560192
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564355072 unmapped: 76914688 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.122691154s of 10.414531708s, submitted: 119
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564363264 unmapped: 76906496 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195c0b000/0x0/0x1bfc00000, data 0x3dc8826/0x3ffb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195b83000/0x0/0x1bfc00000, data 0x3e4f826/0x4082000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195b83000/0x0/0x1bfc00000, data 0x3e4f826/0x4082000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5769892 data_alloc: 234881024 data_used: 23609344
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195b83000/0x0/0x1bfc00000, data 0x3e4f826/0x4082000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [0,0,0,0,0,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564600832 unmapped: 76668928 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563929088 unmapped: 77340672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127ea3b800 session 0x56127ecefa40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563929088 unmapped: 77340672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561286f37c00 session 0x56127f8b4780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127ea36000 session 0x561281c76780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5826668 data_alloc: 234881024 data_used: 23732224
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564879360 unmapped: 76390400 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.159999847s of 10.091464996s, submitted: 89
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565927936 unmapped: 75341824 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x19536c000/0x0/0x1bfc00000, data 0x4667826/0x489a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565256192 unmapped: 76013568 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565264384 unmapped: 76005376 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565264384 unmapped: 76005376 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5834934 data_alloc: 234881024 data_used: 24035328
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565264384 unmapped: 76005376 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565264384 unmapped: 76005376 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195361000/0x0/0x1bfc00000, data 0x467a826/0x48ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565264384 unmapped: 76005376 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565272576 unmapped: 75997184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565272576 unmapped: 75997184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195361000/0x0/0x1bfc00000, data 0x467a826/0x48ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5834950 data_alloc: 234881024 data_used: 24035328
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565272576 unmapped: 75997184 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195361000/0x0/0x1bfc00000, data 0x467a826/0x48ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x25fef9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565288960 unmapped: 75980800 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.978540421s of 11.017058372s, submitted: 14
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561285e01800 session 0x56127e8d4d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x56127f49c400 session 0x56127f886780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 565297152 unmapped: 75972608 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612823e1c00 session 0x561280b0a5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554508288 unmapped: 86761472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554532864 unmapped: 86736896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5633869 data_alloc: 234881024 data_used: 14770176
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554467328 unmapped: 86802432 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554491904 unmapped: 86777856 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x196001000/0x0/0x1bfc00000, data 0x35cc7f3/0x37fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x196001000/0x0/0x1bfc00000, data 0x35cc7f3/0x37fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561280e86c00 session 0x561280fa30e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554532864 unmapped: 86736896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612804a6000 session 0x561282f9e000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554532864 unmapped: 86736896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x5612862ae400 session 0x561281beba40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 ms_handle_reset con 0x561285603800 session 0x56127e9974a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5640039 data_alloc: 234881024 data_used: 14770176
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5641451 data_alloc: 234881024 data_used: 14827520
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5641451 data_alloc: 234881024 data_used: 14827520
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.556997299s of 20.875722885s, submitted: 282
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554565632 unmapped: 86704128 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fda000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554565632 unmapped: 86704128 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5656051 data_alloc: 234881024 data_used: 15351808
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fda000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5656051 data_alloc: 234881024 data_used: 15347712
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5661011 data_alloc: 234881024 data_used: 16093184
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554500096 unmapped: 86769664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 425 handle_osd_map epochs [425,426], i have 425, src has [1,426]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.668267250s of 13.934776306s, submitted: 10
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195fdc000/0x0/0x1bfc00000, data 0x35f0816/0x3822000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x561285e01c00 session 0x561282a76f00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x561280e86c00 session 0x56127f887680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612804a6000 session 0x56127e997a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195fcd000/0x0/0x1bfc00000, data 0x36a84d1/0x3830000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5676383 data_alloc: 234881024 data_used: 16031744
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554516480 unmapped: 86753280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554532864 unmapped: 86736896 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56128056d000 session 0x56127e8c7c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612806aa400 session 0x56127e997e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x561286219c00 session 0x561282f9e960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612804a6000 session 0x561280b0b0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56128056d000 session 0x561281beab40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554541056 unmapped: 86728704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5719484 data_alloc: 234881024 data_used: 16035840
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554541056 unmapped: 86728704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195aef000/0x0/0x1bfc00000, data 0x3b874d1/0x3d0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554541056 unmapped: 86728704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554541056 unmapped: 86728704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x561284537800 session 0x56128127f4a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554541056 unmapped: 86728704 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127fa4a800 session 0x561282f9e960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554549248 unmapped: 86720512 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612841f9c00 session 0x56127e8c7c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.336693764s of 13.468007088s, submitted: 36
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127fa4a800 session 0x56127e997a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5719484 data_alloc: 234881024 data_used: 16035840
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554557440 unmapped: 86712320 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195aef000/0x0/0x1bfc00000, data 0x3b874d1/0x3d0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554565632 unmapped: 86704128 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554573824 unmapped: 86695936 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56128056d000 session 0x5612813e72c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554573824 unmapped: 86695936 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x561284537800 session 0x561281c77c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554573824 unmapped: 86695936 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195ae9000/0x0/0x1bfc00000, data 0x3b914d1/0x3d15000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127e5c1c00 session 0x5612813e7c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56128458ac00 session 0x561285cd1c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5751069 data_alloc: 234881024 data_used: 19804160
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195ae9000/0x0/0x1bfc00000, data 0x3b914d1/0x3d15000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554582016 unmapped: 86687744 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554590208 unmapped: 86679552 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554590208 unmapped: 86679552 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195ae8000/0x0/0x1bfc00000, data 0x3b914e1/0x3d16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195ae8000/0x0/0x1bfc00000, data 0x3b914e1/0x3d16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554598400 unmapped: 86671360 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554598400 unmapped: 86671360 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5751653 data_alloc: 234881024 data_used: 19816448
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554598400 unmapped: 86671360 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554598400 unmapped: 86671360 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 554598400 unmapped: 86671360 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.716708183s of 12.770442009s, submitted: 14
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557015040 unmapped: 84254720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x1956b0000/0x0/0x1bfc00000, data 0x3fc84e1/0x414d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556990464 unmapped: 84279296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5810889 data_alloc: 234881024 data_used: 19959808
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556990464 unmapped: 84279296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556990464 unmapped: 84279296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556990464 unmapped: 84279296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556990464 unmapped: 84279296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556351488 unmapped: 84918272 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x19554f000/0x0/0x1bfc00000, data 0x412a4e1/0x42af000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5823081 data_alloc: 234881024 data_used: 20680704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556351488 unmapped: 84918272 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x19552b000/0x0/0x1bfc00000, data 0x414e4e1/0x42d3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556351488 unmapped: 84918272 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556351488 unmapped: 84918272 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x19552b000/0x0/0x1bfc00000, data 0x414e4e1/0x42d3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556359680 unmapped: 84910080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 556359680 unmapped: 84910080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.680471420s of 12.101355553s, submitted: 82
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x19552b000/0x0/0x1bfc00000, data 0x414e4e1/0x42d3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5823465 data_alloc: 234881024 data_used: 20676608
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557408256 unmapped: 83861504 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557408256 unmapped: 83861504 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557408256 unmapped: 83861504 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557416448 unmapped: 83853312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127e5c1c00 session 0x561280b0ad20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127fa4a800 session 0x56128127e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612862af800 session 0x56127f7192c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 557424640 unmapped: 83845120 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195525000/0x0/0x1bfc00000, data 0x41544e1/0x42d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5823276 data_alloc: 234881024 data_used: 20680704
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558473216 unmapped: 82796544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558473216 unmapped: 82796544 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612804a6000 session 0x56127f887680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558407680 unmapped: 82862080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558407680 unmapped: 82862080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x195513000/0x0/0x1bfc00000, data 0x43274d1/0x42eb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558407680 unmapped: 82862080 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612822d6400 session 0x56127ea1fa40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127e5c1c00 session 0x561280c0f0e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5827202 data_alloc: 234881024 data_used: 20672512
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558432256 unmapped: 82837504 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x56127fa4a800 session 0x561282a76f00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.686823845s of 11.059775352s, submitted: 71
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 heartbeat osd_stat(store_statfs(0x1955c9000/0x0/0x1bfc00000, data 0x427146f/0x4234000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 ms_handle_reset con 0x5612862af800 session 0x56127e8c74a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558440448 unmapped: 82829312 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 handle_osd_map epochs [426,427], i have 426, src has [1,427]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 426 handle_osd_map epochs [427,427], i have 427, src has [1,427]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x5612804a6000 session 0x561280c0e3c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x5612823de400 session 0x56127f887e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x56127e5c1c00 session 0x5612813e65a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x56127fa4a800 session 0x56127e613c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558465024 unmapped: 82804736 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 427 heartbeat osd_stat(store_statfs(0x1955d4000/0x0/0x1bfc00000, data 0x3ff426e/0x4228000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x263ff9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558465024 unmapped: 82804736 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x561286219800 session 0x56127e8e9680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x5612862ad400 session 0x561280b0b4a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558489600 unmapped: 82780160 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [O-0] New memtable created with log file: #60. Immutable memtables: 0.
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 427 ms_handle_reset con 0x5612804a6000 session 0x5612832a2780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5808686 data_alloc: 234881024 data_used: 20566016
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559554560 unmapped: 81715200 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 427 heartbeat osd_stat(store_statfs(0x19445a000/0x0/0x1bfc00000, data 0x3fd024b/0x4203000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559562752 unmapped: 81707008 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559562752 unmapped: 81707008 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 427 handle_osd_map epochs [427,428], i have 427, src has [1,428]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559562752 unmapped: 81707008 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 428 ms_handle_reset con 0x56127e5c1c00 session 0x56128329f680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 428 handle_osd_map epochs [428,429], i have 428, src has [1,429]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 429 handle_osd_map epochs [429,429], i have 429, src has [1,429]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 429 ms_handle_reset con 0x5612804a6000 session 0x56127f2ba1e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 429 ms_handle_reset con 0x56127fa4a800 session 0x56128329f860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560619520 unmapped: 80650240 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 429 ms_handle_reset con 0x561280c31800 session 0x56127f3de3c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5675450 data_alloc: 234881024 data_used: 15044608
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558989312 unmapped: 82280448 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558989312 unmapped: 82280448 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 429 heartbeat osd_stat(store_statfs(0x194f4d000/0x0/0x1bfc00000, data 0x34da9d5/0x370f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 429 heartbeat osd_stat(store_statfs(0x194f4d000/0x0/0x1bfc00000, data 0x34da9d5/0x370f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5675450 data_alloc: 234881024 data_used: 15044608
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 429 heartbeat osd_stat(store_statfs(0x194f4d000/0x0/0x1bfc00000, data 0x34da9d5/0x370f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.988192558s of 15.492597580s, submitted: 96
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 429 handle_osd_map epochs [429,430], i have 429, src has [1,430]
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 82272256 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5686916 data_alloc: 234881024 data_used: 15859712
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559005696 unmapped: 82264064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194f4b000/0x0/0x1bfc00000, data 0x34dc514/0x3712000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559005696 unmapped: 82264064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559005696 unmapped: 82264064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194f46000/0x0/0x1bfc00000, data 0x34e2514/0x3718000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559005696 unmapped: 82264064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559005696 unmapped: 82264064 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612862af800 session 0x561282a774a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563896 data_alloc: 234881024 data_used: 11071488
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612862af800 session 0x56127ea18b40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559030272 unmapped: 82239488 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559030272 unmapped: 82239488 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559030272 unmapped: 82239488 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563896 data_alloc: 234881024 data_used: 11071488
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5564056 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559038464 unmapped: 82231296 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5564056 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 82223104 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559054848 unmapped: 82214912 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5564056 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559054848 unmapped: 82214912 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559054848 unmapped: 82214912 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559054848 unmapped: 82214912 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612806aa400 session 0x561280c0eb40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127e5c2800 session 0x561280c0e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612834c6800 session 0x561280fa2000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612841f9400 session 0x5612832a30e0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.846471786s of 31.994029999s, submitted: 58
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127e5c2800 session 0x56127f4c2f00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612806aa400 session 0x561282f9fc20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612834c6800 session 0x561281bea5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612862af800 session 0x561282f9e3c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612822d6000 session 0x561282f9f680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d8000/0x0/0x1bfc00000, data 0x324f524/0x3486000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5623978 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561295f1fc00 session 0x561282f9fa40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d8000/0x0/0x1bfc00000, data 0x324f524/0x3486000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612862afc00 session 0x56128329e780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612804a7c00 session 0x56127e798000
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56128b8bec00 session 0x56127e6132c0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5627085 data_alloc: 234881024 data_used: 11083776
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5674605 data_alloc: 234881024 data_used: 17743872
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x1951d6000/0x0/0x1bfc00000, data 0x324f557/0x3488000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5674605 data_alloc: 234881024 data_used: 17743872
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559063040 unmapped: 82206720 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.016057968s of 19.102005005s, submitted: 18
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 560381952 unmapped: 80887808 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559685632 unmapped: 81584128 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5729831 data_alloc: 234881024 data_used: 18255872
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c05000/0x0/0x1bfc00000, data 0x381f557/0x3a58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c05000/0x0/0x1bfc00000, data 0x381f557/0x3a58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5729423 data_alloc: 234881024 data_used: 18276352
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612823e0800 session 0x5612813e7680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127ea38c00 session 0x56127e8d4d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5729423 data_alloc: 234881024 data_used: 18276352
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559620096 unmapped: 81649664 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559628288 unmapped: 81641472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559628288 unmapped: 81641472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561284536000 session 0x561282f9e960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5729583 data_alloc: 234881024 data_used: 18280448
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559628288 unmapped: 81641472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559628288 unmapped: 81641472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559628288 unmapped: 81641472 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5730223 data_alloc: 234881024 data_used: 18345984
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5730223 data_alloc: 234881024 data_used: 18345984
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c00000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559636480 unmapped: 81633280 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.599523544s of 31.511573792s, submitted: 47
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194bfe000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5733487 data_alloc: 234881024 data_used: 18886656
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194bfe000/0x0/0x1bfc00000, data 0x3824557/0x3a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5733647 data_alloc: 234881024 data_used: 18890752
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559570944 unmapped: 81698816 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561286f36400 session 0x561281beab40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56128458b800 session 0x56128329e5a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194c02000/0x0/0x1bfc00000, data 0x3824547/0x3a5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127ea38000 session 0x56127f886d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574928 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 82468864 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574928 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574928 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 82460672 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574928 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195946000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5574928 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561280c31800 session 0x56127f3eef00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127ea38400 session 0x5612813e7e00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612834c7000 session 0x5612813e7c20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 558817280 unmapped: 82452480 heap: 641269760 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561289a6cc00 session 0x56127f887680
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.226699829s of 32.331203461s, submitted: 22
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612804a7000 session 0x561280fa3a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127ea38400 session 0x561280f80f00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194814000/0x0/0x1bfc00000, data 0x3c1254d/0x3e4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561280c31800 session 0x561285cd1860
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612834c7000 session 0x56128127fe00
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561289a6cc00 session 0x561280bd8960
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194814000/0x0/0x1bfc00000, data 0x3c1254d/0x3e4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194814000/0x0/0x1bfc00000, data 0x3c12586/0x3e4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5712775 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194814000/0x0/0x1bfc00000, data 0x3c12586/0x3e4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561285603c00 session 0x5612832a3a40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56127ea38400 session 0x56127f2bad20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 85934080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561280c31800 session 0x561281c76d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194814000/0x0/0x1bfc00000, data 0x3c12586/0x3e4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612834c7000 session 0x561280b0ba40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 85901312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559046656 unmapped: 85901312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194812000/0x0/0x1bfc00000, data 0x3c125b9/0x3e4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5844871 data_alloc: 251658240 data_used: 29020160
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194812000/0x0/0x1bfc00000, data 0x3c125b9/0x3e4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5844871 data_alloc: 251658240 data_used: 29020160
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 559882240 unmapped: 85065728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.580867767s of 18.858497620s, submitted: 53
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x194812000/0x0/0x1bfc00000, data 0x3c125b9/0x3e4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563322880 unmapped: 81625088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5973445 data_alloc: 251658240 data_used: 30908416
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x193887000/0x0/0x1bfc00000, data 0x4b975b9/0x4dd1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563765248 unmapped: 81182720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19385c000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19385c000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19385c000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5992247 data_alloc: 251658240 data_used: 31162368
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x561289a6cc00 session 0x561281c774a0
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56128056d000 session 0x561281c76780
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19386a000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5980779 data_alloc: 251658240 data_used: 31174656
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19386a000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.893699646s of 14.352492332s, submitted: 153
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5981071 data_alloc: 251658240 data_used: 31178752
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x19386a000/0x0/0x1bfc00000, data 0x4bba5b9/0x4df4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56128b8bc000 session 0x561280b0ba40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x5612810ba000 session 0x56127ea1fa40
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563978240 unmapped: 80969728 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195944000/0x0/0x1bfc00000, data 0x2ae15a9/0x2d1a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5597967 data_alloc: 234881024 data_used: 11083776
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 ms_handle_reset con 0x56128b8bec00 session 0x561281c76d20
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564011008 unmapped: 80936960 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564011008 unmapped: 80936960 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564011008 unmapped: 80936960 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564011008 unmapped: 80936960 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564011008 unmapped: 80936960 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564027392 unmapped: 80920576 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564043776 unmapped: 80904192 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564043776 unmapped: 80904192 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564043776 unmapped: 80904192 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564068352 unmapped: 80879616 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564076544 unmapped: 80871424 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564076544 unmapped: 80871424 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564076544 unmapped: 80871424 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564092928 unmapped: 80855040 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:13 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564109312 unmapped: 80838656 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564125696 unmapped: 80822272 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564125696 unmapped: 80822272 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564133888 unmapped: 80814080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564133888 unmapped: 80814080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564133888 unmapped: 80814080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564142080 unmapped: 80805888 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564166656 unmapped: 80781312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564174848 unmapped: 80773120 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564174848 unmapped: 80773120 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564174848 unmapped: 80773120 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564191232 unmapped: 80756736 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564256768 unmapped: 80691200 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'config diff' '{prefix=config diff}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'config show' '{prefix=config show}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'counter dump' '{prefix=counter dump}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'counter schema' '{prefix=counter schema}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563953664 unmapped: 80994304 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563789824 unmapped: 81158144 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563224576 unmapped: 81723392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'log dump' '{prefix=log dump}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'perf dump' '{prefix=perf dump}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563224576 unmapped: 81723392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'perf schema' '{prefix=perf schema}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563208192 unmapped: 81739776 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563208192 unmapped: 81739776 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563216384 unmapped: 81731584 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563216384 unmapped: 81731584 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563216384 unmapped: 81731584 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563216384 unmapped: 81731584 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563216384 unmapped: 81731584 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563216384 unmapped: 81731584 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563216384 unmapped: 81731584 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563216384 unmapped: 81731584 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563224576 unmapped: 81723392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563224576 unmapped: 81723392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563232768 unmapped: 81715200 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563240960 unmapped: 81707008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563240960 unmapped: 81707008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563240960 unmapped: 81707008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563240960 unmapped: 81707008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563240960 unmapped: 81707008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563240960 unmapped: 81707008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563257344 unmapped: 81690624 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563257344 unmapped: 81690624 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563257344 unmapped: 81690624 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563257344 unmapped: 81690624 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563257344 unmapped: 81690624 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563257344 unmapped: 81690624 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563257344 unmapped: 81690624 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563265536 unmapped: 81682432 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563265536 unmapped: 81682432 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563265536 unmapped: 81682432 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563265536 unmapped: 81682432 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563265536 unmapped: 81682432 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563265536 unmapped: 81682432 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563265536 unmapped: 81682432 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563265536 unmapped: 81682432 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563273728 unmapped: 81674240 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563281920 unmapped: 81666048 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563281920 unmapped: 81666048 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563290112 unmapped: 81657856 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563290112 unmapped: 81657856 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563290112 unmapped: 81657856 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563290112 unmapped: 81657856 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563290112 unmapped: 81657856 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563290112 unmapped: 81657856 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563298304 unmapped: 81649664 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563298304 unmapped: 81649664 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563306496 unmapped: 81641472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563306496 unmapped: 81641472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563306496 unmapped: 81641472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563306496 unmapped: 81641472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563306496 unmapped: 81641472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563306496 unmapped: 81641472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563322880 unmapped: 81625088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563322880 unmapped: 81625088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563322880 unmapped: 81625088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563322880 unmapped: 81625088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563322880 unmapped: 81625088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563322880 unmapped: 81625088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563322880 unmapped: 81625088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563331072 unmapped: 81616896 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563339264 unmapped: 81608704 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563339264 unmapped: 81608704 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563339264 unmapped: 81608704 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563339264 unmapped: 81608704 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563339264 unmapped: 81608704 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563347456 unmapped: 81600512 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563347456 unmapped: 81600512 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563347456 unmapped: 81600512 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563355648 unmapped: 81592320 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563355648 unmapped: 81592320 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563355648 unmapped: 81592320 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563355648 unmapped: 81592320 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563355648 unmapped: 81592320 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563355648 unmapped: 81592320 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563355648 unmapped: 81592320 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563363840 unmapped: 81584128 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563372032 unmapped: 81575936 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563372032 unmapped: 81575936 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563372032 unmapped: 81575936 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563372032 unmapped: 81575936 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 81567744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 81567744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563388416 unmapped: 81559552 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563388416 unmapped: 81559552 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 81543168 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 81543168 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 81543168 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 81543168 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 81543168 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 81543168 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 81543168 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 81543168 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563421184 unmapped: 81526784 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563421184 unmapped: 81526784 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563421184 unmapped: 81526784 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563421184 unmapped: 81526784 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563421184 unmapped: 81526784 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563421184 unmapped: 81526784 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563429376 unmapped: 81518592 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563429376 unmapped: 81518592 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563429376 unmapped: 81518592 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 81510400 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 81502208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 81502208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 81502208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 81502208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 81502208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 81502208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 81494016 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 81494016 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563462144 unmapped: 81485824 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563462144 unmapped: 81485824 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 81477632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 81477632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 81477632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 81477632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 81469440 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 81469440 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 81469440 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 81469440 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 81469440 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 81469440 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 81469440 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 81469440 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563494912 unmapped: 81453056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563494912 unmapped: 81453056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563494912 unmapped: 81453056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563494912 unmapped: 81453056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563494912 unmapped: 81453056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563494912 unmapped: 81453056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563503104 unmapped: 81444864 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563503104 unmapped: 81444864 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563511296 unmapped: 81436672 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563511296 unmapped: 81436672 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563519488 unmapped: 81428480 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563519488 unmapped: 81428480 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563519488 unmapped: 81428480 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563519488 unmapped: 81428480 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563519488 unmapped: 81428480 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 81420288 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 81420288 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 81420288 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 81420288 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 81420288 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 81420288 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 81420288 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 81420288 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 81412096 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 81412096 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 81412096 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563544064 unmapped: 81403904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563544064 unmapped: 81403904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563552256 unmapped: 81395712 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563552256 unmapped: 81395712 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563560448 unmapped: 81387520 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563568640 unmapped: 81379328 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 81371136 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 81371136 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 81371136 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 81371136 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 81371136 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 81371136 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 81371136 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563593216 unmapped: 81354752 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 81567744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 81567744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 81567744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 81567744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 81567744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 81567744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 81567744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563396608 unmapped: 81551360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563396608 unmapped: 81551360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563396608 unmapped: 81551360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563396608 unmapped: 81551360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563396608 unmapped: 81551360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563396608 unmapped: 81551360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563396608 unmapped: 81551360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563396608 unmapped: 81551360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563412992 unmapped: 81534976 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563412992 unmapped: 81534976 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563412992 unmapped: 81534976 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563412992 unmapped: 81534976 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563412992 unmapped: 81534976 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563412992 unmapped: 81534976 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563421184 unmapped: 81526784 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 81510400 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6601.0 total, 600.0 interval#012Cumulative writes: 79K writes, 321K keys, 79K commit groups, 1.0 writes per commit group, ingest: 0.32 GB, 0.05 MB/s#012Cumulative WAL: 79K writes, 29K syncs, 2.69 writes per sync, written: 0.32 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2103 writes, 7442 keys, 2103 commit groups, 1.0 writes per commit group, ingest: 6.64 MB, 0.01 MB/s#012Interval WAL: 2103 writes, 881 syncs, 2.39 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 81510400 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 81510400 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 81510400 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 81510400 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 81510400 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 81510400 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 81510400 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 81510400 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 81502208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 81502208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 81502208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 81502208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 81502208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 81494016 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 81494016 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563462144 unmapped: 81485824 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 81477632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 81477632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 81477632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 81477632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 81477632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 81477632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 81469440 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 81469440 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563486720 unmapped: 81461248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563486720 unmapped: 81461248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563486720 unmapped: 81461248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563486720 unmapped: 81461248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563494912 unmapped: 81453056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595662 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563494912 unmapped: 81453056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563494912 unmapped: 81453056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563494912 unmapped: 81453056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563503104 unmapped: 81444864 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 350.994384766s of 352.413848877s, submitted: 54
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195945000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563503104 unmapped: 81444864 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595414 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563544064 unmapped: 81403904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563544064 unmapped: 81403904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563625984 unmapped: 81321984 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563650560 unmapped: 81297408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563658752 unmapped: 81289216 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563658752 unmapped: 81289216 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563658752 unmapped: 81289216 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563658752 unmapped: 81289216 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563658752 unmapped: 81289216 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563658752 unmapped: 81289216 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563658752 unmapped: 81289216 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563658752 unmapped: 81289216 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563666944 unmapped: 81281024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563666944 unmapped: 81281024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563666944 unmapped: 81281024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563666944 unmapped: 81281024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563666944 unmapped: 81281024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563666944 unmapped: 81281024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563666944 unmapped: 81281024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563666944 unmapped: 81281024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 81272832 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 81272832 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 81272832 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 81272832 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 81272832 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563675136 unmapped: 81272832 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563683328 unmapped: 81264640 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563683328 unmapped: 81264640 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563683328 unmapped: 81264640 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563683328 unmapped: 81264640 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563683328 unmapped: 81264640 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563683328 unmapped: 81264640 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563691520 unmapped: 81256448 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563691520 unmapped: 81256448 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563691520 unmapped: 81256448 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563691520 unmapped: 81256448 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 81248256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 81248256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 81248256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 81248256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 81248256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 81248256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 81248256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563699712 unmapped: 81248256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 81240064 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 81240064 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 81240064 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 234881024 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 81240064 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 81240064 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 81240064 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 81240064 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563707904 unmapped: 81240064 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563716096 unmapped: 81231872 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 81223680 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 81223680 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 81223680 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 81223680 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 81223680 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 81223680 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563724288 unmapped: 81223680 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563732480 unmapped: 81215488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563732480 unmapped: 81215488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563732480 unmapped: 81215488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563732480 unmapped: 81215488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563732480 unmapped: 81215488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563732480 unmapped: 81215488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563740672 unmapped: 81207296 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563740672 unmapped: 81207296 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 81199104 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 81199104 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 81199104 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 81199104 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 81199104 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 81199104 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563748864 unmapped: 81199104 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563757056 unmapped: 81190912 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563757056 unmapped: 81190912 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563757056 unmapped: 81190912 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563757056 unmapped: 81190912 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563757056 unmapped: 81190912 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563765248 unmapped: 81182720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563765248 unmapped: 81182720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563765248 unmapped: 81182720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563765248 unmapped: 81182720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563781632 unmapped: 81166336 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563781632 unmapped: 81166336 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563781632 unmapped: 81166336 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563781632 unmapped: 81166336 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563781632 unmapped: 81166336 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563781632 unmapped: 81166336 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563781632 unmapped: 81166336 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563781632 unmapped: 81166336 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563781632 unmapped: 81166336 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563789824 unmapped: 81158144 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563789824 unmapped: 81158144 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563789824 unmapped: 81158144 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563789824 unmapped: 81158144 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563789824 unmapped: 81158144 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563789824 unmapped: 81158144 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563789824 unmapped: 81158144 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563798016 unmapped: 81149952 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563798016 unmapped: 81149952 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563798016 unmapped: 81149952 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563798016 unmapped: 81149952 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563798016 unmapped: 81149952 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563798016 unmapped: 81149952 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563798016 unmapped: 81149952 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563798016 unmapped: 81149952 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563814400 unmapped: 81133568 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563814400 unmapped: 81133568 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563814400 unmapped: 81133568 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563814400 unmapped: 81133568 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563814400 unmapped: 81133568 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563814400 unmapped: 81133568 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563814400 unmapped: 81133568 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563822592 unmapped: 81125376 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563822592 unmapped: 81125376 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563822592 unmapped: 81125376 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563822592 unmapped: 81125376 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563822592 unmapped: 81125376 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563822592 unmapped: 81125376 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563822592 unmapped: 81125376 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563830784 unmapped: 81117184 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563847168 unmapped: 81100800 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563847168 unmapped: 81100800 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563847168 unmapped: 81100800 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563847168 unmapped: 81100800 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563847168 unmapped: 81100800 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563847168 unmapped: 81100800 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563847168 unmapped: 81100800 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563855360 unmapped: 81092608 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563855360 unmapped: 81092608 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563855360 unmapped: 81092608 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563855360 unmapped: 81092608 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563863552 unmapped: 81084416 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563863552 unmapped: 81084416 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563863552 unmapped: 81084416 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563863552 unmapped: 81084416 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563863552 unmapped: 81084416 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563879936 unmapped: 81068032 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563879936 unmapped: 81068032 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563879936 unmapped: 81068032 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563879936 unmapped: 81068032 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563879936 unmapped: 81068032 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563879936 unmapped: 81068032 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563879936 unmapped: 81068032 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563888128 unmapped: 81059840 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563896320 unmapped: 81051648 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563896320 unmapped: 81051648 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563896320 unmapped: 81051648 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563896320 unmapped: 81051648 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563896320 unmapped: 81051648 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563896320 unmapped: 81051648 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563896320 unmapped: 81051648 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563912704 unmapped: 81035264 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563912704 unmapped: 81035264 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563912704 unmapped: 81035264 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563912704 unmapped: 81035264 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563912704 unmapped: 81035264 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563912704 unmapped: 81035264 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563920896 unmapped: 81027072 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563920896 unmapped: 81027072 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563920896 unmapped: 81027072 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563929088 unmapped: 81018880 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563937280 unmapped: 81010688 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563937280 unmapped: 81010688 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563937280 unmapped: 81010688 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563937280 unmapped: 81010688 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563937280 unmapped: 81010688 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563937280 unmapped: 81010688 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563937280 unmapped: 81010688 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563945472 unmapped: 81002496 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563945472 unmapped: 81002496 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563945472 unmapped: 81002496 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563945472 unmapped: 81002496 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563945472 unmapped: 81002496 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563953664 unmapped: 80994304 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563953664 unmapped: 80994304 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563961856 unmapped: 80986112 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563961856 unmapped: 80986112 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563961856 unmapped: 80986112 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563961856 unmapped: 80986112 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563961856 unmapped: 80986112 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563970048 unmapped: 80977920 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563986432 unmapped: 80961536 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563986432 unmapped: 80961536 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563986432 unmapped: 80961536 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563986432 unmapped: 80961536 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563994624 unmapped: 80953344 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563994624 unmapped: 80953344 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563994624 unmapped: 80953344 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 563994624 unmapped: 80953344 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564002816 unmapped: 80945152 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564019200 unmapped: 80928768 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564035584 unmapped: 80912384 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564043776 unmapped: 80904192 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564051968 unmapped: 80896000 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564060160 unmapped: 80887808 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564076544 unmapped: 80871424 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564076544 unmapped: 80871424 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564076544 unmapped: 80871424 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564076544 unmapped: 80871424 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564084736 unmapped: 80863232 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564092928 unmapped: 80855040 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564101120 unmapped: 80846848 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564117504 unmapped: 80830464 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564125696 unmapped: 80822272 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564125696 unmapped: 80822272 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564125696 unmapped: 80822272 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564133888 unmapped: 80814080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564133888 unmapped: 80814080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564133888 unmapped: 80814080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564133888 unmapped: 80814080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564133888 unmapped: 80814080 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564150272 unmapped: 80797696 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564150272 unmapped: 80797696 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564150272 unmapped: 80797696 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564150272 unmapped: 80797696 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564150272 unmapped: 80797696 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564150272 unmapped: 80797696 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564158464 unmapped: 80789504 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564166656 unmapped: 80781312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564166656 unmapped: 80781312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564166656 unmapped: 80781312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564166656 unmapped: 80781312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564166656 unmapped: 80781312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564166656 unmapped: 80781312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564166656 unmapped: 80781312 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564174848 unmapped: 80773120 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564174848 unmapped: 80773120 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564183040 unmapped: 80764928 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564191232 unmapped: 80756736 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564191232 unmapped: 80756736 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564191232 unmapped: 80756736 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564191232 unmapped: 80756736 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564191232 unmapped: 80756736 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564199424 unmapped: 80748544 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564199424 unmapped: 80748544 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564199424 unmapped: 80748544 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564199424 unmapped: 80748544 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564215808 unmapped: 80732160 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564215808 unmapped: 80732160 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564224000 unmapped: 80723968 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564224000 unmapped: 80723968 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564224000 unmapped: 80723968 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564224000 unmapped: 80723968 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564224000 unmapped: 80723968 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564224000 unmapped: 80723968 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564224000 unmapped: 80723968 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564224000 unmapped: 80723968 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564232192 unmapped: 80715776 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564232192 unmapped: 80715776 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564232192 unmapped: 80715776 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564232192 unmapped: 80715776 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564232192 unmapped: 80715776 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564232192 unmapped: 80715776 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564232192 unmapped: 80715776 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564248576 unmapped: 80699392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564248576 unmapped: 80699392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564248576 unmapped: 80699392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564248576 unmapped: 80699392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564248576 unmapped: 80699392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564248576 unmapped: 80699392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564248576 unmapped: 80699392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564248576 unmapped: 80699392 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564256768 unmapped: 80691200 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564256768 unmapped: 80691200 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564264960 unmapped: 80683008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564264960 unmapped: 80683008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564264960 unmapped: 80683008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564264960 unmapped: 80683008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564264960 unmapped: 80683008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564264960 unmapped: 80683008 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564273152 unmapped: 80674816 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564273152 unmapped: 80674816 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564273152 unmapped: 80674816 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564273152 unmapped: 80674816 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564273152 unmapped: 80674816 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564273152 unmapped: 80674816 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564289536 unmapped: 80658432 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564297728 unmapped: 80650240 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564305920 unmapped: 80642048 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564305920 unmapped: 80642048 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564305920 unmapped: 80642048 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564314112 unmapped: 80633856 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564314112 unmapped: 80633856 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564314112 unmapped: 80633856 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564314112 unmapped: 80633856 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564314112 unmapped: 80633856 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564330496 unmapped: 80617472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564330496 unmapped: 80617472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564330496 unmapped: 80617472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564330496 unmapped: 80617472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564330496 unmapped: 80617472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564330496 unmapped: 80617472 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564338688 unmapped: 80609280 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564338688 unmapped: 80609280 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564346880 unmapped: 80601088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564346880 unmapped: 80601088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564346880 unmapped: 80601088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564346880 unmapped: 80601088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564346880 unmapped: 80601088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564346880 unmapped: 80601088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564346880 unmapped: 80601088 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564363264 unmapped: 80584704 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564371456 unmapped: 80576512 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564371456 unmapped: 80576512 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564371456 unmapped: 80576512 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564371456 unmapped: 80576512 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564371456 unmapped: 80576512 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564379648 unmapped: 80568320 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564379648 unmapped: 80568320 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564379648 unmapped: 80568320 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564387840 unmapped: 80560128 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564387840 unmapped: 80560128 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564387840 unmapped: 80560128 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564387840 unmapped: 80560128 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564387840 unmapped: 80560128 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564387840 unmapped: 80560128 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564387840 unmapped: 80560128 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564387840 unmapped: 80560128 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564404224 unmapped: 80543744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564404224 unmapped: 80543744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564404224 unmapped: 80543744 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564412416 unmapped: 80535552 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564412416 unmapped: 80535552 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564412416 unmapped: 80535552 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564412416 unmapped: 80535552 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564412416 unmapped: 80535552 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564420608 unmapped: 80527360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564420608 unmapped: 80527360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564420608 unmapped: 80527360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564420608 unmapped: 80527360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564420608 unmapped: 80527360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564420608 unmapped: 80527360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564420608 unmapped: 80527360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564420608 unmapped: 80527360 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564428800 unmapped: 80519168 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564428800 unmapped: 80519168 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564436992 unmapped: 80510976 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564445184 unmapped: 80502784 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564453376 unmapped: 80494592 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564453376 unmapped: 80494592 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564453376 unmapped: 80494592 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 80478208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 80478208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 80478208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 80478208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 80478208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 80478208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 80478208 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564477952 unmapped: 80470016 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564477952 unmapped: 80470016 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564486144 unmapped: 80461824 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564486144 unmapped: 80461824 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564486144 unmapped: 80461824 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564486144 unmapped: 80461824 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564494336 unmapped: 80453632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564494336 unmapped: 80453632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564494336 unmapped: 80453632 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564502528 unmapped: 80445440 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564510720 unmapped: 80437248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564510720 unmapped: 80437248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564510720 unmapped: 80437248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564510720 unmapped: 80437248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564510720 unmapped: 80437248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564510720 unmapped: 80437248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564510720 unmapped: 80437248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564510720 unmapped: 80437248 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564518912 unmapped: 80429056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564518912 unmapped: 80429056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564518912 unmapped: 80429056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564518912 unmapped: 80429056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564518912 unmapped: 80429056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564518912 unmapped: 80429056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564518912 unmapped: 80429056 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564535296 unmapped: 80412672 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564551680 unmapped: 80396288 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564559872 unmapped: 80388096 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564559872 unmapped: 80388096 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564559872 unmapped: 80388096 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564559872 unmapped: 80388096 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564559872 unmapped: 80388096 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564559872 unmapped: 80388096 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564559872 unmapped: 80388096 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564568064 unmapped: 80379904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564568064 unmapped: 80379904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564568064 unmapped: 80379904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564568064 unmapped: 80379904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564568064 unmapped: 80379904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564568064 unmapped: 80379904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564568064 unmapped: 80379904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564568064 unmapped: 80379904 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564576256 unmapped: 80371712 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564576256 unmapped: 80371712 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564576256 unmapped: 80371712 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564576256 unmapped: 80371712 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564576256 unmapped: 80371712 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564576256 unmapped: 80371712 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564584448 unmapped: 80363520 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564592640 unmapped: 80355328 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564592640 unmapped: 80355328 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564592640 unmapped: 80355328 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564592640 unmapped: 80355328 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564592640 unmapped: 80355328 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 80330752 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 80330752 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 80330752 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 80330752 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 80330752 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 80330752 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 80330752 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564617216 unmapped: 80330752 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564625408 unmapped: 80322560 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564625408 unmapped: 80322560 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564633600 unmapped: 80314368 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564633600 unmapped: 80314368 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564641792 unmapped: 80306176 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564641792 unmapped: 80306176 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564641792 unmapped: 80306176 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564641792 unmapped: 80306176 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564641792 unmapped: 80306176 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564641792 unmapped: 80306176 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564641792 unmapped: 80306176 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564649984 unmapped: 80297984 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564658176 unmapped: 80289792 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564658176 unmapped: 80289792 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564658176 unmapped: 80289792 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564658176 unmapped: 80289792 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564658176 unmapped: 80289792 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564658176 unmapped: 80289792 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564666368 unmapped: 80281600 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564674560 unmapped: 80273408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564674560 unmapped: 80273408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564674560 unmapped: 80273408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564674560 unmapped: 80273408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564674560 unmapped: 80273408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564674560 unmapped: 80273408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564674560 unmapped: 80273408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564674560 unmapped: 80273408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564690944 unmapped: 80257024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564690944 unmapped: 80257024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564690944 unmapped: 80257024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564690944 unmapped: 80257024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564690944 unmapped: 80257024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564690944 unmapped: 80257024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564690944 unmapped: 80257024 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564699136 unmapped: 80248832 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564699136 unmapped: 80248832 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564715520 unmapped: 80232448 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564723712 unmapped: 80224256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564723712 unmapped: 80224256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564723712 unmapped: 80224256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564723712 unmapped: 80224256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564723712 unmapped: 80224256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564723712 unmapped: 80224256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564723712 unmapped: 80224256 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564740096 unmapped: 80207872 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564740096 unmapped: 80207872 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564740096 unmapped: 80207872 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564740096 unmapped: 80207872 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564740096 unmapped: 80207872 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564740096 unmapped: 80207872 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564748288 unmapped: 80199680 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564748288 unmapped: 80199680 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564756480 unmapped: 80191488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564756480 unmapped: 80191488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564756480 unmapped: 80191488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564756480 unmapped: 80191488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564756480 unmapped: 80191488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564756480 unmapped: 80191488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564756480 unmapped: 80191488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564756480 unmapped: 80191488 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564764672 unmapped: 80183296 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564764672 unmapped: 80183296 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564764672 unmapped: 80183296 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564772864 unmapped: 80175104 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564781056 unmapped: 80166912 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564781056 unmapped: 80166912 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564781056 unmapped: 80166912 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564781056 unmapped: 80166912 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564789248 unmapped: 80158720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564789248 unmapped: 80158720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564789248 unmapped: 80158720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564789248 unmapped: 80158720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564789248 unmapped: 80158720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564789248 unmapped: 80158720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564789248 unmapped: 80158720 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7201.0 total, 600.0 interval#012Cumulative writes: 80K writes, 322K keys, 80K commit groups, 1.0 writes per commit group, ingest: 0.32 GB, 0.05 MB/s#012Cumulative WAL: 80K writes, 29K syncs, 2.68 writes per sync, written: 0.32 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 478 writes, 740 keys, 478 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s#012Interval WAL: 478 writes, 235 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564797440 unmapped: 80150528 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564797440 unmapped: 80150528 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: bluestore.MempoolThread(0x56127d0e3b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5595342 data_alloc: 218103808 data_used: 11075584
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564805632 unmapped: 80142336 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564805632 unmapped: 80142336 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'config diff' '{prefix=config diff}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'config show' '{prefix=config show}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564674560 unmapped: 80273408 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'counter dump' '{prefix=counter dump}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'counter schema' '{prefix=counter schema}'
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564477952 unmapped: 80470016 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: mgrc ms_handle_reset ms_handle_reset con 0x561280fcd400
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3158772141
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3158772141,v1:192.168.122.100:6801/3158772141]
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: mgrc handle_mgr_configure stats_period=5
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: osd.2 430 heartbeat osd_stat(store_statfs(0x195947000/0x0/0x1bfc00000, data 0x2ae1514/0x2d17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2759f9c6), peers [0,1] op hist [])
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: prioritycache tune_memory target: 4294967296 mapped: 564461568 unmapped: 80486400 heap: 644947968 old mem: 2845415832 new mem: 2845415832
Oct  2 09:34:14 np0005465988 ceph-osd[79039]: do_command 'log dump' '{prefix=log dump}'
Oct  2 09:34:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Oct  2 09:34:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4278751497' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct  2 09:34:14 np0005465988 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 09:34:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:14.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:14 np0005465988 nova_compute[236126]: 2025-10-02 13:34:14.473 2 DEBUG oslo_service.periodic_task [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:34:14 np0005465988 nova_compute[236126]: 2025-10-02 13:34:14.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:34:14 np0005465988 nova_compute[236126]: 2025-10-02 13:34:14.474 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:34:14 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:14 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:34:14 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:14.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:34:14 np0005465988 nova_compute[236126]: 2025-10-02 13:34:14.600 2 DEBUG nova.compute.manager [None req-d60008bb-0e37-49fb-babb-308fbc7c2269 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:34:14 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Oct  2 09:34:14 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/771179353' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct  2 09:34:15 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Oct  2 09:34:15 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3386432754' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct  2 09:34:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Oct  2 09:34:16 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2866728388' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct  2 09:34:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:16.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:16 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:16 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:16 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:16.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:16 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Oct  2 09:34:16 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1087908314' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct  2 09:34:17 np0005465988 nova_compute[236126]: 2025-10-02 13:34:17.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:34:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Oct  2 09:34:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/182568512' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct  2 09:34:17 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Oct  2 09:34:17 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2380963739' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct  2 09:34:18 np0005465988 nova_compute[236126]: 2025-10-02 13:34:18.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:34:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:34:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Oct  2 09:34:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1043042639' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct  2 09:34:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Oct  2 09:34:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4053814205' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct  2 09:34:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:34:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:18.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:34:18 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:18 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:34:18 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:18.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:34:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Oct  2 09:34:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/18255196' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Oct  2 09:34:18 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Oct  2 09:34:18 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3114995335' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2127147942' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3763429194' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct  2 09:34:19 np0005465988 systemd[1]: Starting Hostname Service...
Oct  2 09:34:19 np0005465988 systemd[1]: Started Hostname Service.
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3509026750' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/92235263' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2222229722' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct  2 09:34:19 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct  2 09:34:20 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Oct  2 09:34:20 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/577632163' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct  2 09:34:20 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct  2 09:34:20 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct  2 09:34:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:20.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:20 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:20 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:20 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:20.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:21 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Oct  2 09:34:21 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2813790633' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct  2 09:34:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0) v1
Oct  2 09:34:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1726994292' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Oct  2 09:34:22 np0005465988 nova_compute[236126]: 2025-10-02 13:34:22.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:34:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:22.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Oct  2 09:34:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3762790924' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct  2 09:34:22 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:22 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:22 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:22.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:22 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Oct  2 09:34:22 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1426765085' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct  2 09:34:23 np0005465988 nova_compute[236126]: 2025-10-02 13:34:23.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:34:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Oct  2 09:34:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct  2 09:34:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct  2 09:34:23 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0) v1
Oct  2 09:34:23 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4069982995' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Oct  2 09:34:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.001000029s ======
Oct  2 09:34:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.100 - anonymous [02/Oct/2025:13:34:24.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct  2 09:34:24 np0005465988 radosgw[82571]: ====== starting new request req=0x7fcca5eec6f0 =====
Oct  2 09:34:24 np0005465988 radosgw[82571]: ====== req done req=0x7fcca5eec6f0 op status=0 http_status=200 latency=0.000000000s ======
Oct  2 09:34:24 np0005465988 radosgw[82571]: beast: 0x7fcca5eec6f0: 192.168.122.102 - anonymous [02/Oct/2025:13:34:24.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct  2 09:34:24 np0005465988 ceph-mon[76355]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0) v1
Oct  2 09:34:24 np0005465988 ceph-mon[76355]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/564563942' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Oct  2 09:34:24 np0005465988 podman[365965]: 2025-10-02 13:34:24.698021686 +0000 UTC m=+0.064318883 container health_status daf94764b1f6357b8e1906c6ae258d2089a6b258ef2b88619df38c084b7c2501 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
